[ref. v52628201] Technical Specialist 4

apartmentStartekk LLC placeColumbus calendar_month 
Note:
Location: Columbus, OH
C2C: Yes
Duration: 12+ Months
Primary Skill: SFDC

Visa: No OPT and CPT

Job Description
The contract manager will change the hybrid status based on the criticality, prioritization and project deadlines.
The Technical Specialist will be responsible for migrating the current data, framework and programs from the ODM EDW IOP Big data environment to the ODM EDW Snowflakeenvironment. Technical Specialist will also involve in Medicaid Enterprise DataWarehouse design, development, implementation, migration, maintenance andoperation activities.

Works closely with Data Governance and Analytics team.Will be one of the key technical resources for ingesting the data to ODM EDWSnowflake environment and to build new or support existing Data warehouses andDataMarts for data analytics and exchange with State and Medicaid partners.This position is a member of Medicaid ITS and works closely with the BusinessIntelligence & Data Analytics team.

Responsibilities: Particiatein Team activities, Design discussions, Stand up meetings and planning Reviewwith team.

ProvideSnowflake database technical support in developing reliable, efficient, andscalable solutions for various projects on Snowflake.
Ingest theexisting data, framework and programs from ODM EDW IOP Big data environment tothe ODM EDW Snowflake environment using the best practices.
Design anddevelop Snowpark features in Python, understand the requirements and iterate.
Interfacewith the open-source community and contribute to Snowflakes open-sourcelibraries including Snowpark Python and the Snowflake Python Connector.
Create,monitor, and maintain role-based access controls, Virtual warehouses, Tasks,Snow pipe, Streams on Snowflake databases to support different use cases.
Performancetuning of Snowflake queries and procedures. Recommending and documenting thebest practices of Snowflake.
Explore thenew capabilities of Snowflake, perform POC and implement them based on businessrequirements.
Responsiblefor creating and maintaining the Snowflake technical documentation, ensuringcompliance with data governance and security policies.
ImplementSnowflake user /query log analysis, History capture, and user email alertconfiguration.
Enable datagovernance in Snowflake, including row/column-level data security using secureviews and dynamic data masking features.
Perform dataanalysis, data profiling, data quality and data ingestion in various layersusing big data/Hadoop/Hive/Impala queries, PySpark programs and UNIX shellscripts.
Follow theorganization coding standard document, Create mappings, sessions and workflowsas per the mapping specification document.
Perform Gapand impact analysis of ETL and IOP jobs for the new requirement andenhancements.
Create mockupdata, perform Unit testing and capture the result sets against the jobsdeveloped in lower environment.
Updating theproduction support Run book, Control M schedule document as per the productionrelease.
Create andupdate design documents, provide detail description about workflows after everyproduction release.
Continuouslymonitor the production data loads, fix the issues, update the tracker documentwith the issues, Identify the performance issues.
Performancetuning long running ETL/ELT jobs by creating partitions, enabling full load andother standard approaches.
PerformQuality assurance check, Reconciliation post data loads and communicate tovendor for receiving fixed data.
Participatein ETL/ELT code review and design re-usable frameworks.
Create Changerequests, workplan, Test results, BCAB checklist documents for the codedeployment to production environment and perform the code validation postdeployment.
Work withSnowflake Admin, Hadoop Admin, ETL and SAS admin teams for code deployments andhealth checks.
Createre-usable framework for Audit Balance Control to capture Reconciliation,mapping parameters and variables, serves as single point of reference forworkflows.
CreateSnowpark and PySpark programs to ingest historical and incremental data.
Create SQOOPscripts to ingest historical data from EDW oracle database to Hadoop IOP,created HIVE tables and Impala views creation scripts for Dimension tables.

Participatein meetings to continuously upgrade the Functional and technical expertise.

REQUIREDSkill Sets: Proficiencyin Data Warehousing, Data migration, and Snowflake is essential for this role.

StrongExperience in the implementation, execution, and maintenance of DataIntegration technology solutions.
Minimum (4-6)years of hands-on experience with Cloud databases.

Minimum (2-3)years of hands-on data migration experience from the Big data environment to Snowflake environment.

apartmentmsysincplaceColumbus
Technical Specialist 4 / TS 4 is Hybrid role The Technical Specialist will be responsible for migrating the current data, framework and programs from the ODM EDW IOP Big data environment to the ODM EDW Snowflake environment. Technical Specialist will also...
placeColumbus
Overview: Shift: Mon-Wed-Fri 5:00a-1:30p and alternating Saturdays 8a-10a **Required to travel to the prison as necessary due to staffing needs. Mileage will be reimbursed from the clinic to the prison not from home to clinic.** Community Medical...
apartmentAbercrombie and Fitch Co.placeColumbus
to best support their store supply needs. This job is located at our Global Home Office in Columbus, Ohio. What Will You Be Doing?  •  Review and manage store and DM supply orders  •  Support stores and DMs with any website or ordering issues  •  Analyze...