Data Engineer (ETL Expert)
Robert Half Tampa
Robert Half has a brand new opening for a Data Engineer (ETL Expert) with a client here in Tampa, FL.
Position is a full-time, 100% REMOTE ongoing contract. Compensation is flexible $50-65/hr depending on experience.
Interviews are actively being scheduled - Apply NOW!
Top Skill Sets Needed:
- Expertise in ETL Processes: Extensive experience in designing, building, and optimizing ETL pipelines to extract, transform, and load data from various sources, ensuring seamless data flow and reliability across the organization.
- Expertise in SQL and SQL Server: Advanced proficiency in SQL and SQL Server for managing, querying, and optimizing data within databases, supporting high-performance data operations and analytics.
- Experience with System Integrations and Enterprise Software: Skilled in working with complex system integrations and enterprise-level software, enabling smooth data connectivity and collaboration across different business functions and data sources.
Summary:
We’re seeking a skilled Data Engineer with a deep expertise in ETL (Extract, Transform, Load) processes to join our team. In this role, you’ll be responsible for designing, building, and maintaining the data pipelines that power our business intelligence and analytics needs.You’ll work closely with data scientists, analysts, and other cross-functional teams to ensure the availability, accuracy, and accessibility of data across the organization.
Key Responsibilities:
- Develop and maintain ETL pipelines: Design, implement, and optimize ETL processes to extract data from diverse sources, transform it as required, and load it into our data warehouses.
- Data Integration: Collaborate with cross-functional teams to integrate various data sources and deliver reliable, consistent data to support analytics and reporting.
- Data Quality and Performance Optimization: Monitor and enhance data pipeline performance and data quality to ensure accuracy, reliability, and scalability of data workflows.
- Database Management: Manage and optimize data warehouses and databases, ensuring best practices in data architecture and storage solutions.
- Collaboration: Work closely with data scientists and analysts to support their data needs, deliver insights, and contribute to strategic data initiatives.
Qualifications:
- Proven expertise in ETL processes with hands-on experience in building, managing, and optimizing ETL pipelines.
- Experience with ETL tools and platforms such as Apache NiFi, Informatica, Talend, or equivalent.
- Strong SQL skills and proficiency in database systems such as SQL Server, MySQL, PostgreSQL, or others.
- Experience with cloud data platforms (e.g., AWS, GCP, Azure) and data warehousing solutions like Snowflake, Redshift, or BigQuery.
- Programming skills in Python, Java, or similar languages.
- Familiarity with data modeling, data warehousing concepts, and best practices.
- Experience with Microsoft Fabric or similar tools is a HUGE plus, as it enhances efficiency in data processes.
- Excellent communication and collaboration skills with an ability to work effectively across technical and non-technical teams.
Additional/Preferred Qualifications:
- Bachelor’s degree in Computer Science, Engineering, or a related field (or equivalent experience).
- Knowledge of data governance, data security, and compliance considerations.
- Prior experience in a similar role within a data-driven organization.
InfowaysTampa
Azure Functions, Azure Cloud, Synapse Person should have strong knowledge in the field of data engineering with Python. Good understanding of ETL/ELT, lakehouse architecture and modern data architecture Should have fair knowledge on Azure...
PfizerTampa
product development and for all Pfizer countries. One of the team's top priorities is the development of an enterprise data lake which will serve as the engine for the company's digital transformation.
Pfizer is seeking a Data Engineer to help...
Halo MediaTampa
We are seeking an experienced AI/LLM Data Engineer to build and maintain the data pipeline for our Generative AI platform. The ideal candidate will be well-versed in the latest Large Language Model (LLM) technologies and have a strong background...