[ref. h5020122] GCP Analytics Engineer
Stefanini Group is hiring!
Stefanini is looking for an GCP Analytics Engineer, Remote
For quick apply, please reach out Parul Singh at 248-582-6481/ parul.singh@stefanini.com
Open to W2 candidates only!
Position Description:
We are seeking an Analytics Engineer to join our team, responsible for fueling our Power BI dashboards with high-quality data. As an Analytics Engineer, you will design, develop, and maintain scalable data pipelines, architect data models, and transform complex data sets to drive business insights.You will work with large datasets from Adobe Analytics, integrating them with other data sources to create a unified view of customer behavior and preferences. Your work will enable our business stakeholders to make data-driven decisions by providing them with accurate, timely, and actionable data.
If you have a passion for data modeling, data transformation, and enjoy working with Power BI, Adobe Analytics and GCP we encourage you to apply for this exciting opportunity!
Skills Required:-
Build and maintain data models that transform raw data into clean, tested, and reusable datasets and make it easier for stakeholders to analysis the data using Power BI dashboards.-
Contribute to the development and maintenance of our Qlik and Power BI dashboards ensuring they are powered by accurate and timely data.-
Design, develop, and maintain data transformation pipelines on GCP, handling complex business rules and high-volume data from various sources, including Adobe Analytics.-
Build and maintain batch and real-time data pipelines using GCP services like Big Query, Dataflow, Cloud Functions, and Pub/Sub to support our digital analytics needs.-
Data Analysis Expressions (DAX) Creating DAX calculations and measures to support data analysis.-
Develop and optimize ETL processes to ingest, transform, and load data from multiple sources into our data warehouse, ensuring data quality and consistency.-
Collaborate with the analytics team to translate business requirements into actionable data models and pipelines that power reports, dashboards, and other analytical outputs.-
Create and maintain comprehensive documentation for all data pipelines, ETL processes, and data models.-
Implement and maintain data quality checks and monitoring systems to ensure data accuracy, integrity, and reliability.-
Stay up-to-date with the latest technologies and best practices in data engineering, particularly within the GCP ecosystem.