GCP Data Engineer
Stefanini Group is hiring!
Stefanini is looking for a GCP Data Engineer, Dearborn, MI (Hybrid)
For quick apply, please reach out Anmol Tyagi at 248-263-8628/anmol.tyagi@stefanini.com
Data engineer responsible for designing, developing the transformation and modernization of big data solutions on GCP cloud integrating native GCP services and 3rd party data technologies also build new data products in GCP. We are looking for candidates who have a broad set of technology skills across these areas and who can demonstrate an ability to design and develop right solutions with appropriate combination of GCP and 3rd party technologies for deploying on GCP cloud.
Responsibilities:-
Work as part of an implementation team from concept to operations, providing deep technical subject matter expertise for successfully deployment of Data Platform Implement methods for automation of all parts of the pipeline to minimize labor in development and production.-
Identify, develop, evaluate, and summarize Proof of Concepts to prove out solutions.-
Test and compare competing solutions and report out a point of view on the best solution.-
Experience with large scale solutioning and operationalization of data warehouses, data lakes and analytics platforms on GCP.-
Design and build production data engineering solutions to deliver our pipeline patterns using Google Cloud Platform (GCP) services: o BigQuery, DataFlow (Apache Beam), Pub/Sub, BigTable, Data Fusion, DataProc, Cloud Composer (Apache Airflow), Cloud SQL, Compute Engine, Cloud Functions, and App Engine.-
Migrate existing Big Data pipelines into Google Cloud Platform. Build new data products in GCP.