Dbt Snowflake Azure Data Engineer W

apartmentRulesIQ LLC placePleasanton calendar_month 

80 85 hour (W2)

Candidates must be able to convert fte without sponsorship now or at any point in the future

Hybrid in Pleasanton, CA (ONSITE 3x a week)

Okay for relocation- but need to start day 1 in CA

3 month+ contract to hire

Will be Onshore Lead for 2 Teams- Supply Chain and Merchandising
  1. 15 years exp

Architect or Senior Engineer but they will be doing development work

Hands on exp- not looking for manager

Cloud data warehousing

Snowflake

Azure

Aws GCP is secondary- Snowflake is primary

Python, SQL

DBT Experience

Previous retail (Prev Albertsons would be great!)

Communication is key, speaking to stakeholders

Will need to present to leadership

Python sql technical round

Ask to provide how code works, how do you want to do this

Snowflake

Communication skills, explain implementation teams

Strong communication

Communication skills

Everyone's resume says every cloud

Relocation

Interview Process:

45 min with Manager

1 hour technical round sr data engineer

Leadership round with Manager and VP

Work with functionally split within retail analytics side

Person would be working on hub and spoke model- supply chain and merchandising

Two domains supply chains and store sites- lead engineers do review to other on team

Briefly summarize the overall purpose of the position. This is a short explanation of the job's primary purpose and functions.

The Data Engineer III plays a critical role in engineering of data solutions that support Ross reporting and analytic needs. As a key member of the Data engineering team, will work on diverse data technologies such as Steramsets, dbt, data ops and others to build insightful, scalable, and robust data pipelines that feed our various analytics platforms.

ESSENTIAL FUNCTIONS:

  • Design and Model data engineering pipelines that support Ross reporting and analytic needs.
  • Engineer efficient, adaptable, and scalable data pipelines for moving data from different sources into our Cloud Lakehouse
  • Understand and analyze business requirements and translate into well-architected solutions that demonstrate the modern BI & Analytics platform
  • Be a part of data modernization projects providing direction on matters of overall design and technical direction, acts as the primary driver toward establishing guidelines and approaches
  • Develop and deploy performance optimization methodologies
  • Drive timely and proactive issue identification, escalation & resolution
  • Collaborate effectively within Data Technology teams, Business Information teams to design and build optimized data flows from source to Data visualization

QUALIFICATIONS AND SPECIAL SKILLS REQUIRED:

  • 12 + years in-depth, data engineering experience and execution of data pipelines, data ops, scripting and SQL queries
  • 5+ years proven data architecture experience - must have demonstrable experience data architecture, accountable for data standards, designing data models for data warehousing and modern analytics use-cases (e.g., from operational data store to semantic models)
  • At least 3 years experience in modern data architecture that support advanced analytics including Snowflake, Azure, etc. Experience with Snowflake and other Cloud Data Warehousing / Data Lake preferred
  • Expert in engineering data pipelines using various data technologies ETL/ELT, big data technologies (Hive, Spark) on large-scale data sets demonstrated through years of experience
  • 5+ years hands on data warehouse design, development, and data modeling best practices for modern data architectures
  • Highly proficient in at least one of these programming languages: Java, Python
  • Experience with modern data modelling tools, data preparation tools
  • Experience with adding data lineage, technical glossary from data pipelines to data catalog tools
  • Highly proficient in Data analysis analyzing SQL, Python scripts, ETL/ELT transformation scripts
  • Highly skilled in data orchestration with experience in tools like Ctrl-M, Apache Airflow. Hands on DevOps/Data Ops experience required
  • Knowledge/working experience in reporting tools such as MicroStrategy, Power BI would be a plus
  • Self-driven individual with the ability to work independently or as part of a project team

-

apartmentTeslaplaceFremont, 9 mi from Pleasanton (CA)
Reporting to the Data Engineering Lead in EHS&S Systems and Tools, this position works with EHS&S data engineering team to support the Global EHS&S program as we transform our EHS&S systems supporting Tesla’s scalability goals. A successful...
thumb_up_altRecommended

Data Engineer

apartmentizealinc.complaceDublin (CA), 5 mi from Pleasanton (CA)
iZeal, Inc. is currently seeking an Data Engineer for our client's requirement. The salary depends on experience. Job Title: Data Engineer Type: W2-Contract Location: Dublin, CA Required Skills: 10+ years in-depth data engineering...
local_fire_departmentUrgent

Senior Data Engineer

apartmentGridwareplaceSan Francisco, 32 mi from Pleasanton (CA)
by the best climate-tech and Silicon Valley investors. We are headquartered in the Bay Area in northern California. Role Description: The Senior Data Engineer is responsible for designing, building, and optimizing data pipelines and infrastructure to support...