

Data Engineer Snowflake Data Warehouse Github CI/CD, Python, ETL Contract to Permanent Job in Sunnyvale, Ca.
We have a 6-month contract position (with possible extension) for a Data Engineer. The position is preferably at hybrid role in Sunnyvale CA, but they are also open to remote and someone who is local and can come in time to time would be preferred.
As a Data Engineer, you will collaborate with team in building data pipelines, orchestrate and manage Snowflake Datawarehouse. You will play a key role in ensuring that existing pipelines are meeting SLAs and integrate CI/CD practices in our operation.
Responsibilities:
• Experience in Snowflake environment and setting up Role based access control roles and Users, setup warehouses and monitors.
• Experience developing Github CI/CD pipelines and deployment into dev and production nodes. Familiar with setting up CI/CD pipelines with Kubernetes, and Docker.
• Write and maintain scripts necessary for data ingestion with orchestration in Airflow
• Develop testing frameworks to validate data accuracy and data consistency at each layer.
Required skills:
• Proficiency in programming languages such as Python.
• Strong analytical and problem-solving skills
• Familiarity with data modeling and ELT /ETL processes.
• Proficient in SQL.
• Familiar with setting up UDF Functions / Stored Procs
• Familiarity with data ingestion tools (Fivetran, HVR, Airbyte)
We have a 6-month contract position (with possible extension) for a Data Engineer. The position is preferably at hybrid role in Sunnyvale CA, but they are also open to remote and someone who is local and can come in time to time would be preferred.
As a Data Engineer, you will collaborate with team in building data pipelines, orchestrate and manage Snowflake Datawarehouse. You will play a key role in ensuring that existing pipelines are meeting SLAs and integrate CI/CD practices in our operation.
Responsibilities:
• Experience in Snowflake environment and setting up Role based access control roles and Users, setup warehouses and monitors.
• Experience developing Github CI/CD pipelines and deployment into dev and production nodes. Familiar with setting up CI/CD pipelines with Kubernetes, and Docker.
• Write and maintain scripts necessary for data ingestion with orchestration in Airflow
• Develop testing frameworks to validate data accuracy and data consistency at each layer.
Required skills:
• Proficiency in programming languages such as Python.
• Strong analytical and problem-solving skills
• Familiarity with data modeling and ELT /ETL processes.
• Proficient in SQL.
• Familiar with setting up UDF Functions / Stored Procs
• Familiarity with data ingestion tools (Fivetran, HVR, Airbyte)