COVET IT INC

Python Snowflake Data Engineer - Only H1B Consultants

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Snowflake Data Engineer in Concord, CA, on a long-term contract. Required skills include 5+ years in ETL/ELT, Python, Snowflake, and AWS. Only H1B consultants are eligible; no remote work allowed.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 24, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Concord, CA
-
🧠 - Skills detailed
#Data Warehouse #Data Engineering #API (Application Programming Interface) #AWS (Amazon Web Services) #PySpark #SnowSQL #Data Ingestion #Snowflake #Snowpark #Scala #Python #JSON (JavaScript Object Notation) #Spark (Apache Spark) #Data Quality #Data Lake #Datasets #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Cloud #Airflow #Data Pipeline
Role description
Job: Python Snowflake Data Engineer Location: Concord, CA Type: Long Term Contract Only H1B Consultants No remote. This is onsite position Job Summary We are seeking a skilled Python Data Engineer with strong expertise in Snowflake Cloud Data Warehouse and hands-on experience developing ETL/ELT data pipelines in a Python and AWS environment. The ideal candidate will design, build, and optimize scalable data solutions, leveraging Snowflake’s advanced features to support high-performance analytics and data transformation initiatives. Key Responsibilities Design, develop, and maintain ETL/ELT data pipelines using Python, PySpark, and Snowflake. Perform data ingestion, transformation, and processing for both structured and semi-structured data sources. Develop and optimize Snowflake SQL scripts, Stored Procedures, Tasks, and Streams for complex data workflows. Implement error handling, data validation, and data quality checks within the data pipelines. Collaborate with cross-functional teams to gather requirements and ensure data availability, accuracy, and reliability. Conduct SQL performance tuning and troubleshooting for large datasets in Snowflake. Work in an AWS cloud environment integrating data from multiple sources. Schedule and orchestrate data workflows using tools such as Airflow, Control-M, or Autosys. Required Skills & Experience 5+ years of hands-on experience as a Python Data Engineer in ETL/ELT development and Data Warehousing. Strong coding proficiency in Python, PySpark, and DataFrame API. Proven experience working with Snowflake Cloud Data Warehouse on AWS Data Lake. Expertise in SnowSQL, Stored Procedures, Tasks/Streams, and Snowpark. Deep understanding of SQL, query optimization, and performance tuning. Experience handling structured and semi-structured data (JSON, Parquet, etc.). Excellent communication skills and ability to work effectively in distributed teams. Nice to Have Experience scheduling and automating data jobs in Airflow, Control-M, or Autosys. Implementation experience with data quality frameworks and error handling mechanisms. SnowPro Core Certification is a strong plus.