

Baanyan Software Services, Inc.
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering a pay rate of "XXX" per hour. Key skills include 5+ years in Python, ETL/ELT, Snowflake on AWS, and certifications in SnowPro Core and SQL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 4, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Concord, CA
-
🧠 - Skills detailed
#JSON (JavaScript Object Notation) #Snowflake #AWS (Amazon Web Services) #Spark (Apache Spark) #API (Application Programming Interface) #Data Warehouse #PySpark #Python #SnowSQL #SQL (Structured Query Language) #Airflow #"ETL (Extract #Transform #Load)" #Snowpark #Cloud #Data Engineering #Data Lake #Data Quality
Role description
Required Skills:
• 5+ years of hands-on experience as a Python Data Engineer in ETL/ELT development and Data Warehousing.
• Strong coding proficiency in Python, PySpark, and DataFrame API.
• Proven experience working with Snowflake Cloud Data Warehouse on AWS Data Lake.
• Expertise in SnowSQL, Stored Procedures, Tasks/Streams, and Snowpark.
• Deep understanding of SQL, query optimization, and performance tuning.
• Experience handling structured and semi-structured data (JSON, Parquet, etc.).
• Excellent communication skills and ability to work effectively in distributed teams.
• SnowPro Core Certification
• SQL certification
Nice to Have:
• Experience scheduling and automating data jobs in Airflow, Control-M, or Autosys.
• Implementation experience with data quality frameworks and error handling mechanisms.
Required Skills:
• 5+ years of hands-on experience as a Python Data Engineer in ETL/ELT development and Data Warehousing.
• Strong coding proficiency in Python, PySpark, and DataFrame API.
• Proven experience working with Snowflake Cloud Data Warehouse on AWS Data Lake.
• Expertise in SnowSQL, Stored Procedures, Tasks/Streams, and Snowpark.
• Deep understanding of SQL, query optimization, and performance tuning.
• Experience handling structured and semi-structured data (JSON, Parquet, etc.).
• Excellent communication skills and ability to work effectively in distributed teams.
• SnowPro Core Certification
• SQL certification
Nice to Have:
• Experience scheduling and automating data jobs in Airflow, Control-M, or Autosys.
• Implementation experience with data quality frameworks and error handling mechanisms.






