

TekValue IT Solutions
Data Engineer (Python and Snowflake) W2
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Python and Snowflake) with over 10 years of experience in data engineering tools, including extensive Python development, Informatica, and AWS services. The contract is long-term and remote within the United States.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 3, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Pipeline #Data Lake #Spark (Apache Spark) #AWS Glue #Informatica #Python #Automation #"ETL (Extract #Transform #Load)" #IICS (Informatica Intelligent Cloud Services) #Data Ingestion #AWS EC2 (Amazon Elastic Compute Cloud) #AWS (Amazon Web Services) #EC2 #PySpark #S3 (Amazon Simple Storage Service) #Cloud #Snowflake #Data Engineering #Data Warehouse
Role description
Data Engineer (Python and Snowflake) W2
Remote - United States
Long Term
Expectations:
At least 10+ years of experience working with various Data Engineering tools. Work as an individual contributor who can run with requirements and handle end to end data engineering and data warehousing tasks in an AWS and Snowflake environment. Diverse experience with both ETL and
ELT process.
Python – Should have done extensive development in Python to automate ETL processes and build/manage data pipelines. Should have understanding on python best practices, error handling, code repositories code management etc.
Informatica – Experience handling both on prem and cloud versions of Informatica (IICS) ETL tools
Data warehouse skills – Should have worked with cloud based data warehouses tools like Snowflake and have good understanding of data Ingestion, data warehouses, data lakes and data analytics. Experience with Python or Pyspark based ETL/ELT development / automation.
AWS – Extensive working knowledge with AWS Data platform and services – S3, AWS Glue, AWS EC2, Cloudwatch etc.
Scheduling Tool – Control M preferred or other relevant tools.
Data Engineer (Python and Snowflake) W2
Remote - United States
Long Term
Expectations:
At least 10+ years of experience working with various Data Engineering tools. Work as an individual contributor who can run with requirements and handle end to end data engineering and data warehousing tasks in an AWS and Snowflake environment. Diverse experience with both ETL and
ELT process.
Python – Should have done extensive development in Python to automate ETL processes and build/manage data pipelines. Should have understanding on python best practices, error handling, code repositories code management etc.
Informatica – Experience handling both on prem and cloud versions of Informatica (IICS) ETL tools
Data warehouse skills – Should have worked with cloud based data warehouses tools like Snowflake and have good understanding of data Ingestion, data warehouses, data lakes and data analytics. Experience with Python or Pyspark based ETL/ELT development / automation.
AWS – Extensive working knowledge with AWS Data platform and services – S3, AWS Glue, AWS EC2, Cloudwatch etc.
Scheduling Tool – Control M preferred or other relevant tools.






