

TekValue IT Solutions
Data Engineer (Python and Snowflake)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Python and Snowflake) in Lewisville, TX, with a long-term contract. Requires 10+ years of experience in data engineering, proficiency in Python, Informatica, AWS, and cloud-based data warehousing, particularly Snowflake.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 30, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Lewisville, TX
-
🧠 - Skills detailed
#Spark (Apache Spark) #EC2 #PySpark #Data Lake #Data Ingestion #Informatica #Automation #"ETL (Extract #Transform #Load)" #Python #Data Engineering #Data Warehouse #AWS EC2 (Amazon Elastic Compute Cloud) #AWS Glue #Data Pipeline #Cloud #AWS (Amazon Web Services) #IICS (Informatica Intelligent Cloud Services) #S3 (Amazon Simple Storage Service) #Snowflake
Role description
Data Engineer (Python and Snowflake)
Onsite - Lewisville, TX
Long Term
Expectations:
At least 10+ years of experience working with various Data Engineering tools. Work as an individual contributor who can run with requirements and handle end to end data engineering and data warehousing tasks in an AWS and Snowflake environment. Diverse experience with both ETL and
ELT process.
Python – Should have done extensive development in Python to automate ETL processes and build/manage data pipelines. Should have understanding on python best practices, error handling, code repositories code management etc.
Informatica – Experience handling both on prem and cloud versions of Informatica (IICS) ETL tools
Data warehouse skills – Should have worked with cloud based data warehouses tools like Snowflake and have good understanding of data Ingestion, data warehouses, data lakes and data analytics. Experience with Python or Pyspark based ETL/ELT development / automation.
AWS – Extensive working knowledge with AWS Data platform and services – S3, AWS Glue, AWS EC2, Cloudwatch etc.
Scheduling Tool – Control M preferred or other relevant tools.
Data Engineer (Python and Snowflake)
Onsite - Lewisville, TX
Long Term
Expectations:
At least 10+ years of experience working with various Data Engineering tools. Work as an individual contributor who can run with requirements and handle end to end data engineering and data warehousing tasks in an AWS and Snowflake environment. Diverse experience with both ETL and
ELT process.
Python – Should have done extensive development in Python to automate ETL processes and build/manage data pipelines. Should have understanding on python best practices, error handling, code repositories code management etc.
Informatica – Experience handling both on prem and cloud versions of Informatica (IICS) ETL tools
Data warehouse skills – Should have worked with cloud based data warehouses tools like Snowflake and have good understanding of data Ingestion, data warehouses, data lakes and data analytics. Experience with Python or Pyspark based ETL/ELT development / automation.
AWS – Extensive working knowledge with AWS Data platform and services – S3, AWS Glue, AWS EC2, Cloudwatch etc.
Scheduling Tool – Control M preferred or other relevant tools.






