

Snowflake Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Snowflake Data Engineer in “Houston, TX” with a contract length of unspecified duration. Requires 15+ years of experience, expertise in Snowflake, dbt, Matillion, AWS services, and data replication using Qlik. Hybrid work model.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 29, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Houston, TX
-
🧠 - Skills detailed
#DynamoDB #Data Pipeline #Lambda (AWS Lambda) #S3 (Amazon Simple Storage Service) #AWS Glue #Data Quality #Replication #Java #Matillion #AWS (Amazon Web Services) #Azure #dbt (data build tool) #Python #Data Replication #Scala #Qlik #GitHub #Oracle #"ETL (Extract #Transform #Load)" #Data Engineering #Cloud #Batch #SQS (Simple Queue Service) #AWS Lambda #Azure SQL #Data Warehouse #SNS (Simple Notification Service) #Data Integrity #Snowflake #Data Integration #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Snowflake Data Engineer
Location: “Houston, TX”. (Hybrid - 3 days/week onsite required)
Exp: 15+ Years
Description:
At Client, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our company’s growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring the best people and helping them grow both professionally and personally, we ensure a bright future for Client and for the people who work here.
NOTES: Here is some below mandatory tech stack needed
Tech Stack :- (For your reference only)
- Data Warehouse: Snowflake
- ETL tools: dbt, Matillion
- CDC/Replication tools: Qlik Attunity
- AWS services for near real time and batch loading: AWS Lambda and AWS Glue (Python is the language of choice in these implementations)
- Other AWS services used regularly : SNS, SQS, S3, (Kinesis streams, DynamoDB used sometimes based on integrations)
- Versioning system: GitHub
Sometimes we have use cases to connect different data sources like Oracle, Azure SQL DB, Postgres etc. so having a knowledge of these help.
Basic Qualifications:
• Experienced with 10+ years in implementing and operationalizing large scale data solutions in production environments using a variety of technologies and frameworks in the cloud or on premise.
• Designs and builds performant data pipelines to support a range of business consumption patterns using a range of current data integration technologies (cloud, on-premises, or a combination) or custom approaches using Python, Java, Scala, etc.
• Assists in adopting best practices in reporting and analysis including data integrity, test design, validation, and data quality and analysis.
• Experienced with data replication using Qlik. (Should also have a Snowflake, Oracle, SQL, dbt skills)
Responsibilities:
The assigned resources will perform the following activities at the direction of client.
• Design, build, test, and implement data pipelines and integration with client Snowflake Cloud Data
• Warehouse per best practices and client IT policies and procedures.
• Support client end user testing
• Support data validation, data quality and analysis activities as requested by client