

Senior Data Engineer (Only W2)
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer (W2 contract, hybrid in Houston, TX) with a pay rate of "$X/hour." Key skills include Python, Snowflake, SQL, and ETL tools. Experience in data security and pipeline automation is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Houston, TX
-
π§ - Skills detailed
#Data Science #Dimensional Data Models #Data Pipeline #Informatica #AI (Artificial Intelligence) #Scripting #Data Quality #Data Management #Automation #BI (Business Intelligence) #ML (Machine Learning) #Complex Queries #Jenkins #dbt (data build tool) #Airflow #Python #Debugging #Scala #Security #GitHub #Data Orchestration #Metadata #"ETL (Extract #Transform #Load)" #Data Engineering #Collibra #Batch #Observability #Data Warehouse #DMP (Data Management Platform) #Snowflake #Data Security #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
This is a W2 contract position; C2C and bench sales candidates will not be considered.
Position Summary
We are looking for a highly skilled Senior Data Engineer to design, build, and optimize scalable data solutions in a hybrid work environment. The ideal candidate will have strong expertise in Python and Snowflake, with a focus on pipeline automation, data security, performance optimization, and analytics support.
Key Responsibilities
β’ Design, build, and maintain scalable batch and real-time data pipelines using Python, SQL, and Snowflake
β’ Develop and manage Snowflake data warehouses and lakes to support business analytics
β’ Implement security features such as row-level access and dynamic masking for sensitive data
β’ Monitor Snowflake query performance, warehouse usage, and credit consumption for cost efficiency
β’ Collaborate with data science and analytics teams to support machine learning and analytical initiatives
β’ Create semantic layers and dimensional data models (Star/Snowflake schemas) for BI tools
β’ Automate workflows using CI/CD pipelines (e.g., Jenkins, GitHub) and orchestration tools (e.g., Airflow, dbt)
β’ Facilitate secure external data sharing through Snowflake shares and reader accounts
Required Technical Skills
β’ Strong Python skills for ETL scripting, automation, and data transformation
β’ Deep knowledge of Snowflake architecture, security policies (RBAC, masking), and performance tuning
β’ Advanced SQL skills for writing, optimizing, and debugging complex queries
β’ Hands-on experience with data orchestration and ETL tools such as Airflow, dbt, or Informatica
Preferred Skills
β’ Experience integrating with ML/AI/LLM models
β’ Familiarity with data observability tools such as Great Expectations
β’ Exposure to metadata management platforms like DataHub or Collibra
Soft Skills
β’ Excellent communication and collaboration skills
β’ Strong problem-solving abilities with a focus on data quality and pipeline scalability
Work Model
β’ Hybrid work arrangement: 3 to 4 days onsite per week in Houston, TX