

Third Republic
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 12+ month contract, offering $120–140/hr. Remote work is available, focusing on cloud data platforms and AI/ML workloads. Key skills include Python, Spark, SQL, and experience with Databricks or Snowflake.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
1120
-
🗓️ - Date
March 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Observability #Scala #Terraform #DataOps #Data Modeling #Consulting #Model Deployment #Snowflake #SQL (Structured Query Language) #Airflow #Cloud #ML (Machine Learning) #dbt (data build tool) #Spark SQL #Redshift #BigQuery #Data Quality #Data Pipeline #Kafka (Apache Kafka) #Data Processing #Spark (Apache Spark) #Data Engineering #AI (Artificial Intelligence) #Delta Lake #Deployment #Python #"ETL (Extract #Transform #Load)" #Databricks
Role description
Senior Contract Data Engineers Needed for High-Impact AI & Data Platform Transformation Program (Start April/May) – CONTRACT / FREELANCE ONLY
THE OPPORTUNITY
We’ve secured a major data platform and AI transformation engagement with a fast-growing consulting partner delivering enterprise-scale solutions across finance, healthcare, and technology.
This program focuses on building production-grade data platforms, scalable pipelines, and ML-ready architectures for critical client initiatives — not proof-of-concept work.
WHY THIS ROLE STANDS OUT
The Scale: Architect and deliver complex, cloud-native data platforms supporting enterprise AI/ML workloads
The Rates: $120–140/hr direct-to-customer for proven senior contractors
The Impact: Production systems, real clients, real data, real scale
WHO WE NEED — Senior Data Engineers Who Can Deliver Day-1 Value
Contract Type:
✅ 1099 or Corp-to-Corp (C2C) only
❌ No W2 / no third-party pass-throughs
Senior Data Engineering (Core Skills)
✔ Expert-level Python, Spark, SQL for large-scale data processing
✔ Strong experience with cloud data platforms (Databricks, Snowflake, BigQuery, Redshift)
✔ Real-time / streaming architectures (Kafka, Flink, Spark Streaming)
✔ Data modeling, warehousing & lakehouse design (Delta Lake, Iceberg, Parquet)
✔ Performance tuning, cost optimisation, and scalable pipeline design
Modern Data Platform / DataOps
✔ End-to-end pipeline ownership (Airflow, dbt, Prefect, Dagster)
✔ CI/CD for data platforms
✔ Data quality, observability, and orchestration at scale
✔ Infrastructure-as-Code (Terraform preferred)
✔ Experience working in multi-team enterprise environments
AI / ML Platform Experience (Strong Plus)
✔ Preparing data for LLM / RAG / vector database workloads
✔ Feature stores and ML data pipelines
✔ MLOps / model deployment integration
✔ Experience supporting data for AI products in production
CONTRACT DETAILS
Duration: 12 months+ (extensions likely)
Rate: $120–140/hr
Location: Remote US (EST / PST preferred)
TIMELINE
Interviews: Happening now
Start dates: Rolling April / May
Offers moving quickly
HOW TO APPLY
Senior Data Engineers available on 1099 / C2C — DM resume now
Only 1 spot available — hiring immediately
#DataEngineering #SeniorDataEngineer #Databricks #Snowflake #MLOps #AIEngineering #Contract #C2C #Freelance #Hiring #TechJobs
Senior Contract Data Engineers Needed for High-Impact AI & Data Platform Transformation Program (Start April/May) – CONTRACT / FREELANCE ONLY
THE OPPORTUNITY
We’ve secured a major data platform and AI transformation engagement with a fast-growing consulting partner delivering enterprise-scale solutions across finance, healthcare, and technology.
This program focuses on building production-grade data platforms, scalable pipelines, and ML-ready architectures for critical client initiatives — not proof-of-concept work.
WHY THIS ROLE STANDS OUT
The Scale: Architect and deliver complex, cloud-native data platforms supporting enterprise AI/ML workloads
The Rates: $120–140/hr direct-to-customer for proven senior contractors
The Impact: Production systems, real clients, real data, real scale
WHO WE NEED — Senior Data Engineers Who Can Deliver Day-1 Value
Contract Type:
✅ 1099 or Corp-to-Corp (C2C) only
❌ No W2 / no third-party pass-throughs
Senior Data Engineering (Core Skills)
✔ Expert-level Python, Spark, SQL for large-scale data processing
✔ Strong experience with cloud data platforms (Databricks, Snowflake, BigQuery, Redshift)
✔ Real-time / streaming architectures (Kafka, Flink, Spark Streaming)
✔ Data modeling, warehousing & lakehouse design (Delta Lake, Iceberg, Parquet)
✔ Performance tuning, cost optimisation, and scalable pipeline design
Modern Data Platform / DataOps
✔ End-to-end pipeline ownership (Airflow, dbt, Prefect, Dagster)
✔ CI/CD for data platforms
✔ Data quality, observability, and orchestration at scale
✔ Infrastructure-as-Code (Terraform preferred)
✔ Experience working in multi-team enterprise environments
AI / ML Platform Experience (Strong Plus)
✔ Preparing data for LLM / RAG / vector database workloads
✔ Feature stores and ML data pipelines
✔ MLOps / model deployment integration
✔ Experience supporting data for AI products in production
CONTRACT DETAILS
Duration: 12 months+ (extensions likely)
Rate: $120–140/hr
Location: Remote US (EST / PST preferred)
TIMELINE
Interviews: Happening now
Start dates: Rolling April / May
Offers moving quickly
HOW TO APPLY
Senior Data Engineers available on 1099 / C2C — DM resume now
Only 1 spot available — hiring immediately
#DataEngineering #SeniorDataEngineer #Databricks #Snowflake #MLOps #AIEngineering #Contract #C2C #Freelance #Hiring #TechJobs




