

Senior Data Engineer (Lakehouse Specialist) :: W2 Role
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Lakehouse Specialist) on a 1-year+ contract, with a pay rate of "TBD". It requires 5+ years of experience, proficiency in DataBricks, Snowflake, AWS, GCP, SQL, Python, and expertise in Lakehouse architecture. Remote or hybrid work available.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Lake St Louis, MO
-
π§ - Skills detailed
#Data Science #Delta Lake #Data Pipeline #Data Quality #Datasets #Big Data #Data Lake #ML (Machine Learning) #AWS (Amazon Web Services) #Data Analysis #dbt (data build tool) #Python #Data Architecture #Spark (Apache Spark) #Scala #Security #Databricks #"ETL (Extract #Transform #Load)" #Data Engineering #Cloud #Apache Iceberg #Batch #Data Lakehouse #GCP (Google Cloud Platform) #Snowflake #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Engineer (Lakehouse Specialist)
Client
Data Analytics Internal Practice Team
Team is doing work for St. Louis University, Washington University, Mehrs Goodwill, and a hand full of other active clients.
Communication
Must be perfect communication
Work location
Remote or Local
If local will work hybrid.
Client Location
Work hours will be Central Time zone, St. Louis, MO (Chesterfield, MO)
Term
1 year+ contract
Work Status
GC/USC required
Key Skills
β’ DataBricks, Snowflake, AWS, GCP, SQL, Python
β’ Lakehouse platform development
β’ Lakehouse Architect and Maintain with Delta Lake, Apache Iceberg, or Hudi
β’ Spark: Develop and optimize batch and streaming ETL pipelines
β’ Data Lakehouse design patterns and modern data architectures
Job Description
We are seeking a Senior Data Engineer with strong experience building and maintaining Data Lakehouse architectures. In this role, youβll design scalable data pipelines, optimize data models, and ensure high-performance data availability across structured and unstructured sources. Youβll work closely with data analysts, data scientists, and business stakeholders to deliver reliable, high-quality datasets that support analytics, machine learning, and real-time decision-making.
Key Responsibilities
β’ Architect and maintain Lakehouse platforms using tools like Delta Lake, Apache Iceberg, or Hudi
β’ Develop and optimize batch and streaming ETL pipelines
β’ Implement data quality, governance, and security best practices
β’ Collaborate with cross-functional teams to deliver business-critical datasets
β’ Tune performance and ensure scalability for large-scale workloads
Qualifications
β’ 5+ years of data engineering experience
β’ Strong knowledge of cloud data platforms (e.g., Databricks, Snowflake, AWS, GCP)
β’ Proficiency in SQL, Python, and big data tools (Spark, dbt, etc.)
Experience with data lakehouse design patterns and modern data architectures
If you are interested or have any references please share resume at mukul@brightmindsol.com.