

Python Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Python Developer/Data Engineer on a 12-month contract, 100% onsite in downtown Houston. Requires 5+ years in Python, DBT, and Data Lakehouse technologies. Key skills include data pipeline architecture and data quality assurance.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 8, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
Houston, TX
-
π§ - Skills detailed
#Data Pipeline #S3 (Amazon Simple Storage Service) #Data Lake #Snowflake #Data Lakehouse #Version Control #AWS (Amazon Web Services) #Cloud #Data Engineering #Pandas #Dremio #NumPy #Datasets #Data Integration #Visualization #Physical Data Model #Pytest #Programming #Python #Airflow #Storage #Data Quality #Apache Iceberg #SQL (Structured Query Language) #Data Vault #Kubernetes #Delta Lake #Apache Airflow #dbt (data build tool) #Vault
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We're looking for a Python Developer/Data Engineer to join a 12-month contract. 100% onsite in downtown Houston.
The Python Developer/Data Engineer will work closely with business domain experts to create an Enterprise Data Lakehouse to support data analytic use cases for the midstream oil and gas operations, engineering, and measurements business units.
Must Haves:
β’ Software development/software engineering experience
β’ 5+ years Python
β’ Data Build Tool (DBT)
β’ Knowledge of Data Lakehouse technologies, Apache Iceberg, or Delta Lake
β’ Working with S3 object storage
Nice to Haves:
β’ Python UI development, Dash
β’ Dremio
β’ Kubernetes/AWS EKS
β’ AWS Cloud
Responsibilities include:
β’ Design and implement reliable data pipelines to integrate disparate data sources into a single Data Lakehouse
β’ Design and implement data quality pipelines to ensure data correctness and building trusted datasets
β’ Design and implement a Data Lakehouse solution which accurately reflects business operations
β’ Assist with data platform performance tuning and physical data model support including partitioning and compaction
β’ Provide guidance in data visualizations and reporting efforts to ensure solutions are aligned to business objectives.
Qualifications include:
β’ 5+ years Data Engineer designing and maintaining data pipeline architectures
β’ 5+ years of programming experience in Python and ANSI SQL
β’ 2+ years of development experience with DBT
β’ Various data modelling methods such as Star Schema, Snowflake, Data Vault design
β’ Implementing Data Lakehouse using a Medallion Architecture with Apache Iceberg on S3 Object Storage
β’ Various data integration patters including ELT, Pub/Sub, and Change Data Capture
β’ Common Python Data Engineering packages (Pandas, Numpy, Pyarrow, Pytest, Scikit-Learn, Boto3)
β’ Excellent communication and experience presenting complex concepts to technical and non-technical stakeholders
β’ Software development practices such as Design Principles and Patters, Testing, Refactoring, CI/CD, and version control.
β’ Dremio, Apache Airflow, and Airbyte preferred