Oxenham Group

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include Python, ETL/ELT, data warehousing (Snowflake, Redshift, BigQuery), and experience with Palantir Foundry is preferred. A Bachelor’s degree and 3+ years in data engineering are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
February 26, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Engineering #Storage #Airflow #Automation #Complex Queries #Docker #Cloud #Data Warehouse #SQLAlchemy #PySpark #Pandas #Python #Data Pipeline #EDW (Enterprise Data Warehouse) #Kubernetes #Metadata #Spark (Apache Spark) #Data Analysis #Data Quality #GCP (Google Cloud Platform) #Snowflake #Data Governance #SQL (Structured Query Language) #Apache Airflow #Data Vault #Spark SQL #Kafka (Apache Kafka) #Libraries #Databricks #Computer Science #Monitoring #Palantir Foundry #Data Science #Version Control #Dimensional Data Models #Azure #AWS (Amazon Web Services) #Scala #Vault #Data Catalog #Redshift #ML (Machine Learning) #"ETL (Extract #Transform #Load)" #Data Modeling #BigQuery #Data Management
Role description
Position Overview We are seeking an experienced Data Engineer to design, build, and maintain scalable data pipelines and warehouse infrastructure that power critical business analytics and decision-making. The ideal candidate combines deep Python expertise with hands-on data warehouse experience and brings a passion for transforming raw data into reliable, well-modeled assets. Experience with Palantir Foundry is a strong differentiator for this role. Key Responsibilities • Design, develop, and optimize ETL/ELT pipelines to ingest, transform, and load data from diverse sources into cloud-based data warehouse environments • Build and maintain dimensional data models, schemas, and table structures that support reporting, analytics, and machine learning workloads • Write clean, production-grade Python code for data transformation, validation, orchestration, and automation tasks • Develop and manage data workflows within Palantir Foundry, including ontology design, pipeline configuration, and dataset governance (preferred) • Collaborate with data analysts, data scientists, and business stakeholders to translate requirements into robust data solutions • Implement data quality frameworks including monitoring, alerting, testing, and lineage tracking • Optimize query performance and storage efficiency across warehouse platforms (e.g., Snowflake, Redshift, BigQuery, or Databricks) • Contribute to infrastructure-as-code practices, CI/CD pipelines, and version control standards for data assets Required Qualifications • Bachelor’s degree in Computer Science, Data Engineering, Information Systems, or a related field • 3+ years of professional experience in a Data Engineering or similar data-focused role • Strong proficiency in Python, including libraries such as Pandas, PySpark, SQLAlchemy, or Airflow • Solid experience designing and working within enterprise data warehouse environments (Snowflake, Redshift, BigQuery, or equivalent) • Advanced SQL skills with the ability to write complex queries, optimize performance, and model data effectively • Familiarity with orchestration tools (Apache Airflow, Prefect, Dagster, or similar) • Experience with cloud platforms (AWS, Azure, or GCP) and related data services • Strong understanding of data modeling concepts (star schema, snowflake schema, data vault) Preferred Qualifications • Hands-on experience with Palantir Foundry, including pipeline development, ontology management, and Foundry-native transforms • Experience with real-time or streaming data pipelines (Kafka, Kinesis, Spark Streaming) • Familiarity with containerization (Docker) and orchestration (Kubernetes) • Knowledge of data governance, data cataloging, and metadata management tools • Prior experience in a regulated industry (defense, healthcare, finance) is a plus What We Offer • Competitive salary and comprehensive benefits package • Opportunity to work with cutting-edge data platforms and technologies • Collaborative, innovation-driven team culture • Professional development support and career growth pathways