

Oxenham Group
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown", offering a pay rate of "$$$". Candidates must have 7–10 years of experience in data engineering, advanced SQL and Python skills, and hands-on Snowflake experience. U.S. Person status is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
600
-
🗓️ - Date
December 30, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New Jersey, United States
-
🧠 - Skills detailed
#Data Quality #Metadata #Data Analysis #SQL (Structured Query Language) #Data Architecture #Airflow #Data Warehouse #Data Pipeline #Scala #Documentation #DataOps #Data Governance #Batch #Code Reviews #Snowflake #Data Management #Data Ingestion #"ETL (Extract #Transform #Load)" #Data Engineering #Automation #Data Science #Data Orchestration #Python #Storage #Data Modeling #Cloud #Data Processing #DevOps
Role description
Senior Data Engineer
Position Overview
We are seeking a Senior Data Engineer to support a large-scale, mission-critical data platform for a regulated U.S. government program. This role focuses on designing, building, and maintaining scalable data pipelines and analytics-ready data environments in the cloud. The ideal candidate brings deep experience with SQL, Python, and Snowflake, strong data modeling fundamentals, and a disciplined engineering mindset suitable for high-trust environments.
This position requires U.S. Person status due to the nature of the work.
Key Responsibilities
• Design, develop, and maintain scalable, reliable data pipelines supporting analytics, reporting, and downstream applications
• Build and optimize data ingestion, transformation, and aggregation workflows using SQL and Python
• Implement and manage data models in Snowflake optimized for performance, cost, and analytical use cases
• Partner with data analysts, data scientists, and application teams to translate business and mission requirements into technical solutions
• Ensure data quality, consistency, and integrity through validation frameworks and automated checks
• Optimize query performance, warehouse usage, and storage strategies within Snowflake
• Support batch and near-real-time data processing patterns as required by the platform
• Contribute to data architecture decisions, standards, and best practices
• Develop and maintain technical documentation for data pipelines, schemas, and operational processes
• Participate in code reviews and mentor junior engineers as needed
• Support production operations, troubleshooting, and root-cause analysis for data issues
Required Qualifications
• 7–10 years of experience in data engineering, data warehousing, or backend data platform development
• Advanced SQL expertise, including complex transformations, performance tuning, and query optimization
• Strong Python experience for data processing, orchestration, and automation
• Hands-on experience designing and operating Snowflake data warehouses in production environments
• Solid understanding of data modeling concepts (dimensional, relational, and analytical models)
• Experience building ETL/ELT pipelines and working with structured and semi-structured data
• Familiarity with cloud-based data architectures and modern data engineering practices
• Strong problem-solving skills and the ability to work independently in a fast-moving environment
• Excellent communication skills and comfort working with cross-functional stakeholders
• U.S. Person status required
Nice to Have
• Experience with data orchestration tools (e.g., Airflow or similar)
• Exposure to DevOps or DataOps practices, including CI/CD for data pipelines
• Experience working in regulated, government, or defense-related environments
• Familiarity with data governance, lineage, and metadata management concepts
Senior Data Engineer
Position Overview
We are seeking a Senior Data Engineer to support a large-scale, mission-critical data platform for a regulated U.S. government program. This role focuses on designing, building, and maintaining scalable data pipelines and analytics-ready data environments in the cloud. The ideal candidate brings deep experience with SQL, Python, and Snowflake, strong data modeling fundamentals, and a disciplined engineering mindset suitable for high-trust environments.
This position requires U.S. Person status due to the nature of the work.
Key Responsibilities
• Design, develop, and maintain scalable, reliable data pipelines supporting analytics, reporting, and downstream applications
• Build and optimize data ingestion, transformation, and aggregation workflows using SQL and Python
• Implement and manage data models in Snowflake optimized for performance, cost, and analytical use cases
• Partner with data analysts, data scientists, and application teams to translate business and mission requirements into technical solutions
• Ensure data quality, consistency, and integrity through validation frameworks and automated checks
• Optimize query performance, warehouse usage, and storage strategies within Snowflake
• Support batch and near-real-time data processing patterns as required by the platform
• Contribute to data architecture decisions, standards, and best practices
• Develop and maintain technical documentation for data pipelines, schemas, and operational processes
• Participate in code reviews and mentor junior engineers as needed
• Support production operations, troubleshooting, and root-cause analysis for data issues
Required Qualifications
• 7–10 years of experience in data engineering, data warehousing, or backend data platform development
• Advanced SQL expertise, including complex transformations, performance tuning, and query optimization
• Strong Python experience for data processing, orchestration, and automation
• Hands-on experience designing and operating Snowflake data warehouses in production environments
• Solid understanding of data modeling concepts (dimensional, relational, and analytical models)
• Experience building ETL/ELT pipelines and working with structured and semi-structured data
• Familiarity with cloud-based data architectures and modern data engineering practices
• Strong problem-solving skills and the ability to work independently in a fast-moving environment
• Excellent communication skills and comfort working with cross-functional stakeholders
• U.S. Person status required
Nice to Have
• Experience with data orchestration tools (e.g., Airflow or similar)
• Exposure to DevOps or DataOps practices, including CI/CD for data pipelines
• Experience working in regulated, government, or defense-related environments
• Familiarity with data governance, lineage, and metadata management concepts






