

Strategic Staffing Solutions
Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on an 18+ month contract in West Des Moines, IA, offering $42-46/hr. Key skills include ETL/ELT pipeline design, advanced PySpark, and Apache Airflow. Cloud platform experience, particularly GCP, is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
368
-
ποΈ - Date
March 14, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
West Des Moines, IA 50266
-
π§ - Skills detailed
#Metadata #Data Governance #Spark (Apache Spark) #Data Pipeline #Data Processing #Cloud #Scala #"ETL (Extract #Transform #Load)" #Data Modeling #GCP (Google Cloud Platform) #HDFS (Hadoop Distributed File System) #Database Systems #PySpark #SQL (Structured Query Language) #Datasets #Dataflow #Airflow #Batch #Apache Airflow #Hadoop #Data Quality #Programming #S3 (Amazon Simple Storage Service) #Data Management #Data Storage #BigQuery #Delta Lake #Storage #Data Engineering #Python
Role description
STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!
This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this position. Visa Sponsorship is Available! The details are below.
βBeware of scams. S3 never asks for money during its onboarding process.β
Job Title: Senior Data Engineer Contract Length: 18+ Month contractLocation: WEST DES MOINES, IA 50266Work Schedule: 3 days onsite/ 2 days remotePay: 42-46 an hr on W2
The Data Engineer will design, develop, and maintain data pipelines and ETL/ELT workflows supporting batch and real-time data processing. This role focuses on building scalable data infrastructure, implementing modern data storage solutions, and ensuring reliable data delivery for reporting and downstream applications.
Key Responsibilities
Design and develop ETL/ELT workflows and data pipelines for batch and real-time processing.
Build and maintain data pipelines for reporting and downstream applications using open source frameworks and cloud technologies.
Implement operational and analytical data stores leveraging Delta Lake and modern database concepts.
Optimize data structures for performance and scalability across large datasets.
Collaborate with architects and engineering teams to ensure alignment with target state architecture.
Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
Develop, schedule, and orchestrate complex workflows using Apache Airflow, including designing and managing Airflow DAGs.
Troubleshoot and resolve issues in data pipelines to ensure high availability and reliability.
Required Technical Skills
Strong understanding of data structures, data modeling, and lifecycle management.
Hands-on experience designing and managing ETL/ELT data pipelines.
Advanced PySpark skills for distributed data processing and transformation.
Experience implementing open table formats using NetApp Iceberg.
Knowledge of the Hadoop ecosystem, including HDFS and Hive.
Experience with cloud platforms, including GCP (BigQuery, Dataflow), Delta Lake, and Dataplex for governance and metadata management.
Programming and orchestration using Python, Spark, and SQL.
Strong experience with Apache Airflow, including authoring and maintaining DAGs for complex workflows.
Strong understanding of relational and distributed database systems and reporting concepts.
Pay: $42.00 - $46.00 per hour
Work Location: Hybrid remote in West Des Moines, IA 50266
STRATEGIC STAFFING SOLUTIONS HAS AN OPENING!
This is a Contract Opportunity with our company that MUST be worked on a W2 Only. No C2C eligibility for this position. Visa Sponsorship is Available! The details are below.
βBeware of scams. S3 never asks for money during its onboarding process.β
Job Title: Senior Data Engineer Contract Length: 18+ Month contractLocation: WEST DES MOINES, IA 50266Work Schedule: 3 days onsite/ 2 days remotePay: 42-46 an hr on W2
The Data Engineer will design, develop, and maintain data pipelines and ETL/ELT workflows supporting batch and real-time data processing. This role focuses on building scalable data infrastructure, implementing modern data storage solutions, and ensuring reliable data delivery for reporting and downstream applications.
Key Responsibilities
Design and develop ETL/ELT workflows and data pipelines for batch and real-time processing.
Build and maintain data pipelines for reporting and downstream applications using open source frameworks and cloud technologies.
Implement operational and analytical data stores leveraging Delta Lake and modern database concepts.
Optimize data structures for performance and scalability across large datasets.
Collaborate with architects and engineering teams to ensure alignment with target state architecture.
Apply best practices for data governance, lineage tracking, and metadata management, including integration with Google Dataplex for centralized governance and data quality enforcement.
Develop, schedule, and orchestrate complex workflows using Apache Airflow, including designing and managing Airflow DAGs.
Troubleshoot and resolve issues in data pipelines to ensure high availability and reliability.
Required Technical Skills
Strong understanding of data structures, data modeling, and lifecycle management.
Hands-on experience designing and managing ETL/ELT data pipelines.
Advanced PySpark skills for distributed data processing and transformation.
Experience implementing open table formats using NetApp Iceberg.
Knowledge of the Hadoop ecosystem, including HDFS and Hive.
Experience with cloud platforms, including GCP (BigQuery, Dataflow), Delta Lake, and Dataplex for governance and metadata management.
Programming and orchestration using Python, Spark, and SQL.
Strong experience with Apache Airflow, including authoring and maintaining DAGs for complex workflows.
Strong understanding of relational and distributed database systems and reporting concepts.
Pay: $42.00 - $46.00 per hour
Work Location: Hybrid remote in West Des Moines, IA 50266






