

The Ash Group
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 12-month W2 contract in Falls Church, VA, offering $65 per hour. Requires 8+ years in data engineering, 5+ years with AWS Glue/Python, and strong SQL skills, particularly with Amazon Redshift.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date
October 23, 2025
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Falls Church, VA
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Fivetran #Data Quality #AWS Glue #Data Pipeline #PySpark #SQL Queries #Data Warehouse #Airflow #Security #Data Governance #Snowflake #Kafka (Apache Kafka) #Data Modeling #Data Engineering #Indexing #Cloud #Data Processing #Python #Database Design #Apache Airflow #Oracle #Redshift #Scala #Spark (Apache Spark) #Amazon Redshift #Apache Kafka #Data Science #SQL (Structured Query Language) #Datasets #AWS (Amazon Web Services) #SQL Server
Role description
β’
β’
β’ W2 Contract Only β No C2C β No 3rd Parties
β’
β’
β’ Summary
The Ash Group is hiring a new Programmer Analyst Principal (Data Engineer) for our client (a global leader providing advanced systems and support in defense, aerospace, and security) based in Falls Church, VA.
In this role, you'll be designing, implementing, and optimizing large-scale data systems and ETL pipelines, with a strong focus on using Amazon Redshift and AWS services to ensure data quality and integrity for complex defense programs.
Compensation, Benefits, and Role Info
β’ Competitive pay rate of $65 per hour.
β’ Medical, dental, vision, direct primary care benefits, and, after six months of employment, a 4% matched 401(k) plan with immediate 100% vesting.
β’ Type: 12-month contract with potential extension or conversion.
β’ Location: On-site in Falls Church, VA.
What Youβll Be Doing
β’ Design and implement large-scale ETL data pipelines using AWS Glue and Python/PySpark to ingest, transform, and load data from various sources.
β’ Build and maintain robust data warehouses, focusing on Amazon Redshift, including data modeling and governance.
β’ Write and optimize complex, highly-performant SQL queries across large datasets (Redshift, Oracle, SQL Server).
β’ Collaborate with cross-functional teams (data scientists, analysts) to understand requirements and deliver end-to-end data solutions.
β’ Troubleshoot, optimize performance, and resolve data-related issues like pipeline failures and data quality bottlenecks.
What Weβre Looking For
β’ 8+ years of hands-on experience in data engineering, focusing on designing and implementing large-scale data systems.
β’ 5+ years of experience in building production-level ETL pipelines using AWS Glue and Python/PySpark.
β’ Deep proficiency in SQL, including query optimization, indexing, and performance tuning across data warehouses like Amazon Redshift.
β’ Strong understanding of database design principles, data modeling (star/snowflake schemas), and data governance.
β’ Experience with data processing/orchestration frameworks such as Apache Airflow, Apache Kafka, or Fivetran.
If you're a seasoned data engineering professional passionate about building scalable data solutions and driving innovation in cloud-based environments, we want to hear from you. This is an exciting opportunity to work on cutting-edge technologies, collaborate with cross-functional teams, and make a meaningful impact on data-driven decision-making. Apply now to be part of a forward-thinking organization where your expertise will shape the future of our data infrastructure.
#DataEngineer #DataEngineering #AWSEngineer #Redshift #ETL #PySpark #DataPipeline #Westminster #ColoradoJobs #Contract
β’
β’
β’ W2 Contract Only β No C2C β No 3rd Parties
β’
β’
β’ Summary
The Ash Group is hiring a new Programmer Analyst Principal (Data Engineer) for our client (a global leader providing advanced systems and support in defense, aerospace, and security) based in Falls Church, VA.
In this role, you'll be designing, implementing, and optimizing large-scale data systems and ETL pipelines, with a strong focus on using Amazon Redshift and AWS services to ensure data quality and integrity for complex defense programs.
Compensation, Benefits, and Role Info
β’ Competitive pay rate of $65 per hour.
β’ Medical, dental, vision, direct primary care benefits, and, after six months of employment, a 4% matched 401(k) plan with immediate 100% vesting.
β’ Type: 12-month contract with potential extension or conversion.
β’ Location: On-site in Falls Church, VA.
What Youβll Be Doing
β’ Design and implement large-scale ETL data pipelines using AWS Glue and Python/PySpark to ingest, transform, and load data from various sources.
β’ Build and maintain robust data warehouses, focusing on Amazon Redshift, including data modeling and governance.
β’ Write and optimize complex, highly-performant SQL queries across large datasets (Redshift, Oracle, SQL Server).
β’ Collaborate with cross-functional teams (data scientists, analysts) to understand requirements and deliver end-to-end data solutions.
β’ Troubleshoot, optimize performance, and resolve data-related issues like pipeline failures and data quality bottlenecks.
What Weβre Looking For
β’ 8+ years of hands-on experience in data engineering, focusing on designing and implementing large-scale data systems.
β’ 5+ years of experience in building production-level ETL pipelines using AWS Glue and Python/PySpark.
β’ Deep proficiency in SQL, including query optimization, indexing, and performance tuning across data warehouses like Amazon Redshift.
β’ Strong understanding of database design principles, data modeling (star/snowflake schemas), and data governance.
β’ Experience with data processing/orchestration frameworks such as Apache Airflow, Apache Kafka, or Fivetran.
If you're a seasoned data engineering professional passionate about building scalable data solutions and driving innovation in cloud-based environments, we want to hear from you. This is an exciting opportunity to work on cutting-edge technologies, collaborate with cross-functional teams, and make a meaningful impact on data-driven decision-making. Apply now to be part of a forward-thinking organization where your expertise will shape the future of our data infrastructure.
#DataEngineer #DataEngineering #AWSEngineer #Redshift #ETL #PySpark #DataPipeline #Westminster #ColoradoJobs #Contract






