

Senior AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Data Engineer on a contract basis (C2C, W2, or 1099) in Reston, VA, or Plano, TX. Requires 10+ years in data engineering, 5+ years with AWS services, and AWS certification is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 15, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Reston, VA
-
π§ - Skills detailed
#DevOps #Automation #Data Catalog #Version Control #Databases #Amazon Redshift #Linux #AWS Glue #GitHub #Strategy #Infrastructure as Code (IaC) #Database Management #Shell Scripting #Monitoring #Data Pipeline #Lambda (AWS Lambda) #Spark (Apache Spark) #Data Integrity #AWS (Amazon Web Services) #Computer Science #Logging #Python #Cloud #PySpark #Compliance #Data Governance #Consulting #Data Strategy #Data Warehouse #Scripting #Scala #Security #Data Processing #Terraform #Data Engineering #S3 (Amazon Simple Storage Service) #Unix #SQL (Structured Query Language) #AWS Lambda #"ETL (Extract #Transform #Load)" #Data Ingestion #GitLab #Amazon EMR (Amazon Elastic MapReduce) #SQL Queries #Redshift
Role description
About the Company:
Capitis Solutions Inc. was founded in 2007 and is a Maryland-based Information Technology consulting company, specializing in implementing modern applications. Our philosophy to align with the clientβs organizational culture and provide cost effective, quality solutions is key to our success. Our approach to problem solving is unique and our clients benefit from the results in weeks rather than months.
Location: Reston, VA (primary) or Plano, TX (alternate)
Work Schedule: 3 days onsite, 2 days remote
Job Type: Contract (C2C, W2, or 1099)
Citizenship Requirements: Must be a U.S. Citizen, Green Card holder, or a valid C2C/W2/1099 H1B candidate. We are not sponsoring H1B visas for this position.
About the Role:
We are seeking a highly experienced Senior Data Engineer to join our Information Security Data Compliance Engineering group for a contract position. You will be a key contributor to a data governance project, building robust and scalable data pipelines to manage large volumes of enterprise data.
The ideal candidate will have extensive hands-on experience with Amazon EMR and AWS Glue, as these are the primary services for this role. You will be responsible for designing and implementing data solutions that ingest, transform, and manage data from various sources, ensuring data integrity and security. This position will be serving one of our large clients in the Financial Industry.
Responsibilities
β’ Develop Data Pipelines: Design, build, and maintain highly scalable and efficient data pipelines using Amazon EMR and AWS Glue as the core technologies. Secondary services will include AWS Lambda and AWS Step Functions.
β’ Data Processing at Scale: Develop and optimize PySpark programs to run on EMR clusters or as Glue Jobs. You will handle large-scale data ingestion and transformation from sources like APIs, S3, and file systems.
β’ Database Management: Utilize AWS Glue Data Catalog to define and manage data schemas. You will be responsible for extensive data inserts, updates, and management of Glue-based tables.
β’ SQL and Redshift Expertise: Write and optimize complex SQL queries, particularly for large tables within our Amazon Redshift data warehouse.
β’ Infrastructure as Code (IaC): Apply DevOps principles using GitHub for version control and either Terraform or AWS CloudFormation for infrastructure automation and CI/CD pipelines. Familiarity with GitLab is also a plus.
β’ Monitoring and Operations: Implement comprehensive monitoring, logging, and alerting to ensure the reliability and performance of data pipelines. Operate in a Unix/Linux environment for scripting and automation.
β’ Solution Architecture: Actively contribute to high-level solution design and architectural discussions, leveraging your deep expertise in EMR and Glue to shape our data strategy.
Qualifications:
β’ Overall Experience: 10+ years of professional experience in data engineering or a related field.
β’ Technical Expertise:
β’ 5+ years of experience with Python for scripting and automation.
β’ 2+ years of hands-on experience with Amazon EMR and AWS Glue.
β’ 7+ years of expertise in SQL and working with relational databases.
β’ 3+ years of experience with Unix/Linux shell scripting.
β’ 5+ years of professional experience with AWS services.
Education: A B.S. or M.S. in Computer Science or a related technical field is highly preferred.
DevOps Experience: Strong understanding of version control with GitHub and experience with IaC tools like Terraform or AWS CloudFormation.
AWS Certification: An AWS Certification (e.g., AWS Certified Data Analytics β Specialty) is highly desirable.
About the Company:
Capitis Solutions Inc. was founded in 2007 and is a Maryland-based Information Technology consulting company, specializing in implementing modern applications. Our philosophy to align with the clientβs organizational culture and provide cost effective, quality solutions is key to our success. Our approach to problem solving is unique and our clients benefit from the results in weeks rather than months.
Location: Reston, VA (primary) or Plano, TX (alternate)
Work Schedule: 3 days onsite, 2 days remote
Job Type: Contract (C2C, W2, or 1099)
Citizenship Requirements: Must be a U.S. Citizen, Green Card holder, or a valid C2C/W2/1099 H1B candidate. We are not sponsoring H1B visas for this position.
About the Role:
We are seeking a highly experienced Senior Data Engineer to join our Information Security Data Compliance Engineering group for a contract position. You will be a key contributor to a data governance project, building robust and scalable data pipelines to manage large volumes of enterprise data.
The ideal candidate will have extensive hands-on experience with Amazon EMR and AWS Glue, as these are the primary services for this role. You will be responsible for designing and implementing data solutions that ingest, transform, and manage data from various sources, ensuring data integrity and security. This position will be serving one of our large clients in the Financial Industry.
Responsibilities
β’ Develop Data Pipelines: Design, build, and maintain highly scalable and efficient data pipelines using Amazon EMR and AWS Glue as the core technologies. Secondary services will include AWS Lambda and AWS Step Functions.
β’ Data Processing at Scale: Develop and optimize PySpark programs to run on EMR clusters or as Glue Jobs. You will handle large-scale data ingestion and transformation from sources like APIs, S3, and file systems.
β’ Database Management: Utilize AWS Glue Data Catalog to define and manage data schemas. You will be responsible for extensive data inserts, updates, and management of Glue-based tables.
β’ SQL and Redshift Expertise: Write and optimize complex SQL queries, particularly for large tables within our Amazon Redshift data warehouse.
β’ Infrastructure as Code (IaC): Apply DevOps principles using GitHub for version control and either Terraform or AWS CloudFormation for infrastructure automation and CI/CD pipelines. Familiarity with GitLab is also a plus.
β’ Monitoring and Operations: Implement comprehensive monitoring, logging, and alerting to ensure the reliability and performance of data pipelines. Operate in a Unix/Linux environment for scripting and automation.
β’ Solution Architecture: Actively contribute to high-level solution design and architectural discussions, leveraging your deep expertise in EMR and Glue to shape our data strategy.
Qualifications:
β’ Overall Experience: 10+ years of professional experience in data engineering or a related field.
β’ Technical Expertise:
β’ 5+ years of experience with Python for scripting and automation.
β’ 2+ years of hands-on experience with Amazon EMR and AWS Glue.
β’ 7+ years of expertise in SQL and working with relational databases.
β’ 3+ years of experience with Unix/Linux shell scripting.
β’ 5+ years of professional experience with AWS services.
Education: A B.S. or M.S. in Computer Science or a related technical field is highly preferred.
DevOps Experience: Strong understanding of version control with GitHub and experience with IaC tools like Terraform or AWS CloudFormation.
AWS Certification: An AWS Certification (e.g., AWS Certified Data Analytics β Specialty) is highly desirable.