

Luna Data Solutions, Inc.
Cloud Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a remote, long-term W2 contract for a Cloud Data Engineer with a pay rate of "TBD." Requires 5-9 years of AWS data engineering experience, proficiency in Python and SQL, and AWS Associate-level certification.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
May 15, 2026
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Georgia, United States
-
π§ - Skills detailed
#Data Catalog #Monitoring #DevSecOps #SageMaker #SQL (Structured Query Language) #Anomaly Detection #Metadata #Data Engineering #Data Access #Data Analysis #GraphQL #GitHub #Data Processing #Data Quality #Data Governance #Athena #IAM (Identity and Access Management) #Strategy #Security #API (Application Programming Interface) #AWS (Amazon Web Services) #Data Science #Cloud #"ETL (Extract #Transform #Load)" #Datasets #Amazon QuickSight #Redshift #DevOps #S3 (Amazon Simple Storage Service) #Scala #REST (Representational State Transfer) #Data Pipeline #Data Management #Python #BI (Business Intelligence) #Lambda (AWS Lambda) #Migration #ML (Machine Learning) #Documentation
Role description
We have a remote long term W2 only contract opportunity for a Cloud Data Engineer. The Cloud Data Engineer is a hands-on technical practitioner responsible for delivering a high-impact portfolio of data engineering, application development, and cloud modernization workstreams for a public sector customer. This role spans the full spectrum from source control migration and CI/CD modernization to medallion-architecture data pipelines, exploratory data science, and end-user analytics applications. The ideal candidate brings deep AWS data platform expertise, strong software engineering fundamentals, and the ability to independently drive complex deliverables from design through production.
Responsibilities:
DevOps & Source Control Modernization
β’ Demonstrated experience with AWS CodeCommit and GitHub, including branch strategy, access controls, and audit trail preservation
β’ Establish GitHub Actions CI/CD pipelines with integrated security scanning (SAST, secrets detection, dependency vulnerability scanning)
β’ Define and enforce DevSecOps standards in alignment with HIPAA requirements, including pipeline gating, approval workflows, and environment promotion policies
Data Platform & Cloud PoC Delivery
β’ Design and execute a proof-of-concept for Amazon SageMaker Unified Studio, evaluating its capabilities for unified data engineering, model development, and analytics workflows
β’ Design and execute a proof-of-concept for AWS Lake Formation, including fine-grained access controls, data catalog governance, and integration with downstream consumers
β’ Document PoC findings, architecture recommendations, and go/no-go criteria for production adoption
Data Engineering & Pipeline Development
β’ Architect and implement a Medallion (Bronze/Silver/Gold) pipeline architecture for Gold-tier data processing, including data quality enforcement, transformation logic, and lineage tracking
β’ Build and maintain scalable, production-grade ETL/ELT pipelines using AWS-native services (Glue, Athena, Step Functions, Lambda, S3, Redshift)
β’ Perform Redshift cluster upgrades and implement health enhancements including query performance tuning, WLM optimization, and monitoring instrumentation
Application & API Development
β’ Lead full-stack development of a public-facing Data API site, including API design (REST/GraphQL), backend data access layers, authentication/authorization, and front-end documentation portal
β’ Own end-to-end delivery of the AgencyData Explorer dashboard application built on Amazon QuickSight, including dataset modeling, calculated fields, row-level security, and embedded delivery
Data Science & Analytics
β’ Conduct manual exploratory data analysis (EDA) on contracts and vouchers datasets, including profiling, anomaly detection, distribution analysis, and documentation of findings
β’ Produce data quality reports and analytical narratives to support downstream decision-making and stakeholder review
General Delivery & Continuous Improvement
β’ Drive general cloud health enhancements including cost optimization reviews, tagging governance, IAM policy hardening, and service limit monitoring
β’ Maintain technical documentation, architecture decision records (ADRs), and knowledge transfer materials throughout the engagement
β’ Participate in sprint ceremonies, backlog refinement, and customer status reporting
Minimum Requirements
β’ 5β9 years of hands-on experience in data engineering, cloud application development, or DevOps on AWS
β’ Demonstrated experience with AWS data services including Redshift, Glue, Athena, S3, Lake Formation, and Step Functions
β’ Proficiency in Python and SQL for data pipeline development and exploratory analysis
β’ Experience with GitHub Actions or equivalent CI/CD platforms, including security tooling integration
β’ Familiarity with Amazon QuickSight or comparable BI/dashboarding platforms
β’ AWS Associate-level certification or higher in a relevant domain (Developer, Data Analytics, or Solutions Architect)
Preferences:
β’ Experience with SageMaker Unified Studio or the broader SageMaker ecosystem
β’ Prior delivery of Medallion/Lakehouse architecture patterns in a production environment
β’ Experience building and publishing REST or GraphQL APIs for external consumers
β’ Background in public sector or government data environments
β’ AWS Specialty certification in Data Analytics or Machine Learning
β’ Familiarity with data governance frameworks and metadata management practices
What We Offer
β’ Great opportunity to make a big impact and take ownership on technology initiatives
β’ Altruistic work
β’ Competitive compensation and benefits including health, dental, vision, life and accident insurance, short-term disability insurance and more!
Luna Data Solutions, Inc. (LDS) provides equal employment opportunities to all employees. All applicants will be considered for employment. LDS prohibits discrimination and harassment of any type regarding age, race, color, religion, sexual orientation, gender identity, sex, national origin, genetics, protected veteran status, and/or disability status. A 7 year background check is required for this role.
We have a remote long term W2 only contract opportunity for a Cloud Data Engineer. The Cloud Data Engineer is a hands-on technical practitioner responsible for delivering a high-impact portfolio of data engineering, application development, and cloud modernization workstreams for a public sector customer. This role spans the full spectrum from source control migration and CI/CD modernization to medallion-architecture data pipelines, exploratory data science, and end-user analytics applications. The ideal candidate brings deep AWS data platform expertise, strong software engineering fundamentals, and the ability to independently drive complex deliverables from design through production.
Responsibilities:
DevOps & Source Control Modernization
β’ Demonstrated experience with AWS CodeCommit and GitHub, including branch strategy, access controls, and audit trail preservation
β’ Establish GitHub Actions CI/CD pipelines with integrated security scanning (SAST, secrets detection, dependency vulnerability scanning)
β’ Define and enforce DevSecOps standards in alignment with HIPAA requirements, including pipeline gating, approval workflows, and environment promotion policies
Data Platform & Cloud PoC Delivery
β’ Design and execute a proof-of-concept for Amazon SageMaker Unified Studio, evaluating its capabilities for unified data engineering, model development, and analytics workflows
β’ Design and execute a proof-of-concept for AWS Lake Formation, including fine-grained access controls, data catalog governance, and integration with downstream consumers
β’ Document PoC findings, architecture recommendations, and go/no-go criteria for production adoption
Data Engineering & Pipeline Development
β’ Architect and implement a Medallion (Bronze/Silver/Gold) pipeline architecture for Gold-tier data processing, including data quality enforcement, transformation logic, and lineage tracking
β’ Build and maintain scalable, production-grade ETL/ELT pipelines using AWS-native services (Glue, Athena, Step Functions, Lambda, S3, Redshift)
β’ Perform Redshift cluster upgrades and implement health enhancements including query performance tuning, WLM optimization, and monitoring instrumentation
Application & API Development
β’ Lead full-stack development of a public-facing Data API site, including API design (REST/GraphQL), backend data access layers, authentication/authorization, and front-end documentation portal
β’ Own end-to-end delivery of the AgencyData Explorer dashboard application built on Amazon QuickSight, including dataset modeling, calculated fields, row-level security, and embedded delivery
Data Science & Analytics
β’ Conduct manual exploratory data analysis (EDA) on contracts and vouchers datasets, including profiling, anomaly detection, distribution analysis, and documentation of findings
β’ Produce data quality reports and analytical narratives to support downstream decision-making and stakeholder review
General Delivery & Continuous Improvement
β’ Drive general cloud health enhancements including cost optimization reviews, tagging governance, IAM policy hardening, and service limit monitoring
β’ Maintain technical documentation, architecture decision records (ADRs), and knowledge transfer materials throughout the engagement
β’ Participate in sprint ceremonies, backlog refinement, and customer status reporting
Minimum Requirements
β’ 5β9 years of hands-on experience in data engineering, cloud application development, or DevOps on AWS
β’ Demonstrated experience with AWS data services including Redshift, Glue, Athena, S3, Lake Formation, and Step Functions
β’ Proficiency in Python and SQL for data pipeline development and exploratory analysis
β’ Experience with GitHub Actions or equivalent CI/CD platforms, including security tooling integration
β’ Familiarity with Amazon QuickSight or comparable BI/dashboarding platforms
β’ AWS Associate-level certification or higher in a relevant domain (Developer, Data Analytics, or Solutions Architect)
Preferences:
β’ Experience with SageMaker Unified Studio or the broader SageMaker ecosystem
β’ Prior delivery of Medallion/Lakehouse architecture patterns in a production environment
β’ Experience building and publishing REST or GraphQL APIs for external consumers
β’ Background in public sector or government data environments
β’ AWS Specialty certification in Data Analytics or Machine Learning
β’ Familiarity with data governance frameworks and metadata management practices
What We Offer
β’ Great opportunity to make a big impact and take ownership on technology initiatives
β’ Altruistic work
β’ Competitive compensation and benefits including health, dental, vision, life and accident insurance, short-term disability insurance and more!
Luna Data Solutions, Inc. (LDS) provides equal employment opportunities to all employees. All applicants will be considered for employment. LDS prohibits discrimination and harassment of any type regarding age, race, color, religion, sexual orientation, gender identity, sex, national origin, genetics, protected veteran status, and/or disability status. A 7 year background check is required for this role.






