

Summit Human Capital
Remote Cloud Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Remote Cloud Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include Python, ETL pipelines, AWS services (S3, Glue, Athena, Lambda, RDS), and Snowflake experience is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 7, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Richmond, VA
-
🧠 - Skills detailed
#Microservices #Storage #"ETL (Extract #Transform #Load)" #Visualization #Data Governance #Lambda (AWS Lambda) #Data Processing #AWS (Amazon Web Services) #Infrastructure as Code (IaC) #Deployment #RDS (Amazon Relational Database Service) #Data Lake #Data Engineering #Python #Scala #S3 (Amazon Simple Storage Service) #Data Manipulation #Snowflake #Cloud #Terraform #Security #Athena
Role description
Summit Human Capital is seeking a highly motivated Cloud Data Engineer with software engineering and data manipulation expertise. The ideal candidate will meet the following criteria:
Requirements:
• Experience with full-stack software engineering or ability to build front-end data visualizations.
• Strong hands-on experience with Python development.
• Experience developing and maintaining ETL pipelines using AWS services.
• Experience with data warehousing such as Snowflake.
• Proven experience as a Data Engineer or Software Developer using cloud technologies.
• Hands-on experience with Infrastructure as Code (Terraform).
• Experience with AWS data services such as S3, Glue, Athena, Lambda, and/or RDS.
Responsibilities:
• Support data infrastructure using cloud-based solutions.
• Create scalable, secure, and automated data solutions.
• Architect and deliver data-driven applications and products that streamline analytics and large-scale data processing.
• Engineer and support ETL/ELT workflows, data lake environments, and system integrations leveraging AWS tools including Glue, Athena, Lambda, RDS, and S3.
• Apply Terraform to define and manage infrastructure programmatically, enabling automated provisioning and deployment.
• Build user-friendly visualization layers that surface key insights and reinforce data governance practices.
• Create and maintain APIs and microservices designed for secure, controlled access to data.
• Incorporate Snowflake into the organization's data landscape to enhance analytical capabilities and storage performance.
• Implement and monitor cloud security standards, ensuring adherence to RBAC policies and SSO-enabled access.
Summit Human Capital is seeking a highly motivated Cloud Data Engineer with software engineering and data manipulation expertise. The ideal candidate will meet the following criteria:
Requirements:
• Experience with full-stack software engineering or ability to build front-end data visualizations.
• Strong hands-on experience with Python development.
• Experience developing and maintaining ETL pipelines using AWS services.
• Experience with data warehousing such as Snowflake.
• Proven experience as a Data Engineer or Software Developer using cloud technologies.
• Hands-on experience with Infrastructure as Code (Terraform).
• Experience with AWS data services such as S3, Glue, Athena, Lambda, and/or RDS.
Responsibilities:
• Support data infrastructure using cloud-based solutions.
• Create scalable, secure, and automated data solutions.
• Architect and deliver data-driven applications and products that streamline analytics and large-scale data processing.
• Engineer and support ETL/ELT workflows, data lake environments, and system integrations leveraging AWS tools including Glue, Athena, Lambda, RDS, and S3.
• Apply Terraform to define and manage infrastructure programmatically, enabling automated provisioning and deployment.
• Build user-friendly visualization layers that surface key insights and reinforce data governance practices.
• Create and maintain APIs and microservices designed for secure, controlled access to data.
• Incorporate Snowflake into the organization's data landscape to enhance analytical capabilities and storage performance.
• Implement and monitor cloud security standards, ensuring adherence to RBAC policies and SSO-enabled access.






