

Tenth Revolution Group
AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer, contract duration of at least 6 months, fully remote. Requires senior-level experience in data engineering, AWS platforms, data modeling, Python, and SQL. Financial services industry experience is preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 17, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#API (Application Programming Interface) #Python #SQL (Structured Query Language) #Version Control #Data Dictionary #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Data Modeling #Snowflake #Data Integrity #Consulting #Monitoring #Data Quality #Code Reviews #DMS (Data Migration Service) #Terraform #"ETL (Extract #Transform #Load)" #Aurora PostgreSQL #Documentation #DevOps #AWS Glue #Aurora #Data Pipeline #PostgreSQL #SQL Queries #GitHub #Data Engineering
Role description
Location: Fully Remote (US)
Duration: Minimum 6 months
Start: 2/9 onwards
Engagement: Contract
Industry: Financial Services
We are supporting a premium AWS consulting partner that is currently working through a pressing customer engagement within financial services. They are looking to engage a senior-level Data Engineer with strong enterprise data exposure to help stabilize, optimize, and extend a production-grade AWS data platform.
This is a hands-on role focused on data modeling, event-driven pipelines, and enterprise data assets, working closely with engineering and platform teams.
Key Responsibilities:
Data Modeling
β’ Maintain, extend, and optimize data models in Aurora PostgreSQL and Snowflake
β’ Modify logical and physical schemas to support new requirements and performance improvements
β’ Ensure strong data integrity and maintain documentation within the enterprise data dictionary
Enterprise Data & API Development
β’ Contribute to shared enterprise-wide data assets
β’ Write and optimize complex SQL queries supporting internal and external APIs
β’ Partner with application and platform teams to enable reliable downstream consumption
Data Pipeline Engineering
β’ Maintain and enhance event-driven data pipelines using AWS Glue, Lambda, DMS, and Python
β’ Refactor ETL logic, tune performance, and deliver new features
β’ Work with AWS EventBridge workflows
β’ Implement data quality checks, monitoring, and alerting
System Integration & DevOps
β’ Contribute to and maintain CI/CD pipelines using GitHub Actions
β’ Follow strong version control practices, including code reviews via GitHub
β’ Collaborate on infrastructure topics, including Terraform and artifact management
β’ Troubleshoot and resolve production data pipeline issues
Required Experience
β’ Senior-level Data Engineering background with strong enterprise data exposure
β’ Deep experience building and supporting AWS-based data platforms
β’ Strong data modeling skills across transactional and analytical systems
β’ Hands-on experience with Snowflake and PostgreSQL
β’ Event-driven architecture experience using AWS native services
β’ Strong Python and SQL skills
β’ Comfortable operating in fast-moving, client-facing consulting environments
Nice to Have
β’ Prior experience in financial services
β’ Exposure to regulated or highly governed data environments
β’ Experience supporting API-driven data products
Location: Fully Remote (US)
Duration: Minimum 6 months
Start: 2/9 onwards
Engagement: Contract
Industry: Financial Services
We are supporting a premium AWS consulting partner that is currently working through a pressing customer engagement within financial services. They are looking to engage a senior-level Data Engineer with strong enterprise data exposure to help stabilize, optimize, and extend a production-grade AWS data platform.
This is a hands-on role focused on data modeling, event-driven pipelines, and enterprise data assets, working closely with engineering and platform teams.
Key Responsibilities:
Data Modeling
β’ Maintain, extend, and optimize data models in Aurora PostgreSQL and Snowflake
β’ Modify logical and physical schemas to support new requirements and performance improvements
β’ Ensure strong data integrity and maintain documentation within the enterprise data dictionary
Enterprise Data & API Development
β’ Contribute to shared enterprise-wide data assets
β’ Write and optimize complex SQL queries supporting internal and external APIs
β’ Partner with application and platform teams to enable reliable downstream consumption
Data Pipeline Engineering
β’ Maintain and enhance event-driven data pipelines using AWS Glue, Lambda, DMS, and Python
β’ Refactor ETL logic, tune performance, and deliver new features
β’ Work with AWS EventBridge workflows
β’ Implement data quality checks, monitoring, and alerting
System Integration & DevOps
β’ Contribute to and maintain CI/CD pipelines using GitHub Actions
β’ Follow strong version control practices, including code reviews via GitHub
β’ Collaborate on infrastructure topics, including Terraform and artifact management
β’ Troubleshoot and resolve production data pipeline issues
Required Experience
β’ Senior-level Data Engineering background with strong enterprise data exposure
β’ Deep experience building and supporting AWS-based data platforms
β’ Strong data modeling skills across transactional and analytical systems
β’ Hands-on experience with Snowflake and PostgreSQL
β’ Event-driven architecture experience using AWS native services
β’ Strong Python and SQL skills
β’ Comfortable operating in fast-moving, client-facing consulting environments
Nice to Have
β’ Prior experience in financial services
β’ Exposure to regulated or highly governed data environments
β’ Experience supporting API-driven data products






