

AWS Cloud and ETL Developer - Houston, TX
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Cloud and ETL Developer based in Houston, TX, with a contract length of "Unknown" and a pay rate of "Unknown." Key skills include AWS services, SQL, and ETL tools. Experience with big data and data modeling is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 31, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Walnut Creek, CA
-
🧠 - Skills detailed
#AWS CLI (Amazon Web Services Command Line Interface) #Compliance #IAM (Identity and Access Management) #Aurora #Apache Spark #Big Data #Spark (Apache Spark) #AWS (Amazon Web Services) #Data Storage #Databases #Oracle #RDS (Amazon Relational Database Service) #Snowflake #Talend #Airflow #Scala #Amazon RDS (Amazon Relational Database Service) #SageMaker #Data Modeling #Data Quality #Data Pipeline #Cloud #"ETL (Extract #Transform #Load)" #Archimate #CLI (Command-Line Interface) #SQL Server #Monitoring #Data Mapping #.Net #ERWin #Storage #Sqoop (Apache Sqoop) #Data Processing #Data Warehouse #Data Lake #AWS S3 (Amazon Simple Storage Service) #S3 (Amazon Simple Storage Service) #Amazon EMR (Amazon Elastic MapReduce) #Security #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: AWS Cloud and ETL developer
Location: Houston, TX/ Jersey City, NJ
Technical Skills:
• ETL Tools: Talend (Nice to have)
• Database: Snowflake, Oracle, Amazon RDS (Aurora, Postgres), DB2, SQL server and Casandra,
• Big Data and Amazon Services: Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker, Apache Spark
• Data Modeling Tools: Archimate (not mandated- secondary/preferred), Erwin, Oracle Data Modeler (secondary/preferred)
• Scheduling Tools: Autosys, SFTP, AirFlow (preferred. This should not be an issue, any resource can learn how to use it)
Key Responsibilities:
• Designing, building, and automating ETL processes using AWS services like Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker Apache Spark.
• Developing and maintaining data pipelines to move and transform data from diverse sources into data warehouses or data lakes.
• Ensuring data quality and integrity through validation, cleansing, and monitoring ETL processes.
• Optimizing ETL workflows for performance, scalability, and cost efficiency within the AWS environment.
• Troubleshooting and resolving issues related to data processing and ETL workflows.
• Implementing and maintaining security measures and compliance standards for data pipelines and infrastructure.
• Documenting ETL processes, data mappings, and system architecture.
• Implementing security measures such as IAM roles and access controls.
• Diagnosing and resolving issues related to AWS services, infrastructure, and applications.
• Proficiency in Big data tool and AWS services: Including Apache Sqoop, AWS S3, Hue, AWS CLI, Amazon EMR, Amazon MSK, Amazon Sagemaker, Apache Spark relevant to data storage and processing.
• Strong SQL skills: For querying databases and manipulating data during the transformation process.
mohan@synergent.net
mohan@synergent.net