

Lead (AWS Data Lake/ETL Migration)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead (AWS Data Lake/ETL Migration) in Columbus, OH, offering a W2 contract at $48/hour or C2C at $54/hour. Requires 8-10 years of experience, strong AWS skills, and knowledge of ETL frameworks.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
384
-
ποΈ - Date discovered
July 26, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Columbus, OH
-
π§ - Skills detailed
#PostgreSQL #AWS (Amazon Web Services) #Data Governance #S3 (Amazon Simple Storage Service) #DMS (Data Migration Service) #AWS S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Metadata #DynamoDB #SQS (Simple Queue Service) #AWS DevOps #Migration #Security #Cloud #GCP (Google Cloud Platform) #Python #Spark (Apache Spark) #Scala #Terraform #Data Warehouse #Lambda (AWS Lambda) #PySpark #Azure #Data Lake #DevOps #Data Ingestion
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hi, Everyone
β’
β’
β’
β’
β’
β’ W2 CONTRACT & C2C ONLY
β’
β’
β’
β’
β’
β’ W2 CONTRACT & C2C ONLY
β’
β’
β’
β’
β’
β’ Job Title: Lead (AWS Data Lake/ETL migration)
Loc: Columbus, OH
Hybrid, 3 days Onsite in a Week
Job Description:
Role:Tech. Lead (AWS Data Lake/ETL migration)
Location: Columbus, OH
Rate-$48 on W2
$54 on C2C
Exp: 8- 10 Years
Must have strong knowledge on implementing metadata driven re-usable data ingestion, DQ framework, ETL pipeline design, ETL orchestration etc. Expertise in building high-performing, scalable, enterprise-grade Data Lake/ Data Warehouse. Solution/technical architecture in AWS on Data Lake /Datawarehouse/Lakehouse.
Required Skillset (Technical)
β’ Must have implementation knowledge about different AWS services like: Step functions, Lambda, Glue Workflow, AWS S3, PostgreSQL, Terraform
β’ Must have good knowledge on Data Like/DW/Lakehouse architecture. Good communication knowledge to participate in technical discussions with customers.
β’ Must have knowledge and design experience on metadata driven re-usable ingestion and ETL framework
β’ Must have hand-on knowledge on Python, Pyspark
β’ Must have hand on experience in POSTGRE SQL
β’ Candidate should also have knowledge on AWS SQS, Pub/Sub architecture, Kinesis Firehose, EKS etc.
β’ Should have knowledge of AWS DevOps
Good-to-Have
β’ AWS infrastructure, security
β’ Knowledge of other AWS services like DMS, CloudTrail, CloudWatch, DynamoDB etc.
β’ Knowledge about other ETL tools.
β’ Experience on other cloud (e.g. Azure/GCP) implementation
β’ Knowledge about Data Governance, Data Modelling etc.
β’ Certified AWS Solutions Architect - Associate
β’ Any Professional/Specialty AWS certification
NOTE: Please share Updated Resumes to vmahesh@galaxyitech.com or call me 480-407-6915.