

Tech Genius inc
DevOps Engineer on Our W2
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a DevOps Engineer with 10+ years of experience, on a W2 contract for 6 months, based in Atlanta, USA. Key skills include AWS services, SQL, Python, and familiarity with CI/CD. Visa sponsorship available for OPT, H4, L2, H1.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Athena #IAM (Identity and Access Management) #Datasets #SQS (Simple Queue Service) #Terraform #Cloud #Databases #Security #Data Ingestion #AWS (Amazon Web Services) #DynamoDB #Data Science #Batch #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Data Modeling #PySpark #Python #SNS (Simple Notification Service) #RDS (Amazon Relational Database Service) #SQL (Structured Query Language) #Compliance #DevOps #Data Engineering #S3 (Amazon Simple Storage Service) #Data Quality #Lambda (AWS Lambda) #Schema Design #Redshift #Data Lake
Role description
Data engineer positions
Client: Deluxe Corporation
Visa: only OPT / H4 / L2 / H1
Exp: 10+ years
W2
Location: Atlanta, USA
Interview slots available
Key Responsibilities:
· Design, develop, and maintain batch and streaming ETL/ELT pipelines using AWS services (Glue, Lambda, Step Functions, etc.).
· Implement data ingestion frameworks from diverse sources (APIs, databases, streaming platforms).
· Ensure data quality, governance, and security across all pipelines.
· Build and optimize data lakes and warehouses leveraging Amazon S3, Redshift, Athena, and Lake Formation.
· Collaborate with data scientists, analysts, and business stakeholders to deliver reliable datasets.
· Monitor and troubleshoot data workflows, ensuring high availability and performance.
· Stay updated with emerging AWS technologies and recommend improvements.
Required Skills & Qualifications
· Strong experience with AWS cloud services: S3, Glue, Redshift, EMR, Kinesis, Lambda, DynamoDB, RDS, Athena, SNS. SQS.
· Proficiency in SQL, Python and PySpark
· Knowledge of data modeling, schema design, and performance tuning.
· Familiarity with CI/CD pipelines and DevOps practices tool (Terraform, CloudFormation).
• · Experience with security best practices (IAM, encryption, compliance).
Data engineer positions
Client: Deluxe Corporation
Visa: only OPT / H4 / L2 / H1
Exp: 10+ years
W2
Location: Atlanta, USA
Interview slots available
Key Responsibilities:
· Design, develop, and maintain batch and streaming ETL/ELT pipelines using AWS services (Glue, Lambda, Step Functions, etc.).
· Implement data ingestion frameworks from diverse sources (APIs, databases, streaming platforms).
· Ensure data quality, governance, and security across all pipelines.
· Build and optimize data lakes and warehouses leveraging Amazon S3, Redshift, Athena, and Lake Formation.
· Collaborate with data scientists, analysts, and business stakeholders to deliver reliable datasets.
· Monitor and troubleshoot data workflows, ensuring high availability and performance.
· Stay updated with emerging AWS technologies and recommend improvements.
Required Skills & Qualifications
· Strong experience with AWS cloud services: S3, Glue, Redshift, EMR, Kinesis, Lambda, DynamoDB, RDS, Athena, SNS. SQS.
· Proficiency in SQL, Python and PySpark
· Knowledge of data modeling, schema design, and performance tuning.
· Familiarity with CI/CD pipelines and DevOps practices tool (Terraform, CloudFormation).
• · Experience with security best practices (IAM, encryption, compliance).





