Arbor TekSystems

Data Engineer with AWS

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with AWS expertise, offering a 12-month remote contract at a W2 pay rate. Requires 5+ years in data engineering, proficiency in Python and SQL, and aviation industry experience is preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 12, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Redshift #Python #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Jenkins #Observability #Data Integration #Terraform #Cloud #Automation #Data Engineering #Computer Science #Shell Scripting #Scripting #Deployment #Scala #Programming #Automated Testing #AWS Glue #Lambda (AWS Lambda) #Data Quality #Monitoring #Data Pipeline
Role description
Role: AWS Data Engineer Location: Remote Job Type: Contract Duration: 12 months with possible extension Only W2 Aviation indsutry experience is prefrred. Role Summary: The AWS Engineer will build and maintain secure, scalable, and automated data pipelines to support the ingestion, transformation, and curation of data products. This role will focus on implementing CI/CD pipelines, automated testing, and monitoring solutions. Key Responsibilities: Data Pipelines: Develop configuration-driven ingestion and transformation pipelines using AWS Glue, Lambda, and Redshift. Automation: Implement CI/CD pipelines for automated build, test, and deployment of data products. Testing: Establish automated test frameworks for schema validation, data quality, and performance testing. Monitoring: Set up monitoring and observability dashboards for product-level metrics (CloudWatch, custom metrics). Collaboration: Work with data modelers and architects to ensure alignment with business requirements. Required Skills: Programming: Proficiency in Python, SQL, and shell scripting. AWS Expertise: Hands-on experience with AWS services (Glue, Redshift, S3, Lambda, CloudWatch). Automation: Experience with CI/CD tools (Terraform, Jenkins, AWS CodePipeline). Data Integration: Strong knowledge of ETL/ELT processes and frameworks. Soft Skills: Strong problem-solving, collaboration, and communication skills. Qualifications: Bachelor’s degree in computer science, Data Engineering, or related field. 5+ years of experience in data engineering and pipeline development. AWS Certified Data Analytics or AWS Certified Developer is a plus.