Mastek

Data Engineer - ***Green Card & Citizens W2 Only***

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for Green Card holders and Citizens, focusing on AWS, Python, and Conversational Analytics. Contract length is unspecified, with competitive pay. Requires 2+ years of AWS experience, strong SQL skills, and knowledge of data pipelines.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 23, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Arizona, United States
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Athena #Data Engineering #Data Lineage #SQL (Structured Query Language) #Deployment #"ETL (Extract #Transform #Load)" #Data Pipeline #Monitoring #AWS (Amazon Web Services) #Logging #API (Application Programming Interface) #Python #Scala #Terraform #ML (Machine Learning) #DevOps #Security #Cloud #Redshift #Lambda (AWS Lambda) #S3 (Amazon Simple Storage Service) #SageMaker #Automation
Role description
We are looking for a Data Engineer with strong expertise in AWS, Python, and Conversational Analytics with a focus on secure, scalable, and resilient systems. This role will focus on developing knowledgebases and agents that allow business users to interact with data through natural language, while ensuring reliability, scalability, and efficiency across AWS cloud infrastructure. Key Responsibilities • Design, build, and optimize data pipelines using AWS • Implement security best practices across data and AI pipelines • Develop and maintain knowledgebases and agents and integrate structured and unstructured data sources • Implement and support machine learning workflows in SageMaker for model training, deployment, and monitoring. • Leverage AWS Bedrock to build and deploy conversational AI solutions. • Ensure data lineage, and quality control across all conversational AI systems. • Automate orchestration, monitoring, and logging for data pipelines and conversational agents. • Build and maintain CI/CD pipelines to enable secure, automated, and reliable deployments • Collaborate with analytics, product, and business teams to design solutions that improve decision-making and data democratization. Required Skills & Qualifications • 2+ Years Hands-on experience with AWS services (S3, Athena, Lambda, Step Functions, Redshift Serverless, SageMaker, Bedrock) • Strong proficiency in Python for data engineering, automation, and API integration. • Experience building knowledgebases and conversational agents or equivalent. • Strong SQL skills and familiarity with large-scale data environments. • Solid understanding of ETL design, orchestration, and optimization in cloud platforms. Preferred Qualifications • Experience with LLM integration into business workflows. • Familiarity with lakehouse frameworks (Iceberg, Delta) for scalable analytics. • Experience with DevOps/Infra-as-Code (Terraform, CloudFormation). • Background in marketing analytics or customer intelligence use cases