Realign LLC

Senior AWS Data Engineer-1

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior AWS Data Engineer in Torrance, CA, on a contract basis. It requires 5+ years in data engineering, 3+ years of AWS experience, and strong skills in Python and PySpark. Day 1 onsite required.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 3, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Torrance, CA
-
🧠 - Skills detailed
#Security #Compliance #RDS (Amazon Relational Database Service) #Monitoring #Datasets #"ETL (Extract #Transform #Load)" #Data Governance #DevOps #Data Engineering #S3 (Amazon Simple Storage Service) #Java #Airflow #Data Quality #Data Lake #Redshift #AWS Glue #Documentation #Scala #Data Integration #Apache Spark #Programming #BI (Business Intelligence) #Data Pipeline #Data Processing #Python #Athena #Cloud #Lambda (AWS Lambda) #AWS (Amazon Web Services) #PySpark #Hadoop #Spark (Apache Spark)
Role description
Job Type: Contract Job Category: IT Job Title: Senior AWS Data Engineer Location: Torrance, CA (Day 1 Onsite) Client: Wipro / Honda Employment Type: Contract Job Summary: We are seeking a highly skilled Senior AWS Data Engineer with strong expertise in data engineering, AWS cloud services, and large-scale data integration. The ideal candidate will design, implement, and optimize end-to-end data pipelines, ensuring accuracy, scalability, and performance for enterprise analytics and business intelligence initiatives. Key Responsibilities: Data Integration & Pipeline Development Design and implement data workflows using AWS Glue, EMR, Lambda, MWAA (Airflow), and Redshift. Build scalable ETL/ELT processes leveraging PySpark, Apache Spark, and Python. Ensure accurate extraction, transformation, and loading (ETL) of large datasets. Data Quality & Integrity Implement validation, cleansing, and monitoring mechanisms within pipelines. Ensure compliance with governance, regulatory, and industry standards. Performance Optimization Optimize workflows for performance, scalability, and cost-efficiency on AWS. Fine-tune queries and enhance Redshift performance by resolving bottlenecks. Collaboration & Business Support Translate business requirements into technical specifications for data pipelines. Partner with analysts and business stakeholders to ensure timely data delivery for BI and analytics. Documentation & Compliance Maintain technical documentation, workflow diagrams, and system specifications. Ensure adherence to data governance and compliance frameworks. Required Skills & Qualifications: 5+ years in data engineering, ETL, and data warehousing. 3+ years of hands-on AWS experience (S3, EMR, Glue, Athena, Redshift, RDS, Spectrum, Airflow). 3+ years in programming (Python, Java, or Scala) with strong PySpark/Spark experience. 2+ years with CI/CD tools and modern DevOps practices. Strong knowledge of distributed data processing (Hadoop, Spark) and data lake architectures. Excellent problem-solving and optimization skills. Required Skills DEVOPS ENGINEER SENIOR EMAIL SECURITY ENGINEER