AWS Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer with a contract length of "unknown," offering a pay rate of "$/hour." Key skills include AWS tools, PySpark, Python, and data integration. Requires 4-6+ years in data engineering and ETL processes.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 30, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Torrance, CA
-
🧠 - Skills detailed
#Programming #Redshift #Python #Data Integration #RDS (Amazon Relational Database Service) #"ETL (Extract #Transform #Load)" #Data Quality #BI (Business Intelligence) #Databases #Computer Science #Agile #Cloud #Documentation #Snowflake #Data Pipeline #Apache Spark #Database Design #AWS (Amazon Web Services) #Datasets #Athena #Scala #AWS Glue #Security #Data Lake #Compliance #Schema Design #Data Engineering #Monitoring #Data Mart #Data Analysis #Lambda (AWS Lambda) #Data Warehouse #Spark (Apache Spark) #Data Security #S3 (Amazon Simple Storage Service) #PySpark #Data Processing #Data Governance
Role description
Develop and Maintain Data Integration Solutions: β€’ Design and implement data integration workflows using AWS Glue EMR , Lambda , Redshift β€’ Demonstrate proficiency in Pyspark , Apache Spark and Python for data processing large datasets β€’ Ensure data is accurately and efficiently extracted, transformed, and loaded into target systems . Ensure Data Quality and Integrity: β€’ Validate and cleanse data to maintain high data quality. β€’ Ensure data quality and integrity by implementing monitoring , validation , and error handling mechanisms within data pipelines Optimize Data Integration Processes: β€’ Enhance the performance , optimization of data workflows to meet SLAs , scalability of data integration processes and cost-efficiency on AWS cloud infrastructure . β€’ Solid knowledge on Data Analysis and Data Warehousing concepts ( star snowflake schema design , dimensional modeling , and reporting enablement ). β€’ Identify and resolve performance bottlenecks , fine-tuning queries, and optimizing data processing to enhance Redshift's performance β€’ Regularly review and refine integration processes to improve efficiency. Support Business Intelligence and Analytics: β€’ Translate business requirements to technical specifications and coded data pipelines β€’ Ensure timely availability of integrated data for business intelligence and analytics . β€’ Collaborate with data analysts and business stakeholders to meet their data requirements . Maintain Documentation and Compliance: β€’ Document all data integration processes , workflows , and technical & system specifications. β€’ Ensure compliance with data governance policies , industry standards, and regulatory requirements. What will this person be working on β€’ The IT Data Integration Engineer AWS Data Engineer is tasked with the design , development , and management of data integration processes to ensure seamless data flow and accessibility across the organization. β€’ This role is pivotal in integrating data from diverse sources , transforming it to meet business requirements , and loading it into target systems such as data warehou ses or data lakes . β€’ The aim is to support the CX business on their data-driven decision-making by providing high-quality, consistent, and accessible data. Position Success Criteria (Desired) - 'WANTS' β€’ Bachelor's degree in computer science, information technology, or a related field. A master's degree can be advantageous. β€’ 4-6+ years of experience in data engineering , database design , ETL processes , β€’ 5+ in programming languages such as PySpark , Python β€’ 5+ years of experience with AWS tools and technologies ( S3 , EMR , Glue , Athena , RedShift , Postgres , RDS , Lambda , PySpark ) β€’ 3+ years of experience of working with databases data marts data warehouses β€’ Proven experience in ETL development , system integration , and CI CD imple mentation. β€’ Experience in complex database objects to move the changed data across multiple environments β€’ Solid understanding of data security , privacy, and compliance. β€’ Excellent problem-solving and communication skills. β€’ Display good communication skills to effectively collaborate with multi-functional teams β€’ Participate in agile development processes including sprint planning stand-ups and retrospectives β€’ Provide technical guidance and mentorship to junior developers β€’ Attention to detail and a commitment to data quality. β€’ Continuous learning mindset to keep up with evolving technologies and best practices in data engineering.