

AWS Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Engineer in Torrance, CA, hybrid for over 6 months, with a pay rate of $142k-$152k. Requires 4-6+ years in data engineering, proficiency in AWS tools, PySpark, Python, and strong data quality skills.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
690.9090909091
-
ποΈ - Date discovered
August 31, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Torrance, CA
-
π§ - Skills detailed
#Security #"ETL (Extract #Transform #Load)" #Data Warehouse #Data Security #Lambda (AWS Lambda) #Snowflake #PySpark #Data Processing #Schema Design #Scala #Data Integration #Computer Science #Agile #AWS Glue #RDS (Amazon Relational Database Service) #Data Quality #Python #Data Pipeline #Data Analysis #Databases #Database Design #Spark (Apache Spark) #Monitoring #Data Mart #Programming #Data Engineering #Data Lake #Compliance #Datasets #Data Governance #Cloud #Redshift #Apache Spark #AWS (Amazon Web Services) #Documentation #BI (Business Intelligence) #Athena #S3 (Amazon Simple Storage Service)
Role description
We are currently accepting resumes for a AWS Data Engineer position in Torrance, CA.
This position is Hybrid. (Onsite - Mandatory 4 days a week)
Salary range: $142k-$152k
Benefits offered: Medical, Vision, Dental, 401 K
3rd Party Suppliers
The selected candidate will perform the following duties:
Not accepting 3rd party suppliers for this role
Travel: Yes
5% - Yearly
Daily Tasks Performed
Develop and Maintain Data Integration Solutions:
" Design and implement data integration workflows using AWS Glue/EMR, Lambda, Redshift
" Demonstrate proficiency in Pyspark, Apache Spark and Python for data processing large datasets
" Ensure data is accurately and efficiently extracted, transformed, and loaded into target systems.
Ensure Data Quality And Integrity
" Validate and cleanse data to maintain high data quality.
" Ensure data quality and integrity by implementing monitoring, validation, and error handling mechanisms within data pipelines
Optimize Data Integration Processes
" Enhance the performance, optimization of data workflows to meet SLAs, scalability of data integration processes and cost-efficiency on AWS cloud infrastructure.
" Solid knowledge on Data Analysis and Data Warehousing concepts (star/snowflake schema design, dimensional modeling, and reporting enablement).
" Identify and resolve performance bottlenecks, fine-tuning queries, and optimizing data processing to enhance Redshift's performance
" Regularly review and refine integration processes to improve efficiency.
Support Business Intelligence And Analytics
" Translate business requirements to technical specifications and coded data pipelines
" Ensure timely availability of integrated data for business intelligence and analytics.
" Collaborate with data analysts and business stakeholders to meet their data requirements.
Maintain Documentation And Compliance
" Document all data integration processes, workflows, and technical & system specifications.
" Ensure compliance with data governance policies, industry standards, and regulatory requirements.
What will this person be working on
The IT Data Integration Engineer / AWS Data Engineer is tasked with the design, development, and management of data integration processes to ensure seamless data flow and accessibility across the organization. This role is pivotal in integrating data from diverse sources, transforming it to meet business requirements, and loading it into target systems such as data warehouses or data lakes. The aim is to support the CX business on their data-driven decision-making by providing high-quality, consistent, and accessible data.
Position Success Criteria (Desired) - 'WANTS'
Bachelor's degree in computer science, information technology, or a related field. A master's degree can be advantageous.
" 4-6+ years of experience in data engineering, database design, ETL processes,
" 5+ in programming languages such as PySpark, Python
" 5+ years of experience with AWS tools and technologies (S3, EMR, Glue, Athena, RedShift, Postgres, RDS, Lambda, PySpark)
" 3+ years of experience of working with databases/ data marts/data warehouses
" Proven experience in ETL development, system integration, and CI/CD implementation.
" Experience in complex database objects to move the changed data across multiple environments
" Solid understanding of data security, privacy, and compliance.
" Excellent problem-solving and communication skills.
" Display good communication skills to effectively collaborate with multi-functional teams
" Participate in agile development processes including sprint planning stand-ups and retrospectives
" Provide technical guidance and mentorship to junior developers
" Attention to detail and a commitment to data quality.
" Continuous learning mindset to keep up with evolving technologies and best practices in data engineering.
Note: Qualified applications with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act
UNICON International, Inc. is an Equal Opportunity Employer.
If you are interested in working for an organization where honesty, integrity, and quality are among the core principles, click apply today!
We are currently accepting resumes for a AWS Data Engineer position in Torrance, CA.
This position is Hybrid. (Onsite - Mandatory 4 days a week)
Salary range: $142k-$152k
Benefits offered: Medical, Vision, Dental, 401 K
3rd Party Suppliers
The selected candidate will perform the following duties:
Not accepting 3rd party suppliers for this role
Travel: Yes
5% - Yearly
Daily Tasks Performed
Develop and Maintain Data Integration Solutions:
" Design and implement data integration workflows using AWS Glue/EMR, Lambda, Redshift
" Demonstrate proficiency in Pyspark, Apache Spark and Python for data processing large datasets
" Ensure data is accurately and efficiently extracted, transformed, and loaded into target systems.
Ensure Data Quality And Integrity
" Validate and cleanse data to maintain high data quality.
" Ensure data quality and integrity by implementing monitoring, validation, and error handling mechanisms within data pipelines
Optimize Data Integration Processes
" Enhance the performance, optimization of data workflows to meet SLAs, scalability of data integration processes and cost-efficiency on AWS cloud infrastructure.
" Solid knowledge on Data Analysis and Data Warehousing concepts (star/snowflake schema design, dimensional modeling, and reporting enablement).
" Identify and resolve performance bottlenecks, fine-tuning queries, and optimizing data processing to enhance Redshift's performance
" Regularly review and refine integration processes to improve efficiency.
Support Business Intelligence And Analytics
" Translate business requirements to technical specifications and coded data pipelines
" Ensure timely availability of integrated data for business intelligence and analytics.
" Collaborate with data analysts and business stakeholders to meet their data requirements.
Maintain Documentation And Compliance
" Document all data integration processes, workflows, and technical & system specifications.
" Ensure compliance with data governance policies, industry standards, and regulatory requirements.
What will this person be working on
The IT Data Integration Engineer / AWS Data Engineer is tasked with the design, development, and management of data integration processes to ensure seamless data flow and accessibility across the organization. This role is pivotal in integrating data from diverse sources, transforming it to meet business requirements, and loading it into target systems such as data warehouses or data lakes. The aim is to support the CX business on their data-driven decision-making by providing high-quality, consistent, and accessible data.
Position Success Criteria (Desired) - 'WANTS'
Bachelor's degree in computer science, information technology, or a related field. A master's degree can be advantageous.
" 4-6+ years of experience in data engineering, database design, ETL processes,
" 5+ in programming languages such as PySpark, Python
" 5+ years of experience with AWS tools and technologies (S3, EMR, Glue, Athena, RedShift, Postgres, RDS, Lambda, PySpark)
" 3+ years of experience of working with databases/ data marts/data warehouses
" Proven experience in ETL development, system integration, and CI/CD implementation.
" Experience in complex database objects to move the changed data across multiple environments
" Solid understanding of data security, privacy, and compliance.
" Excellent problem-solving and communication skills.
" Display good communication skills to effectively collaborate with multi-functional teams
" Participate in agile development processes including sprint planning stand-ups and retrospectives
" Provide technical guidance and mentorship to junior developers
" Attention to detail and a commitment to data quality.
" Continuous learning mindset to keep up with evolving technologies and best practices in data engineering.
Note: Qualified applications with arrest or conviction records will be considered for employment in accordance with the Los Angeles County Fair Chance Ordinance for Employers and the California Fair Chance Act
UNICON International, Inc. is an Equal Opportunity Employer.
If you are interested in working for an organization where honesty, integrity, and quality are among the core principles, click apply today!