

AWS Data Architecture
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AWS Data Architecture position in Torrance, CA (Hybrid), with a contract duration. Key skills include AWS expertise, data modeling, ETL development, and proficiency in Python and SQL. Experience with data governance and data engineering is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 26, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Torrance, CA
-
π§ - Skills detailed
#Data Engineering #Conceptual Data Model #Data Pipeline #Athena #Programming #AWS (Amazon Web Services) #Data Modeling #Data Quality #Data Governance #S3 (Amazon Simple Storage Service) #Physical Data Model #AWS Glue #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Architecture #Logical Data Model #Classification #Migration #Cloud #Batch #IICS (Informatica Intelligent Cloud Services) #Python #Normalization #Spark (Apache Spark) #Scala #Lambda (AWS Lambda) #Data Cleansing #Data Migration #PySpark #Storage #Data Lineage #Data Lake #Data Processing #Informatica #Redshift #Snowflake #Databases #Data Ingestion
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position:Β Β AWS Data Architecture
Location:Β Β Torrance CA (Hybrid)
Duration:Β Contract
Job Description:
AWS-Data Architecture
Key skillΒ The role requires specific skills around developing CDM (Conceptual Data Model), LDM (Logical Data Model), PDM (Physical Data Model), data modeling, integration, data migration, and ETL development on the AWS platform.
Key Responsibilities
Architect and implement a scalable data hub solution on AWS using best practices for data ingestion transformation storage and access control
Define data models data lineage and data quality standards for the DataHub
Select appropriate AWS services S3 Glue Redshift Athena Lambda based on data volume access patterns and performance requirements
Design and build data pipelines to extract transform and load data from various sources databases APIs flat files into the DataHub using AWS Glue AWS Batch or custom ETL processes
Implement data cleansing and normalization techniques to ensure data quality
Manage data ingestion schedules and error handling mechanisms
Required Skills and Experience:
AWS Expertise Deep understanding of AWS data services including S3 Glue Redshift Athena Lake Formation Sep Functions CloudWatch and EventBridge
Data Modeling Proficiency in designing dimensional and snowflake data models for data warehousing and data lakes
Data Engineering Skills Experience with ETLELT processes data cleansing data transformation and data quality checks Experience with Informatica IICS and ICDQ is a plus
Programming Languages Proficiency in Python SQL and potentially PySpark for data processing and manipulation
Data Governance Knowledge of data governance best practices including data classification access control and data lineage tracking