

Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (Remote – PST Hours) with a contract length of "unknown" and a pay rate of "unknown." Candidates should have 4+ years of experience in data pipelines, proficiency in Snowflake, dbt, Python, AWS, and expertise in data modeling and handling sensitive data.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 26, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Engineering #Data Pipeline #Vault #Datasets #AWS (Amazon Web Services) #Compliance #Data Modeling #dbt (data build tool) #Data Quality #Data Vault #"ETL (Extract #Transform #Load)" #Monitoring #Schema Design #Computer Science #Security #Batch #Python #Scala #Lambda (AWS Lambda) #SNS (Simple Notification Service) #Telematics #Snowflake #Data Ingestion
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior Data Engineer (Remote – PST Hours)
Location: Remote (Must be authorized to work in the U.S. and able to support Pacific Time hours)
We are seeking a Senior Data Engineer to support a large-scale initiative focused on designing, building, and deploying multiple data pipelines and a robust ELT (Extract, Load, Transform) system. The goal is to ingest, validate, and manage high-volume data from multiple sources related to electric vehicle (EV) charging infrastructure.
The engineered pipelines will enable secure ingestion, transformation, and daily refresh of data used for internal analytics and dashboard reporting. The data supports various state-funded programs and includes sensitive information like geolocation and telematics data for compliance and performance monitoring.
Key Responsibilities
• Design and implement scalable data pipelines for high-volume, real-time data ingestion.
• Develop and maintain ELT processes using dbt, Snowflake, Python, and AWS technologies.
• Ensure high standards of data quality, security (including PII), and cost optimization.
• Support geocoding workflows to enrich address data with latitude and longitude coordinates.
• Build models using Data Vault 2 and Star Schema designs.
• Optimize Snowflake performance and manage incremental refresh strategies.
• Collaborate with cross-functional teams to ensure data usability and accessibility for analytics.
Required Qualifications
• Minimum 4+ years of experience in authoring and monitoring data pipelines.
• At least 3 years of hands-on experience with:
• Snowflake, dbt, Python, AWS services (SES/SNS, Lambda using Boto3, file systems).
• Micro-batch and near real-time ingestion techniques.
• Data modeling (Data Vault 2, Star Schema).
• Geocoding workflows within ELT pipelines.
• Snowflake incremental refreshes and cost optimization for large datasets.
• Handling PII and sensitive data in secure data environments.
• Data quality frameworks and management practices.
• Bachelor's or advanced degree in Computer Science, Engineering, Information Systems, Math, or related field.
• Note: Equivalent experience may be considered in place of a degree. Completion of a Data Engineering bootcamp + 1 year of experience may also be considered.