Morton

Data Warehouse Developer (8777)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Warehouse Developer (8777) with a contract length of "unknown," offering a pay rate of "unknown." It requires expertise in AWS Glue, Snowflake, and SQL, along with a Bachelor's degree in a related field and relevant certifications preferred. The position is fully onsite in North Chesterfield, VA.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 7, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Chesterfield County, VA
-
🧠 - Skills detailed
#Datasets #Scala #Compliance #Data Quality #Security #Strategy #Data Ingestion #PySpark #"ETL (Extract #Transform #Load)" #Computer Science #Data Modeling #Data Strategy #Data Reconciliation #Data Profiling #AWS (Amazon Web Services) #Spark (Apache Spark) #SQL (Structured Query Language) #AWS Glue #Data Transformations #Databases #Data Warehouse #Data Migration #Python #Clustering #Storage #Schema Design #Documentation #Data Pipeline #Automation #Cloud #Snowflake #Data Governance #SnowPipe #Migration #Data Engineering
Role description
Job Description Morton is seeking an experienced Data Warehouse Developer to support the Data Strategy initiative and help build a modern, enterprise grade data platform for our client. This role focuses on designing, developing, and optimizing data pipelines and warehouse structures using AWS Glue, Snowflake, and Snowpipe. You will be responsible for transforming raw, disparate data into reliable, scalable, and well governed datasets that power analytics, reporting, and operational decision making across the agency. This role is FULLY ONSITE in North Chesterfield, VA. Responsibilities β€’ Design, build, and maintain ETL/ELT pipelines using AWS Glue, including Glue Jobs, Workflows, Crawlers, and Catalog integration. β€’ Develop and optimize Snowflake data models, schemas, and warehouse structures to support enterprise reporting and analytics. β€’ Implement and manage Snowpipe for continuous data ingestion from structured and semi structured sources. β€’ Collaborate with internal and external teams to understand data requirements, source system structures, and integration needs. β€’ Build scalable, secure, and high performance data ingestion and transformation processes aligned with best practices. β€’ Perform data profiling, data quality assessments, and root cause analysis for data issues across complex datasets. β€’ Support data migration activities, including mapping, cleansing, validation, and reconciliation. β€’ Develop and maintain documentation for data pipelines, data models, and operational processes. β€’ Work closely with business stakeholders, SMEs, and technical teams to translate requirements into robust data engineering solutions. β€’ Ensure compliance with data governance, security, and privacy standards across all data assets. β€’ Optimize Snowflake performance through clustering, partitioning, caching, and query tuning. β€’ Support ad hoc data engineering requests related to new integrations, data transformations, or enhancements. Qualifications β€’ Proven experience as a Data Warehouse Developer, Data Engineer, or similar role. β€’ Hands on expertise with AWS Glue, including Glue Studio, Glue Jobs (PySpark/Python), Glue Catalog, and Glue Workflows. β€’ Strong experience with Snowflake, including schema design, Snowflake SQL, performance tuning, and data sharing. β€’ Practical experience implementing Snowpipe for automated ingestion from cloud storage. β€’ Strong SQL skills and experience working with large scale relational and cloud based databases. β€’ Proficiency in Python for ETL/ELT development and automation. β€’ Solid understanding of data modeling, data warehousing concepts, and ETL best practices. β€’ Experience with data quality, data validation, and data reconciliation processes. β€’ Ability to analyze complex data structures and troubleshoot data pipeline issues with precision. β€’ Strong communication skills and ability to collaborate with cross functional teams. β€’ Proficiency with Microsoft Office tools, especially Excel, for data validation and documentation. Education & Experience β€’ Bachelor’s degree in Computer Science, Information Systems, Data Engineering, or a related field. β€’ Relevant certifications (AWS, Snowflake) are a plus.