PDS

Sr. Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include AWS, Snowflake, SQL, and Python. Requires 5+ years of experience in Data Engineering and familiarity with data governance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
October 29, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Pipeline #Data Quality #SnowPipe #SQL (Structured Query Language) #Data Governance #Data Modeling #Data Engineering #Lambda (AWS Lambda) #Data Management #Cloud #Automation #Redshift #Snowflake #CRM (Customer Relationship Management) #Data Integration #API (Application Programming Interface) #Data Ingestion #Python #DevOps #Migration #Data Architecture #AWS Glue #REST (Representational State Transfer) #"ETL (Extract #Transform #Load)" #Athena #S3 (Amazon Simple Storage Service) #AWS (Amazon Web Services) #Metadata
Role description
Key Responsibilities • Design, build, and maintain robust data ingestion and transformation pipelines using AWS and Snowflake. • Support data management migration activities, including mapping, data quality validation, and data integration between legacy and target systems. • Collaborate with data architects, data stewards, and business stakeholders to ensure consistent data definitions and lineage. • Integrate data from diverse sources (ERP, CRM, marketing, and external systems) into unified data models. • Implement and optimize ETL/ELT processes using tools like AWS Glue, Python, and SQL. • Develop and maintain data models, metadata, and data quality frameworks to support master data initiatives. • Monitor, troubleshoot, and optimize data pipelines for performance, cost, and reliability. • Contribute to automation, DevOps, and CI/CD practices within the data engineering environment. Required Qualifications • 5+ years of professional experience in Data Engineering or related roles. • Strong experience with AWS data ecosystem (e.g., S3, Glue, Lambda, Redshift, Athena, EMR). • Proven experience working with Snowflake (data modeling, performance tuning, Snowpipe, Streams & Tasks). • Hands-on experience with data platform integrations preferred • Strong SQL and Python skills for data transformation and automation. • Familiarity with data governance, data quality, and metadata management concepts. • Experience with API integrations, REST/SOAP services, and cloud-native data architecture patterns. • Strong understanding of data modeling (conceptual, logical, and physical) and master data domains (Customer, Product, Vendor, etc.).