Smart IT Frame LLC

Data Integration Lead (ETL With Snowflake, AWS)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Integration Lead (ETL with Snowflake, AWS) on a hybrid contract in Arlington, VA. Requires 10-12 years of experience, expertise in ETL, AWS, Snowflake, and advanced SQL, along with strong leadership and data governance skills.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 1, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Arlington, VA
-
🧠 - Skills detailed
#Ab Initio #Data Science #IAM (Identity and Access Management) #AWS (Amazon Web Services) #Scala #"ETL (Extract #Transform #Load)" #Clustering #Snowflake #Data Lineage #Leadership #Lambda (AWS Lambda) #Automation #Data Management #Deployment #Data Integration #Data Engineering #Data Pipeline #Cloud #Data Quality #Complex Queries #Dimensional Modelling #Metadata #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Documentation #Compliance
Role description
Role: Data Integration lead (ETL With Snowflake, AWS) Location: Arlington VA - Hybrid (4 days/week) Type: Contract Summary: β€’ Experienced Data Engineer with 10–12 years of hands-on expertise in designing and scaling enterprise-grade data solutions. Proven track record in building robust ETL pipelines, architecting cloud-native data platforms, and driving performance across large-scale systems. Adept at translating business needs into technical solutions, mentoring teams, and optimizing data workflows for analytics and operational excellence. Responsibilities: β€’ ETL/ELT Development: Architect and maintain high-performance data pipelines using Ab Initio, handling complex transformations and large data volumes. β€’ Cloud Data Engineering: Build and optimize data platforms on AWS, leveraging services like S3, Lambda, Glue, and IAM for secure, scalable workflows. β€’ Snowflake Expertise: Design efficient schemas, implement clustering strategies, and tune performance for analytics workloads in Snowflake. β€’ Advanced SQL: Develop complex queries, stored procedures, and data validation logic to support reporting, analytics, and downstream systems. β€’ Data Modelling & Governance: Lead efforts in dimensional modelling, metadata management, and data lineage to ensure consistency and compliance. β€’ Performance & Quality: Conduct tuning across ETL jobs and cloud components; implement data quality frameworks to ensure reliability. β€’ Cross-Functional Collaboration: Partner with analysts, data scientists, and business stakeholders to deliver scalable, value-driven solutions. β€’ Mentorship & Leadership: Guide junior engineers, enforce best practices, and contribute to architectural decisions and roadmap planning. β€’ Innovation & Automation: Evaluate new tools, drive automation initiatives, and continuously improve pipeline efficiency and deployment velocity. β€’ Leverage industry best practices and methods. β€’ Define documentation to support the implementation of best practices. β€’ Good communication and stakeholders’ management.