

SilverSearch, Inc.
Principal Data Architect & ETL Engineering Lead
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Principal Data Architect & ETL Engineering Lead, offering a contract-to-hire position with remote work during EST hours. Requires 10+ years in data architecture, expertise in Snowflake, AWS, and ETL/ELT development, plus a Bachelor's degree.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 24, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Warehouse #Data Engineering #BI (Business Intelligence) #Automation #Informatica Cloud #Informatica #dbt (data build tool) #Data Management #API (Application Programming Interface) #AWS Lambda #Data Governance #AWS (Amazon Web Services) #SAP #Lambda (AWS Lambda) #Data Integration #Collibra #Snowflake #Alation #Mathematics #Apache Airflow #Data Architecture #Scala #Python #Data Quality #Computer Science #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Observability #"ETL (Extract #Transform #Load)" #Monitoring #Strategy #Batch #GitHub #Cloud #Leadership #REST (Representational State Transfer) #Documentation #Metadata #Airflow #Data Pipeline
Role description
Position Summary
We’re seeking a highly technical, hands-on Principal Data Architecture and ETL/ELT Lead to drive the design, development, and modernization of ETL/ELT processes in support of enterprise analytics and business intelligence. This role is ideal for a seasoned data engineer and architect who enjoys building scalable, cloud-based data platforms and guiding the evolution of data integration strategy.
You’ll play a pivotal role in aligning enterprise data architecture with business objectives—designing robust data pipelines, optimizing data flow, and leading the transition from legacy systems to modern, code-driven data solutions. The ideal candidate is both an architect and builder—comfortable writing code, mentoring engineers, and implementing best-in-class data practices.
Key Responsibilities
• Partner with senior leadership to translate business objectives into scalable data architecture strategies and align technology priorities with analytics and operational goals.
• Architect and implement modern, cloud-native data pipelines (batch and streaming) using Snowflake, AWS, Informatica Cloud, and Python.
• Lead the modernization of ETL/ELT workflows, guiding the transition from Informatica to dbt, Dagster, and Airflow for flexible, version-controlled data transformation and orchestration.
• Design and maintain enterprise data models and domain architectures, integrating diverse systems such as Salesforce, NetSuite, and SAP into a unified, governed data platform.
• Develop and maintain API-driven integrations, including REST/SOAP and event-driven architectures, to enable seamless data flow across applications.
• Establish and enforce data engineering best practices—including testing, CI/CD, monitoring, and performance optimization.
• Collaborate closely with data engineers, analysts, and product teams to translate requirements into scalable, automated technical solutions.
• Mentor and coach data engineers, fostering strong coding, documentation, and operational standards.
• Champion data governance, lineage, and observability, ensuring all data flows are reliable, auditable, and well-documented.
Required Qualifications
• 10+ years of hands-on experience in data architecture, data engineering, and ETL/ELT development at enterprise scale.
• Proven expertise with Snowflake, AWS (Lambda, Glue, S3, Kinesis), and Informatica Cloud.
• Strong proficiency in SQL and Python for automation, transformation, and data quality validation.
• Experience integrating platforms such as Salesforce, SAP Commerce Cloud, NetSuite, and third-party APIs into centralized data warehouses.
• Familiarity with modern data transformation and orchestration tools such as dbt, Dagster, and Apache Airflow.
• Deep understanding of data governance, semantic modeling, and metadata management (e.g., Collibra, Alation).
• Track record of leading architecture and engineering delivery end-to-end—from concept through production.
• Bachelor’s degree in Computer Science, Engineering, Mathematics, or related discipline.
• Snowflake SnowPro Core or Data Engineer certification preferred.
Preferred Attributes
• Passion for working with data at scale and enabling teams to deliver trustworthy, analytics-ready data.
• Ability to thrive in a hybrid strategic/technical role that balances hands-on engineering with architectural leadership.
• Experience building CI/CD pipelines, version-controlled transformations, and testable data workflows using dbt and GitHub Actions (or similar tools).
• Strong communicator, capable of explaining complex technical designs to both technical and non-technical stakeholders.
Contract to Hire opportunity
No sponsorship is offered at this time
Remote working EST hours
Position Summary
We’re seeking a highly technical, hands-on Principal Data Architecture and ETL/ELT Lead to drive the design, development, and modernization of ETL/ELT processes in support of enterprise analytics and business intelligence. This role is ideal for a seasoned data engineer and architect who enjoys building scalable, cloud-based data platforms and guiding the evolution of data integration strategy.
You’ll play a pivotal role in aligning enterprise data architecture with business objectives—designing robust data pipelines, optimizing data flow, and leading the transition from legacy systems to modern, code-driven data solutions. The ideal candidate is both an architect and builder—comfortable writing code, mentoring engineers, and implementing best-in-class data practices.
Key Responsibilities
• Partner with senior leadership to translate business objectives into scalable data architecture strategies and align technology priorities with analytics and operational goals.
• Architect and implement modern, cloud-native data pipelines (batch and streaming) using Snowflake, AWS, Informatica Cloud, and Python.
• Lead the modernization of ETL/ELT workflows, guiding the transition from Informatica to dbt, Dagster, and Airflow for flexible, version-controlled data transformation and orchestration.
• Design and maintain enterprise data models and domain architectures, integrating diverse systems such as Salesforce, NetSuite, and SAP into a unified, governed data platform.
• Develop and maintain API-driven integrations, including REST/SOAP and event-driven architectures, to enable seamless data flow across applications.
• Establish and enforce data engineering best practices—including testing, CI/CD, monitoring, and performance optimization.
• Collaborate closely with data engineers, analysts, and product teams to translate requirements into scalable, automated technical solutions.
• Mentor and coach data engineers, fostering strong coding, documentation, and operational standards.
• Champion data governance, lineage, and observability, ensuring all data flows are reliable, auditable, and well-documented.
Required Qualifications
• 10+ years of hands-on experience in data architecture, data engineering, and ETL/ELT development at enterprise scale.
• Proven expertise with Snowflake, AWS (Lambda, Glue, S3, Kinesis), and Informatica Cloud.
• Strong proficiency in SQL and Python for automation, transformation, and data quality validation.
• Experience integrating platforms such as Salesforce, SAP Commerce Cloud, NetSuite, and third-party APIs into centralized data warehouses.
• Familiarity with modern data transformation and orchestration tools such as dbt, Dagster, and Apache Airflow.
• Deep understanding of data governance, semantic modeling, and metadata management (e.g., Collibra, Alation).
• Track record of leading architecture and engineering delivery end-to-end—from concept through production.
• Bachelor’s degree in Computer Science, Engineering, Mathematics, or related discipline.
• Snowflake SnowPro Core or Data Engineer certification preferred.
Preferred Attributes
• Passion for working with data at scale and enabling teams to deliver trustworthy, analytics-ready data.
• Ability to thrive in a hybrid strategic/technical role that balances hands-on engineering with architectural leadership.
• Experience building CI/CD pipelines, version-controlled transformations, and testable data workflows using dbt and GitHub Actions (or similar tools).
• Strong communicator, capable of explaining complex technical designs to both technical and non-technical stakeholders.
Contract to Hire opportunity
No sponsorship is offered at this time
Remote working EST hours






