

Contract Role: Senior Data Engineer at Los Angeles, CA (Onsite from Day 1)
β - Featured Role | Apply direct with Data Freelance Hub
This role is a long-term contract for a Senior Data Engineer in Los Angeles, CA, requiring expertise in Snowflake, Azure Data Factory, and data pipeline optimization. Strong SQL skills and relevant certifications are preferred. Onsite work from Day 1 is mandatory.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 27, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Los Angeles, CA
-
π§ - Skills detailed
#Leadership #Databricks #"ETL (Extract #Transform #Load)" #GIT #Data Management #Version Control #Kafka (Apache Kafka) #Informatica #ADF (Azure Data Factory) #Snowflake #Data Integration #Agile #Semantic Models #Scala #SQL (Structured Query Language) #Data Pipeline #Azure #Azure Data Factory #Data Quality #Data Engineering #ML (Machine Learning) #Data Bricks #Data Access #Synapse #Data Modeling #dbt (data build tool) #Data Processing #Project Management
Role description
Senior Data Engineer
Los Angeles, CA (Onsite from Day 1)
Long Term Contract
Mandatory Skills:
β’ Snowflake
β’ Experience with Azure Data Factory, Azure Synapse, Informatica DBT, Kafka)
β’ Strong SQL skills for querying and transforming data.
β’ Leverage Direct Lake and Delta Table integration for high-performance data access and real-time analytics
β’ Collaborate with data engineering teams to ensure efficient data pipelines and refresh strategies.
β’ Build and optimize semantic models and table structures for performance and scalability.
β’ Understanding of data modeling best practices and performance tuning.
β’ Knowledge of CI/CD tools and version control (Git, Tabular Editor, TMDL).
Job Summary
β’ The Senior Technical Lead will be responsible for leading technical teams and overseeing the implementation of projects related to Snowflake, Azure Data Factory (ADF), and Data Bricks. The role involves ensuring the successful delivery of data solutions and optimizing data pipelines for efficient performance.
β’ Lead and manage technical teams, providing guidance and support in the implementation of snowflake, azure data factory (adf), and data bricks projects.
β’ Design, develop, and maintain data pipelines using snowflake, azure data factory (adf), and data bricks to support business requirements.
β’ Work closely with stakeholders to gather requirements, identify opportunities for data analytics, and propose data driven solutions.
β’ Monitor and troubleshoot data pipelines, ensuring data quality, reliability, and performance.
β’ Stay updated with the latest trends and technologies in data management and analytics, incorporating best practices into projects.
β’ Collaborate with cross functional teams to integrate data solutions with existing systems and applications.
β’ Provide technical expertise and mentorship to team members, promoting a culture of learning and innovation.
Skill Requirements
β’ Proficiency in snowflake, including data modeling, querying, and performance optimization.
β’ Strong experience in azure data factory (adf) for data integration and orchestration.
β’ Handson knowledge of data bricks for data engineering, data processing, and machine learning.
β’ Ability to design, develop, and optimize complex data pipelines for etl processes.
β’ Strong problem-solving skills and the ability to troubleshoot data pipeline issues.
β’ Excellent communication skills to interact with technical and nontechnical stakeholders effectively.
β’ Strong leadership skills to guide and motivate technical teams towards project delivery and success.
β’ Experience in agile methodologies and project management practices for efficient project execution.
Certifications: Relevant certifications in Snowflake, Azure Data Factory (ADF), DataBricks are a plus
Senior Data Engineer
Los Angeles, CA (Onsite from Day 1)
Long Term Contract
Mandatory Skills:
β’ Snowflake
β’ Experience with Azure Data Factory, Azure Synapse, Informatica DBT, Kafka)
β’ Strong SQL skills for querying and transforming data.
β’ Leverage Direct Lake and Delta Table integration for high-performance data access and real-time analytics
β’ Collaborate with data engineering teams to ensure efficient data pipelines and refresh strategies.
β’ Build and optimize semantic models and table structures for performance and scalability.
β’ Understanding of data modeling best practices and performance tuning.
β’ Knowledge of CI/CD tools and version control (Git, Tabular Editor, TMDL).
Job Summary
β’ The Senior Technical Lead will be responsible for leading technical teams and overseeing the implementation of projects related to Snowflake, Azure Data Factory (ADF), and Data Bricks. The role involves ensuring the successful delivery of data solutions and optimizing data pipelines for efficient performance.
β’ Lead and manage technical teams, providing guidance and support in the implementation of snowflake, azure data factory (adf), and data bricks projects.
β’ Design, develop, and maintain data pipelines using snowflake, azure data factory (adf), and data bricks to support business requirements.
β’ Work closely with stakeholders to gather requirements, identify opportunities for data analytics, and propose data driven solutions.
β’ Monitor and troubleshoot data pipelines, ensuring data quality, reliability, and performance.
β’ Stay updated with the latest trends and technologies in data management and analytics, incorporating best practices into projects.
β’ Collaborate with cross functional teams to integrate data solutions with existing systems and applications.
β’ Provide technical expertise and mentorship to team members, promoting a culture of learning and innovation.
Skill Requirements
β’ Proficiency in snowflake, including data modeling, querying, and performance optimization.
β’ Strong experience in azure data factory (adf) for data integration and orchestration.
β’ Handson knowledge of data bricks for data engineering, data processing, and machine learning.
β’ Ability to design, develop, and optimize complex data pipelines for etl processes.
β’ Strong problem-solving skills and the ability to troubleshoot data pipeline issues.
β’ Excellent communication skills to interact with technical and nontechnical stakeholders effectively.
β’ Strong leadership skills to guide and motivate technical teams towards project delivery and success.
β’ Experience in agile methodologies and project management practices for efficient project execution.
Certifications: Relevant certifications in Snowflake, Azure Data Factory (ADF), DataBricks are a plus