

Openkyber
Cloud Identity Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Cloud Data Engineer on a contract basis in Long Beach, CA, offering $53-$58/hr. Requires 7+ years in cloud data engineering, expertise in Azure, Snowflake, SQL, Python, and experience in healthcare IT environments.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
464
-
ποΈ - Date
March 4, 2026
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Alaska
-
π§ - Skills detailed
#GitHub #Programming #Data Warehouse #SQL Server #Vault #"ETL (Extract #Transform #Load)" #SnowPipe #Azure DevOps #Migration #dbt (data build tool) #Compliance #Cloud #Documentation #Data Layers #ADF (Azure Data Factory) #Snowpark #SSIS (SQL Server Integration Services) #Azure cloud #DevOps #Security #Computer Science #Strategy #Data Security #SQL (Structured Query Language) #API (Application Programming Interface) #SSRS (SQL Server Reporting Services) #Python #Azure Data Factory #Automation #Data Pipeline #Streamlit #Leadership #Data Vault #Data Architecture #Data Governance #Data Modeling #Scala #Snowflake #Data Quality #Data Engineering #Microsoft Power BI #Data Processing #SSAS (SQL Server Analysis Services) #BI (Business Intelligence) #Data Integration #Azure #Agile
Role description
We are looking for a Lead Cloud Data Engineer for our client in Long beach, CA
Job Title: Lead Cloud Data Engineer
Job Location: Long beach, CA
Job Type: Contract
Job Description:
Pay Range: $53hr - $58hr
Requirement/Must Have:
Seven plus years of experience in cloud and enterprise data engineering leadership roles.
Strong expertise in Azure data engineering services, including Azure Data Factory.
Hands-on experience with Snowflake, Snowpipe, and Snowpark.
Experience with DBT or Coalesce.
Strong SQL and Python programming skills.
Experience building and managing CI/CD pipelines using Azure DevOps and GitHub Actions.
Strong understanding of data warehouse architecture and Data Vault 2.0 modeling.
Experience implementing cloud migrations and modern integration frameworks.
Knowledge of Azure cloud security, compliance, and cost optimization.
Experience leading integration and modernization projects.
Strong stakeholder communication and leadership skills.
Experience working in Agile environments, including the SAFe framework.
Experience: Proven track record delivering enterprise cloud data warehouse modernization projects. Experience designing and building ETL pipelines for structured and semi-structured data. Experience implementing automation strategies and DevOps best practices. Experience working in healthcare IT environments with regulatory compliance considerations. Experience managing offshore and onsite delivery teams.
Responsibilities:
Lead migration, maintenance, and modernization of data warehouse solutions to Snowflake on Azure.
Provide architectural direction for enterprise integration strategies and Azure-native applications.
Oversee Azure Data Factory, data fabric, and advanced data processing architectures.
Design, enhance, and optimize ETL pipelines and data repositories.
Implement advanced data pipelines using Coalesce or similar tools.
Guide analytics teams in leveraging Power BI and semantic data layers.
Define technology roadmaps and identify innovation opportunities.
Evaluate new tools, frameworks, and cloud services to enhance data platforms.
Ensure best practices for data integration, API development, and real-time data streaming.
Build and optimize CI/CD pipelines in collaboration with engineering teams.
Implement processes to monitor and ensure data quality, integrity, and availability.
Develop documentation, including process design, system specifications, and test plans.
Ensure compliance with data security and privacy regulations.
Lead integration project delivery and long-term data platform architecture strategy.
Mentor development teams and promote continuous learning.
Collaborate with business stakeholders to translate requirements into scalable solutions.
Perform additional duties as assigned.
Should Have:
SnowPro certification preferred.
Experience with the Microsoft BI stack, including SSRS, SSIS, SSAS, and SQL Server.
Experience building self-service semantic layers for reporting.
Experience creating data applications using Streamlit.
Strong analytical and problem-solving capabilities.
Skills: Cloud data architecture and enterprise solution design, ETL and data pipeline engineering, DevOps and CI/CD implementation, Data governance and compliance in healthcare, Data modeling and Data Vault 2.0, Agile and SAFe methodologies, Leadership and stakeholder management.
Qualification and Education: Bachelor s degree in Computer Science, Information Systems, Engineering, or related field preferred. Snowflake Architect or SnowPro certification preferred.
For applications and inquiries, contact: hirings@openkyber.com
We are looking for a Lead Cloud Data Engineer for our client in Long beach, CA
Job Title: Lead Cloud Data Engineer
Job Location: Long beach, CA
Job Type: Contract
Job Description:
Pay Range: $53hr - $58hr
Requirement/Must Have:
Seven plus years of experience in cloud and enterprise data engineering leadership roles.
Strong expertise in Azure data engineering services, including Azure Data Factory.
Hands-on experience with Snowflake, Snowpipe, and Snowpark.
Experience with DBT or Coalesce.
Strong SQL and Python programming skills.
Experience building and managing CI/CD pipelines using Azure DevOps and GitHub Actions.
Strong understanding of data warehouse architecture and Data Vault 2.0 modeling.
Experience implementing cloud migrations and modern integration frameworks.
Knowledge of Azure cloud security, compliance, and cost optimization.
Experience leading integration and modernization projects.
Strong stakeholder communication and leadership skills.
Experience working in Agile environments, including the SAFe framework.
Experience: Proven track record delivering enterprise cloud data warehouse modernization projects. Experience designing and building ETL pipelines for structured and semi-structured data. Experience implementing automation strategies and DevOps best practices. Experience working in healthcare IT environments with regulatory compliance considerations. Experience managing offshore and onsite delivery teams.
Responsibilities:
Lead migration, maintenance, and modernization of data warehouse solutions to Snowflake on Azure.
Provide architectural direction for enterprise integration strategies and Azure-native applications.
Oversee Azure Data Factory, data fabric, and advanced data processing architectures.
Design, enhance, and optimize ETL pipelines and data repositories.
Implement advanced data pipelines using Coalesce or similar tools.
Guide analytics teams in leveraging Power BI and semantic data layers.
Define technology roadmaps and identify innovation opportunities.
Evaluate new tools, frameworks, and cloud services to enhance data platforms.
Ensure best practices for data integration, API development, and real-time data streaming.
Build and optimize CI/CD pipelines in collaboration with engineering teams.
Implement processes to monitor and ensure data quality, integrity, and availability.
Develop documentation, including process design, system specifications, and test plans.
Ensure compliance with data security and privacy regulations.
Lead integration project delivery and long-term data platform architecture strategy.
Mentor development teams and promote continuous learning.
Collaborate with business stakeholders to translate requirements into scalable solutions.
Perform additional duties as assigned.
Should Have:
SnowPro certification preferred.
Experience with the Microsoft BI stack, including SSRS, SSIS, SSAS, and SQL Server.
Experience building self-service semantic layers for reporting.
Experience creating data applications using Streamlit.
Strong analytical and problem-solving capabilities.
Skills: Cloud data architecture and enterprise solution design, ETL and data pipeline engineering, DevOps and CI/CD implementation, Data governance and compliance in healthcare, Data modeling and Data Vault 2.0, Agile and SAFe methodologies, Leadership and stakeholder management.
Qualification and Education: Bachelor s degree in Computer Science, Information Systems, Engineering, or related field preferred. Snowflake Architect or SnowPro certification preferred.
For applications and inquiries, contact: hirings@openkyber.com





