

PEAK Technical Staffing USA
Technical Lead - Cloud Data Engineering
โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Technical Lead - Cloud Data Engineering, fully remote for 9 months at a pay rate of "X". Requires 9 years of experience, strong Azure Cloud, ETL, and data engineering skills, and US Citizenship for Public Trust clearance.
๐ - Country
United States
๐ฑ - Currency
$ USD
-
๐ฐ - Day rate
632
-
๐๏ธ - Date
March 26, 2026
๐ - Duration
Unknown
-
๐๏ธ - Location
Remote
-
๐ - Contract
Unknown
-
๐ - Security
Unknown
-
๐ - Location detailed
Washington, DC 20001
-
๐ง - Skills detailed
#BI (Business Intelligence) #Documentation #Security #SSIS (SQL Server Integration Services) #SQL Server #SQL (Structured Query Language) #SSRS (SQL Server Reporting Services) #ADLS (Azure Data Lake Storage) #Data Lakehouse #Azure #Azure Synapse Analytics #Azure SQL #Data Migration #Data Engineering #Data Mart #Data Quality #Data Pipeline #Microsoft Azure #Computer Science #Cloud #Apache Spark #Monitoring #Scala #Visualization #MS SQL (Microsoft SQL Server) #Data Warehouse #Data Lake #Azure cloud #Data Management #Deployment #Data Integrity #Microsoft SQL Server #Metadata #Storage #Migration #ADF (Azure Data Factory) #PySpark #Quality Assurance #Agile #SSAS (SQL Server Analysis Services) #Synapse #"ETL (Extract #Transform #Load)" #EDW (Enterprise Data Warehouse) #JMeter #API (Application Programming Interface) #Azure ADLS (Azure Data Lake Storage) #AI (Artificial Intelligence) #Azure Data Factory #NoSQL #Microsoft SQL #ML (Machine Learning) #Data Security #Python #Scripting #Azure DevOps #DevOps #Spark (Apache Spark)
Role description
#Eng-IT-01
This is a fully remote, teleworking position with potential travel to the Washington D.C. metro area on special occasions.
Seeking a customer experience-focused professional to lead a team of developers in the design and implementation of data engineering solutions for Azure Cloud-based Data Lake, SQL, and NoSQL data stores. In this role, you will work directly with end users to translate business requirements into cloud data engineering solutions to support an enterprise-scale Microsoft Azure-based data analytics and reporting platform. Our ideal candidate is mission-focused, delivery-oriented, and applies critical thinking, including the use of AI concepts, to create innovative functions and solve technical issues.
Who We Are
Fortune 500ยฎ technology, engineering, and science solutions and services leader, working to solve the worldโs toughest challenges in the defense, intelligence, civil, and health markets. The Leidos Civil Group helps the government modernize operations with leading-edge AI/ML-driven data management and analytics solutions. We are trusted partners to both government and highly-regulated commercial customers looking for transformative solutions in mission IT, security, software, engineering, and operations. We work with our customers, including the FAA, DOE, DOJ, NASA, National Science Foundation, Transportation Security Administration, Customs and Border Protection, airports, and electric utilities, to make the world safer, healthier, and more efficient.
Responsibilities
- Work directly with business users to understand data requirements, and develop, deliver, and maintain appropriate data solutions.
- Lead a team of data engineering developers to implement Azure Data Lake-based ETL/ELT/data pipelines, data extracts, and API-based data delivery.
- Ensure continued legacy EDW development and operations utilizing Microsoft SQL Server, SSIS, SSRS, SSAS, PowerShell, and related technologies.
- Drive the migration of legacy EDW platform data workloads to the new Azure Cloud Data Lake platform.
- Define ETL performance testing scope, benchmark workloads against legacy EDW baselines, validate SLA-compliant data loads, and optimize throughput to ensure scalable cloud performance.
- Work with IT infrastructure, DevOps, Data Engineers, Modelers, and Architects to optimize pipelines, test environment, test design, test execution, resolve issues, perform optimization, and tuning.
- Support the design and implementation of data models and data pipelines for relational, dimensional, data Lakehouse (Medallion architecture), data warehouse, data mart, SQL, and NoSQL data stores.
- Utilize Microsoft Azure services including Azure Data Lake Storage Gen2, Azure SQL Managed Instance, Azure Data Factory, Synapse Pipelines, Apache Spark Notebooks, Python, SQL, stored procedures, and Azure OpenAI to develop cloud-native data solutions.
- Prepare data required for advanced analytics, visualization, reporting, data extracts, and AI/ML.
- Develop and implement processes, procedures, and checklists to guide the development and operations of the legacy EDW platform and modernized Azure platform.
- Implement data migration, data integrity, data quality, metadata management, performance management, audit data capture, and data security functions.
- Monitor and troubleshoot data-related issues to maintain high availability and performance.
- Implement governance, build, deployment, and monitoring to automate platform operation.
- Actively support Agile DevOps process, including Program Increment planning.
- Engage in continuous learning to increase relevant skills.
- Maintain strict versioning and configuration control to ensure data integrity.
Qualifications
- At least a BS degree in Computer Science or related field and 9 years of experience.
- 5 years of proven experience leading a team supporting enterprise data warehousing using Microsoft data warehousing and reporting solutions.
- 5 years of experience working directly with clients to understand data needs, develop and deliver solutions (data engineering, reports, dashboards, data extracts), and provide continued support.
- 5 years of strong experience in ETL performance testing for traditional EDW and Azure Cloud Data platforms with the use of tools such as JMeter, Native Azure Tools like Azure App Testing, Azure Monitor Log Analytics, Spark UI Ganglia for Lakehouse Performance.
- 5 years of experience delivering solutions using Microsoft database, ETL/ELT, and business intelligence tools, including SQL Server (e.g., stored procedures), SSIS, SSRS, SSAS (cubes).
- 5 years of experience with more than one of the following scripting languages: SQL, T-SQL, Python, PySpark, PowerShell.
- 3 years of experience designing and building ETL/data engineering solutions utilizing various cloud services such as Azure Data Lake Services, Azure Synapse Analytics, Azure Data Factory, Integration Runtime.
- Experience with data management and engineering best practices, including system development lifecycle, configuration control, change management, quality assurance, performance management, and documentation support.
- Experience in Agile projects, working with a multi-functional team.
- Must be detail-oriented and able to support multiple projects and tasks.
- Demonstrate continuous learning to increase relevant skills.
- Demonstrated experience in supporting production, testing, integration, and development environments.
- Open mindset, ability to quickly adapt new technologies to solve customer problems.
- US Citizenship and ability to successfully obtain government-issued Public Trust clearance.
Preferred Qualifications
- Experience working with data in law, HR, financial management, inventory, property, and management domains.
- Experience working on Federal government projects with at least active Public Trust Clearance.
- Experience working with Azure DevOps.
- Microsoft certification in Azure fundamentals, data engineering, AI, data analytics.
Benefits
PEAK's benefit offerings available for our associates include medical, dental, vision, Flexible Spending Account (FSA), Dependent Care Savings Account (DCA), and 401K plan. PEAK believes that taking care of our team is essential for success and we are proud to provide benefits that enhance both your well-being and your future. Additionally, our associates may be eligible for Paid Sick Leave as required by Federal, State, or local laws.
Equal Opportunity Employer (EEO)
PEAK Technical Staffing is committed to creating a diverse and inclusive environment and is proud to be an Equal Opportunity Employer. PEAK does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, or veteran status, or any other characteristic protected by applicable law. All employment decisions are made based on qualifications, merit, and business need. We encourage all individuals to apply.
Americans Disabilities Act (ADA)
The physical and mental requirements described in this job description are representative of those that must be met by an employee to successfully perform the essential functions of the position. Reasonable accommodations may be made to enable qualified individuals with disabilities to perform the essential functions. Must be able to perform the essential physical functions of the position, including sitting, standing, walking, stooping, kneeling, and lifting up to 25 pounds, with or without reasonable accommodation.
Candidate Privacy
To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to https://peaktechnical.com/privacy-policy/ and https://peaktechnical.com/ca-residents-privacy-rights/
AI Recruiting Disclosure
We use AI-assisted tools to help review applications and compare your experience to job requirements, but all hiring decisions are made by human recruiters. You may request a human-only process or opt out of automated communication at any time. Required notices and our latest bias audit are available on our website: www.peaktechnical.com/ai-disclosure.
#Eng-IT-01
This is a fully remote, teleworking position with potential travel to the Washington D.C. metro area on special occasions.
Seeking a customer experience-focused professional to lead a team of developers in the design and implementation of data engineering solutions for Azure Cloud-based Data Lake, SQL, and NoSQL data stores. In this role, you will work directly with end users to translate business requirements into cloud data engineering solutions to support an enterprise-scale Microsoft Azure-based data analytics and reporting platform. Our ideal candidate is mission-focused, delivery-oriented, and applies critical thinking, including the use of AI concepts, to create innovative functions and solve technical issues.
Who We Are
Fortune 500ยฎ technology, engineering, and science solutions and services leader, working to solve the worldโs toughest challenges in the defense, intelligence, civil, and health markets. The Leidos Civil Group helps the government modernize operations with leading-edge AI/ML-driven data management and analytics solutions. We are trusted partners to both government and highly-regulated commercial customers looking for transformative solutions in mission IT, security, software, engineering, and operations. We work with our customers, including the FAA, DOE, DOJ, NASA, National Science Foundation, Transportation Security Administration, Customs and Border Protection, airports, and electric utilities, to make the world safer, healthier, and more efficient.
Responsibilities
- Work directly with business users to understand data requirements, and develop, deliver, and maintain appropriate data solutions.
- Lead a team of data engineering developers to implement Azure Data Lake-based ETL/ELT/data pipelines, data extracts, and API-based data delivery.
- Ensure continued legacy EDW development and operations utilizing Microsoft SQL Server, SSIS, SSRS, SSAS, PowerShell, and related technologies.
- Drive the migration of legacy EDW platform data workloads to the new Azure Cloud Data Lake platform.
- Define ETL performance testing scope, benchmark workloads against legacy EDW baselines, validate SLA-compliant data loads, and optimize throughput to ensure scalable cloud performance.
- Work with IT infrastructure, DevOps, Data Engineers, Modelers, and Architects to optimize pipelines, test environment, test design, test execution, resolve issues, perform optimization, and tuning.
- Support the design and implementation of data models and data pipelines for relational, dimensional, data Lakehouse (Medallion architecture), data warehouse, data mart, SQL, and NoSQL data stores.
- Utilize Microsoft Azure services including Azure Data Lake Storage Gen2, Azure SQL Managed Instance, Azure Data Factory, Synapse Pipelines, Apache Spark Notebooks, Python, SQL, stored procedures, and Azure OpenAI to develop cloud-native data solutions.
- Prepare data required for advanced analytics, visualization, reporting, data extracts, and AI/ML.
- Develop and implement processes, procedures, and checklists to guide the development and operations of the legacy EDW platform and modernized Azure platform.
- Implement data migration, data integrity, data quality, metadata management, performance management, audit data capture, and data security functions.
- Monitor and troubleshoot data-related issues to maintain high availability and performance.
- Implement governance, build, deployment, and monitoring to automate platform operation.
- Actively support Agile DevOps process, including Program Increment planning.
- Engage in continuous learning to increase relevant skills.
- Maintain strict versioning and configuration control to ensure data integrity.
Qualifications
- At least a BS degree in Computer Science or related field and 9 years of experience.
- 5 years of proven experience leading a team supporting enterprise data warehousing using Microsoft data warehousing and reporting solutions.
- 5 years of experience working directly with clients to understand data needs, develop and deliver solutions (data engineering, reports, dashboards, data extracts), and provide continued support.
- 5 years of strong experience in ETL performance testing for traditional EDW and Azure Cloud Data platforms with the use of tools such as JMeter, Native Azure Tools like Azure App Testing, Azure Monitor Log Analytics, Spark UI Ganglia for Lakehouse Performance.
- 5 years of experience delivering solutions using Microsoft database, ETL/ELT, and business intelligence tools, including SQL Server (e.g., stored procedures), SSIS, SSRS, SSAS (cubes).
- 5 years of experience with more than one of the following scripting languages: SQL, T-SQL, Python, PySpark, PowerShell.
- 3 years of experience designing and building ETL/data engineering solutions utilizing various cloud services such as Azure Data Lake Services, Azure Synapse Analytics, Azure Data Factory, Integration Runtime.
- Experience with data management and engineering best practices, including system development lifecycle, configuration control, change management, quality assurance, performance management, and documentation support.
- Experience in Agile projects, working with a multi-functional team.
- Must be detail-oriented and able to support multiple projects and tasks.
- Demonstrate continuous learning to increase relevant skills.
- Demonstrated experience in supporting production, testing, integration, and development environments.
- Open mindset, ability to quickly adapt new technologies to solve customer problems.
- US Citizenship and ability to successfully obtain government-issued Public Trust clearance.
Preferred Qualifications
- Experience working with data in law, HR, financial management, inventory, property, and management domains.
- Experience working on Federal government projects with at least active Public Trust Clearance.
- Experience working with Azure DevOps.
- Microsoft certification in Azure fundamentals, data engineering, AI, data analytics.
Benefits
PEAK's benefit offerings available for our associates include medical, dental, vision, Flexible Spending Account (FSA), Dependent Care Savings Account (DCA), and 401K plan. PEAK believes that taking care of our team is essential for success and we are proud to provide benefits that enhance both your well-being and your future. Additionally, our associates may be eligible for Paid Sick Leave as required by Federal, State, or local laws.
Equal Opportunity Employer (EEO)
PEAK Technical Staffing is committed to creating a diverse and inclusive environment and is proud to be an Equal Opportunity Employer. PEAK does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, or veteran status, or any other characteristic protected by applicable law. All employment decisions are made based on qualifications, merit, and business need. We encourage all individuals to apply.
Americans Disabilities Act (ADA)
The physical and mental requirements described in this job description are representative of those that must be met by an employee to successfully perform the essential functions of the position. Reasonable accommodations may be made to enable qualified individuals with disabilities to perform the essential functions. Must be able to perform the essential physical functions of the position, including sitting, standing, walking, stooping, kneeling, and lifting up to 25 pounds, with or without reasonable accommodation.
Candidate Privacy
To read our Candidate Privacy Information Statement, which explains how we will use your information, please navigate to https://peaktechnical.com/privacy-policy/ and https://peaktechnical.com/ca-residents-privacy-rights/
AI Recruiting Disclosure
We use AI-assisted tools to help review applications and compare your experience to job requirements, but all hiring decisions are made by human recruiters. You may request a human-only process or opt out of automated communication at any time. Required notices and our latest bias audit are available on our website: www.peaktechnical.com/ai-disclosure.



