

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a 6-month contract, paying "X" per hour, fully remote. Requires 5+ years in data engineering, strong Azure and Databricks expertise, and experience in secure environments. Relevant certifications preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
August 10, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Cybersecurity #Azure Data Factory #BI (Business Intelligence) #Python #Azure ADLS (Azure Data Lake Storage) #Metadata #PySpark #Data Catalog #Snowflake #Synapse #Data Engineering #Data Encryption #Data Lakehouse #Data Pipeline #Data Management #RDBMS (Relational Database Management System) #Data Governance #Delta Lake #Logging #Scala #ADLS (Azure Data Lake Storage) #SQL (Structured Query Language) #Computer Science #Programming #DevSecOps #Data Science #Security #Spark (Apache Spark) #Data Lake #Azure SQL #Distributed Computing #Classification #Microsoft Azure #Databricks #Compliance #Data Quality #Terraform #Azure Databricks #Infrastructure as Code (IaC) #"ETL (Extract #Transform #Load)" #Azure Synapse Analytics #GCP (Google Cloud Platform) #Strategy #ADF (Azure Data Factory) #Cloud #Data Processing #AWS (Amazon Web Services) #Azure #Storage
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Think of TEKsystems Global Services (TGS) as the growth solution for enterprises today. We unleash growth through technology, strategy, design, execution and operations with a customer-first mindset for bold business leaders. We deliver cloud, data and customer experience solutions. Our partnerships with leading cloud, design and business intelligence platforms fuel our expertise.
We value deep relationships, dedication to serving others and inclusion. We drive positive outcomes for our people and our business, and we stay true to our commitments and act in harmony with our words. We exist to create significant opportunity for people to achieve fulfillment through career success.
Ready to join us?
Hereβs what the opportunity supported through our TGS Talent Acquisition Team requires:
Position Overview
We are seeking a highly skilled and motivated Senior Data Engineer with 5 or more years of experience in data engineering and at least 3 years of strong hands-on experience with data engineering in cloud environment. The ideal candidate will have hands-on expertise in designing, developing, and deploying secure, scalable, and high-performance data pipelines for a defense-grade analytics platform hosted in Azure and Databricks. The ideal candidate should have high proficiency in Microsoft Azure, Databricks, and either Amazon Web Service (AWS) or Google Cloud Platform (GCP) with a solid foundation in cloud-native tools/services, and data governance.
This is a 6-month contract assignment supporting our aerospace manufacturing client with the potential to transition into full-time employment with TEKsystems Global Services.
The location of this position is flexible and can operate fully remote within the United States.
Key Responsibilities
β’ Actively participate in whole data pipeline design, development, and implementation lifecycle.
β’ Design and implement secure ETL/ELT pipelines in Azure Databricks to process structured and unstructured data at scale.
β’ Develop and manage Delta Lake tables, Unity Catalog policies, and secure data Lakehouse architecture.
β’ Integrate data from multiple sources, including real-time feeds, logs, APIs, sensors, and legacy systems.
β’ Use Azure Data Factory, Azure Synapse, Event Hubs, and Databricks Workflows for orchestration and ingestion.
β’ Implement data validation, schema evolution, lineage tracking, and data quality checks.
β’ Collaborate with security teams to enforce role-based access controls, data encryption, key management, and audit logging.
β’ Apply Infrastructure as Code (IaC) practices using Terraform, Bicep, or ARM templates to deploy and configure cloud resources.
β’ Support data governance and metadata management efforts, contributing to the development of secure data catalogs and access workflows.
β’ Collaborate with data scientists, DevSecOps engineers, and cybersecurity SMEs to ensure secure data processing and ensure compliance with DoD STIGs, RMF, FedRAMP, and CMMC L2+ standards in all data operations.
Mandatory Skills & Qualifications
β’ Must be legally eligible for employment in the U.S.
β’ Bachelorβs or Masterβs degree in Computer Science, Engineering, or related field
β’ 5 or more years of hands-on experience in data engineering (preferably in cloud environment) preferably in secure or classified environments
β’ Strong programming skills in Python, SQL, PySpark, and distributed computing
β’ Expert Level proficiency in β
β’ Azure Databricks (SQL, PySpark, Delta Lake)
β’ Azure Data Lake Storage Gen2, Azure Synapse Analytics
β’ Knowledge of secure data handling, encryption, and role-based access in Azure
β’ Integrate structured, semi-structured and unstructured data from APIs, RDBMS, and/or streaming sources into Azure SQL DW or Snowflake
β’ Proven track record of working in secure, governed environments with complex data classification rules
β’ Familiarity with data versioning, lineage, and reproducibility for analytical models
Preferred Skills & Qualifications
β’ Experience working in Azure Government, IL5/6, or JWCC environments
β’ Familiarity with Unity Catalog, Databricks Repos, Databricks Connect, or Delta Live Tables
β’ Experience in handling classified, PII, or controlled unclassified information (CUI)
β’ Certification such as:
β’ DP-203: Azure Data Engineer Associate
β’ Databricks Data Engineer Associate/Professional
β’ DP-700: Fabric Data Engineer Associate
β’ Understanding of DevSecOps, CI/CD, & MLops workflows in data engineering
β’ Familiarity with zero trust architecture and cross-domain solutions (CDS) for secure data sharing
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’
β’ Eligibility requirements apply to some benefits and may depend on your job classification and length of employment. Benefits are subject to change and may be subject to specific elections, plan, or program terms. If eligible, the benefits available for this temporary role may include the following:
β’ Medical, Dental, and Vision
β’ Critical Illness, Accident, and Hospital
β’ 401(k) Retirement Plan β Pre-tax and Roth post-tax contributions available
β’ Life Insurance (Voluntary Life and AD&D for employee and dependents)
β’ Short and Long-Term Disability
β’ Health Spending Account (HSA)
β’ Transportation Benefits
β’ Employee Assistance Program
β’ Time Off/Leave (PTO, Vacation or Sick Leave)
The expected posting close date is August 23, 2025.