
Databricks Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Engineer on a full-time, contract basis for over 6 months, with a pay rate of $106,255.44 - $150,000. Requires 8+ years in software development, 5+ years in cloud data engineering, and an Active Secret clearance.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
681.8181818182
-
ποΈ - Date discovered
September 5, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Yes
-
π - Location detailed
Sterling, VA 20166
-
π§ - Skills detailed
#Data Pipeline #Computer Science #Web Services #NoSQL #Monitoring #Cybersecurity #AWS CloudWatch #Spark (Apache Spark) #Strategy #Storage #Apache Spark #Data Engineering #Programming #Cloud #MLflow #S3 (Amazon Simple Storage Service) #Athena #Data Ingestion #Elasticsearch #NiFi (Apache NiFi) #Redshift #AWS (Amazon Web Services) #Data Management #Security #Migration #SharePoint #Scripting #Databricks
Role description
Job Summary:We are supporting a U.S. Government customer on a large mission critical development and sustainment program to design, integrate, build, deliver, and operate a network operations environment including introducing new cyber capabilities to address emerging threats. We are seeking a Databricks Engineer to support the migration of customer applications, services and platforms to a Medallion Model. This opportunityrepresents the cornerstone of the future of the organization.
Duties:
Supporting teams to migrate services, applications and platforms from legacy back-end systems to Databricks.
Identifying the optimal path for migration, building the plan for migration and finally, execution of the plan.
Data Pipeline migration of legacy pipelines from NiFi to Databricks, complete with validation.
Implementing the medallion model for each of the data assets being migrated to Databricks.
Develop an SOP for integration of data assets into the Databricks platform focused on efficiency, instrumentation and performance.
Optimize development, testing, monitoring and security for data assets being added to the Databricks platform.
Develop and implement a strategy for optimizing migration and integration of data assets to the Databricks platform.
Develop code using various programming and scripting languages to automate/optimize data ingestion, pipeline orchestration and improve data management processes.
Ingest transparency, leveraging technologies such as AWS CloudWatch to identify places for measuring and gathering performance information on automated data pipelines.
Ensure Data Engineering Team Standard Operating Procedures are appropriately captured and communicated across the team.
Ensure technical correctness, timeliness and quality of delivery for the team.
Demonstrate excellent oral and written communication skills to all levels of management and the customer.
Skills:
Must have an Active Secret (S) clearance and be able to obtain a TS/SCI clearance
Must be able to obtain DHS Suitability
8 + years of directly relevant software development experience required.
Minimum of 5 years of experience performing data engineering work in a cloud environment.
Experience with relational, noSQL and/or file-based storage (e.g. Databricks, Elastic, Postgres, S3, Athena, etc.)
Experience working in a CICD Pipeline factory environment
Working knowledge of Databricks, Cloud Relational Database Services, NiFi, AWS
Redshift and Elasticsearch
Desired Skills:
Databricks workflows
Databricks Unity Catalog
Databricks Autoloader
Databricks Delta Live Tables/Delta Lak
Databricks Workspace/Notebooks
MLflow
Apache Spark
Experience with collaboration tools including MS Teams, MS Outlook, MS SharePoint, and Confluence
Amazon Web Services (AWS) Professional certification or equivalent.
Excellent problem-solving and communication skills.
Familiarity with CISA: Securing the Software Supply Chain
Familiarity with CISA: Cybersecurity Best Practices
Familiarity with CISA: Open-Source Software Security
Familiarity with NIST SP 800-218, Secure Software Development Framework V1.1:
Required Education:
Bachelorβs degree in Software Engineering, Computer Science or a related discipline is required. [Ten (10) years of experience (for a total of six (18) or more years) may besubstituted for a degree.
Job Types: Full-time, Contract
Pay: $106,255.44 - $150,000.00 per year
Benefits:
401(k)
Dental insurance
Health insurance
Paid time off
Vision insurance
Work Location: In person
Job Summary:We are supporting a U.S. Government customer on a large mission critical development and sustainment program to design, integrate, build, deliver, and operate a network operations environment including introducing new cyber capabilities to address emerging threats. We are seeking a Databricks Engineer to support the migration of customer applications, services and platforms to a Medallion Model. This opportunityrepresents the cornerstone of the future of the organization.
Duties:
Supporting teams to migrate services, applications and platforms from legacy back-end systems to Databricks.
Identifying the optimal path for migration, building the plan for migration and finally, execution of the plan.
Data Pipeline migration of legacy pipelines from NiFi to Databricks, complete with validation.
Implementing the medallion model for each of the data assets being migrated to Databricks.
Develop an SOP for integration of data assets into the Databricks platform focused on efficiency, instrumentation and performance.
Optimize development, testing, monitoring and security for data assets being added to the Databricks platform.
Develop and implement a strategy for optimizing migration and integration of data assets to the Databricks platform.
Develop code using various programming and scripting languages to automate/optimize data ingestion, pipeline orchestration and improve data management processes.
Ingest transparency, leveraging technologies such as AWS CloudWatch to identify places for measuring and gathering performance information on automated data pipelines.
Ensure Data Engineering Team Standard Operating Procedures are appropriately captured and communicated across the team.
Ensure technical correctness, timeliness and quality of delivery for the team.
Demonstrate excellent oral and written communication skills to all levels of management and the customer.
Skills:
Must have an Active Secret (S) clearance and be able to obtain a TS/SCI clearance
Must be able to obtain DHS Suitability
8 + years of directly relevant software development experience required.
Minimum of 5 years of experience performing data engineering work in a cloud environment.
Experience with relational, noSQL and/or file-based storage (e.g. Databricks, Elastic, Postgres, S3, Athena, etc.)
Experience working in a CICD Pipeline factory environment
Working knowledge of Databricks, Cloud Relational Database Services, NiFi, AWS
Redshift and Elasticsearch
Desired Skills:
Databricks workflows
Databricks Unity Catalog
Databricks Autoloader
Databricks Delta Live Tables/Delta Lak
Databricks Workspace/Notebooks
MLflow
Apache Spark
Experience with collaboration tools including MS Teams, MS Outlook, MS SharePoint, and Confluence
Amazon Web Services (AWS) Professional certification or equivalent.
Excellent problem-solving and communication skills.
Familiarity with CISA: Securing the Software Supply Chain
Familiarity with CISA: Cybersecurity Best Practices
Familiarity with CISA: Open-Source Software Security
Familiarity with NIST SP 800-218, Secure Software Development Framework V1.1:
Required Education:
Bachelorβs degree in Software Engineering, Computer Science or a related discipline is required. [Ten (10) years of experience (for a total of six (18) or more years) may besubstituted for a degree.
Job Types: Full-time, Contract
Pay: $106,255.44 - $150,000.00 per year
Benefits:
401(k)
Dental insurance
Health insurance
Paid time off
Vision insurance
Work Location: In person