

Insight Global
Lead Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Lead Data Engineer, offering a contract of unspecified length at a pay rate of $65-$70/hr. Key skills include Python, ETL experience, and hands-on knowledge of Databricks and analytics platforms. A degree in Computer Science is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date
February 17, 2026
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Scripting #ML (Machine Learning) #Java #Databricks #Scala #Computer Science #Teradata #SSIS (SQL Server Integration Services) #Data Pipeline #Data Transformations #Data Engineering #Programming #Hadoop #Snowflake #RDBMS (Relational Database Management System) #Databases #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Data Quality #Data Accuracy #FastAPI #C++ #C# #SQL (Structured Query Language) #Python
Role description
Insight Global is seeking Data Engineers at multiple levels to support both analytical and transactional data platforms. These roles will focus on data transformation, analytics enablement, and collaboration with crossβfunctional teams to validate and optimize data pipelines. The ideal candidates bring strong programming fundamentals, handsβon ETL experience, and comfort working in modern analytics platforms such as Databricks.
Key Responsibilities
β’ Design, build, and maintain ETL pipelines to support analytics, reporting, and downstream consumption
β’ Perform data transformations and analytics workflows primarily within Databricks
β’ Collaborate closely with business and technical teams to validate data accuracy, logic, and transformations
β’ Support AI/ML workflows, including rapid prototyping and iterative development
β’ Work with structured and transactional data stored in RDBMS and analytics platforms
β’ Contribute to data quality, performance tuning, and scalable pipeline design
β’ For senior/lead roles: provide technical guidance, best practices, and design oversight
Required Qualifications
Formal education in Computer Science
β’ Programming & Scripting
β’ Proficiency in Python
β’ Experience with enterprise languages such as Java, C++, or C#
β’ FastAPI experience is a plus
β’ ETL & Data Engineering
β’ Strong ETL experience required (mandatory for at least one opening)
β’ Experience with SSIS is acceptable
β’ Experience with ETL tools such as Smart QN and Flow Stream
β’ Strong understanding of data transformation and analytical workflows
β’ Less focus on raw ingestion; more focus on searching, transforming, and validating data
β’ Platforms & Tools
β’ Handsβon experience with Databricks
β’ Experience with analytics platforms such as Hadoop, Snowflake, or Teradata
β’ Strong SQL skills and experience working with transactional databases (RDBMS)
$65/hr - $70/hr based off of experience
Insight Global is seeking Data Engineers at multiple levels to support both analytical and transactional data platforms. These roles will focus on data transformation, analytics enablement, and collaboration with crossβfunctional teams to validate and optimize data pipelines. The ideal candidates bring strong programming fundamentals, handsβon ETL experience, and comfort working in modern analytics platforms such as Databricks.
Key Responsibilities
β’ Design, build, and maintain ETL pipelines to support analytics, reporting, and downstream consumption
β’ Perform data transformations and analytics workflows primarily within Databricks
β’ Collaborate closely with business and technical teams to validate data accuracy, logic, and transformations
β’ Support AI/ML workflows, including rapid prototyping and iterative development
β’ Work with structured and transactional data stored in RDBMS and analytics platforms
β’ Contribute to data quality, performance tuning, and scalable pipeline design
β’ For senior/lead roles: provide technical guidance, best practices, and design oversight
Required Qualifications
Formal education in Computer Science
β’ Programming & Scripting
β’ Proficiency in Python
β’ Experience with enterprise languages such as Java, C++, or C#
β’ FastAPI experience is a plus
β’ ETL & Data Engineering
β’ Strong ETL experience required (mandatory for at least one opening)
β’ Experience with SSIS is acceptable
β’ Experience with ETL tools such as Smart QN and Flow Stream
β’ Strong understanding of data transformation and analytical workflows
β’ Less focus on raw ingestion; more focus on searching, transforming, and validating data
β’ Platforms & Tools
β’ Handsβon experience with Databricks
β’ Experience with analytics platforms such as Hadoop, Snowflake, or Teradata
β’ Strong SQL skills and experience working with transactional databases (RDBMS)
$65/hr - $70/hr based off of experience






