

Jobs via Dice
Sr. Data Engineer : Indianapolis, IN : Contract to Hire
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Indianapolis, IN, offering a 3-month contract-to-hire. Key skills include Databricks, Apache Spark, Delta Lake, and real-time data processing. Requires 7+ years of experience and relevant certifications.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 28, 2025
🕒 - Duration
3 to 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Indianapolis, IN
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Cloud #Data Lake #SQL (Structured Query Language) #DevOps #Deployment #Terraform #Scala #GCP (Google Cloud Platform) #MLflow #Storage #Data Processing #Spark (Apache Spark) #Data Pipeline #Azure #Computer Science #Data Engineering #Python #ML (Machine Learning) #Apache Spark #Data Ingestion #AWS (Amazon Web Services) #Kafka (Apache Kafka) #Delta Lake #Databricks #Apache Kafka
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Script2IT, is seeking the following. Apply via Dice today!
Position: Senior Data Engineer
Location: Indianapolis, IN (person to travel to Indianapolis, Indiana to meet with client when required)
Duration: 3 months Contract to Hire
Looking For People Who Does Not Require Visa Sponsorship.
Here are the specifications: Key Responsibilities:
Experience in designing and implementation of scalable data pipelines and lakehouse architectures using Databricks
Experience in defining best practices for data ingestion, transformation, and storage
Experience in architecting Databricks solutions on Azure, AWS
Lead the deployment and configuration of Databricks including Unity Catalog
Experience in optimizing Spark jobs and cluster performance
Implement Delta Lake for reliable and performant data lakes
Implement role-based access controls and audit mechanisms
Evaluate emerging technologies and tools to enhance the Databricks ecosystem
Continuously improve performance, cost-efficiency, and scalability of data solutions
Skills And Competencies
Lakehouse Architecture Design
Real-Time Media Content Analytics
MLOps Pipeline Orchestration
Delta Lake
Unity Catalog
MLflow
Databricks SQL
Apache Kafka
Terraform
Required Qualifications:
Bachelor s or Master s degree in Computer Science, Engineering, or related field.
7+ years of experience in data engineering or architecture roles.
3+ years of hands-on experience with Databricks and Apache Spark.
Strong proficiency in Python, SQL, and Spark.
Experience with Delta Lake, MLflow, and Unity Catalog.
Familiarity with CI/CD pipelines and DevOps practices in data environments.
Certifications in Databricks, Azure/AWS/Google Cloud Platform.
Experience with real-time data processing (Kafka, Structured Streaming).
Knowledge of machine learning workflows and MLOps.
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Script2IT, is seeking the following. Apply via Dice today!
Position: Senior Data Engineer
Location: Indianapolis, IN (person to travel to Indianapolis, Indiana to meet with client when required)
Duration: 3 months Contract to Hire
Looking For People Who Does Not Require Visa Sponsorship.
Here are the specifications: Key Responsibilities:
Experience in designing and implementation of scalable data pipelines and lakehouse architectures using Databricks
Experience in defining best practices for data ingestion, transformation, and storage
Experience in architecting Databricks solutions on Azure, AWS
Lead the deployment and configuration of Databricks including Unity Catalog
Experience in optimizing Spark jobs and cluster performance
Implement Delta Lake for reliable and performant data lakes
Implement role-based access controls and audit mechanisms
Evaluate emerging technologies and tools to enhance the Databricks ecosystem
Continuously improve performance, cost-efficiency, and scalability of data solutions
Skills And Competencies
Lakehouse Architecture Design
Real-Time Media Content Analytics
MLOps Pipeline Orchestration
Delta Lake
Unity Catalog
MLflow
Databricks SQL
Apache Kafka
Terraform
Required Qualifications:
Bachelor s or Master s degree in Computer Science, Engineering, or related field.
7+ years of experience in data engineering or architecture roles.
3+ years of hands-on experience with Databricks and Apache Spark.
Strong proficiency in Python, SQL, and Spark.
Experience with Delta Lake, MLflow, and Unity Catalog.
Familiarity with CI/CD pipelines and DevOps practices in data environments.
Certifications in Databricks, Azure/AWS/Google Cloud Platform.
Experience with real-time data processing (Kafka, Structured Streaming).
Knowledge of machine learning workflows and MLOps.






