

GKP Solutions, Inc.
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Plantation, FL, with a 6-month contract to hire. Requires 5+ years of Data Engineering experience, expertise in Databricks, Terraform, Python, SQL, and dbt. Bachelor's degree in a related field is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
545
-
🗓️ - Date
February 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Plantation, FL
-
🧠 - Skills detailed
#Data Processing #Terraform #GIT #Scala #Python #Data Engineering #Computer Science #Infrastructure as Code (IaC) #Documentation #Spark (Apache Spark) #Data Science #Data Modeling #DevOps #PySpark #Databricks #SQL (Structured Query Language) #dbt (data build tool) #Fivetran #Cloud #Data Ingestion #GitHub #Data Lake #Programming #"ETL (Extract #Transform #Load)" #CRM (Customer Relationship Management) #Delta Lake #Deployment #DevSecOps
Role description
DATA ENGINEER - CONTRACT TO PERM
PREFER GC or USC.
Location: Plantation, FL Duration: 6 Months - Hire
KEY REQUIREMENTS:
• 5+ years of Data Engineering experience with deep expertise in Databricks (Spark/PySpark/Scala, Delta Lake, Unity Catalog, Databricks SQL, platform administration)
• Strong proficiency in Infrastructure as Code using Terraform to provision and manage Databricks workspaces, SQL Endpoints, Unity Catalog objects, and cloud infrastructure
• Hands-on experience with dbt (Data Build Tool) for data modeling, transformation, testing, and documentation in collaboration with Analytics Engineers
• DevOps & CI/CD expertise using Git and GitHub Actions to implement robust pipelines for data workflows, dbt projects, and infrastructure deployments
• Proficiency in Python (required) and SQL for developing, optimizing, and troubleshooting complex data processing jobs
• Experience managing data ingestion pipelines using tools like Airbyte, Fivetran, or Stitch from diverse source systems (ERP, CRM, marketing, supply chain)
• Solid understanding of data warehousing, data lake, and lakehouse architectures with focus on scalability and performance optimization
• Bachelor's degree in Computer Science, Data Engineering, or related technical field
• Strong collaboration skills to work with Analytics Engineers, Data Scientists, and business stakeholders
NICE TO HAVE: Scala programming, DevSecOps principles, experience with Databricks Workflows orchestration
DATA ENGINEER - CONTRACT TO PERM
PREFER GC or USC.
Location: Plantation, FL Duration: 6 Months - Hire
KEY REQUIREMENTS:
• 5+ years of Data Engineering experience with deep expertise in Databricks (Spark/PySpark/Scala, Delta Lake, Unity Catalog, Databricks SQL, platform administration)
• Strong proficiency in Infrastructure as Code using Terraform to provision and manage Databricks workspaces, SQL Endpoints, Unity Catalog objects, and cloud infrastructure
• Hands-on experience with dbt (Data Build Tool) for data modeling, transformation, testing, and documentation in collaboration with Analytics Engineers
• DevOps & CI/CD expertise using Git and GitHub Actions to implement robust pipelines for data workflows, dbt projects, and infrastructure deployments
• Proficiency in Python (required) and SQL for developing, optimizing, and troubleshooting complex data processing jobs
• Experience managing data ingestion pipelines using tools like Airbyte, Fivetran, or Stitch from diverse source systems (ERP, CRM, marketing, supply chain)
• Solid understanding of data warehousing, data lake, and lakehouse architectures with focus on scalability and performance optimization
• Bachelor's degree in Computer Science, Data Engineering, or related technical field
• Strong collaboration skills to work with Analytics Engineers, Data Scientists, and business stakeholders
NICE TO HAVE: Scala programming, DevSecOps principles, experience with Databricks Workflows orchestration






