TPI Global Solutions

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Builder) in Montgomery, AL, on a contract-to-hire basis, requiring 5-7 years of data engineering experience, proficiency in SQL Server ETL/ELT, cloud integration, and programming (Python/Scala/Java). Preferred certifications include AWS Certified Data Engineer.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 23, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Montgomery, AL
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Docker #Automation #Cloud #Terraform #Python #DevOps #Ansible #SQL (Structured Query Language) #Kafka (Apache Kafka) #Dataflow #Big Data #SQL Server #dbt (data build tool) #ADF (Azure Data Factory) #Programming #Azure Data Factory #Kubernetes #Scala #GCP (Google Cloud Platform) #Observability #Azure #Java #Snowflake #AWS Glue #Data Engineering #Data Modeling #AWS DevOps #Airflow #C++ #Computer Science #AWS (Amazon Web Services)
Role description
Role: Data Engineer (Builder) Locations: Montgomery, AL (Onsite) Duration: Contract to Hire Visa: USC or GC Only Experience Needed • 5–7 years in data engineering or database development. • Hands‑on experience with SQL Server ETL/ELT pipelines. • Experience integrating pipelines with cloud services (AWS Glue, Azure Data Factory, GCP Dataflow). • Familiarity with streaming technologies (Kafka, Kinesis). • Experience in data modeling and architecture design. • Proficiency in Python/Scala/Java programming for pipeline development. • Exposure to DevOps automation (Terraform, Ansible) and containerization (Docker, Kubernetes). • DevOps and automation maturity with certifications (HashiCorp Terraform Associate, AWS DevOps Engineer) and containerization (Docker, Kubernetes). • Preferred: Advanced programming depth with applied coursework or certifications (Python Institute PCPP, Scala Professional Certification). • Preferred: Data modeling specialization with advanced coursework or vendor‑specific training (Snowflake, AWS Big Data Specialty). Education • Bachelor’s degree in Computer Science, Software Engineering, or related technical field. Certifications (Preferred) • AWS Certified Data Engineer • Azure Data Engineer Associate • Google Professional Data Engineer Software Use • SQL Server (ETL/ELT pipelines, stored procedures). • Orchestration tools (Airflow, DBT). • Cloud integration services (AWS Glue, Azure Data Factory, GCP Dataflow). • Observability tools (OpenLineage, Monte Carlo). • DevOps automation tools (Terraform, Ansible). • Containerization platforms (Docker, Kubernetes).