Insight Global

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position in Riverdale, UT, offering a contract to hire at $40-47/hr. Requires 2-4 years of experience, advanced Python skills, data pipeline development with Airflow, and familiarity with cloud technologies and databases.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
376
-
πŸ—“οΈ - Date
November 14, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Yes
-
πŸ“ - Location detailed
Riverdale, UT
-
🧠 - Skills detailed
#Database Performance #Monitoring #Data Warehouse #Security #Oracle #Data Cleaning #Programming #Cloud #Airflow #SQL (Structured Query Language) #Data Mart #Kafka (Apache Kafka) #Data Science #Kubernetes #Linux #Compliance #Data Engineering #Data Pipeline #Data Integrity #"ETL (Extract #Transform #Load)" #Python #DevOps #Documentation #API (Application Programming Interface) #Scala
Role description
Job Title: Data Engineer Location: Riverdale, UT Employment Type: Contract to Hire Pay: $40-47/hr Overview We are seeking a Data Engineer to design, build, and maintain scalable data pipelines and solutions that support advanced analytics and data science initiatives. This role involves working with cloud technologies, real-time systems, and modern data engineering practices to ensure data integrity, security, and performance. Responsibilities β€’ Serve as the primary contact for data pipelines, supporting operational implementation of Data Science and analytical models. β€’ Engineer solutions using serverless technology for optimal performance and scalability. β€’ Develop and adapt analytics solution engineering strategies and requirements. β€’ Perform DevOps tasks and orchestrate data marts and analytical processes. β€’ Monitor, secure, and track data usage and analytic product performance. β€’ Collaborate with internal teams to deliver accurate, accessible API services. β€’ Ensure compliance with security, documentation, governance, and standards. β€’ Manage a reporting layer to provide insights into model performance and impact. β€’ Identify potential issues based on trend projections and system monitoring. β€’ Participate in daily stand-ups, troubleshoot overnight pipeline runs, and integrate new data sources into the data warehouse. β€’ Perform data cleaning and transformation using SQL and Python. β€’ Design and implement data models, optimize database performance, and build scalable pipelines using cloud platforms. β€’ Document data processes and infrastructure changes. β€’ Respond to ad-hoc data requests and provide insights for decision-making. Required Skills & Experience β€’ 2–4 years of experience designing, developing, and implementing Data Science solutions. β€’ Advanced programming experience in Python. β€’ Experience building data pipelines with Airflow. β€’ Advanced database skills (MSSQL, Oracle, Postgres). β€’ Familiarity with Kubernetes and Linux environments. β€’ Ability to obtain a U.S. Security Clearance. β€’ Bachelor’s degree in a related field. β€’ Experience with Kafka (a strong plus). Nice to Have β€’ Experience working in the financial industry. β€’ Master’s degree in a related field.