Insight Global

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Charlotte, NC, on a 6-month contract-to-hire, paying $60-$73/hr. Requires 12+ years of experience, 6+ years in Data Engineering, 4+ years with AWS, and strong ETL, SQL, and AI skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
584
-
🗓️ - Date
April 15, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#AWS Glue #Ab Initio #Data Engineering #Data Analysis #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #Spark (Apache Spark) #AWS (Amazon Web Services) #SQL (Structured Query Language) #Programming #PySpark #Complex Queries #Talend
Role description
Position: Sr. Data Engineer Location: Charlotte, NC - HYBRID Length: 6mo contract-to-hire Rate: $60/hr - $73/hr (W2) (Exact compensation may vary based on several factors, including skills, experience, and education) Required Skills • 12+ years experience • 6+ years with Data Engineering • 4+ years with AWS (setting up pipelines, using resources, etc.) • Background in ETL (ab initio, talend, AWS Glue, etc) • Pyspark • Object oriented programming • AI experience (understanding of agents and how to use then) • Advanced SQL experience (multi-joins, complex queries, profiling and summarizing data, etc.) • Strong data analysis knowledge and running through requirements • Understanding of data and how to create and analysis from looking at the data within SQL • Understanding of the relationship, summarizing and profiling data within SQL • Proven problem solving experience Job Description Insight Global is looking for an AWS Data Engineer to join one of our financial clients and help with an ongoing project. This resource will be responsible to helping unify store data in order to unify legacy systems across various lines of businesses. This resource should have an ETL background with a strong focus on data engineering, specifically with AWS. Candidates will be responsible for working independently creating code, testing AWS pipelines, running through and managing requirements, data analysis and helping to enhance existing models.