

Insight Global
GenAI Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GenAI Data Engineer on a 6-month contract to hire, hybrid in Charlotte, NC, with a pay rate of $50-$55/hr. Requires 7+ years in data engineering, 2+ years in insurance, proficiency in SQL, Python, and cloud platforms.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
440
-
ποΈ - Date
January 6, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Docker #Metadata #Python #Data Management #Data Lineage #Scrum #Kanban #Visualization #Snowflake #Deployment #Scala #Data Pipeline #Cloud #Observability #Redshift #AWS Glue #AWS (Amazon Web Services) #ML (Machine Learning) #Security #Data Analysis #Data Mart #AI (Artificial Intelligence) #Agile #"ETL (Extract #Transform #Load)" #Data Integration #SQL (Structured Query Language) #Kubernetes #BigQuery #GCP (Google Cloud Platform) #Data Engineering #Informatica #Automation #Dataflow #Monitoring #Data Warehouse #DataOps
Role description
Role: Senior Data Engineer
Duration: 6 Month Contract to Hire
Location: Hybrid in Charlotte, NC
Interview Process: In-person first round, virtual final
Hourly Rate Range: $50/hr-$55/hr
Required Skills & Experience
β’ 7+ years of experience in data analysis, transformation, and development, with ideally 2+ years in the insurance or a related industry.
β’ 5+ years of strong proficiency in SQL, Python, and ETL tools such as Informatica IDMC for data integration and transformation.
β’ 3+ years of experience developing and deploying large-scale data and analytics applications on cloud platforms such as AWS, GCP and Snowflake.
β’ Experience in small or medium-scale Generative AI (GenAI) integration within data workflows or enterprise solutions.
β’ 3+ years of expertise in designing and optimizing data models for Data Warehouses, Data Marts, and Data Fabric, including dimensional modeling, semantic layers, metadata management, and integration for scalable, governed, and high-performance analytics.
Nice to Haves
β’ 3+ years of experience in Agile methodologies, including Scrum and Kanban frameworks.
β’ Knowledge of data observability (metrics, tracing, logs) and monitoring frameworks.
β’ Exposure to metadata management (catalogs, glossary) and data lineage tools.
β’ Familiarity with containerization & orchestration (Docker, Kubernetes).
Job Description
β’ Modernize legacy data assets by migrating and re-engineering them into modern cloud solutions (AWS, GCP, Snowflake) for scalability, security, and cost efficiency.
β’ Design, develop, and optimize ETL/ELT pipelines for structured and unstructured data, ensuring resilience and high performance.
β’ Build and manage data pipelines leveraging cloud services such as AWS Glue, EMR, Redshift; GCP BigQuery, Dataflow; and Snowflake.
β’ Curate and publish Data Products to support analytics, visualization, machine learning, and Generative AI use cases.
β’ Implement DataOps practices for automated deployments, CI/CD integration, and continuous delivery of data solutions.
β’ Integrate AI/ML models into data pipelines, enabling real-time scoring, feature engineering, and model retraining workflows.
β’ Enable Generative AI capabilities by embedding LLMs into data workflows for intelligent automation and advanced insights.
β’ Develop and deploy AI Agents for automated decision-making and conversational analytics.
β’ Lead Proof of Concepts (POCs) and pilot initiatives for emerging technologiesβbeyond GenAIβsuch as real-time streaming, AI agents, and next-gen data platforms
Role: Senior Data Engineer
Duration: 6 Month Contract to Hire
Location: Hybrid in Charlotte, NC
Interview Process: In-person first round, virtual final
Hourly Rate Range: $50/hr-$55/hr
Required Skills & Experience
β’ 7+ years of experience in data analysis, transformation, and development, with ideally 2+ years in the insurance or a related industry.
β’ 5+ years of strong proficiency in SQL, Python, and ETL tools such as Informatica IDMC for data integration and transformation.
β’ 3+ years of experience developing and deploying large-scale data and analytics applications on cloud platforms such as AWS, GCP and Snowflake.
β’ Experience in small or medium-scale Generative AI (GenAI) integration within data workflows or enterprise solutions.
β’ 3+ years of expertise in designing and optimizing data models for Data Warehouses, Data Marts, and Data Fabric, including dimensional modeling, semantic layers, metadata management, and integration for scalable, governed, and high-performance analytics.
Nice to Haves
β’ 3+ years of experience in Agile methodologies, including Scrum and Kanban frameworks.
β’ Knowledge of data observability (metrics, tracing, logs) and monitoring frameworks.
β’ Exposure to metadata management (catalogs, glossary) and data lineage tools.
β’ Familiarity with containerization & orchestration (Docker, Kubernetes).
Job Description
β’ Modernize legacy data assets by migrating and re-engineering them into modern cloud solutions (AWS, GCP, Snowflake) for scalability, security, and cost efficiency.
β’ Design, develop, and optimize ETL/ELT pipelines for structured and unstructured data, ensuring resilience and high performance.
β’ Build and manage data pipelines leveraging cloud services such as AWS Glue, EMR, Redshift; GCP BigQuery, Dataflow; and Snowflake.
β’ Curate and publish Data Products to support analytics, visualization, machine learning, and Generative AI use cases.
β’ Implement DataOps practices for automated deployments, CI/CD integration, and continuous delivery of data solutions.
β’ Integrate AI/ML models into data pipelines, enabling real-time scoring, feature engineering, and model retraining workflows.
β’ Enable Generative AI capabilities by embedding LLMs into data workflows for intelligent automation and advanced insights.
β’ Develop and deploy AI Agents for automated decision-making and conversational analytics.
β’ Lead Proof of Concepts (POCs) and pilot initiatives for emerging technologiesβbeyond GenAIβsuch as real-time streaming, AI agents, and next-gen data platforms






