

GCP/AI Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP/AI Data Engineer with 8+ years in data engineering, 1+ year in Gen AI (insurance preferred), and strong Python skills. Contract length and work location are unspecified. Pay rate is $55-65/hr.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
520
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Data Science #DevOps #Data Pipeline #AI (Artificial Intelligence) #PyTorch #NLP (Natural Language Processing) #ML (Machine Learning) #Python #Deployment #Spark (Apache Spark) #Scala #"ETL (Extract #Transform #Load)" #Data Engineering #Compliance #Deep Learning #Programming #GCP (Google Cloud Platform) #NoSQL #TensorFlow #Snowflake #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
REQUIRED SKILLS AND EXPERIENCE
8+ years of experience across data engineering utilizing data solutions, SQL and NoSQL, Snowflake, ELT/ETL tools, CICD, Bigdata, GCP, Python/Spark, Datamesh, Datalake, Data Fabric
1+ year of hands-on experience supporting Gen AI initiatives in a data environment, ideally from insurance background
Experience implementing RAG (Retrieval-Augmented Generation) pipelines Vector/graph database experience to structure data for large models using AI tools (steps include extraction, chunking, embedding, and ground strategies)
Strong programming skills in Python and familiarity with deep learning frameworks such as PyTorch or TensorFlow
NICE TO HAVE SKILLS AND EXPERIENCE
AI certifications GCP certifications Experience in Property & Casualty or Employee Benefits industry Knowledge of natural language processing (NLP) and computer vision technologies
JOB DESCRIPTION
Responsible for implementing AI data pipelines that bring together structured, semi-structured and unstructured data to support AI and Agentic solutions - includes pre-processing with extraction, chunking, embedding and grounding strategies to get the data ready Develop AI-driven systems to improve data capabilities, ensuring compliance with industry best practices for insurance-specific data use cases and challenges Develop data domains and data products for various consumption archetypes including Reporting, Data Science, AI/ML, Analytics etc. Collaborate closely with DevOps and infrastructure teams to ensure seamless deployment, operation, and maintenance of data systems Partner with architects and stakeholders to influence and implement the vision of the AI and data pipelines while safeguarding the integrity and scalability of the environment
$55-65/hr based off of experience