GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a 6-month contract, paying $55-60/hr. Requires 8+ years in data engineering, GCP experience, and 1+ year in Gen AI, preferably in insurance. Strong Python skills and knowledge of AI frameworks are essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
480
-
πŸ—“οΈ - Date discovered
September 4, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Greater Hartford
-
🧠 - Skills detailed
#AI (Artificial Intelligence) #Python #SQL (Structured Query Language) #AWS (Amazon Web Services) #NLP (Natural Language Processing) #Compliance #Data Science #ML (Machine Learning) #Scala #PyTorch #DevOps #Spark (Apache Spark) #GCP (Google Cloud Platform) #Azure #NoSQL #Programming #Snowflake #TensorFlow #Data Engineering #Deployment #Deep Learning #Cloud #"ETL (Extract #Transform #Load)" #Public Cloud #Data Pipeline
Role description
JOB DESCRIPTION Responsible for implementing data pipelines that bring together structured, semi-structured and unstructured data to support AI and Agentic solutions - includes pre-processing with extraction, chunking, embedding and grounding strategies to get the data ready Develop GCP driven systems to improve data capabilities, ensuring compliance with industry best practices for insurance-specific data use cases and challenges Develop data domains and data products for various consumption archetypes including Reporting, Data Science, AI/ML, Analytics etc. Collaborate closely with DevOps and infrastructure teams to ensure seamless deployment, operation, and maintenance of data systems Partner with architects and stakeholders to influence and implement the vision of the AI and data pipelines while safeguarding the integrity and scalability of the GCP environment REQUIRED SKILLS AND EXPERIENCE 8+ years of experience across data engineering utilizing data solutions, SQL and NoSQL, Snowflake, ELT/ETL tools, CICD, Bigdata, GCP, Python/Spark, Datamesh, Datalake, Data Fabric 1+ year of hands-on experience supporting Gen AI initiatives in a data environment, ideally from insurance background Experience building out within ideally a GCP public cloud environment, but could be AWS/Azure Strong programming skills in Python and familiarity with deep learning frameworks such as PyTorch or TensorFlow NICE TO HAVE SKILLS AND EXPERIENCE AI certifications GCP/Azure/AWS certifications Experience in Property & Casualty or Employee Benefits industry Knowledge of natural language processing (NLP) and computer vision technologies Pay Rate: $55-60/hr β€’ this is a 6-month contract β€’