Novia Infotech

GCP Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Charlotte, NC (100% onsite) on a contract basis. Requires deep GCP expertise, Teradata and Hadoop experience, SQL proficiency, and Python scripting. Must be authorized to work in the US without sponsorship.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 19, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Teradata SQL #Scrum #Documentation #Hadoop #BTEQ #Data Engineering #Jenkins #Cloud #HDFS (Hadoop Distributed File System) #Business Analysis #SQL (Structured Query Language) #Dataflow #Storage #Agile #Migration #Data Architecture #Scripting #GCP (Google Cloud Platform) #Teradata #Consulting #Python #GitHub
Role description
Role : GCP Data Engineer Location: Charlotte, NC (100% onsite) Hire Type : Contract Note : β€œMust be legally authorized to work in US without need for employer sponsorship now or at any time in the future.” Background: As tenants transition to Google Cloud Platform (GCP) to comply with data center exit mandates, they encounter challenges that require extensive support. This initiative focuses on creating a structured tenant engagement model, including education on platform capabilities, migration best practices, and hands-on guidance throughout onboarding. Key activities include acting as a concierge service for queries, providing practical demonstrations and reusable artifacts, and maintaining knowledge resources. The model combines helpdesk support with high-touch consulting to deliver a comprehensive and consistent migration experience. Technical Skills - Deep expertise in Google Cloud Platform (GCP) services: Big Query, Dataproc, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, Data Plex - Experience with Teradata and Hadoop ecosystems (Hive, HDFS, MapReduce) - Proficiency in SQL (BTEQ, Teradata SQL and Big Query SQL) - Scripting in Python - CI/CD pipeline setup using GitHub Actions, Jenkins, Harness - GCP Big Query Migration Services - Building automated data validation and reconciliation frameworks Soft Skills - Effective communication and documentation abilities - Experience working in Agile/Scrum environments - Ability to collaborate with cross-functional teams (Data Architects, Cloud Engineers, Business Analysts)