

Novia Infotech
GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Charlotte, NC (100% onsite) on a contract basis. Requires deep GCP expertise, Teradata and Hadoop experience, SQL proficiency, and Python scripting. Must be authorized to work in the US without sponsorship.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
November 19, 2025
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Charlotte, NC
-
π§ - Skills detailed
#Teradata SQL #Scrum #Documentation #Hadoop #BTEQ #Data Engineering #Jenkins #Cloud #HDFS (Hadoop Distributed File System) #Business Analysis #SQL (Structured Query Language) #Dataflow #Storage #Agile #Migration #Data Architecture #Scripting #GCP (Google Cloud Platform) #Teradata #Consulting #Python #GitHub
Role description
Role : GCP Data Engineer
Location: Charlotte, NC (100% onsite)
Hire Type : Contract
Note : βMust be legally authorized to work in US without need for employer sponsorship now or at any time in the future.β
Background:
As tenants transition to Google Cloud Platform (GCP) to comply with data center exit mandates, they encounter challenges that require extensive support. This initiative focuses on creating a structured tenant engagement model, including education on platform capabilities, migration best practices, and hands-on guidance throughout onboarding. Key activities include acting as a concierge service for queries, providing practical demonstrations and reusable artifacts, and maintaining knowledge resources. The model combines helpdesk support with high-touch consulting to deliver a comprehensive and consistent migration experience.
Technical Skills
- Deep expertise in Google Cloud Platform (GCP) services: Big Query, Dataproc, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, Data Plex
- Experience with Teradata and Hadoop ecosystems (Hive, HDFS, MapReduce)
- Proficiency in SQL (BTEQ, Teradata SQL and Big Query SQL)
- Scripting in Python
- CI/CD pipeline setup using GitHub Actions, Jenkins, Harness
- GCP Big Query Migration Services
- Building automated data validation and reconciliation frameworks
Soft Skills
- Effective communication and documentation abilities
- Experience working in Agile/Scrum environments
- Ability to collaborate with cross-functional teams (Data Architects, Cloud Engineers, Business Analysts)
Role : GCP Data Engineer
Location: Charlotte, NC (100% onsite)
Hire Type : Contract
Note : βMust be legally authorized to work in US without need for employer sponsorship now or at any time in the future.β
Background:
As tenants transition to Google Cloud Platform (GCP) to comply with data center exit mandates, they encounter challenges that require extensive support. This initiative focuses on creating a structured tenant engagement model, including education on platform capabilities, migration best practices, and hands-on guidance throughout onboarding. Key activities include acting as a concierge service for queries, providing practical demonstrations and reusable artifacts, and maintaining knowledge resources. The model combines helpdesk support with high-touch consulting to deliver a comprehensive and consistent migration experience.
Technical Skills
- Deep expertise in Google Cloud Platform (GCP) services: Big Query, Dataproc, Dataflow, Cloud Storage, Pub/Sub, Cloud Composer, Data Plex
- Experience with Teradata and Hadoop ecosystems (Hive, HDFS, MapReduce)
- Proficiency in SQL (BTEQ, Teradata SQL and Big Query SQL)
- Scripting in Python
- CI/CD pipeline setup using GitHub Actions, Jenkins, Harness
- GCP Big Query Migration Services
- Building automated data validation and reconciliation frameworks
Soft Skills
- Effective communication and documentation abilities
- Experience working in Agile/Scrum environments
- Ability to collaborate with cross-functional teams (Data Architects, Cloud Engineers, Business Analysts)






