

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer in Nashville, Tennessee, offering a 12+ month contract at an undisclosed pay rate. Key skills required include GCP, ETL, Python, and SQL. Local candidates only; prior experience with data engineering is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 17, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Nashville, TN
-
π§ - Skills detailed
#Teradata SQL #Jira #Cloud #Storage #BI (Business Intelligence) #Apache Airflow #Dataflow #GitHub #SQL (Structured Query Language) #Scala #Python #Data Lakehouse #BigQuery #GCP (Google Cloud Platform) #Programming #Data Lake #"ETL (Extract #Transform #Load)" #Data Engineering #Data Analysis #Data Warehouse #Data Integrity #Security #Teradata #Airflow
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Hi ,
Cloud Data Engineer
Nashville, Tennessee (Local only)
12+ months contract
GCP Exp is mandatory
Job Details:-
Job Description :
Google Data Engineer, the individual is tasked with designing, building, and maintaining scalable data infrastructure and pipelines to enable data analytics and business intelligence efforts. This role includes collaborating with cross-functional teams to ensure efficient data movement, transformation, and storage from multiple sources, while upholding data integrity, quality, and security. It demands a blend of technical expertise, analytical abilities, and business insight to engage with stakeholders, gather requirements, and deliver effective data-driven solutions.
Key Responsibilities :
Β· Design and implement scalable ETL/ELT pipelines to ingest and transform large volumes of data using tools such as Python, SQL, and cloud-native services.
Β· Develop and maintain data models and data warehouse solutions using platforms like GCP.
Β· Collaborate with data analysts, scientists, and business teams to gather requirements and deliver data engineering solutions.
Β· Monitor and optimize data workflows to ensure performance, reliability, and cost-efficiency
Key Skills :
β’ ETL, Datawarehouse and Data LakeHouse
β’ Google Cloud Platform: Cloud Storage ; Cloud Pub/Sub; Cloud Dataflow; Apache Airflow; Cloud Functions; Big Query; Cloud Compose
β’ Programming Skills: Python , BigQuery SQL and Teradata SQL
β’ Tools: Jira, GitHub, Confluence
β’ Excellent Problem Solving and Communication Skills
Gaurav Mote | Tekgence Inc
Direct: 469-575-8666, Ext- 145
β’ gaurav.mote@tekgence.com
6655 Deseo Dr, Suite 104, Irving, TX , 75039
β’ www.tekgence.com
Tekgence is an equal opportunity employer. Applicants must be authorized to work in the U.S. U.S. citizens and Green Card holders are strongly encouraged to apply.