GCP Big Query Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Big Query Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include ETL, Python, SQL, and GCP tools. Requires 7+ years of GCP experience and expertise in data pipeline creation and optimization.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
400
-
πŸ—“οΈ - Date discovered
July 4, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Nashville, TN
-
🧠 - Skills detailed
#Dataflow #Clustering #Data Analysis #Teradata SQL #YAML (YAML Ain't Markup Language) #"ETL (Extract #Transform #Load)" #Programming #Teradata #Python #Data Lake #Data Integrity #SQL (Structured Query Language) #Batch #SQL Queries #Airflow #Apache Airflow #Security #Data Warehouse #Data Engineering #Data Lakehouse #BI (Business Intelligence) #Jira #Cloud #GCP (Google Cloud Platform) #BigQuery #Storage #Scala #GitHub
Role description

Heading 1

Heading 2

Heading 3

Heading 4

Heading 5
Heading 6

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.

Block quote

Ordered list

  1. Item 1
  2. Item 2
  3. Item 3

Unordered list

  • Item A
  • Item B
  • Item C

Text link

Bold text

Emphasis

Superscript

Subscript