

Data Engineer – Google Cloud
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer – Google Cloud in Dearborn, MI, with a long-term contract at $60 per hour. Requires 7+ years of data engineering experience, expertise in ETL, data modeling, and familiarity with GCP and Big Query.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
480
-
🗓️ - Date discovered
August 9, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
On-site
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Dearborn, MI
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Cloud #Data Modeling #Data Engineering #Scala #Data Pipeline #GCP (Google Cloud Platform) #Qlik #Storage #Agile #Data Warehouse #Data Lakehouse #Data Lake
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data engineer – Google Cloud
Duration: Long term
Location: Dearborn, MI
Pay Rate: $60 with all benefits
Position Description:
Employees in this job function are responsible for designing, building, and maintaining data solutions including data infrastructure, pipelines, etc. for collecting, storing, processing and analyzing large volumes of data efficiently and accurately Key
Responsibilities:
1. Collaborate with business and technology stakeholders to understand current and future data requirements
1. Design, build and maintain reliable, efficient and scalable data infrastructure for data collection, storage, transformation, and analysis
1. Plan, design, build and maintain scalable data solutions including data pipelines, data models, and applications for efficient and reliable data workflow
1. Design, implement and maintain existing and future data platforms like data warehouses, data lakes, data lakehouse etc. for structured and unstructured data
1. Design and develop analytical tools, algorithms, and programs to support data engineering activities like writing scripts and automating tasks
1. Ensure optimum performance and identify improvement opportunities
Skills Required:
Data/Analytics, ETL, Data Modeling, Data Warehousing, Analytical skills
Skills Preferred:
GCP, Big Query
Experience Required:
Engineer 3 Exp: 7+ years Data Engineering work experience
Experience Preferred:
The ideal candidate will have strong curiosity and analytical skills; be a self-starter with the ability to work in an agile environment and with ambiguity; strong verbal & communications; highly collaborative but able to work independently. Key skills include defining requirements, data modeling, ETL and pipeline development. Bonus skills include GCP, Big Query, Data Form and Qlik Replicate.