

GCP Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a GCP Data Engineer on a contract basis, requiring in-person work. Key skills include AWS or Azure Data Lake, ETL tools like Talend, and scripting in Shell and VBA. Proven data engineering experience is essential.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
-
ποΈ - Date discovered
September 20, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Phoenix, AZ 85003
-
π§ - Skills detailed
#AWS (Amazon Web Services) #Shell Scripting #VBA (Visual Basic for Applications) #Data Engineering #Talend #Scripting #Scala #Cloud #"ETL (Extract #Transform #Load)" #Automation #Azure #Data Warehouse #GCP (Google Cloud Platform) #Data Lake #Data Pipeline
Role description
Job Summary
We are seeking a skilled Data Engineer to join our team. The ideal candidate will be responsible for designing and implementing scalable data pipelines for our organization, ensuring optimal data delivery architecture.
Responsibilities
Analyze, watch, and maintain data infrastructure
Implement and manage cloud-based solutions on AWS or Azure Data Lake
Develop and maintain ETL processes using tools like Talend
Design and build data warehouse solutions
Utilize Shell Scripting and VBA for automation tasks
Monitor server performance and ensure system availability and reliability
Collaborate with the analytics team to support their data infrastructure needs
Experience
Proven experience working with data engineering technologies and tools
Familiarity with cloud platforms such as AWS or Azure Data Lake
Proficiency in ETL tools like Talend
Strong knowledge of data warehouse design and implementation
Experience with scripting languages like Shell Scripting and VBA
Ability to monitor server performance and troubleshoot issues efficiently
Job Type: Contract
Application Question(s):
Are you willing to work on W2?
Work Location: In person
Job Summary
We are seeking a skilled Data Engineer to join our team. The ideal candidate will be responsible for designing and implementing scalable data pipelines for our organization, ensuring optimal data delivery architecture.
Responsibilities
Analyze, watch, and maintain data infrastructure
Implement and manage cloud-based solutions on AWS or Azure Data Lake
Develop and maintain ETL processes using tools like Talend
Design and build data warehouse solutions
Utilize Shell Scripting and VBA for automation tasks
Monitor server performance and ensure system availability and reliability
Collaborate with the analytics team to support their data infrastructure needs
Experience
Proven experience working with data engineering technologies and tools
Familiarity with cloud platforms such as AWS or Azure Data Lake
Proficiency in ETL tools like Talend
Strong knowledge of data warehouse design and implementation
Experience with scripting languages like Shell Scripting and VBA
Ability to monitor server performance and troubleshoot issues efficiently
Job Type: Contract
Application Question(s):
Are you willing to work on W2?
Work Location: In person