

Senior Data Engineer (GCP)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (GCP) with a fully remote contract, focusing on building ETL data pipelines using GCP tools and Python. Requires 2+ years in Google Cloud, SQL, and experience with data visualization tools.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
June 4, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Scripting #Python #BigQuery #Cloud #GCP (Google Cloud Platform) #SQL (Structured Query Language) #Visualization #API (Application Programming Interface) #Data Pipeline #Google Data Studio #ML (Machine Learning) #Data Engineering #Data Ingestion #Dataflow #Tableau
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Senior Data Engineer (GCP)
Fully Remote
Syrinx Digital Media Partner
U.S. Citizens and those authorized to work in the U.S. are encouraged to apply. We are unable to sponsor at this time.
Accelerate the design and implementation of our Next-Gen Platform as part of a complete System overhaul that will support our growing multi-billion-dollar business.
Relevant Experience:
β’ Building end to end ETL data pipelines using GCP cloud infrastructure tools and Python (From Data ingestion to Data visualization/dashboarding)
β’ Have at least 2 years of experience in Google Cloud platform (especially Big Query & Dataflow)
β’ Experience with SQL and Google Cloud SDK & API Scripting.
β’ Setup ETL data pipeline from different data sources (eg: Google CloudSQL, Spanner, Datastore, CSV) to reporting platform (eg: Plx, Google Data Studio, Tableau) and machine learning platform (eg: BigQuery).
Technical/Functional Skills :
Β· Experience in interpreting customer business needs and translate those needs into requirements
Β· Experience in delivering artifacts such as scripts and dataflow components
Β· Experience in dealing with senior member of the business units (within the organization) and product owners
Β· Experience in SDLC with emphasis on specifying, building, and testing mission critical business applications