

Senior GCP Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior GCP Developer, a 6-month remote contract (CST hours) with a pay rate of $65-70/hr. Requires 6+ years of GCP experience, strong skills in BigQuery, Dataflow, and Python, plus familiarity with Alteryx Designer and CI/CD practices.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
560
-
ποΈ - Date discovered
July 4, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Documentation #Metadata #DevOps #Libraries #Data Governance #Automation #Dataflow #GIT #Data Management #Compliance #Spark (Apache Spark) #Data Analysis #"ETL (Extract #Transform #Load)" #Spark SQL #Java #Python #Looker #SQL (Structured Query Language) #Kafka (Apache Kafka) #Airflow #Apache Airflow #Data Integration #Security #Data Engineering #Agile #Alteryx Designer #Cloud #GCP (Google Cloud Platform) #Google Cloud Storage #Version Control #Apache Beam #Data Pipeline #Alteryx #BigQuery #PySpark #Data Quality #Storage #Scala
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior GCP Developer
Location: Remote (CST Hours)
Job Type: 6M Contract + Extensions
Experience Level: Senior (6+ years)
Salary: 65-70/hr
About the Role:
We are looking for a Senior Developer with deep expertise in Google Cloud Platform (GCP) and a strong background in building scalable, cloud-native applications and data-driven solutions. This role is ideal for someone who thrives in a fast-paced, Agile environment and is passionate about leveraging modern cloud technologies to solve complex problems.
Key Responsibilities:
β’ Design and implement scalable data pipelines using Google Cloud Composer (Apache Airflow) and Google Dataflow (Apache Beam).
β’ Develop and optimize data models and queries in Google BigQuery.
β’ Manage and orchestrate data workflows across Google Cloud Storage, BigQuery, and Dataplex.
β’ Integrate data from various sources using BigQuery Data Transfer Service, Kafka Connect, and custom ingestion pipelines.
β’ Build and maintain dashboards and reports using Google Looker Studio.
β’ Collaborate with data analysts, scientists, and business stakeholders to understand data requirements and deliver high-quality solutions.
β’ Apply best practices in version control using Git and CI/CD pipelines.
β’ Work in an Agile environment, participating in sprint planning, reviews, and retrospectives.
β’ Leverage Alteryx Designer for data preparation and transformation tasks when needed.
β’ Ensure data quality, security, and compliance across all data solutions.
Required Skills & Qualifications:
β’ 6+ years of experience in data engineering or cloud development roles, with the focus on Google Cloud Platform(GCP)
β’ Strong hands-on experience with:
β’ Google Cloud Platform (GCP) services: BigQuery, Cloud Composer, Dataflow, Dataplex, Cloud Storage, Looker Studio.
β’ Apache Airflow, Apache Beam, Kafka Libraries/Kafka Connect.
β’ Python, PySpark, SQL, Java.
β’ Git for version control.
β’ Experience with data warehousing concepts and best practices.
β’ Familiarity with Alteryx Designer for data transformation.
β’ Strong problem-solving skills and ability to work independently and collaboratively.
β’ Excellent communication and documentation skills.
Preferred Qualifications:
β’ Experience with CI/CD pipelines and DevOps practices.
β’ GCP certifications (e.g., Professional Cloud Developer or Data Engineer).
β’ Experience with SnapLogic for data integration and automation.
β’ Familiarity with data governance and metadata management using Dataplex.
β’ Experience in the Healthcare Industry