

Smart IT Frame LLC
Sr. GCP Lead
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. GCP Lead in Harford, CT (Hybrid) with a contract length of unspecified duration and a pay rate of "unknown." Candidates should have 10-14 years of experience in Python, PySpark, GCP, and Big Data architecture.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 24, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Hartford, CT
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Big Data #Data Architecture #Python #PySpark #Data Science #Spark (Apache Spark) #GCP (Google Cloud Platform) #Data Mart #Data Pipeline
Role description
Job Title: Sr. GCP Lead
Location: Harford, CT (Hybrid)
Job Summary:
Sr Developer/Lead having strong knowledge of python, pyspark and gcp.
Experience: 10 to 14 Yrs
Responsibilities:
• Develops large scale data structures, pipelines and efficient ETL (extract/load/transform) workflows to organize, collect and standardize data that helps generate insights and addresses reporting needs.
• Collaborates with other data teams to transform data and integrate algorithms and models into automated processes.
• Uses knowledge in Big Data architecture and experience designing & optimizing queries to build data pipelines.
• Builds data marts and data models to support Data Science and other internal customers.
• Analyses current information technology environments to identify and assess critical capabilities and recommend solutions.
• Experiments with available tools and advises on new tools in order to determine optimal solution given the requirements dictated by the model/use cases
Job Title: Sr. GCP Lead
Location: Harford, CT (Hybrid)
Job Summary:
Sr Developer/Lead having strong knowledge of python, pyspark and gcp.
Experience: 10 to 14 Yrs
Responsibilities:
• Develops large scale data structures, pipelines and efficient ETL (extract/load/transform) workflows to organize, collect and standardize data that helps generate insights and addresses reporting needs.
• Collaborates with other data teams to transform data and integrate algorithms and models into automated processes.
• Uses knowledge in Big Data architecture and experience designing & optimizing queries to build data pipelines.
• Builds data marts and data models to support Data Science and other internal customers.
• Analyses current information technology environments to identify and assess critical capabilities and recommend solutions.
• Experiments with available tools and advises on new tools in order to determine optimal solution given the requirements dictated by the model/use cases





