

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 7+ years of GCP, SQL, NoSQL, and Python experience. Contract length is unspecified, with a competitive pay rate. Key skills include ETL, data pipelines, and cloud architecture. Google Professional Data Engineer Certification preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 29, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Massachusetts, United States
-
π§ - Skills detailed
#GIT #Data Modeling #"ETL (Extract #Transform #Load)" #Visualization #Storage #Data Science #Bash #Programming #Agile #AI (Artificial Intelligence) #Microservices #NoSQL #Data Processing #Unix #DevOps #Deployment #Python #Metadata #GCP (Google Cloud Platform) #Data Warehouse #Scripting #ML (Machine Learning) #Scala #Data Pipeline #Schema Design #Cloud #Big Data #SQL (Structured Query Language) #Data Engineering #API (Application Programming Interface)
Role description
Job Description Summary
Our client is looking for a Data Engineer with strong GCP experience. The DE will design, build, and maintain the data infrastructure that supports the organization's data-related initiatives. Responsibilities involve collaborating with cross-functional teams, including data scientists, analysts, and software engineers, to ensure the efficient and reliable processing, storage, and retrieval of data. The DE will also develop scalable data pipelines, optimize data workflows, and ensure the quality and integrity of the data.
Required Qualifications:
β’ 7+ years of experience with SQL, NoSQL
β’ 7+ years of experience with Python or a comparable scripting language
β’ 7+ years of experience with Data warehouses and infrastructure components
β’ 7+ years of experience with ETL/ELT and building high-volume data pipelines.
β’ 7+ years of experience with reporting/analytic tools
β’ 7+ years of experience with Query optimization, data structures, transformation, metadata, dependency, and workload management
β’ 7+ years of experience with Big Data and cloud architecture
β’ 7+ years of hands-on experience building modern data pipelines within GCP
β’ 6 years of experience with deployment/scaling of apps on containerized environment
β’ 7+ years of experience with real-time and streaming technology
β’ 5+ year(s) of soliciting complex requirements and managing relationships with key stakeholders.
β’ 5+ year(s) of experience independently managing deliverables.
Preferred Qualifications:
β’ Experience in designing and building data engineering solutions in GCP environments
β’ Experience with Git, CI/CD pipeline, and other DevOps principles/best practices
β’ Experience with bash shell scripts, UNIX utilities & UNIX Commands
β’ ML/AI Experience is a plus
β’ Understanding of software development methodologies including waterfall and agile.
β’ Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.
β’ Knowledge of API development
β’ Experience with complex systems and solving challenging analytical problems.
β’ Strong collaboration and communication skills within and across teams
β’ Knowledge of data visualization and reporting
β’ Experience with schema design and dimensional data modeling
β’ Google Professional Data Engineer Certification
β’ Knowledge of microservices and SOA
β’ Experience designing, building, and maintaining data processing systems.
Job Description Summary
Our client is looking for a Data Engineer with strong GCP experience. The DE will design, build, and maintain the data infrastructure that supports the organization's data-related initiatives. Responsibilities involve collaborating with cross-functional teams, including data scientists, analysts, and software engineers, to ensure the efficient and reliable processing, storage, and retrieval of data. The DE will also develop scalable data pipelines, optimize data workflows, and ensure the quality and integrity of the data.
Required Qualifications:
β’ 7+ years of experience with SQL, NoSQL
β’ 7+ years of experience with Python or a comparable scripting language
β’ 7+ years of experience with Data warehouses and infrastructure components
β’ 7+ years of experience with ETL/ELT and building high-volume data pipelines.
β’ 7+ years of experience with reporting/analytic tools
β’ 7+ years of experience with Query optimization, data structures, transformation, metadata, dependency, and workload management
β’ 7+ years of experience with Big Data and cloud architecture
β’ 7+ years of hands-on experience building modern data pipelines within GCP
β’ 6 years of experience with deployment/scaling of apps on containerized environment
β’ 7+ years of experience with real-time and streaming technology
β’ 5+ year(s) of soliciting complex requirements and managing relationships with key stakeholders.
β’ 5+ year(s) of experience independently managing deliverables.
Preferred Qualifications:
β’ Experience in designing and building data engineering solutions in GCP environments
β’ Experience with Git, CI/CD pipeline, and other DevOps principles/best practices
β’ Experience with bash shell scripts, UNIX utilities & UNIX Commands
β’ ML/AI Experience is a plus
β’ Understanding of software development methodologies including waterfall and agile.
β’ Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources.
β’ Knowledge of API development
β’ Experience with complex systems and solving challenging analytical problems.
β’ Strong collaboration and communication skills within and across teams
β’ Knowledge of data visualization and reporting
β’ Experience with schema design and dimensional data modeling
β’ Google Professional Data Engineer Certification
β’ Knowledge of microservices and SOA
β’ Experience designing, building, and maintaining data processing systems.