Gardner Resources Consulting, LLC

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for 6 months, offering a pay rate of "unknown". It requires 7+ years of experience in SQL, NoSQL, Python, GCP, ETL/ELT, and data infrastructure. Google Professional Data Engineer Certification is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 2, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Massachusetts, United States
-
🧠 - Skills detailed
#Schema Design #NoSQL #Data Modeling #Unix #Scala #Data Warehouse #DevOps #Python #SQL (Structured Query Language) #Storage #Bash #Data Engineering #Data Pipeline #Data Science #Programming #Data Processing #AI (Artificial Intelligence) #API (Application Programming Interface) #Big Data #Cloud #Visualization #Microservices #Scripting #ML (Machine Learning) #Metadata #Agile #GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #Deployment #GIT
Role description
Job Description Summary Our client is looking for a Data Engineer with strong GCP experience. The DE will design, build, and maintain the data infrastructure that supports the organization's data-related initiatives. Responsibilities involve collaborating with cross-functional teams, including data scientists, analysts, and software engineers, to ensure the efficient and reliable processing, storage, and retrieval of data. The DE will also develop scalable data pipelines, optimize data workflows, and ensure the quality and integrity of the data. Required Qualifications: • 7+ years of experience with SQL, NoSQL • 7+ years of experience with Python or a comparable scripting language • 7+ years of experience with Data warehouses and infrastructure components • 7+ years of experience with ETL/ELT and building high-volume data pipelines. • 7+ years of experience with reporting/analytic tools • 7+ years of experience with Query optimization, data structures, transformation, metadata, dependency, and workload management • 7+ years of experience with Big Data and cloud architecture • 7+ years of hands-on experience building modern data pipelines within GCP • 6 years of experience with deployment/scaling of apps on containerized environment • 7+ years of experience with real-time and streaming technology • 5+ year(s) of soliciting complex requirements and managing relationships with key stakeholders. • 5+ year(s) of experience independently managing deliverables. Preferred Qualifications: • Experience in designing and building data engineering solutions in GCP environments • Experience with Git, CI/CD pipeline, and other DevOps principles/best practices • Experience with bash shell scripts, UNIX utilities & UNIX Commands • ML/AI Experience is a plus • Understanding of software development methodologies including waterfall and agile. • Ability to leverage multiple tools and programming languages to analyze and manipulate data sets from disparate data sources. • Knowledge of API development • Experience with complex systems and solving challenging analytical problems. • Strong collaboration and communication skills within and across teams • Knowledge of data visualization and reporting • Experience with schema design and dimensional data modeling • Google Professional Data Engineer Certification • Knowledge of microservices and SOA • Experience designing, building, and maintaining data processing systems.