

Holistic Partners, Inc
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract position in Boca Raton, Florida, requiring 5–8+ years of experience in data engineering, strong SQL skills, and expertise in Google Cloud Platform. Key responsibilities include building data pipelines and optimizing data warehousing.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 25, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boca Raton, FL
-
🧠 - Skills detailed
#Jira #Deployment #Complex Queries #Data Quality #GCP (Google Cloud Platform) #Scrum #Scala #AI (Artificial Intelligence) #Big Data #Agile #Code Reviews #BigQuery #Leadership #Data Modeling #Python #Observability #Databases #Data Pipeline #Cloud #IAM (Identity and Access Management) #"ETL (Extract #Transform #Load)" #Data Engineering #Datasets #SQL (Structured Query Language) #Informatica #Data Warehouse #Storage
Role description
Job Opportunity: Data Engineer
Location: Boca Raton, Florida Hybrid
Duration: Contract
Key Responsibility
Top Skills - Must Haves
• sql
• data
• big data
• data warehouse
• python
• google cloud
Additional Skills & Qualifications
• 5–8+ years of hands-on Data Engineering experience with strong fundamentals in data modeling, ETL/ELT, and distributed systems.
• Proven expertise with Google Cloud Platform (GCP), especially BigQuery, IAM, and data services.
• Strong SQL (including complex queries and stored procedures) and performance tuning.
• Informatica experience (pipeline design, orchestration, optimization)—approximately 30% of daily work.
• Experience building scalable, production-grade pipelines and processes with strict performance SLAs (micro-processes under 2 minutes).
• Track record working with hybrid teams (onshore/offshore), acting as a senior individual contributor while providing technical guidance.
• - Design & Build Data Pipelines: Develop robust, scalable data pipelines with Informatica (~30% of the work) and native GCP services for the remaining workload.
• - Data Warehousing on GCP: Model, optimize, and manage datasets in BigQuery; build performant SQL and stored procedures to support AI decisioning and analytics.
• - Operational Data Store (ODS): Implement and maintain an ODS to deliver timely, trusted data to AI agents powering CX experiences (chat and voice).
• - Performance Optimization: Build efficient micro-processes (targeting < 2 minutes) and automate tasks like object cloning to accelerate deployments and iterations.
• - Quality & Reliability: Ensure data quality, observability, lineage, and resiliency across ingestion, transformation, and serving layers.
• - Collaboration & Leadership: Partner with onsite/offshore engineers; act as a senior IC who can provide technical guidance, mentoring, and code reviews.
• - Cloud Databases: Leverage GCP storage and database services (e.g., BigQuery, Cloud Spanner, Cloud Bigtable) as appropriate for workload patterns.
• - AI Enablement: Structure and deliver the right features and signals so AI agents can make accurate, low-latency decisions to improve customer experiences.
Secondary Skills - Nice to Haves
• agile coach
• Scrum methodology
• ecommerce
• erpss
• jira
• smart sheet
• confluence
Job Opportunity: Data Engineer
Location: Boca Raton, Florida Hybrid
Duration: Contract
Key Responsibility
Top Skills - Must Haves
• sql
• data
• big data
• data warehouse
• python
• google cloud
Additional Skills & Qualifications
• 5–8+ years of hands-on Data Engineering experience with strong fundamentals in data modeling, ETL/ELT, and distributed systems.
• Proven expertise with Google Cloud Platform (GCP), especially BigQuery, IAM, and data services.
• Strong SQL (including complex queries and stored procedures) and performance tuning.
• Informatica experience (pipeline design, orchestration, optimization)—approximately 30% of daily work.
• Experience building scalable, production-grade pipelines and processes with strict performance SLAs (micro-processes under 2 minutes).
• Track record working with hybrid teams (onshore/offshore), acting as a senior individual contributor while providing technical guidance.
• - Design & Build Data Pipelines: Develop robust, scalable data pipelines with Informatica (~30% of the work) and native GCP services for the remaining workload.
• - Data Warehousing on GCP: Model, optimize, and manage datasets in BigQuery; build performant SQL and stored procedures to support AI decisioning and analytics.
• - Operational Data Store (ODS): Implement and maintain an ODS to deliver timely, trusted data to AI agents powering CX experiences (chat and voice).
• - Performance Optimization: Build efficient micro-processes (targeting < 2 minutes) and automate tasks like object cloning to accelerate deployments and iterations.
• - Quality & Reliability: Ensure data quality, observability, lineage, and resiliency across ingestion, transformation, and serving layers.
• - Collaboration & Leadership: Partner with onsite/offshore engineers; act as a senior IC who can provide technical guidance, mentoring, and code reviews.
• - Cloud Databases: Leverage GCP storage and database services (e.g., BigQuery, Cloud Spanner, Cloud Bigtable) as appropriate for workload patterns.
• - AI Enablement: Structure and deliver the right features and signals so AI agents can make accurate, low-latency decisions to improve customer experiences.
Secondary Skills - Nice to Haves
• agile coach
• Scrum methodology
• ecommerce
• erpss
• jira
• smart sheet
• confluence






