

Cyberobotix
Data Engineering and Data Analyst (exp in SAP Data)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineering and Data Analyst position focused on SAP data, based in Boston, MA (Hybrid), for a contract duration of 6-12 months, offering a pay rate of "X". Key skills include SQL, BigQuery, Snowflake, and BI tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 15, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Qlik #Observability #Monitoring #Snowflake #Data Engineering #SAP #Data Modeling #Clustering #Databases #Tableau #Looker #BI (Business Intelligence) #Data Analysis #Documentation #"ETL (Extract #Transform #Load)" #Datasets #SQL (Structured Query Language) #SaaS (Software as a Service) #BigQuery
Role description
Role: Data Engineering and Data Analyst (exp in SAP Data)
Location: Boston, MA (Hybrid)
Type: Contract
Duration:6-12 months
Role summary:
Consultant will work across Google Big Query and/or Snowflake to develop analytics-ready datasets and power dashboards in Looker, Tableau, and/or Qlik Sense.
Responsibilities
Data engineering
1. Build and maintain pipelines from SaaS tools, operational databases, APIs, and flat files into BigQuery/Snowflake.
1. Design curated datasets and semantic-ready tables/views; improve query performance (partitioning/clustering, pruning, aggregation strategies).
1. Set up data observability: load monitoring, failure alerts, and incident/runbook documentation.
1. Troubleshoot pipeline failures and data discrepancies; conduct root-cause analysis and remediation.
Analytics
1. Gather requirements, define metrics/KPIs, and translate business needs into trusted reporting.
1. Build and optimize dashboards in Looker/Tableau/Qlik Sense with strong usability, performance, and governance.
1. Enable self-service: certified datasets, standardized dimensions/measures, consistent metric definitions.
1. Provide ad-hoc analysis and recommendations; communicate insights clearly to stakeholders.
Required qualifications
1. Strong SQL and hands-on experience with BigQuery and/or Snowflake.
1. Experience with at least one BI tool: Looker, Tableau, or Qlik Sense (multi-tool experience preferred).
1. Experience working with SAP data.
1. Solid data modeling fundamentals (dimensional modeling, metric grain, conformed dimensions).
1. Experience with pipeline/orchestration concepts (scheduling, retries, idempotency, incremental loading).
1. Strong stakeholder management skills; can drive clarity and deliver iteratively.
Role: Data Engineering and Data Analyst (exp in SAP Data)
Location: Boston, MA (Hybrid)
Type: Contract
Duration:6-12 months
Role summary:
Consultant will work across Google Big Query and/or Snowflake to develop analytics-ready datasets and power dashboards in Looker, Tableau, and/or Qlik Sense.
Responsibilities
Data engineering
1. Build and maintain pipelines from SaaS tools, operational databases, APIs, and flat files into BigQuery/Snowflake.
1. Design curated datasets and semantic-ready tables/views; improve query performance (partitioning/clustering, pruning, aggregation strategies).
1. Set up data observability: load monitoring, failure alerts, and incident/runbook documentation.
1. Troubleshoot pipeline failures and data discrepancies; conduct root-cause analysis and remediation.
Analytics
1. Gather requirements, define metrics/KPIs, and translate business needs into trusted reporting.
1. Build and optimize dashboards in Looker/Tableau/Qlik Sense with strong usability, performance, and governance.
1. Enable self-service: certified datasets, standardized dimensions/measures, consistent metric definitions.
1. Provide ad-hoc analysis and recommendations; communicate insights clearly to stakeholders.
Required qualifications
1. Strong SQL and hands-on experience with BigQuery and/or Snowflake.
1. Experience with at least one BI tool: Looker, Tableau, or Qlik Sense (multi-tool experience preferred).
1. Experience working with SAP data.
1. Solid data modeling fundamentals (dimensional modeling, metric grain, conformed dimensions).
1. Experience with pipeline/orchestration concepts (scheduling, retries, idempotency, incremental loading).
1. Strong stakeholder management skills; can drive clarity and deliver iteratively.






