

Data Engineer ? Analytics & ETL
β - Featured Role | Apply direct with Data Freelance Hub
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 16, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Mountain View, CA
-
π§ - Skills detailed
#Grafana #Security #Splunk #SQL (Structured Query Language) #Migration #Python #ADF (Azure Data Factory) #Logic Apps #Azure Data Factory #Cloud #Scripting #Azure #KQL (Kusto Query Language) #"ETL (Extract #Transform #Load)" #Data Analysis #Data Governance #Consulting #BI (Business Intelligence) #Microsoft Power BI #Scala #Data Engineering #Compliance #Data Modeling #Bash #Visualization
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Talent Software Services, Inc, is seeking the following. Apply via Dice today!
Data Engineer - Analytics & ETL
Job Summary: Talent Software Services is in search of a Data Engineer - Analytics & ETL for a contract position in Mountain View, CA. The opportunity will be nine months with a strong chance for a long-term extension.
Position Summary: We are seeking a hands-on Data Engineer with deep expertise in data onboarding (streaming and historical), ETL pipeline development, and dashboard visualization. This role is critical to scaling our analytics infrastructure and supporting high-impact projects across Fabric, ADX/Kusto, and Azure ecosystems.
Primary Responsibilities/Accountabilities:
β’ The operations team, specifically the analytics subgroup, focuses on migrating standalone data estates into Fabric and integrating data from new vendors, supporting analytics and dashboard creation.
β’ Migration of data to Fabric, onboarding data from a new vendor, rebuilding dashboards and analytic workflows, and integrating telemetry for unified operational views.
β’ Typical Task Breakdown & Operating Rhythm:
β’ Onboarding and ingesting data from diverse sources.
β’ Building scalable, automated ETL pipelines using Azure Data Factory, Logic Apps, and Kusto Query Language.
β’ Designing and maintaining real-time dashboards with Azure Data Explorer and Power BI.
β’ Implementing data modeling and governance standards.
β’ Collaborating with cross-functional teams for infrastructure integration and telemetry.
β’ Data Onboarding & Parsing: Ingest and normalize structured/unstructured data (live and historical data) from diverse sources (using Event Hubs, SQL, Vector, etc.).
β’ ETL Pipeline Development: Build scalable, automated pipelines using Azure Data Factory (ADF), Logic Apps, and Kusto Query Language (KQL).
β’ Dashboard Development: Design and maintain real-time dashboards using Fabric (Power BI), Azure Data Explorer (ADX), and Azure Data Studio.
β’ Data Modeling & Governance: Implement robust data models and enforce governance standards across analytics workflows.
β’ Infrastructure Integration: Collaborate with cross-functional teams to integrate telemetry, license usage, and operational metrics into unified views.
β’ Performance Optimization: Tune queries and pipelines for speed, reliability, and cost-efficiency.
Compelling Story & Candidate Value Proposition
β’ Opportunity to work on cutting-edge migration to Fabric.
β’ Involvement in building new analytics infrastructure and dashboards from the ground up.
β’ Exposure to modern data engineering tools and real-time telemetry integration.
β’ Flexibility to work remotely from anywhere in the US.
Qualifications:
β’ Years of Experience Required: Minimum 5 years, ideally 5-7 years for a Level 3 candidate.
β’ Degrees or Certifications Required: Not strictly required; experience is prioritized. Certifications such as Azure Data Engineer Associate, Data Analyst Associate, and Splunk Core Certified User are nice to have.
β’ Best vs. Average: Top candidates will have hands-on experience with Fabric, Azure Data Factory, Kusto, and Splunk, and can demonstrate end-to-end project delivery. Average candidates may lack depth in these areas or practical experience.
β’ Performance Indicators: Within one month, the candidate should onboard data, transform it, and deliver a simple dashboard based on ingested data.
β’ Proficiency in Fabric, Azure Data Explorer (ADX/Kusto), SQL, KQL, Azure Data Studio, Event Hubs, Logic Apps, Vector, Splunk, and ADF.
β’ Strong scripting skills in Python or Bash.
β’ Experience with data visualization tools (ADX, Power BI, Grafana).
β’ Familiarity with license operations, SLI/SLO tracking, and real-time telemetry.
β’ Understanding of data governance, security, and compliance in cloud environments.
β’ Data Onboarding & Ingestion - Minimum 5 years hands-on experience.
β’ ETL Pipeline Development (Azure Data Factory, Logic Apps, Kusto) - Minimum 5 years hands-on experience.
β’ Dashboard Development (Azure Data Explorer, Power BI) - Minimum 5 years hands-on experience.
Preferred:
β’ Microsoft Certified: Azure Data Engineer Associate
β’ Microsoft Certified: Power BI Data Analyst Associate
β’ Splunk Core Certified Power User (Not required but a plus)
If this job is a match for your background, we would be honored to receive your application!
Providing consulting opportunities to TALENTed people since 1987, we offer a host of opportunities, including contract, contract to hire, and permanent placement. Let's talk!
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Talent Software Services, Inc, is seeking the following. Apply via Dice today!
Data Engineer - Analytics & ETL
Job Summary: Talent Software Services is in search of a Data Engineer - Analytics & ETL for a contract position in Mountain View, CA. The opportunity will be nine months with a strong chance for a long-term extension.
Position Summary: We are seeking a hands-on Data Engineer with deep expertise in data onboarding (streaming and historical), ETL pipeline development, and dashboard visualization. This role is critical to scaling our analytics infrastructure and supporting high-impact projects across Fabric, ADX/Kusto, and Azure ecosystems.
Primary Responsibilities/Accountabilities:
β’ The operations team, specifically the analytics subgroup, focuses on migrating standalone data estates into Fabric and integrating data from new vendors, supporting analytics and dashboard creation.
β’ Migration of data to Fabric, onboarding data from a new vendor, rebuilding dashboards and analytic workflows, and integrating telemetry for unified operational views.
β’ Typical Task Breakdown & Operating Rhythm:
β’ Onboarding and ingesting data from diverse sources.
β’ Building scalable, automated ETL pipelines using Azure Data Factory, Logic Apps, and Kusto Query Language.
β’ Designing and maintaining real-time dashboards with Azure Data Explorer and Power BI.
β’ Implementing data modeling and governance standards.
β’ Collaborating with cross-functional teams for infrastructure integration and telemetry.
β’ Data Onboarding & Parsing: Ingest and normalize structured/unstructured data (live and historical data) from diverse sources (using Event Hubs, SQL, Vector, etc.).
β’ ETL Pipeline Development: Build scalable, automated pipelines using Azure Data Factory (ADF), Logic Apps, and Kusto Query Language (KQL).
β’ Dashboard Development: Design and maintain real-time dashboards using Fabric (Power BI), Azure Data Explorer (ADX), and Azure Data Studio.
β’ Data Modeling & Governance: Implement robust data models and enforce governance standards across analytics workflows.
β’ Infrastructure Integration: Collaborate with cross-functional teams to integrate telemetry, license usage, and operational metrics into unified views.
β’ Performance Optimization: Tune queries and pipelines for speed, reliability, and cost-efficiency.
Compelling Story & Candidate Value Proposition
β’ Opportunity to work on cutting-edge migration to Fabric.
β’ Involvement in building new analytics infrastructure and dashboards from the ground up.
β’ Exposure to modern data engineering tools and real-time telemetry integration.
β’ Flexibility to work remotely from anywhere in the US.
Qualifications:
β’ Years of Experience Required: Minimum 5 years, ideally 5-7 years for a Level 3 candidate.
β’ Degrees or Certifications Required: Not strictly required; experience is prioritized. Certifications such as Azure Data Engineer Associate, Data Analyst Associate, and Splunk Core Certified User are nice to have.
β’ Best vs. Average: Top candidates will have hands-on experience with Fabric, Azure Data Factory, Kusto, and Splunk, and can demonstrate end-to-end project delivery. Average candidates may lack depth in these areas or practical experience.
β’ Performance Indicators: Within one month, the candidate should onboard data, transform it, and deliver a simple dashboard based on ingested data.
β’ Proficiency in Fabric, Azure Data Explorer (ADX/Kusto), SQL, KQL, Azure Data Studio, Event Hubs, Logic Apps, Vector, Splunk, and ADF.
β’ Strong scripting skills in Python or Bash.
β’ Experience with data visualization tools (ADX, Power BI, Grafana).
β’ Familiarity with license operations, SLI/SLO tracking, and real-time telemetry.
β’ Understanding of data governance, security, and compliance in cloud environments.
β’ Data Onboarding & Ingestion - Minimum 5 years hands-on experience.
β’ ETL Pipeline Development (Azure Data Factory, Logic Apps, Kusto) - Minimum 5 years hands-on experience.
β’ Dashboard Development (Azure Data Explorer, Power BI) - Minimum 5 years hands-on experience.
Preferred:
β’ Microsoft Certified: Azure Data Engineer Associate
β’ Microsoft Certified: Power BI Data Analyst Associate
β’ Splunk Core Certified Power User (Not required but a plus)
If this job is a match for your background, we would be honored to receive your application!
Providing consulting opportunities to TALENTed people since 1987, we offer a host of opportunities, including contract, contract to hire, and permanent placement. Let's talk!