

Openkyber
Streaming Data
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in the truck industry, based in Renton, WA, on a long-term contract with a competitive pay rate. Required skills include Confluent Kafka, Google Dataflow, Snowflake, and Tableau, with preferred experience in Aftersales service data.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alaska
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Azure #Data Processing #AWS (Amazon Web Services) #Normalization #Apache Beam #Data Modeling #Storage #API (Application Programming Interface) #Snowflake #Python #Data Governance #Migration #Data Integration #Data Engineering #Scala #Data Architecture #Data Pipeline #Java #Tableau #Security #Cloud #Data Quality #GCP (Google Cloud Platform) #Data Cleansing #Programming #Data Ingestion #Kafka (Apache Kafka) #Data Storage #Dataflow #SQL (Structured Query Language)
Role description
Job Description: Data Architect - (Truck Industry) Location: : Renton, WA (5 Days onsite) Duration : Long Term Contract: About the Role
We are seeking a skilled Data Architect to lead the migration, analysis, and reporting of service and repair data, including external data sourced from dealers in the truck industry. This role plays a critical part in enabling data-driven decision-making for our Aftersales operations by designing and implementing robust data pipelines, data models, and real-time reporting solutions. Key Responsibilities
Design and architect scalable data ingestion pipelines to process service and repair data from multiple sources, including external dealer data streams.
Ingest real-time data using Confluent Kafka and ensure reliable, high-throughput data flow.
Utilize Google Dataflow to perform data cleansing, normalization, and transformation operations.
Model and optimize data storage in Snowflake to support efficient querying and reporting.
Develop and maintain interactive and insightful dashboards using Tableau to enable business users to monitor Aftersales service performance.
Implement real-time reporting capabilities by pushing transformed data through APIs to dashboards.
Collaborate closely with data engineers, analysts, and business stakeholders to understand data requirements and deliver optimal solutions.
Establish and enforce data governance, quality, and security standards.
Continuously improve data architecture to support scalability, performance, and evolving business needs.
Required Qualifications
Proven experience as a Data Architect or similar role in data-intensive environments.
Strong expertise in data ingestion technologies, especially Confluent Kafka.
Hands-on experience with Google Dataflow (Apache Beam) for data processing and transformation.
Deep knowledge of cloud data warehousing concepts, with proficiency in Snowflake.
Experience creating reports and dashboards using Tableau.
Solid understanding of data modeling techniques (star schema, snowflake schema, normalized forms).
Familiarity with API integration for real-time data delivery.
Strong problem-solving skills and attention to data quality and reliability.
Excellent communication skills and ability to translate technical concepts for business stakeholders.
Preferred Qualifications
Experience in the automotive or truck industry, particularly in Aftersales service data.
Knowledge of additional cloud platforms and data tools (e.g., AWS, Google Cloud Platform, Azure).
Programming skills in Python, SQL, or Java.
Understanding of dealer management systems and external data integration challenges.
For applications and inquiries, contact: hirings@openkyber.com
Job Description: Data Architect - (Truck Industry) Location: : Renton, WA (5 Days onsite) Duration : Long Term Contract: About the Role
We are seeking a skilled Data Architect to lead the migration, analysis, and reporting of service and repair data, including external data sourced from dealers in the truck industry. This role plays a critical part in enabling data-driven decision-making for our Aftersales operations by designing and implementing robust data pipelines, data models, and real-time reporting solutions. Key Responsibilities
Design and architect scalable data ingestion pipelines to process service and repair data from multiple sources, including external dealer data streams.
Ingest real-time data using Confluent Kafka and ensure reliable, high-throughput data flow.
Utilize Google Dataflow to perform data cleansing, normalization, and transformation operations.
Model and optimize data storage in Snowflake to support efficient querying and reporting.
Develop and maintain interactive and insightful dashboards using Tableau to enable business users to monitor Aftersales service performance.
Implement real-time reporting capabilities by pushing transformed data through APIs to dashboards.
Collaborate closely with data engineers, analysts, and business stakeholders to understand data requirements and deliver optimal solutions.
Establish and enforce data governance, quality, and security standards.
Continuously improve data architecture to support scalability, performance, and evolving business needs.
Required Qualifications
Proven experience as a Data Architect or similar role in data-intensive environments.
Strong expertise in data ingestion technologies, especially Confluent Kafka.
Hands-on experience with Google Dataflow (Apache Beam) for data processing and transformation.
Deep knowledge of cloud data warehousing concepts, with proficiency in Snowflake.
Experience creating reports and dashboards using Tableau.
Solid understanding of data modeling techniques (star schema, snowflake schema, normalized forms).
Familiarity with API integration for real-time data delivery.
Strong problem-solving skills and attention to data quality and reliability.
Excellent communication skills and ability to translate technical concepts for business stakeholders.
Preferred Qualifications
Experience in the automotive or truck industry, particularly in Aftersales service data.
Knowledge of additional cloud platforms and data tools (e.g., AWS, Google Cloud Platform, Azure).
Programming skills in Python, SQL, or Java.
Understanding of dealer management systems and external data integration challenges.
For applications and inquiries, contact: hirings@openkyber.com




