Jobs via Dice

Requirement for Visualization Engineer ---- Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Platform & Visualization Engineer, a 15-month remote contract at $XX/hour. Key skills include Python, SQL, AWS, and React. Experience with data ingestion pipelines and visualization is essential; no certifications required, but hands-on experience is prioritized.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 5, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
1099 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Los Altos, CA
-
🧠 - Skills detailed
#S3 (Amazon Simple Storage Service) #Documentation #Storage #Libraries #Databases #Data Pipeline #EC2 #Metadata #AWS S3 (Amazon Simple Storage Service) #Data Engineering #React #AWS (Amazon Web Services) #ML (Machine Learning) #Visualization #Automation #FastAPI #Python #MySQL #Cloud #SQL (Structured Query Language) #Datasets #Lambda (AWS Lambda) #Flask #Plotly
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Kairos, is seeking the following. Apply via Dice today! Hi , Please let me know if you're comfortable with the position detailed below. This position is an urgent hire. Location : Los Altos, CA ---- Remote Duration : 15 Months contract Need only on 1099 Data Platform & Visualization Engineer Python-focused data platform engineer with experience building reliable ingestion pipelines, SQL-backed metrics systems, backend APIs, and web-based dashboards for ML and experimentation workflows in AWS environments. Python-focused data platform engineer with experience building reliable ingestion pipelines, SQL-backed metrics systems, backend APIs, and web-based dashboards for ML and experimentation workflows in AWS environments. The company collects a lot of data on a monthly trip and potentially from simulations. Currently, they are not handling much of it. The main target is that their monthly trip collects a large amount of vehicle data. Previously, it used to be mostly just vehicle numbers. They are starting to add a lot of camera and LIDAR data, which means that the scale of the data is getting larger. Looking for someone junior to help with the existing ingestor system that ingests the data and uploads it to S3. Bandwidth is very limited (100 Mbps) because they go to a remote location, and they are trying to upload 500 GB of data per night. They want some sort of orchestration logic for sequencing and prioritizing certain steps. Once the data is up there, they are looking for someone with front-end experience to navigate, filter, and create metadata. This will require some back-end experience as well to handle the data. Current Process: It's not very smooth, but they make it work. If they cannot finish uploading the data on the spot, they have to wait hours and then come back to the office where there is fast data and upload it. Hoping to streamline and optimize the process. Clarification: There is a queuing system where they have a bunch of stuff stored locally and need to peter it out to the cloud storage. That system needs to be refined and maintained. Data Sources Data is primarily from actual vehicles. The system is deployed in California. Data Transfer Process Currently, data transfer involves manually taking out a hard drive, pressing buttons, and uploading. TRI wants more sophisticated lambda functions to be triggered once the data is present. Location The HQ is in Los Altos. Data is collected approximately three hours from Northern California, near Willows, California. The industrial system and internet bottleneck are located there. Role Summary The speaker wants a data engineer to optimize the workflow of getting data from vehicles/simulations into AWS. The engineer should leverage AWS tooling for further processing, including metadata and lambda functions. The role involves pre and post-processing data and tagging it. The speaker wants a dashboard to manage data, as they currently use Notepad. This is described as digital asset management. The role is more back-end heavy, but with some light front-end development for usability. SQL expectations are high. Data Schema The team has some schema defined for each of the datasets they are imagining. They currently use Google Sheets, but it is a "little bit janky when it comes to like sorting and retrieving the results." Uploading is also a little janky. Strong collaboration and communication, ownership mindset, analytical problem-solving, adaptability, attention to detail, proactive initiative, and effective time management. Ability to multitask effectively and efficiently. Nice to have: experience with ML experiment tracking, data provenance systems, workflow orchestration frameworks, advanced data visualization, LLM-based automation, and cloud-deployed Python services. Python, SQL, AWS (S3, EC2), FastAPI/Flask, React, relational databases (Postgres/MySQL), data visualization libraries (Plotly/Vega), custom orchestration pipelines, and LLM-based annotation tooling. Data Platform & Visualization Engineer Typical Day: Build and maintain Python data pipelines, backend APIs, and React dashboards; monitor and troubleshoot ingestion and metrics workflows; collaborate with researchers; ensure data reliability, traceability, and actionable visual insights. Training will be a blend of hands-on onboarding, SME mentoring, and documentation review, focused on understanding existing pipelines, dashboards, AWS infrastructure, metrics systems, and LLM integration, with ongoing learning as the platform evolves. No Certifications Are Required. Relevant AWS, data engineering, or Python certifications are nice-to-have and can make a candidate stand out, but hands-on experience is far more important. For the Data Platform & Visualization Engineer role, hands-on experience is far more important than certifications. Candidates with demonstrated skills in Python data pipelines, SQL-backed metrics systems, backend APIs, React dashboards, and AWS workflows should be prioritized - even if they do not hold formal certifications. Certifications (AWS, data engineering, ML) are optional and considered nice-to-have, but lack of certification should not exclude qualified candidates. Focus on practical experience, problem-solving ability, and project ownership. Laxman Andoli | Lead TAG | Kairos Technologies Inc M : | O: Ext 302 | E: