

OSI Engineering
Data Pipeline Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Pipeline Engineer on a 5-month contract in Cupertino, CA, offering $85.00 - $100.00 W2 pay. Key skills include Python, SQL, Airflow, and experience with multimedia data processing and cloud platforms. A Bachelor's degree is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
800
-
ποΈ - Date
May 15, 2026
π - Duration
3 to 6 months
-
ποΈ - Location
Hybrid
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Cupertino, CA
-
π§ - Skills detailed
#Monitoring #Airflow #Data Security #Database Design #SQL (Structured Query Language) #TensorFlow #Database Schema #Databases #PyTorch #Snowflake #OpenCV (Open Source Computer Vision Library) #Data Processing #Compliance #Normalization #Security #Trino #Cloud #Scala #Spark (Apache Spark) #Data Pipeline #Data Warehouse #Computer Science #Data Modeling #Python #AI (Artificial Intelligence) #ML (Machine Learning)
Role description
A globally leading consumer device company headquartered in Cupertino, CA is looking for Data Pipeline Engineer to join their team.
Job Requirements:
β’ Agentic AI prototype and development for data analyze and processing
β’ Building efficient and scalable data pipelines with monitoring and reporting for multimedia data;
β’ Developing dashboards to visualize and provide insights.
β’ Identifying entities, data, and their relationships within the application, considering the constraints of the stack (e.g. SQL, vector, and unstructured data).
β’ Creating and maintaining database schemas, choosing appropriate data types, and understanding normalization and denormalization.
β’ Optimizing and tuning databases and queries for maximum performance and reliability, considering SQL, vector, and unstructured database.
β’ Ensuring compliance with data security and privacy regulations.
β’ Developing front-end and back-end components of data-driven applications, creating APIs, and integrating data with various technologies.
Candidate qualifications:
β’ 6 years of software development experience in Data Modeling, Database Design & Optimization, scalable data pipelines for image and video data processing, and full-stack development using Python and SQL.
β’ 3 years of experience developing data warehouse and data analytics solutions using technologies such as Airflow, Trino, Spark, and Snowflake.
β’ 3 years of experience developing data processing pipelines on cloud platforms.
β’ Demonstrated software development skills using Python.
β’ Proficiency with tools like FFmpeg, GStreamer, OpenCV, and potentially AI/ML frameworks like TensorFlow or PyTorch for video analysis.
β’ Agentic AI development experience preferred.
β’ Bachelorβs degree, Masterβs degree, or above in Computer Science, Software Engineering, or a related field.
Type: Contract
Duration: 5 months with extension
Work Location: Cupertino, CA (hybrid)
Pay range: $ 85.00 - $ 100.00 W2 (DOE)
A globally leading consumer device company headquartered in Cupertino, CA is looking for Data Pipeline Engineer to join their team.
Job Requirements:
β’ Agentic AI prototype and development for data analyze and processing
β’ Building efficient and scalable data pipelines with monitoring and reporting for multimedia data;
β’ Developing dashboards to visualize and provide insights.
β’ Identifying entities, data, and their relationships within the application, considering the constraints of the stack (e.g. SQL, vector, and unstructured data).
β’ Creating and maintaining database schemas, choosing appropriate data types, and understanding normalization and denormalization.
β’ Optimizing and tuning databases and queries for maximum performance and reliability, considering SQL, vector, and unstructured database.
β’ Ensuring compliance with data security and privacy regulations.
β’ Developing front-end and back-end components of data-driven applications, creating APIs, and integrating data with various technologies.
Candidate qualifications:
β’ 6 years of software development experience in Data Modeling, Database Design & Optimization, scalable data pipelines for image and video data processing, and full-stack development using Python and SQL.
β’ 3 years of experience developing data warehouse and data analytics solutions using technologies such as Airflow, Trino, Spark, and Snowflake.
β’ 3 years of experience developing data processing pipelines on cloud platforms.
β’ Demonstrated software development skills using Python.
β’ Proficiency with tools like FFmpeg, GStreamer, OpenCV, and potentially AI/ML frameworks like TensorFlow or PyTorch for video analysis.
β’ Agentic AI development experience preferred.
β’ Bachelorβs degree, Masterβs degree, or above in Computer Science, Software Engineering, or a related field.
Type: Contract
Duration: 5 months with extension
Work Location: Cupertino, CA (hybrid)
Pay range: $ 85.00 - $ 100.00 W2 (DOE)






