

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Cupertino, CA, for 9 months (possible extension) at a W2 pay rate. Requires 2-5 years of experience with SQL, Python, Snowflake, and modern data tools. MS/Ph.D. in a relevant field preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
544
-
ποΈ - Date discovered
August 1, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Cupertino, CA
-
π§ - Skills detailed
#GIT #Data Pipeline #Version Control #Snowflake #Data Lifecycle #Model Deployment #Deployment #SQL (Structured Query Language) #Unix #Shell Scripting #Tableau #Scripting #Automation #Jupyter #Bash #Data Processing #Data Exploration #Data Modeling #Spark (Apache Spark) #Data Science #dbt (data build tool) #AWS (Amazon Web Services) #Python #Docker #Data Engineering #AWS S3 (Amazon Simple Storage Service) #Airflow #Scala #Big Data #GitHub #S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #Storage #Computer Science #Data Storage #Kubernetes #Statistics #Documentation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Engineer (SW)
Location: Cupertino, CA (Hybrid Schedule, local consultants only)
Duration: 9 Months (Poss. of Extn.)
W2 Only Role
Keywords: Snowflake, Python, Shell, Unix, SQL, Database, Tableau, GitHub, Spark, Bash
Key Qualifications:
β’ 2β5 years of experience in data engineering, software engineering, or data analytics roles. Proficient in SQL and Python; comfortable with Bash or shell scripting.
β’ Hands-on experience with modern data tooling:
β’ Spark for large-scale data processing
β’ Airflow for workflow orchestration, Snowflake and DBT for data transformation and modeling
β’ AWS S3 for data storage and movement
β’ Docker and Kubernetes for containerization and deployment workflows
β’ Jupyter Notebooks for collaborative data exploration and documentation
β’ Familiarity with Git-based CI/CD pipelines and collaborative code development.
β’ Solid understanding of data warehousing, data modeling, and working with big data ecosystems.
β’ Foundational knowledge of statistics, including mean, median, standard deviation, and variance.
β’ Strong problem-solving skills with the ability to break down complex issues into manageable components.
β’ Committed to good software engineering practices such as testing, documentation, and code quality checks.
β’ Able to clearly communicate technical concepts to both technical peers and non-technical stakeholders.
β’ Familiarity with battery systems or electrical engineering is a plus, but not required.
Job Description:
β’ As a Data Engineer, you will design, build, and maintain scalable ELT pipelines using SQL and Python.
β’ Work across the full data lifecycle β from ingestion and transformation to model deployment and reporting.
β’ Collaborate with data scientists, engineers, and product managers to deliver clean, reliable, and well-documented data.
β’ Implement and manage workflows using Airflow, while ensuring traceability and version control via GitHub.
β’ Support transformation logic and data modeling using DBT, with data housed primarily in Snowflake.
β’ Use Jupyter Notebooks and ad-hoc analysis to support business questions and drive actionable insights.
β’ Build tools to monitor, validate, and test data pipelines, ensuring high availability and quality.
β’ Contribute to automation efforts, improving the teamβs efficiency and reducing manual work.
β’ Provide occasional support for urgent data reporting needs.
β’ Engage constructively with both technical and non-technical colleagues to ensure data solutions align with business goals.
Education:
β’ MS or Ph.D. in Computer Science, Software Engineering, Statistics, Electrical Engineering, Battery Engineering, or related technical field.
β’ Certifications in Six Sigma (CSSBB) or Quality Engineering (CQE) are a plus but not required.