

Data Engineer II
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer II with a contract length of over 6 months, offering $62-$67/hr. Located in Cupertino, CA, it requires 2-5 years of experience, proficiency in SQL and Python, and familiarity with Snowflake, Airflow, and data modeling tools.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
536
-
ποΈ - Date discovered
August 1, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Cupertino, CA
-
π§ - Skills detailed
#Data Pipeline #Version Control #Snowflake #Data Lifecycle #Model Deployment #Deployment #SQL (Structured Query Language) #Unix #Shell Scripting #Tableau #Scripting #Automation #Jupyter #Bash #Data Processing #Data Exploration #Data Modeling #Spark (Apache Spark) #Data Science #dbt (data build tool) #AWS (Amazon Web Services) #Python #Docker #Data Engineering #AWS S3 (Amazon Simple Storage Service) #Airflow #Scala #Big Data #GitHub #S3 (Amazon Simple Storage Service) #"ETL (Extract #Transform #Load)" #Storage #Computer Science #Data Storage #Kubernetes #Statistics #Documentation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
β’ Data Engineer (SW)
As a Data Engineer, candidate will:
β’ Design, build, and maintain scalable ELT pipelines using SQL and Python.
β’ Work across the full data lifecycle β from ingestion and transformation to model deployment and reporting.
β’ Collaborate with data scientists, engineers, and product managers to deliver clean, reliable, and welldocumented data.
β’ Implement and manage workflows using Airflow, while ensuring traceability and version control via GitHub.
β’ Support transformation logic and data modeling using DBT, with data housed primarily in Snowflake.
β’ Use Jupyter Notebooks and adhoc analysis to support business questions and drive actionable insights.
β’ Build tools to monitor, validate, and test data pipelines, ensuring high availability and quality.
β’ Contribute to automation efforts, improving the teamβs efficiency and reducing manual work.
β’ Provide occasional support for urgent data reporting needs.
β’ Engage constructively with both technical and nontechnical colleagues to ensure data
β’ solutions align with business goals.
Key Qualifications:
β’ 2 to 5 years of experience in data engineering, software engineering, or data analytics roles.
β’ Proficient in SQL and Python; comfortable with Bash or shell scripting.
β’ Handson experience with modern data tooling:
β’ Spark for largescale data processing
β’ Airflow for workflow orchestration
β’ Snowflake and DBT for data transformation and modeling
β’ AWS S3 for data storage and movement
β’ Docker and Kubernetes for containerization and deployment workflows
β’ Jupyter Notebooks for collaborative data exploration and documentation
β’ Familiarity with Gitbased CI/CD pipelines and collaborative code development.
β’ Solid understanding of data warehousing, data modeling, and working with big data ecosystems.
β’ Foundational knowledge of statistics, including mean, median, standard deviation, and variance.
β’ Strong problemsolving skills with the ability to break down complex issues into manageable components.
β’ Committed to good software engineering practices such as testing, documentation, and code quality checks.
β’ Able to clearly communicate technical concepts to both technical peers and nontechnical stakeholders.
β’ Familiarity with battery systems or electrical engineering is a plus but not required.
Education:
β’ MS or Ph.D. in Computer Science, Software Engineering, Statistics, Electrical Engineering, Battery Engineering, or related technical field.
β’ Certifications in Six Sigma (CSSBB) or Quality Engineering (CQE) are a plus but not required.
Note:
β’ Location: Cupertino, CA
β’ Onsite/Offsite: Onsite (Hybrid Schedule)
β’ Keywords: Snowflake, Python, Shell, Unix, SQL, Database, Tableau, GitHub, Spark, Bash,
Pay Range: $62/hr on W2 - $67/hr on W2
The specific compensation for this position will be determined by a number of factors, including the scope, complexity and location of the role as well as the cost of labor in the market; the skills, education, training, credentials and experience of the candidate; and other conditions of employment. clientβs full-time consultants have access to benefits including medical, dental, vision and 401K contributions as well as any other PTO, sick leave, and other benefits mandated by appliable state or localities where you reside or work.