

KellyMitchell Group
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Glendale, California, offering a contract of unspecified length with a pay rate between $51.00 and $73.00. Key skills include 5+ years of experience in data engineering, proficiency in Python, Java, or Scala, and expertise in AWS, Spark, and Airflow.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
584
-
🗓️ - Date
December 6, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glendale, CA
-
🧠 - Skills detailed
#Kubernetes #Python #GraphQL #Data Pipeline #SQL (Structured Query Language) #Data Science #Airflow #Cloud #Databricks #Scrum #Programming #Datasets #Delta Lake #Scala #Agile #Data Governance #Data Modeling #AWS (Amazon Web Services) #Spark (Apache Spark) #Documentation #API (Application Programming Interface) #Data Quality #Data Engineering #Java
Role description
Job Summary:
Our client is seeking a Senior Data Engineer to join their team! This position is located in Glendale, California.
Duties:
• Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
• Build tools and services to support data discovery, lineage, governance, and privacy
• Collaborate with other software and data engineers and cross-functional teams
• Work with a tech stack that includes Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS
• Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
• Contribute to developing and documenting internal and external standards and best practices for pipeline configurations, naming conventions, and more
• Ensure high operational efficiency and quality of Core Data platform datasets to meet SLAs and ensure reliability and accuracy for stakeholders in Engineering, Data Science, Operations, and Analytics
• Participate in agile and scrum ceremonies to collaborate and refine team processes
• Engage with customers to build relationships, understand needs, and prioritize both innovative solutions and incremental platform improvements
• Maintain detailed documentation of work and changes to support data quality and data governance requirements
Desired Skills/Experience:
• 5+ years of data engineering experience developing large data pipelines
• Proficiency in at least one major programming language such as: Python, Java or Scala
• Strong SQL skills and the ability to create queries to analyze complex datasets
• Hands-on production experience with distributed processing systems such as Spark
• Experience interacting with and ingesting data efficiently from API data sources
• Experience coding with the Spark DataFrame API to create data engineering workflows in Databricks
• Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
• Experience developing APIs with GraphQL
• Deep understanding of AWS or other cloud providers, as well as infrastructure-as-code
• Familiarity with data modeling techniques and data warehousing best practices
• Strong algorithmic problem-solving skills
• Excellent written and verbal communication skills
• Advanced understanding of OLTP versus OLAP environments
Benefits:
• Medical, Dental, & Vision Insurance Plans
• Employee-Owned Profit Sharing (ESOP)
• 401K offered
The approximate pay range for this position is between $51.00 and $73.00. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
Job Summary:
Our client is seeking a Senior Data Engineer to join their team! This position is located in Glendale, California.
Duties:
• Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
• Build tools and services to support data discovery, lineage, governance, and privacy
• Collaborate with other software and data engineers and cross-functional teams
• Work with a tech stack that includes Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS
• Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
• Contribute to developing and documenting internal and external standards and best practices for pipeline configurations, naming conventions, and more
• Ensure high operational efficiency and quality of Core Data platform datasets to meet SLAs and ensure reliability and accuracy for stakeholders in Engineering, Data Science, Operations, and Analytics
• Participate in agile and scrum ceremonies to collaborate and refine team processes
• Engage with customers to build relationships, understand needs, and prioritize both innovative solutions and incremental platform improvements
• Maintain detailed documentation of work and changes to support data quality and data governance requirements
Desired Skills/Experience:
• 5+ years of data engineering experience developing large data pipelines
• Proficiency in at least one major programming language such as: Python, Java or Scala
• Strong SQL skills and the ability to create queries to analyze complex datasets
• Hands-on production experience with distributed processing systems such as Spark
• Experience interacting with and ingesting data efficiently from API data sources
• Experience coding with the Spark DataFrame API to create data engineering workflows in Databricks
• Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
• Experience developing APIs with GraphQL
• Deep understanding of AWS or other cloud providers, as well as infrastructure-as-code
• Familiarity with data modeling techniques and data warehousing best practices
• Strong algorithmic problem-solving skills
• Excellent written and verbal communication skills
• Advanced understanding of OLTP versus OLAP environments
Benefits:
• Medical, Dental, & Vision Insurance Plans
• Employee-Owned Profit Sharing (ESOP)
• 401K offered
The approximate pay range for this position is between $51.00 and $73.00. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.






