

KellyMitchell Group
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer position in Glendale, California, with a contract length of unspecified duration, offering a pay rate of $51.00 to $73.00 per hour. Key skills include 5+ years of data engineering, proficiency in Python, SQL, Spark, and AWS.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
584
-
🗓️ - Date
January 5, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glendale, CA
-
🧠 - Skills detailed
#Delta Lake #Spark (Apache Spark) #Agile #Airflow #Databricks #Scrum #Data Modeling #Data Quality #SQL (Structured Query Language) #Data Science #Datasets #Data Governance #Scala #Documentation #AWS (Amazon Web Services) #Programming #Java #Cloud #Python #Kubernetes #Data Pipeline #API (Application Programming Interface) #GraphQL #Data Engineering
Role description
Job Summary
Our client is seeking a Senior Data Engineer to join their team! This position is located in Glendale, California.
Duties
• Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
• Build tools and services to support data discovery, lineage, governance, and privacy
• Collaborate with other software and data engineers and cross-functional teams
• Work with a tech stack that includes Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS
• Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
• Contribute to developing and documenting internal and external standards and best practices for pipeline configurations, naming conventions, and more
• Ensure high operational efficiency and quality of Core Data platform datasets to meet SLAs and ensure reliability and accuracy for stakeholders in Engineering, Data Science, Operations, and Analytics
• Participate in agile and scrum ceremonies to collaborate and refine team processes
• Engage with customers to build relationships, understand needs, and prioritize both innovative solutions and incremental platform improvements
• Maintain detailed documentation of work and changes to support data quality and data governance requirements
Desired Skills/Experience:
• 5+ years of data engineering experience developing large data pipelines
• Proficiency in at least one major programming language such as: Python, Java or Scala
• Strong SQL skills and the ability to create queries to analyze complex datasets
• Hands-on production experience with distributed processing systems such as Spark
• Experience interacting with and ingesting data efficiently from API data sources
• Experience coding with the Spark DataFrame API to create data engineering workflows in Databricks
• Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
• Experience developing APIs with GraphQL
• Deep understanding of AWS or other cloud providers, as well as infrastructure-as-code
• Familiarity with data modeling techniques and data warehousing best practices
• Strong algorithmic problem-solving skills
• Excellent written and verbal communication skills
• Advanced understanding of OLTP versus OLAP environments
Benefits:
• Medical, Dental, & Vision Insurance Plans
• Employee-Owned Profit Sharing (ESOP)
• 401K offered
The approximate pay range for this position is between $51.00 and $73.00. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.
Job Summary
Our client is seeking a Senior Data Engineer to join their team! This position is located in Glendale, California.
Duties
• Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
• Build tools and services to support data discovery, lineage, governance, and privacy
• Collaborate with other software and data engineers and cross-functional teams
• Work with a tech stack that includes Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS
• Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
• Contribute to developing and documenting internal and external standards and best practices for pipeline configurations, naming conventions, and more
• Ensure high operational efficiency and quality of Core Data platform datasets to meet SLAs and ensure reliability and accuracy for stakeholders in Engineering, Data Science, Operations, and Analytics
• Participate in agile and scrum ceremonies to collaborate and refine team processes
• Engage with customers to build relationships, understand needs, and prioritize both innovative solutions and incremental platform improvements
• Maintain detailed documentation of work and changes to support data quality and data governance requirements
Desired Skills/Experience:
• 5+ years of data engineering experience developing large data pipelines
• Proficiency in at least one major programming language such as: Python, Java or Scala
• Strong SQL skills and the ability to create queries to analyze complex datasets
• Hands-on production experience with distributed processing systems such as Spark
• Experience interacting with and ingesting data efficiently from API data sources
• Experience coding with the Spark DataFrame API to create data engineering workflows in Databricks
• Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
• Experience developing APIs with GraphQL
• Deep understanding of AWS or other cloud providers, as well as infrastructure-as-code
• Familiarity with data modeling techniques and data warehousing best practices
• Strong algorithmic problem-solving skills
• Excellent written and verbal communication skills
• Advanced understanding of OLTP versus OLAP environments
Benefits:
• Medical, Dental, & Vision Insurance Plans
• Employee-Owned Profit Sharing (ESOP)
• 401K offered
The approximate pay range for this position is between $51.00 and $73.00. Please note that the pay range provided is a good faith estimate. Final compensation may vary based on factors including but not limited to background, knowledge, skills, and location. We comply with local wage minimums.






