

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Senior Data Engineer based in Glendale, California, for a contract length of unspecified duration, offering a pay rate of $59.50 - $85.00 per hour. Requires 5+ years of data engineering experience, strong SQL skills, and proficiency in AWS and distributed processing systems.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
680
-
ποΈ - Date discovered
July 14, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Glendale, CA
-
π§ - Skills detailed
#Data Engineering #Scrum #AWS (Amazon Web Services) #Data Quality #Kubernetes #Programming #Data Modeling #Data Governance #Datasets #Data Pipeline #Snowflake #Documentation #Delta Lake #Spark (Apache Spark) #Airflow #Databricks #SQL (Structured Query Language) #Agile #GraphQL #Cloud #Infrastructure as Code (IaC)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Summary
Our client is seeking a Senior Data Engineer to join their team! This position is located in Glendale, California.
Duties
β’ Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
β’ Build tools and services to support data discovery, lineage, governance, and privacy
β’ Collaborate with other software/data engineers and cross-functional teams
β’ Tech stack includes Airflow, Spark, Databricks, Delta Lake, Snowflake, Kubernetes, and AWS
β’ Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
β’ Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
β’ Ensure high operational efficiency and quality of the Core Data platform datasets to ensure solutions meet SLAs and project reliability and accuracy for all stakeholders
β’ Be an active participant and advocate of agile/scrum ceremonies to collaborate and improve processes for the team
β’ Engage with and understand customers, forming relationships to prioritize both innovative new offerings and incremental platform improvements
β’ Maintain detailed documentation of work and changes to support data quality and data governance requirements
Desired Skills/Experience
β’ 5+ years of data engineering experience developing large data pipelines
β’ Proficiency in at least one major programming language
β’ Strong SQL skills and ability to create queries to analyze complex datasets
β’ Hands-on production environment experience with distributed processing systems such as Spark
β’ Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
β’ Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology
β’ Experience in developing APIs with GraphQL
β’ Deep understanding of AWS or other cloud providers as well as infrastructure as code
β’ Familiarity with Data Modeling techniques and Data Warehousing standard methodologies and practices
β’ Advanced understanding of OLTP vs. OLAP environments
Benefits
β’ Medical, Dental, & Vision Insurance Plans
β’ 401K offered
$59.50 - $85.00 (est. hourly)