

Sr Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr Data Engineer in Glendale, CA, with a contract length of "unknown" and a pay rate of "unknown." Candidates must have 5+ years of experience in data engineering, proficiency in Python and SQL, and expertise with Spark, Airflow, and Snowflake.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
680
-
ποΈ - Date discovered
July 28, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Glendale, CA
-
π§ - Skills detailed
#Data Pipeline #API (Application Programming Interface) #Data Engineering #Apache Spark #Cloud #Datasets #Airflow #Databricks #Programming #Stories #GraphQL #SQL (Structured Query Language) #Spark (Apache Spark) #Databases #Snowflake #Python #Delta Lake #AWS (Amazon Web Services) #Scala #Data Modeling #Java #Data Processing
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Experience: 5 - 20 Years
Location: USA - Glendale
Must-Have
β’ 5+ years of data engineering experience, specifically developing large-scale data pipelines
β’ Experience with Spark, Airflow, Databricks or Snowflake, SQL, Python
Location: Glendale, CA β Onsite 4 days a week
The Company
Headquartered in Los Angeles, this leader in the Entertainment & Media space is focused on delivering world-class stories and experiences to its global audience. To offer the best entertainment experiences, their technology teams focus on continued innovation and utilization of cutting-edge technology.
Platform / Stack
You will work with technologies that include Python, AWS, Spark, Snowflake, Databricks, and Airflow.
What You'll Do As a Sr Data Engineer
β’ Contribute to maintaining, updating, and expanding existing Core Data platform data pipelines
β’ Build and maintain APIs to expose data to downstream applications
β’ Develop real-time streaming data pipelines
β’ Utilize a tech stack including Airflow, Spark, Databricks, Delta Lake, and Snowflake
β’ Collaborate with product managers, architects, and other engineers to drive the success of the Core Data platform
β’ Contribute to developing and documenting both internal and external standards and best practices for pipeline configurations, naming conventions, and more
β’ Ensure high operational efficiency and quality of the Core Data platform datasets to meet SLAs and project reliability and accuracy expectations
Qualifications
You could be a great fit if you have:
β’ 5+ years of data engineering experience developing large data pipelines
β’ Proficiency in at least one major programming language (e.g., Python, Java, Scala)
β’ Hands-on production environment experience with distributed processing systems such as Spark
β’ Hands-on production experience with data pipeline orchestration systems such as Airflow for creating and maintaining data pipelines
β’ Experience with at least one major Massively Parallel Processing (MPP) or cloud database technology (Snowflake, Databricks, Big Query)
β’ Experience in developing APIs with GraphQL
β’ Advanced understanding of OLTP vs OLAP environments
β’ Strong background in distributed data processing, software engineering of data services, or data modeling
Skills: graphql,spark,aws,data modeling,delta lake,data,pipelines,database,sql,snowflake,data engineering,api development,technology,oltp,java,airflow,scala,apache spark,core data,python,databricks,olap,large-scale data pipelines