

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Glendale, CA, on a 12-month W2 contract. Requires 5+ years in data engineering, proficiency in Python, SQL, Spark, and Airflow, and experience with Databricks or Snowflake. US work authorization required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 31, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Glendale, CA
-
π§ - Skills detailed
#GraphQL #Apache Spark #Delta Lake #Python #Spark (Apache Spark) #AWS (Amazon Web Services) #Databases #Snowflake #Airflow #Data Modeling #Data Quality #Datasets #Data Pipeline #Cloud #Databricks #Data Engineering #Data Processing #Apache Airflow #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Senior Data Engineer
Location: Glendale, CA β Onsite 4 days a week
Employment Type: W2 Contract (No C2C)
Work Authorisation: US Citizen, Green Card Holder, H4-EAD, TN Visa holders only
Contract Duration: 12 Months
About the Role:
Our client is seeking a Senior Data Engineer to join a cutting-edge technology team focused on building and scaling modern data infrastructure. This role involves maintaining and expanding large-scale data pipelines, developing real-time streaming solutions, and collaborating with cross-functional teams to ensure high data quality, operational efficiency, and system reliability.
Technology Stack:
β’ Languages/Tools: Python, SQL, GraphQL
β’ Frameworks/Platforms: Apache Airflow, Apache Spark, Databricks, Delta Lake, Snowflake
β’ Cloud Services: AWS
Key Responsibilities:
β’
β’ Maintain, improve, and scale existing data pipelines within the core data platform
β’ .Build APIs to expose data to various internal applications and teams
β’ .Develop and manage real-time data streaming pipelines
β’ .Collaborate with product managers, architects, and engineers to align on priorities and implementation
β’ .Ensure high operational efficiency, reliability, and SLA adherence of core datasets
β’ .Establish and document internal standards for pipeline naming conventions, configurations, and best practices
.
Must-Have Qualifications
β’ :
β’ 5+ years of hands-on experience in data engineering and building large-scale data pipeline
β’ s.Proficient in Python and SQ
β’ L.Production-level experience with Spark and Airflo
β’ w.Experience with Databricks or Snowflake (or other cloud-based MPP databases
β’ ).Familiarity with developing APIs using GraphQ
β’ L.Solid understanding of OLTP vs. OLAP system
β’ s.Strong background in distributed data processing, data modeling, or engineering data service
s.
Nice-to-Have
β’ s:
β’ Experience in Media & Entertainment or content-driven organizatio
β’ ns.Familiarity with large-scale content distribution or post-production syste
β’ ms.Background with digital asset workflows or large-volume data manageme
nt.
Soft Skil
β’ ls:
β’ Excellent verbal and written communication ski
β’ lls.Proactive, self-motivated, and comfortable working in fast-paced, evolving environme
β’ nts.Strong collaboration and problem-solving mind
set.
Additional Informat
β’ ion:
β’ Background Check: Req
β’ uiredBenefits (W2 O
β’ nly):Medical and dental insurance (multiple plan options for you and your fa
β’ mily)401(k)
β’ planOvertime pay eligib
ility