Optomi

Data Engineer - SQL

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer - SQL in Glendale, CA, lasting 12 months at a pay rate of "TBD." Key skills include SQL, data engineering, ETL, and machine learning. Candidates must be USC or GC holders or EAD eligible to work W2.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
October 14, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Glendale, CA
-
🧠 - Skills detailed
#Data Strategy #Data Engineering #SQL (Structured Query Language) #Data Ingestion #Data Pipeline #Deployment #Automation #Scala #ML (Machine Learning) #Strategy #"ETL (Extract #Transform #Load)" #Data Science
Role description
Data Engineer - SQL Location: Glendale, CA - 4 days onsite Duration: 12months Visa: Must be USC or GC Holder or EAD able to work W2. • No C2C or 1099 Optomi, in partnership with a leader in the family entertainment industry, is seeking a Data Engineer to play a pivotal role in the transformation of data into actionable insights. Collaborate with our dynamic team of technologists to develop cutting-edge data solutions that drive innovation and fuel business growth. Your responsibilities will include managing complex data structures and delivering scalable and efficient data solutions. Your expertise in data engineering will be crucial in optimizing our data-driven decision-making processes. If you're passionate about leveraging data to make a tangible impact, we welcome you to join us in shaping the future of our organization. • Architect and design data products using foundational data sets. • Develop and maintain code for data products. • Consult with business stakeholders on data strategy and current data assets. • Provide specifications for data ingestion and transformation. • Document and instruct others on using data products for automation and decision-making. • Build data pipelines to automate the creation and deployment of knowledge from models. • Monitor and improve statistical and machine learning models in data products. • Work with data scientists to implement methodologies for marketing problem-solving. • Coordinate with other science and technology teams.