Optomi

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer, a 6-month contract position, offering a pay rate of "$X/hour". Candidates must have 5+ years of data engineering experience, strong graph database knowledge, and proficiency in Python and ETL workflows. Remote work is available in specified locations.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
696
-
🗓️ - Date
February 7, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Seattle, WA
-
🧠 - Skills detailed
#MySQL #Airflow #Graph Databases #Databases #dbt (data build tool) #"ETL (Extract #Transform #Load)" #Schema Design #Storage #Collibra #Automation #Python #Cloud #RDBMS (Relational Database Management System) #Agile #Datasets #Scrum #Scala #Data Engineering #PostgreSQL #Data Science #Data Pipeline #Data Quality #Snowflake #Neo4J #Data Warehouse #Documentation #Programming
Role description
This is open to candidates located in Seattle, Glendale, Santa Monica, or San Francisco! Job Description: • As a Senior Data Engineer, you will play a pivotal role in transforming data into actionable insights. You’ll collaborate with a cross-functional team of technologists to design and deliver modern data solutions that drive innovation and support business growth. This role involves managing complex data structures and building scalable, efficient data products and pipelines. Your data engineering expertise will be critical in improving data-driven decision-making across the organization. Key Responsibilities • Design, build, and maintain data platform pipelines supporting structured, graph, and unstructured datasets • Architect and implement graph database models, including schema design and scalable solution development • Apply strong data engineering principles across cloud services and modern data platforms, including storage, compute, messaging/event-driven services, and table formats (e.g., object storage, serverless compute, queues/topics, Iceberg) • Build and support data transformation, orchestration, and automation workflows (e.g., dbt, Airflow) • Implement and monitor data quality and governance practices using relevant tooling (e.g., Great Expectations, Soda, Collibra or similar) • Participate in and advocate for Agile/Scrum ceremonies to improve team collaboration and delivery • Partner with product managers, architects, and engineers to deliver core data platform capabilities • Create and maintain documentation, standards, and best practices (pipeline configuration, naming conventions, etc.) • Ensure operational excellence of platform datasets to meet SLAs and reliability expectations for internal stakeholders (engineering, data science, operations, analytics) • Engage with internal stakeholders to understand needs, prioritize enhancements, and support adoption of platform capabilities • Maintain detailed documentation of changes to support governance, auditability, and data quality requirements Qualifications • 5+ years of data engineering experience developing and supporting production data pipelines • Strong understanding of graph database concepts, advantages vs. traditional RDBMS approaches, and relevant use cases • Proficiency in at least one major programming language (e.g., Python) • Experience building ETL/ELT workflows for graph-oriented datasets (extracting, transforming, and loading graph data) • Hands-on experience with workflow orchestration tools (e.g., Airflow) in production environments • Experience integrating graph databases with cloud data warehouses (e.g., Neo4j with Snowflake or equivalent) • Strong algorithmic problem-solving skills and attention to detail • Comfortable working in a fast-paced, collaborative environment • Excellent written and verbal communication skills • Demonstrated ability to learn new technologies and tools quickly • Familiarity with Scrum and Agile methodologies • Graph database experience - Neo4j (or Cypher) • Airflow - or another Pipeline familiarly - prefect or other orchestration tools • Relational databases - PostgreSQL, MySQL, MSSQL