Jobs via Dice

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 12-15+ years of experience, offering a long-term contract at a pay rate of "unknown." It requires expertise in SQL, Python, Spark, Snowflake, and cloud platforms, with a hybrid work location in "New York City/New Jersey/Santa Monica/San Francisco/Seattle."
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 10, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
California City, CA
-
🧠 - Skills detailed
#GitHub #Agile #SQL Queries #Python #Datadog #Dimensional Modelling #Spark (Apache Spark) #Kafka (Apache Kafka) #Data Modeling #Jenkins #Data Integration #Data Engineering #Cloud #Docker #Airflow #Databricks #SQL (Structured Query Language) #Snowflake #AWS (Amazon Web Services) #Scrum #Monitoring #Azure #GCP (Google Cloud Platform)
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Triunity Software, is seeking the following. Apply via Dice today! Job Title: Data Engineer (Contract Role) Location: New York City / New Jersey / Santa Monica / San Francisco/Seattle] Hybrid Duration: Long term Experience - 12 Years to 15+ Years Job Description: We are seeking an experienced Data Engineer to support data integration, pipeline development, and analytics initiatives. The ideal candidate will have strong hands-on experience working with SQL, Python, and modern data engineering platforms. Must Have Skills: • Strong expertise in Complex / Advanced SQL queries • Proficiency with Python • Experience with Spark • Experience with Snowflake • Hands-on exposure to Databricks • Worked with Airflow or Prefect • Experience with at least one cloud: AWS / Azure / Google Cloud Platform • Strong analytical & problem-solving skills • Strong communication skills with ability to interact with stakeholders • Familiarity with Agile Scrum principles & ceremonies Nice to Have: • Data Modeling (e.g., Dimensional Modelling) • Experience working with Kafka • Familiarity with CI/CD tools (Jenkins, GitHub Actions, etc.) & Docker • Exposure to monitoring tools like Datadog