

Signify Technology
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer contract position for 12 months, paying $60-80 per hour, located in the United States. Requires 3+ years of experience, expertise in SQL, dbt, Python, and familiarity with Snowflake or similar cloud data warehouses.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date
February 3, 2026
π - Duration
More than 6 months
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Utica-Rome Area
-
π§ - Skills detailed
#Data Ingestion #Documentation #Version Control #Quality Assurance #Data Warehouse #Data Architecture #Snowflake #"ETL (Extract #Transform #Load)" #BigQuery #Server Administration #Clustering #dbt (data build tool) #GCP (Google Cloud Platform) #SQL Queries #Data Pipeline #Scala #Azure #Data Engineering #Complex Queries #Data Modeling #Datasets #GIT #SQL (Structured Query Language) #Automation #Debugging #Data Quality #Redshift #Python #Data Documentation #SQL Server #AWS (Amazon Web Services) #Cloud #Data Manipulation #Scripting
Role description
Job title: Data Engineer
Job type: Contract
Contract Length:12 months
Rate: $60-80 per hour
Role Location: United States
The Company
A data-driven healthcare organization modernizing our analytics platform to power value-based care. Our team is transitioning from SQL Server to Snowflake as our cloud data warehouse and implementing dbt, Python, and OpenFlow to build reliable, automated data pipelines.
Youβll join a collaborative, high-skill data team that values maintainability, clarity, and thoughtful design - working alongside engineers who specialize in automation, SQL Server administration, and data architecture.
Role And Responsibilities
Weβre seeking an experienced Data Engineer (3+ years) with strong SQL, dbt, and Python skills to design, build, and maintain our data transformation and analytics pipelines in Snowflake.
The ideal candidate has hands-on experience with modern ELT tools, data modeling, and cloud-based platforms - and brings a mindset of automation, testing, and documentation to every project.
Youβll collaborate closely with our data architects and integration engineers to ensure data from OpenFlow pipelines is transformed into trusted, analytics-ready models for reporting and advanced analytics.
β’ Advanced SQL Development: Write and optimize complex SQL queries and dbt models for data transformation and analysis within Snowflake.
β’ DBT Model Development: Build, test, and maintain dbt models that convert raw data into actionable insights.
β’ ETL/ELT Pipeline Management: Design and manage efficient pipelines using dbt, OpenFlow, and Python to process and deliver data across systems.
β’ SQL Performance Tuning: Optimize query performance, clustering, and cost efficiency in Snowflake.
β’ Data Quality Assurance: Ensure that transformed data meets accuracy and consistency standards through dbt testing and validation frameworks.
β’ Collaboration: Work closely with data engineers, analysts, and architecture leads to translate data requirements into scalable transformations.
β’ Data Documentation: Maintain clear documentation for dbt models, data flows, and dependencies for ongoing visibility and reuse.
β’ Version Control: Manage dbt and Python projects in Git, following clean, modular, and testable development practices.
β’ Automation Support: Partner with automation engineers to enhance data ingestion and transformation workflows through OpenFlow.
Job Requirements
β’ Proven experience as a dbt Developer or in a similar Data Engineer role.
β’ Expert-level SQL skills β capable of writing, tuning, and debugging complex queries across large datasets.
β’ Strong experience with Snowflake or comparable data warehouse technologies (BigQuery, Redshift, etc.).
β’ Proficiency in Python for scripting, automation, or data manipulation.
β’ Solid understanding of data warehousing concepts, modeling, and ELT workflows.
β’ Familiarity with Git or other version control systems.
β’ Experience working with cloud-based platforms such as AWS, GCP, or Azure.
Accessibility Statement
Read and apply for this role in the way that works for you by using our Recite Me assistive technology tool. Click the circle at the bottom right side of the screen and select your preferences.
We make an active choice to be inclusive towards everyone every day.? Please let us know if you require any accessibility adjustments through the application or interview process.
Our Commitment To Diversity, Equity, And Inclusion
Signifyβs mission is to empower every person, regardless of their background or?circumstances, with an equitable chance to achieve the careers?they deserve. Building a diverse future, one placement at a?time. Check out our DE&I page here
Job title: Data Engineer
Job type: Contract
Contract Length:12 months
Rate: $60-80 per hour
Role Location: United States
The Company
A data-driven healthcare organization modernizing our analytics platform to power value-based care. Our team is transitioning from SQL Server to Snowflake as our cloud data warehouse and implementing dbt, Python, and OpenFlow to build reliable, automated data pipelines.
Youβll join a collaborative, high-skill data team that values maintainability, clarity, and thoughtful design - working alongside engineers who specialize in automation, SQL Server administration, and data architecture.
Role And Responsibilities
Weβre seeking an experienced Data Engineer (3+ years) with strong SQL, dbt, and Python skills to design, build, and maintain our data transformation and analytics pipelines in Snowflake.
The ideal candidate has hands-on experience with modern ELT tools, data modeling, and cloud-based platforms - and brings a mindset of automation, testing, and documentation to every project.
Youβll collaborate closely with our data architects and integration engineers to ensure data from OpenFlow pipelines is transformed into trusted, analytics-ready models for reporting and advanced analytics.
β’ Advanced SQL Development: Write and optimize complex SQL queries and dbt models for data transformation and analysis within Snowflake.
β’ DBT Model Development: Build, test, and maintain dbt models that convert raw data into actionable insights.
β’ ETL/ELT Pipeline Management: Design and manage efficient pipelines using dbt, OpenFlow, and Python to process and deliver data across systems.
β’ SQL Performance Tuning: Optimize query performance, clustering, and cost efficiency in Snowflake.
β’ Data Quality Assurance: Ensure that transformed data meets accuracy and consistency standards through dbt testing and validation frameworks.
β’ Collaboration: Work closely with data engineers, analysts, and architecture leads to translate data requirements into scalable transformations.
β’ Data Documentation: Maintain clear documentation for dbt models, data flows, and dependencies for ongoing visibility and reuse.
β’ Version Control: Manage dbt and Python projects in Git, following clean, modular, and testable development practices.
β’ Automation Support: Partner with automation engineers to enhance data ingestion and transformation workflows through OpenFlow.
Job Requirements
β’ Proven experience as a dbt Developer or in a similar Data Engineer role.
β’ Expert-level SQL skills β capable of writing, tuning, and debugging complex queries across large datasets.
β’ Strong experience with Snowflake or comparable data warehouse technologies (BigQuery, Redshift, etc.).
β’ Proficiency in Python for scripting, automation, or data manipulation.
β’ Solid understanding of data warehousing concepts, modeling, and ELT workflows.
β’ Familiarity with Git or other version control systems.
β’ Experience working with cloud-based platforms such as AWS, GCP, or Azure.
Accessibility Statement
Read and apply for this role in the way that works for you by using our Recite Me assistive technology tool. Click the circle at the bottom right side of the screen and select your preferences.
We make an active choice to be inclusive towards everyone every day.? Please let us know if you require any accessibility adjustments through the application or interview process.
Our Commitment To Diversity, Equity, And Inclusion
Signifyβs mission is to empower every person, regardless of their background or?circumstances, with an equitable chance to achieve the careers?they deserve. Building a diverse future, one placement at a?time. Check out our DE&I page here





