

Jobs via Dice
Data Engineer (PostgreSQL)
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (PostgreSQL) with a 12+ month contract, remote in Cincinnati, OH. Key skills include SQL, data pipeline development, and cloud experience (AWS/Azure). A bachelor's degree and relevant experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 22, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Cincinnati, OH
-
🧠 - Skills detailed
#Public Cloud #Snowflake #Data Lake #Leadership #Data Security #Redshift #Scripting #Data Analysis #Data Integrity #Data Processing #Liquibase #Data Warehouse #DBA (Database Administrator) #Schema Design #Azure #Talend #Oracle #Compliance #Data Architecture #Scala #MariaDB #PostgreSQL #Programming #Normalization #Data Engineering #Security #Agile #Deployment #DevOps #Bash #Data Pipeline #AWS (Amazon Web Services) #GDPR (General Data Protection Regulation) #Data Quality #MongoDB #Database Infrastructure #Python #Base #Data Science #Storage #Cloud #"ETL (Extract #Transform #Load)" #AWS Glue #MySQL #Terraform #Triggers #Databases #BigQuery
Role description
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Orpine.com, is seeking the following. Apply via Dice today!
Job Title: Data Engineer
Location: Cincinnati, OH (Remote)
Duration: 12+ Months Contract
Job Summary:
Responsible for working closely with the broader Shared Technology Services-Digital Workplace (STS-DW) organization to drive the development and deployment of key productivity, collaboration, and engagement tools across the organization in a seamless, contemporary, and modern way.
Reporting to the Director of Infrastructure & Database Engineering, you will design, build, deploy, and maintain a suite of databases, data warehouses, data pipelines, data transfer solutions, and their infrastructure to efficiently collect, store, and process large volumes of data. This enables our data analytics and product portfolio of core services in Collaboration, Document Management, Email, Productivity Software, Computers, Mobility, Peripheral Devices and Remote Support. You will be part of a growing, focused team to continually develop this internal product portfolio, delivering increased value to our 70,000+ user base across the globe, impacting every employee across the company.
Essential Responsibilities:
• Design and implement robust data architectures, including databases, data lakes, data warehouses, and cloud-based solutions
• Build and deploy scalable infrastructure for databases and data warehouses using Infrastructure as a Code (e.g. Terraform) across multiple cloud environments such as AWS and Azure
• Work closely with data scientists, analysts, and business/product teams to understand data requirements and define data models, normalization, & schema design
• Develop and maintain scalable data pipelines to ingest, integrate, transform, and load data from diverse sources
• Create and maintain database objects such as tables, views (materialized views), indexes, triggers and database packages such as procedures, functions, variables to enable business and analytic processes
• Manage and optimize relational/non-relational databases (e.g., PostgreSQL, MySQL/MariaDB, Oracle, MongoDB) and data warehouses (e.g. AWS Redshift)
• Maintain data quality, ensure data integrity and compliance with relevant regulations and standards, such as GDPR or HIPAA
• Implement data security measures to protect sensitive information
• Monitor the health of databases and data warehouses, while identifying and resolving bottlenecks in data processing and storage systems to ensure optimal performance
• Document data engineering processes, workflows, and system configurations
Basic Qualifications:
• Bachelor s degree from accredited university or college with minimum of 2 years of professional experience OR Associates degree with minimum of 5 years of professional experience OR High School Diploma with minimum of 7 years of professional experience
• A minimum of 1 year of experience managing databases/data warehouses, including administration and optimization is required
• A minimum of 2 years of experience working in data engineering and proficiency in SQLs is required
• A minimum of 2 years of experience managing Database Infrastructure for Digital Workplace
• A minimum of 2 years of experience developing Data pipelines & ETL jobs using tools such as Talend, Rundeck or AWS Glue
• Note: Military experience is equivalent to professional experience
• Eligibility Requirement for US Candidates:
• Must be a US person ( or Permanent Resident)
Desired Characteristics:
Technical Expertise:
• Primary role in recent positions must be as a Data Engineer, Data Analyst, or Database Administrator
• Working experience in public cloud is strongly preferred (AWS, Azure)
• Experience with at least one programming language (e.g. Python) and some scripting experience (e.g., Bash/Shell) is a strong plus
• Experience with data warehouse platforms like Redshift, Snowflake or BigQuery is desired
• Familiarity with DevOps practices and CI/CD pipelines for data engineering (using Flyway, Liquibase) is preferred
• Knowledge of ETL tools and frameworks is strongly desired
• Ability to analyze complex data sets and identify patterns or trends
• Problem-solving skills to address data-related challenges
• Ability to assess and improve data quality through validation and cleansing processes
• A technical mindset focused on automating processes to reduce manual toil
Business and Leadership Expertise:
• Comfortable working with different Agile software development methodologies
• Demonstrated strong leadership and analytical skills
• Strong communication and collaboration skills to work effectively with cross-functional teams
• Proactively identifies and removes project obstacles or barriers on behalf of the team
• Ability to determine the most effective way to integrate disparate systems to optimize operational processes
• Ability to evaluate technology trends to drive features and roadmaps
• Attention to detail and a commitment to delivering high-quality solutions
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Orpine.com, is seeking the following. Apply via Dice today!
Job Title: Data Engineer
Location: Cincinnati, OH (Remote)
Duration: 12+ Months Contract
Job Summary:
Responsible for working closely with the broader Shared Technology Services-Digital Workplace (STS-DW) organization to drive the development and deployment of key productivity, collaboration, and engagement tools across the organization in a seamless, contemporary, and modern way.
Reporting to the Director of Infrastructure & Database Engineering, you will design, build, deploy, and maintain a suite of databases, data warehouses, data pipelines, data transfer solutions, and their infrastructure to efficiently collect, store, and process large volumes of data. This enables our data analytics and product portfolio of core services in Collaboration, Document Management, Email, Productivity Software, Computers, Mobility, Peripheral Devices and Remote Support. You will be part of a growing, focused team to continually develop this internal product portfolio, delivering increased value to our 70,000+ user base across the globe, impacting every employee across the company.
Essential Responsibilities:
• Design and implement robust data architectures, including databases, data lakes, data warehouses, and cloud-based solutions
• Build and deploy scalable infrastructure for databases and data warehouses using Infrastructure as a Code (e.g. Terraform) across multiple cloud environments such as AWS and Azure
• Work closely with data scientists, analysts, and business/product teams to understand data requirements and define data models, normalization, & schema design
• Develop and maintain scalable data pipelines to ingest, integrate, transform, and load data from diverse sources
• Create and maintain database objects such as tables, views (materialized views), indexes, triggers and database packages such as procedures, functions, variables to enable business and analytic processes
• Manage and optimize relational/non-relational databases (e.g., PostgreSQL, MySQL/MariaDB, Oracle, MongoDB) and data warehouses (e.g. AWS Redshift)
• Maintain data quality, ensure data integrity and compliance with relevant regulations and standards, such as GDPR or HIPAA
• Implement data security measures to protect sensitive information
• Monitor the health of databases and data warehouses, while identifying and resolving bottlenecks in data processing and storage systems to ensure optimal performance
• Document data engineering processes, workflows, and system configurations
Basic Qualifications:
• Bachelor s degree from accredited university or college with minimum of 2 years of professional experience OR Associates degree with minimum of 5 years of professional experience OR High School Diploma with minimum of 7 years of professional experience
• A minimum of 1 year of experience managing databases/data warehouses, including administration and optimization is required
• A minimum of 2 years of experience working in data engineering and proficiency in SQLs is required
• A minimum of 2 years of experience managing Database Infrastructure for Digital Workplace
• A minimum of 2 years of experience developing Data pipelines & ETL jobs using tools such as Talend, Rundeck or AWS Glue
• Note: Military experience is equivalent to professional experience
• Eligibility Requirement for US Candidates:
• Must be a US person ( or Permanent Resident)
Desired Characteristics:
Technical Expertise:
• Primary role in recent positions must be as a Data Engineer, Data Analyst, or Database Administrator
• Working experience in public cloud is strongly preferred (AWS, Azure)
• Experience with at least one programming language (e.g. Python) and some scripting experience (e.g., Bash/Shell) is a strong plus
• Experience with data warehouse platforms like Redshift, Snowflake or BigQuery is desired
• Familiarity with DevOps practices and CI/CD pipelines for data engineering (using Flyway, Liquibase) is preferred
• Knowledge of ETL tools and frameworks is strongly desired
• Ability to analyze complex data sets and identify patterns or trends
• Problem-solving skills to address data-related challenges
• Ability to assess and improve data quality through validation and cleansing processes
• A technical mindset focused on automating processes to reduce manual toil
Business and Leadership Expertise:
• Comfortable working with different Agile software development methodologies
• Demonstrated strong leadership and analytical skills
• Strong communication and collaboration skills to work effectively with cross-functional teams
• Proactively identifies and removes project obstacles or barriers on behalf of the team
• Ability to determine the most effective way to integrate disparate systems to optimize operational processes
• Ability to evaluate technology trends to drive features and roadmaps
• Attention to detail and a commitment to delivering high-quality solutions






