

Deloitte
Database Engineer Contractor (Remote)
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Database Engineer Contractor (Remote) with a contract length of unspecified duration and a pay rate of $60 - $65 per hour. Key skills required include PostgreSQL, MongoDB, Kafka, and experience with data migration from IBM DB2 to cloud databases.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
March 27, 2026
π - Duration
Unknown
-
ποΈ - Location
Remote
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
Massachusetts, United States
-
π§ - Skills detailed
#Data Pipeline #REST (Representational State Transfer) #Replication #Monitoring #Data Quality #Migration #PostgreSQL #Storage #Logging #Data Lifecycle #Database Security #Kafka (Apache Kafka) #Indexing #AWS (Amazon Web Services) #Cloud #Databases #Schema Design #Data Reconciliation #Data Integration #Data Storage #IAM (Identity and Access Management) #Security #MongoDB #RDBMS (Relational Database Management System) #DevOps #Data Engineering #Automated Testing #Database Systems #"ETL (Extract #Transform #Load)"
Role description
Position Summary
Contractor
Remote
Work you'll do
We are seeking a Database Engineer Contractor to design, develop, and optimize database systems that support data storage, retrieval, and analysis. This role will be central to a migration program moving an on-prem IBM DB2 database to AWS PostgreSQL (raw/landing layer) and then to AWS-hosted MongoDB (transaction cache/consumption layer) using Precisely and Kafka pipelines. You will also help model and architect the target MongoDB solution and ensure secure, performant, reliable data flows for downstream applications.
Key Responsibilities
β’ Design and construct large-scale relational databases and schemas in PostgreSQL to meet project requirements (raw/landing layer).
β’ Support migration from on-prem DB2 to AWS PostgreSQL, including source-to-target mapping, data reconciliation, and cutover support.
β’ Develop and maintain data integration processes leveraging Precisely and Kafka (e.g., CDC/streaming ingestion) into PostgreSQL and onward to MongoDB.
β’ Design MongoDB collections, document models, indexing, partitioning/sharding approach (if needed), and read/write patterns to support a transaction cache for downstream apps.
β’ Apply transformation rules to raw data to produce βfriendlyβ canonical formats for MongoDB consumption (including versioning and backward compatibility where needed).
β’ Tune PostgreSQL and MongoDB performance (query optimization, indexing strategies, connection pooling, caching, vacuum/analyze, and capacity planning).
β’ Implement regular validation checks, reconciliations, and data quality controls (completeness, accuracy, latency, duplicates, referential/cross-entity consistency).
β’ Implement and enforce database security measuresβaccess controls, encryption in transit/at rest, secrets management, audit logging, and least-privilege patterns.
β’ Establish monitoring/alerting for pipelines and databases (throughput, lag, error rates, replication health, disk/CPU/memory), and support incident triage/root-cause analysis.
β’ Produce architecture diagrams, data models, runbooks, and operational SOPs; partner with app teams to ensure consumption patterns are efficient and reliable.
Required Qualifications
β’ 5+ years (or equivalent) hands-on experience as a Database Engineer / Data Engineer supporting production systems.
β’ Strong experience with PostgreSQL (schema design, indexing, query tuning, performance troubleshooting).
β’ Strong experience with MongoDB (document modeling, aggregation framework, indexing, TTL patterns, performance tuning; sharding knowledge a plus).
β’ Experience migrating from legacy/on-prem RDBMS (preferably IBM DB2) to cloud databases, including data validation and reconciliation.
β’ Experience with Kafka-based data pipelines and streaming/CDC concepts (topics, partitions, consumers, delivery semantics, ordering, replay).
β’ Ability to design robust transformation logic and data contracts for downstream consumption.
β’ Strong understanding of database security fundamentals and operational monitoring.
Preferred Skills
β’ Hands-on experience with Precisely tooling (Connect/Replicate or similar) for CDC/integration.
β’ AWS experience (networking basics, IAM, security groups, CloudWatch monitoring; managed database services experience is a plus).
β’ Experience designing βraw β curated/consumerβ data patterns and data lifecycle management (retention, TTL, archival).
β’ Familiarity with DevOps practices (CI/CD for database changes, Infrastructure-as-Code exposure, automated testing for data).
The expected pay range for this contract assignment is $ 60 - $ 65 per hour.β― The exact pay rate will vary based on skills, experience, and location and will be determined by the third-party whose employees provide services to Deloitte.
Candidates interested in applying for this opportunity must be geographically based in the United States and must be legally authorized to work in the United States without the need for employer sponsorship.
We do not accept agency resumes and are not responsible for any fees related to unsolicited resumes.
Deloitte is not the employer for this role.
This work is contracted through a third-party whose employees provide services to Deloitte.
#Remote
Expected Work Schedule
Approximate hours per week
About Deloitte
Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It makes Deloitte one of the most rewarding places to work.
As used in this posting, βDeloitteβ means , a subsidiary of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries.
Requisition code: 328058
Position Summary
Contractor
Remote
Work you'll do
We are seeking a Database Engineer Contractor to design, develop, and optimize database systems that support data storage, retrieval, and analysis. This role will be central to a migration program moving an on-prem IBM DB2 database to AWS PostgreSQL (raw/landing layer) and then to AWS-hosted MongoDB (transaction cache/consumption layer) using Precisely and Kafka pipelines. You will also help model and architect the target MongoDB solution and ensure secure, performant, reliable data flows for downstream applications.
Key Responsibilities
β’ Design and construct large-scale relational databases and schemas in PostgreSQL to meet project requirements (raw/landing layer).
β’ Support migration from on-prem DB2 to AWS PostgreSQL, including source-to-target mapping, data reconciliation, and cutover support.
β’ Develop and maintain data integration processes leveraging Precisely and Kafka (e.g., CDC/streaming ingestion) into PostgreSQL and onward to MongoDB.
β’ Design MongoDB collections, document models, indexing, partitioning/sharding approach (if needed), and read/write patterns to support a transaction cache for downstream apps.
β’ Apply transformation rules to raw data to produce βfriendlyβ canonical formats for MongoDB consumption (including versioning and backward compatibility where needed).
β’ Tune PostgreSQL and MongoDB performance (query optimization, indexing strategies, connection pooling, caching, vacuum/analyze, and capacity planning).
β’ Implement regular validation checks, reconciliations, and data quality controls (completeness, accuracy, latency, duplicates, referential/cross-entity consistency).
β’ Implement and enforce database security measuresβaccess controls, encryption in transit/at rest, secrets management, audit logging, and least-privilege patterns.
β’ Establish monitoring/alerting for pipelines and databases (throughput, lag, error rates, replication health, disk/CPU/memory), and support incident triage/root-cause analysis.
β’ Produce architecture diagrams, data models, runbooks, and operational SOPs; partner with app teams to ensure consumption patterns are efficient and reliable.
Required Qualifications
β’ 5+ years (or equivalent) hands-on experience as a Database Engineer / Data Engineer supporting production systems.
β’ Strong experience with PostgreSQL (schema design, indexing, query tuning, performance troubleshooting).
β’ Strong experience with MongoDB (document modeling, aggregation framework, indexing, TTL patterns, performance tuning; sharding knowledge a plus).
β’ Experience migrating from legacy/on-prem RDBMS (preferably IBM DB2) to cloud databases, including data validation and reconciliation.
β’ Experience with Kafka-based data pipelines and streaming/CDC concepts (topics, partitions, consumers, delivery semantics, ordering, replay).
β’ Ability to design robust transformation logic and data contracts for downstream consumption.
β’ Strong understanding of database security fundamentals and operational monitoring.
Preferred Skills
β’ Hands-on experience with Precisely tooling (Connect/Replicate or similar) for CDC/integration.
β’ AWS experience (networking basics, IAM, security groups, CloudWatch monitoring; managed database services experience is a plus).
β’ Experience designing βraw β curated/consumerβ data patterns and data lifecycle management (retention, TTL, archival).
β’ Familiarity with DevOps practices (CI/CD for database changes, Infrastructure-as-Code exposure, automated testing for data).
The expected pay range for this contract assignment is $ 60 - $ 65 per hour.β― The exact pay rate will vary based on skills, experience, and location and will be determined by the third-party whose employees provide services to Deloitte.
Candidates interested in applying for this opportunity must be geographically based in the United States and must be legally authorized to work in the United States without the need for employer sponsorship.
We do not accept agency resumes and are not responsible for any fees related to unsolicited resumes.
Deloitte is not the employer for this role.
This work is contracted through a third-party whose employees provide services to Deloitte.
#Remote
Expected Work Schedule
Approximate hours per week
About Deloitte
Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It makes Deloitte one of the most rewarding places to work.
As used in this posting, βDeloitteβ means , a subsidiary of Deloitte LLP. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries.
Requisition code: 328058





