Meduvi

Cloud Database Administrator (DBA), Oracle RDS

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Database Administrator (DBA) with at least 12 months contract, hybrid work location, and a pay rate of "X". Key skills include Oracle RDS, AWS services, advanced SQL, and experience with data pipelines and migration.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
November 19, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Everett, MA 02149
-
🧠 - Skills detailed
#Data Warehouse #Scala #DMS (Data Migration Service) #Kafka (Apache Kafka) #DBA (Database Administrator) #Aurora #Databases #Bash #SSIS (SQL Server Integration Services) #Slowly Changing Dimensions #JSON (JavaScript Object Notation) #Data Engineering #Airflow #Oracle #Snowflake #Cloud #XML (eXtensible Markup Language) #Apache Airflow #S3 (Amazon Simple Storage Service) #Unit Testing #Jira #Data Migration #DynamoDB #"ETL (Extract #Transform #Load)" #Storage #SQL (Structured Query Language) #SQL Server #Deployment #Migration #Normalization #Replication #Code Reviews #Data Pipeline #RDS (Amazon Relational Database Service) #AWS (Amazon Web Services) #Scripting #Data Mart #Python #GitHub
Role description
37.5 Hours/week M-F Contract Duration: at least 12 months Reporting Mode: Hybrid - 1 day onsite/month Assist in maintaining, optimizing, modernizing, and troubleshooting our data warehouse, data mart and data portfolio. Under the direction of the Chief Applications Officer and the Data Engineering and Analytics Team Leads, the DBA/ETL/ELT Engineer will manage databases and data services hosted on cloud platforms, ensuring they are secure, performant, highly available, and compliant with governance policies. The Engineer will be hands-on working with a team of cloud Engineers, ETL developers, technical leads, DBAs, project managers and analysts to design and implement the portfolio data, data pipelines, and transformations in a more stream-lined, scalable and cost-effective set of solutions. Regular responsibilities include: Create and manage cloud-native databases and services (e.g., RDS Oracle, Aurora, Postgres, Snowflake). Track and tune query execution, compute scaling, and storage performance. Define policies for snapshots, PITR (point-in-time recovery), and cross-region replication. Implement encryption, access policies, masking, and auditing to meet PII standards. Manage schema migrations, data pipelines, and versioned deployments. Hands-on discovery, solution designing, re-platforming and troubleshooting to migrate the Legacy SSIS ETL code to a SQL based solution with Apache Airflow for scheduling and dependency management. Tasks may include re-Engineering overall solution approach, constructing code packages, bug fixing, unit testing code and using GitHub code repository Develop and guide implementation of Apache Airflow Scheduling and Dependency Framework Performance tuning and optimizing solution implementation, benchmark the new solution against the on-prem solution to ensure that it performs comparably or better Using Jira to review and work through assigned tasks Using GitHub to check in and manage code, code reviews, pull requests Required Experience working with Oracle RDS Experience working with one or more AWS Cloud services such as S3 storage, Managed Airflow (MWAA), Data Migration Service (DMS) in support of building data pipelines Experience working with a variety of backend data sources (e.g. SQL Server, Oracle, Postgres, DynamoDB, Snowflake) Advanced SQL coding skills and ability to translate Oracle PL SQL and Stored Procedure code to alternative SQL platforms such as Snowflake Familiarity with data warehouse and data mart concepts such as normalization, facts, dimensions, slowly changing dimensions Familiarity with Change Data Capture (CDC) concepts and implementations. Knowledge of Kafka or similar replication tools is a plus Understanding of common file formats such as JSON, XML, CSV Basic experience using scripting tools to automate tasks (e.g.: Python, Windows PowerShell, bash) Ability to write unit test scripts and be able to validate migrated ELT/ETL code - Compensation Statement. Please see pay rate within this job posting. Employee Benefits Statement. Meduvi offers comprehensive medical health insurance (HMO/PPO), dental (PPO), 401k and weekly payroll with direct deposit. EEO Statement. We welcome all applicants and qualified individuals, who will receive consideration for employment without regard to their race, color, religion, national origin, sex, sexual orientation, gender identity, protected veteran status or disability.