

TalentBurst, an Inc 5000 company
Cloud Database Administrator/ETL Engineer - Mostly Remote
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Database Administrator/ETL Engineer on a multi-year renewable contract, mostly remote, with a pay rate of "unknown." Key skills include Oracle RDS, AWS services, advanced SQL, and familiarity with data warehousing concepts.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
November 19, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Everett, MA
-
🧠 - Skills detailed
#Data Management #Data Warehouse #Scala #DMS (Data Migration Service) #Kafka (Apache Kafka) #DBA (Database Administrator) #Aurora #Databases #Bash #SSIS (SQL Server Integration Services) #Slowly Changing Dimensions #JSON (JavaScript Object Notation) #Data Engineering #Airflow #Oracle #Snowflake #Cloud #XML (eXtensible Markup Language) #Apache Airflow #S3 (Amazon Simple Storage Service) #SnowPipe #Unit Testing #Jira #Data Migration #DynamoDB #"ETL (Extract #Transform #Load)" #Storage #SQL (Structured Query Language) #SQL Server #Deployment #Migration #Normalization #Replication #Code Reviews #Data Pipeline #RDS (Amazon Relational Database Service) #AWS (Amazon Web Services) #Scripting #Data Mart #Python #GitHub
Role description
Position: Cloud Database Admin/ETL Engineer
Location: Everett, MA (Mostly Remote)
Duration: Multi Year Renewable Contract
Hours: 37.5 Hours per week
Overview
Client is seeking a Cloud Database Administrator (DBA)/ETL Engineer to assist in maintaining, optimizing, modernizing, and troubleshooting our data warehouse, data mart and data portfolio. Under the direction of the Chief Applications Officer and the Data Engineering and Analytics Team Leads, the DBA/ETL/ELT Engineer will manage databases and data services hosted on cloud platforms, ensuring they are secure, performant, highly available, and compliant with governance policies.
The Engineer will be hands-on working with a team of cloud Engineers, ETL developers, technical leads, DBAs, project managers and analysts to design and implement the Education portfolio data, data pipelines, and transformations in a more stream-lined, scalable and cost-effective set of solutions.
Regular Responsibilities Include
• Create and manage cloud-native databases and services (e.g., RDS Oracle, Aurora, Postgres, Snowflake).
• Track and tune query execution, compute scaling, and storage performance.
• Define policies for snapshots, PITR (point-in-time recovery), and cross-region replication.
• Implement encryption, access policies, masking, and auditing to meet FERPA/PII standards.
• Manage schema migrations, data pipelines, and versioned deployments.
• Hands-on discovery, solution designing, re-platforming and troubleshooting to migrate the EOE Legacy SSIS ETL code to a SQL based solution with Apache Airflow for scheduling and dependency management.
• Tasks may include re-Engineering overall solution approach, constructing code packages, bug fixing, unit testing code and using GitHub code repository.
• Develop and guide implementation of Apache Airflow Scheduling and Dependency Framework.
• Performance tuning and optimizing solution implementation, benchmark the new solution against the on-prem solution to ensure that it performs comparably or better.
• Using Jira to review and work through assigned tasks.
• Using GitHub to check in and manage code, code reviews, pull requests.
Required
• Experience working with Oracle RDS.
• Experience working with one or more AWS Cloud services such as S3 storage, Managed Airflow (MWAA), Data Migration Service (DMS) in support of building data pipelines.
• Experience working with a variety of backend data sources (e.g. SQL Server, Oracle, Postgres, DynamoDB, Snowflake).
• Advanced SQL coding skills and ability to translate Oracle PL SQL and Stored Procedure code to alternative SQL platforms such as Snowflake.
• Familiarity with data warehouse and data mart concepts such as normalization, facts, dimensions, slowly changing dimensions.
• Familiarity with Change Data Capture (CDC) concepts and implementations. Knowledge of Kafka or similar replication tools is a plus.
• Understanding of common file formats such as JSON, XML, CSV.
• Basic experience using scripting tools to automate tasks (e.g.: Python, Windows PowerShell, bash).
• Ability to write unit test scripts and be able to validate migrated ELT/ETL code.
Preferred
• Experience configuring, managing, and troubleshooting Airflow tool. Knowledge of Airflow DAGs and concepts for managing dependent graphs and complex steps.
• Knowledge of Snowflake data warehouse features such as Snowpipe streaming, cloning, time travel, role-based-access control.
• Prior experience working at other large organizations, preferably state or federal government.
• Business domain knowledge in the Education and Student data management area.
• Experience working with software development tools such as GitHub, Jira.
Job #: 25-46303
Position: Cloud Database Admin/ETL Engineer
Location: Everett, MA (Mostly Remote)
Duration: Multi Year Renewable Contract
Hours: 37.5 Hours per week
Overview
Client is seeking a Cloud Database Administrator (DBA)/ETL Engineer to assist in maintaining, optimizing, modernizing, and troubleshooting our data warehouse, data mart and data portfolio. Under the direction of the Chief Applications Officer and the Data Engineering and Analytics Team Leads, the DBA/ETL/ELT Engineer will manage databases and data services hosted on cloud platforms, ensuring they are secure, performant, highly available, and compliant with governance policies.
The Engineer will be hands-on working with a team of cloud Engineers, ETL developers, technical leads, DBAs, project managers and analysts to design and implement the Education portfolio data, data pipelines, and transformations in a more stream-lined, scalable and cost-effective set of solutions.
Regular Responsibilities Include
• Create and manage cloud-native databases and services (e.g., RDS Oracle, Aurora, Postgres, Snowflake).
• Track and tune query execution, compute scaling, and storage performance.
• Define policies for snapshots, PITR (point-in-time recovery), and cross-region replication.
• Implement encryption, access policies, masking, and auditing to meet FERPA/PII standards.
• Manage schema migrations, data pipelines, and versioned deployments.
• Hands-on discovery, solution designing, re-platforming and troubleshooting to migrate the EOE Legacy SSIS ETL code to a SQL based solution with Apache Airflow for scheduling and dependency management.
• Tasks may include re-Engineering overall solution approach, constructing code packages, bug fixing, unit testing code and using GitHub code repository.
• Develop and guide implementation of Apache Airflow Scheduling and Dependency Framework.
• Performance tuning and optimizing solution implementation, benchmark the new solution against the on-prem solution to ensure that it performs comparably or better.
• Using Jira to review and work through assigned tasks.
• Using GitHub to check in and manage code, code reviews, pull requests.
Required
• Experience working with Oracle RDS.
• Experience working with one or more AWS Cloud services such as S3 storage, Managed Airflow (MWAA), Data Migration Service (DMS) in support of building data pipelines.
• Experience working with a variety of backend data sources (e.g. SQL Server, Oracle, Postgres, DynamoDB, Snowflake).
• Advanced SQL coding skills and ability to translate Oracle PL SQL and Stored Procedure code to alternative SQL platforms such as Snowflake.
• Familiarity with data warehouse and data mart concepts such as normalization, facts, dimensions, slowly changing dimensions.
• Familiarity with Change Data Capture (CDC) concepts and implementations. Knowledge of Kafka or similar replication tools is a plus.
• Understanding of common file formats such as JSON, XML, CSV.
• Basic experience using scripting tools to automate tasks (e.g.: Python, Windows PowerShell, bash).
• Ability to write unit test scripts and be able to validate migrated ELT/ETL code.
Preferred
• Experience configuring, managing, and troubleshooting Airflow tool. Knowledge of Airflow DAGs and concepts for managing dependent graphs and complex steps.
• Knowledge of Snowflake data warehouse features such as Snowpipe streaming, cloning, time travel, role-based-access control.
• Prior experience working at other large organizations, preferably state or federal government.
• Business domain knowledge in the Education and Student data management area.
• Experience working with software development tools such as GitHub, Jira.
Job #: 25-46303






