

BranCore Technologies, LLC
SQL Database Administrator – Contract Position
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a SQL Database Administrator with a contract until July 31, 2026, located in Richmond, VA. Requires 5+ years in database administration, AWS migration experience, Python scripting, and building AWS data pipelines. On-site 2-3 days a week.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 13, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Richmond, VA
-
🧠 - Skills detailed
#RDS (Amazon Relational Database Service) #API (Application Programming Interface) #SQL Server #AWS (Amazon Web Services) #Data Catalog #Data Processing #Data Management #AWS DMS (AWS Database Migration Service) #Migration #SQL (Structured Query Language) #AWS SCT (AWS Schema Conversion Tool) #"ETL (Extract #Transform #Load)" #Database Performance #GitHub #Data Modeling #Data Engineering #DynamoDB #AWS Glue #AWS Migration #S3 (Amazon Simple Storage Service) #Snowflake #PostgreSQL #Lambda (AWS Lambda) #Data Lake #Data Quality #Physical Data Model #Python #Data Migration #Database Administration #Datasets #Cloud #Monitoring #Amazon RDS (Amazon Relational Database Service) #Database Migration #Version Control #Databases #Schema Design #Data Pipeline #DBA (Database Administrator) #DMS (Data Migration Service)
Role description
Richmond, VA
Posted on March 12, 2026
Contract Length: Until July 31 but with possibility of extension if needed
•
• Local Richmond, VA candidates required.
ON SITE: 2-3 days a week ON SITE and then as requested by Mgr. Parking not provided for contractors
Role: Database Administrator / Data Engineer – AWS Migration
ABOUT THE ROLE
We are seeking a Database Administrator / Data Engineer with strong experience in migrating on-premises databases to AWS. The candidate will support data platform modernization by assisting with database migration, Python-based data processing, and building AWS data pipelines supporting applications using Amazon RDS and DynamoDB.
Responsible for supporting SQL Server/Cloud BAU operations. supports production stability and performance monitoring by building operational metrics and dashboards for database performance. Develop data quality and performance dashboards to monitor system efficiency and ensure reliable data for reporting and analytics.
The role requires hands-on experience in database migration, data modeling, and AWS data services.
Must Have Skills (Screening Required)
1. On-Prem SQL server to AWS and snowflake source Data Migration experience
Experience migration/archiving databases from on-premises environments to AWS
Experience with AWS Database Migration Service(DMS) or similar tools preferred
1. AWS Database Technologies
Strong experience with:
Amazon RDS
DynamoDB
1. Python for Data Processing
Experience developing Python scripts for data creation, transformation, and data loads
1. Data Pipeline Development
Experience building AWS-based data pipelines
Familiarity with services such as:
S3, Glue, Lambda
1. Data Modeling
Experience designing logical and physical data models
Ability to support application teams with schema design
Preferred Skills
Experience with AWS Schema Conversion Tool (SCT)and data migration service (DMS)
Experience working with large datasets and ETL pipelines
Knowledge of data lake architecture
Experience with PostgreSQL, SQL Server, Dynamo Db
Experience supporting application data environments
Experience Required
5+ years in Database Administration, Data Engineering, or similar role
3+ years of AWS and Snowflake experience
Responsibilities
Support migration of on-prem databases to AWS
Develop Python scripts for data generation and bulk loads
Build and maintain AWS-based data pipelines
Design and support data models
Support application teams using RDS and DynamoDB
Assist with data validation, migration testing, and performance tuning
Required/Desired Skills
On-Prem SQL server to AWS and snowflake source Data Migration experience
Required
5
Years
Experience migration/archiving databases from on-premises environments to AWS
Required
5
Years
Experience with Schema Conversion Tool (SCT) and AWS Database Migration Service (DMS) or similar tools preferred
Required
5
Years
Experience developing Python scripts for data creation, transformation, and data loads
Required
5
Years
Experience building AWS-based data pipelines using s3, Glue, Lambda
Required
5
Years
Experience supporting application data environments, Experience with PostgreSQL, SQL Server, Dynamo Db
Required
5
Years
Hands on experience in using AWS db services including S3, RDS, postgres, DynamoDB, and Snowflake.
Required
5
Years
Core experience of configure data management practices on AWS using native services such as AWS Glue, AWS Data Zone, AWS Glue Data Catalog,API Gateway
Required
5
Years
Familiarity with version control using tools like Github, and AWS Code Pipeline
Required
5
Years
5+ years in Database Administration, Data Engineering, or similar role (AWS, Snowflake) and Sql server
Required
5
Year
Richmond, VA
Posted on March 12, 2026
Contract Length: Until July 31 but with possibility of extension if needed
•
• Local Richmond, VA candidates required.
ON SITE: 2-3 days a week ON SITE and then as requested by Mgr. Parking not provided for contractors
Role: Database Administrator / Data Engineer – AWS Migration
ABOUT THE ROLE
We are seeking a Database Administrator / Data Engineer with strong experience in migrating on-premises databases to AWS. The candidate will support data platform modernization by assisting with database migration, Python-based data processing, and building AWS data pipelines supporting applications using Amazon RDS and DynamoDB.
Responsible for supporting SQL Server/Cloud BAU operations. supports production stability and performance monitoring by building operational metrics and dashboards for database performance. Develop data quality and performance dashboards to monitor system efficiency and ensure reliable data for reporting and analytics.
The role requires hands-on experience in database migration, data modeling, and AWS data services.
Must Have Skills (Screening Required)
1. On-Prem SQL server to AWS and snowflake source Data Migration experience
Experience migration/archiving databases from on-premises environments to AWS
Experience with AWS Database Migration Service(DMS) or similar tools preferred
1. AWS Database Technologies
Strong experience with:
Amazon RDS
DynamoDB
1. Python for Data Processing
Experience developing Python scripts for data creation, transformation, and data loads
1. Data Pipeline Development
Experience building AWS-based data pipelines
Familiarity with services such as:
S3, Glue, Lambda
1. Data Modeling
Experience designing logical and physical data models
Ability to support application teams with schema design
Preferred Skills
Experience with AWS Schema Conversion Tool (SCT)and data migration service (DMS)
Experience working with large datasets and ETL pipelines
Knowledge of data lake architecture
Experience with PostgreSQL, SQL Server, Dynamo Db
Experience supporting application data environments
Experience Required
5+ years in Database Administration, Data Engineering, or similar role
3+ years of AWS and Snowflake experience
Responsibilities
Support migration of on-prem databases to AWS
Develop Python scripts for data generation and bulk loads
Build and maintain AWS-based data pipelines
Design and support data models
Support application teams using RDS and DynamoDB
Assist with data validation, migration testing, and performance tuning
Required/Desired Skills
On-Prem SQL server to AWS and snowflake source Data Migration experience
Required
5
Years
Experience migration/archiving databases from on-premises environments to AWS
Required
5
Years
Experience with Schema Conversion Tool (SCT) and AWS Database Migration Service (DMS) or similar tools preferred
Required
5
Years
Experience developing Python scripts for data creation, transformation, and data loads
Required
5
Years
Experience building AWS-based data pipelines using s3, Glue, Lambda
Required
5
Years
Experience supporting application data environments, Experience with PostgreSQL, SQL Server, Dynamo Db
Required
5
Years
Hands on experience in using AWS db services including S3, RDS, postgres, DynamoDB, and Snowflake.
Required
5
Years
Core experience of configure data management practices on AWS using native services such as AWS Glue, AWS Data Zone, AWS Glue Data Catalog,API Gateway
Required
5
Years
Familiarity with version control using tools like Github, and AWS Code Pipeline
Required
5
Years
5+ years in Database Administration, Data Engineering, or similar role (AWS, Snowflake) and Sql server
Required
5
Year






