

Mid-Level Database Administrator – Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Mid-Level Database Administrator – Data Engineer for a 6-month contract, offering a competitive pay rate. Key skills include PostgreSQL, MySQL, AWS, Azure, and data pipeline development. Requires 5+ years of relevant experience and strong automation scripting abilities.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
July 30, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Austin, Texas Metropolitan Area
-
🧠 - Skills detailed
#Batch #AWS RDS (Amazon Relational Database Service) #Cloud #PostgreSQL #Security #Aurora #Terraform #Database Maintenance #S3 (Amazon Simple Storage Service) #Databases #Prometheus #Scripting #dbt (data build tool) #Observability #Azure #Bash #Anomaly Detection #Lambda (AWS Lambda) #Datadog #Datasets #Automation #Database Administration #Replication #DBA (Database Administrator) #Data Integration #Data Engineering #Data Quality #Data Lake #Python #Scala #RDS (Amazon Relational Database Service) #Airflow #Data Architecture #Data Pipeline #Indexing #MySQL #Disaster Recovery #Grafana #AWS (Amazon Web Services) #Monitoring #Compliance #GitLab #AI (Artificial Intelligence) #Kafka (Apache Kafka)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Verified Job On Employer Career Site
Job Summary:
Simplex is a late-stage health-tech company focused on integrating AI into their data workflows. They are seeking a Contract DBA / Data Engineer who will own and optimize database platforms while building scalable data pipelines and analytical infrastructure across AWS and Azure.
Responsibilities:
• Lead operational management of PostgreSQL and MySQL databases hosted in AWS (RDS, Aurora) and Azure.
• Ensure database availability, performance, backups, replication, and disaster recovery.
• Continuously optimize database queries and indexing to support product performance at scale.
• Implement AI-augmented tooling and automation for monitoring, tuning, and routine database maintenance.
• Design, build, and maintain scalable data pipelines that support real-time and batch processing.
• Develop data integration workflows across internal and external data sources, including APIs and partner systems.
• Support data lake and warehouse infrastructure, optimizing for performance and cost-efficiency.
• Define and enforce best practices for data quality, lineage, and schema governance.
• Enable analytics, AI, and product teams by delivering trusted, well-structured datasets and models.
• Identify opportunities to incorporate AI into data workflows (e.g., anomaly detection, data validation, pipeline optimization).
• Champion adoption of AI-based tools and automation across database and data engineering operations.
• Experiment with and integrate modern AI agents, copilots, and models to accelerate data workflows.
Qualifications:
Required:
• 5+ years of experience in a combined Database Administration and/or Data Engineering role.
• Deep experience administering PostgreSQL and MySQL in production, cloud-native environments.
• Strong experience building and scaling data pipelines with tools like Airflow, DBT, or similar.
• Hands-on skills with AWS (including RDS, Aurora, S3, Glue, Lambda) and familiarity with Azure data services.
• Proficient in Terraform and modern CI/CD tools (e.g., GitLab CI/CD).
• Strong scripting ability (e.g., Python, Bash) for automation and workflow orchestration.
• Committed to working in an AI-first environment—excited by rapid learning and tool integration.
• Detail-oriented problem solver who values performance, scalability, and reliability.
• Collaborative, communicative, and comfortable in a fast-paced, high-growth environment.
Preferred:
• AWS or Azure certifications.
• Experience with stream processing (e.g., Kafka, Kinesis) and real-time data architecture.
• Familiarity with observability stacks like Prometheus, Grafana, or Datadog.
• Exposure to security, compliance, and auditing frameworks (HIPAA, SOC2, etc.).
• Passion for modern AI tooling and its impact on data infrastructure and engineering workflows.
Company:
10 years in third party agency and corporate recruiting with a specialization in IT/Engineering placements in high tech companies. Founded in 2012, the company is headquartered in Austin, TX, US, with a team of 0-1 employees. The company is currently Early Stage.