MVK Technology Inc.

Senior Data Engineer (PySpark + Databricks + Oracle/PostgreSQL Migration)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of unspecified duration, offering $60.00 - $70.00 per hour. Requires 9+ years of experience, expertise in PySpark, Databricks, Oracle, PostgreSQL migration, and strong SQL skills. On-site location.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
December 7, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alpharetta, GA 30005
-
🧠 - Skills detailed
#Oracle #Lambda (AWS Lambda) #Database Design #Databases #PostgreSQL #Data Engineering #Database Performance #Data Storage #Spark (Apache Spark) #Database Migration #Data Mapping #Database Architecture #Databricks #Airflow #Python #Triggers #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #ADF (Azure Data Factory) #PySpark #Indexing #Data Integrity #Automation #Data Pipeline #Cloud #Storage #Data Processing #Migration #SQL (Structured Query Language) #RDBMS (Relational Database Management System) #RDS (Amazon Relational Database Service) #S3 (Amazon Simple Storage Service) #Data Analysis
Role description
Note: Candidate should be on our own W2 Payroll. Interview Levels: 2 Rounds 1st Round-Webex Video 2nd round-Coding + F2F Exp Level: Min 9+ Years Expert-level proficiency in PySpark and Databricks. Primary Skill Focus: PySpark, Databricks (Hands-on coding test expected) Skills Required: 1. RDBMS Expertise: Strong hands-on experience with Oracle and PostgreSQL databases. Deep understanding of database architecture, design, indexing, and performance tuning. Expert-level SQL skills: queries, stored procedures, functions, triggers, and views. 1. Database Migration Proven experience in end-to-end database migration projects (preferred: Oracle → PostgreSQL). Strong ability to perform data mapping, transformation, validation, and reconciliation. Experience using migration tools, scripts, and automation frameworks. 1. Data Engineering & Analysis Advanced proficiency in Databricks for large-scale data processing. Expert in PySpark and Python for data transformation and analytics. Ability to build, enhance, and optimize complex ETL/ELT data pipelines. 1. Job Scheduling & Automation Experience creating and maintaining Databricks jobs for scheduled reporting. Familiarity with workflow orchestration tools (Airflow, ADF, Step Functions, etc.). 1. Performance Optimization Strong background in performance tuning for Oracle and PostgreSQL. Experience with index strategies, query optimization, execution plan analysis, and caching. 1. Cloud Platforms Good understanding of AWS Cloud, including RDS, S3, EMR, Lambda, Glue, or similar services. Responsibilities: 1. Database Design & Development: Design and develop robust database solutions that meet data storage and retrieval requirements. Create scripts and procedures to automate routine database tasks. 1. Migration & Implementation: Lead the migration process from Oracle to PostgreSQL, ensuring data integrity and minimal downtime. Develop comprehensive migration plans and execute them proficiently. 1. Support & Maintenance: Monitor database performance and implement necessary improvements. 1. Reporting & Analytics: Develop and maintain Databricks jobs for generating business reports and analytics. Provide insights from data analysis to support decision-making. Job Type: Contract Pay: $60.00 - $70.00 per hour Expected hours: 40 per week Work Location: In person