

Techgene Solutions
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Located in Alpharetta, GA, it requires 9+ years of experience, expertise in PySpark, Databricks, Oracle/PostgreSQL migration, and strong SQL skills.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 5, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alpharetta, GA
-
🧠 - Skills detailed
#PySpark #RDBMS (Relational Database Management System) #Migration #Oracle #Data Mapping #Spark (Apache Spark) #Cloud #Database Migration #SQL (Structured Query Language) #Indexing #Lambda (AWS Lambda) #RDS (Amazon Relational Database Service) #AWS (Amazon Web Services) #Airflow #PostgreSQL #S3 (Amazon Simple Storage Service) #Databricks #Databases #Data Pipeline #Data Processing #Python #"ETL (Extract #Transform #Load)" #Database Architecture #ADF (Azure Data Factory) #Triggers #Data Engineering #Automation
Role description
Position: Senior Data Engineer (PySpark + Databricks + Oracle/PostgreSQL Migration)
Location: 5800 Windward Parkway, Alpharetta, GA 3000 (Only Locals) 3 Days onsite
Job type: Contract
Exp Level: Min 9+ Years
• Expert-level proficiency in PySpark and Databricks.
• Primary Skill Focus: PySpark, Databricks (Hands-on coding test expected)
Skills Required:
1. RDBMS Expertise:
• Strong hands-on experience with Oracle and PostgreSQL databases.
• Deep understanding of database architecture, design, indexing, and performance tuning.
• Expert-level SQL skills: queries, stored procedures, functions, triggers, and views.
1. Database Migration
• Proven experience in end-to-end database migration projects (preferred: Oracle → PostgreSQL).
• Strong ability to perform data mapping, transformation, validation, and reconciliation.
• Experience using migration tools, scripts, and automation frameworks.
1. Data Engineering & Analysis
• Advanced proficiency in Databricks for large-scale data processing.
• Expert in PySpark and Python for data transformation and analytics.
• Ability to build, enhance, and optimize complex ETL/ELT data pipelines.
1. Job Scheduling & Automation
• Experience creating and maintaining Databricks jobs for scheduled reporting.
• Familiarity with workflow orchestration tools (Airflow, ADF, Step Functions, etc.).
1. Performance Optimization
• Strong background in performance tuning for Oracle and PostgreSQL.
• Experience with index strategies, query optimization, execution plan analysis, and caching.
1. Cloud Platforms
• Good understanding of AWS Cloud, including RDS, S3, EMR, Lambda, Glue, or similar services.
Position: Senior Data Engineer (PySpark + Databricks + Oracle/PostgreSQL Migration)
Location: 5800 Windward Parkway, Alpharetta, GA 3000 (Only Locals) 3 Days onsite
Job type: Contract
Exp Level: Min 9+ Years
• Expert-level proficiency in PySpark and Databricks.
• Primary Skill Focus: PySpark, Databricks (Hands-on coding test expected)
Skills Required:
1. RDBMS Expertise:
• Strong hands-on experience with Oracle and PostgreSQL databases.
• Deep understanding of database architecture, design, indexing, and performance tuning.
• Expert-level SQL skills: queries, stored procedures, functions, triggers, and views.
1. Database Migration
• Proven experience in end-to-end database migration projects (preferred: Oracle → PostgreSQL).
• Strong ability to perform data mapping, transformation, validation, and reconciliation.
• Experience using migration tools, scripts, and automation frameworks.
1. Data Engineering & Analysis
• Advanced proficiency in Databricks for large-scale data processing.
• Expert in PySpark and Python for data transformation and analytics.
• Ability to build, enhance, and optimize complex ETL/ELT data pipelines.
1. Job Scheduling & Automation
• Experience creating and maintaining Databricks jobs for scheduled reporting.
• Familiarity with workflow orchestration tools (Airflow, ADF, Step Functions, etc.).
1. Performance Optimization
• Strong background in performance tuning for Oracle and PostgreSQL.
• Experience with index strategies, query optimization, execution plan analysis, and caching.
1. Cloud Platforms
• Good understanding of AWS Cloud, including RDS, S3, EMR, Lambda, Glue, or similar services.






