Diligent Tec, Inc

Senior Data Engineer

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in Alpharetta, GA, on a W2 contract for 10+ years of experience. Key skills include PySpark, Databricks, Oracle/PostgreSQL migration, RDBMS expertise, and cloud platforms like AWS.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
Unknown
-
๐Ÿ—“๏ธ - Date
December 4, 2025
๐Ÿ•’ - Duration
Unknown
-
๐Ÿ๏ธ - Location
On-site
-
๐Ÿ“„ - Contract
W2 Contractor
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
Alpharetta, GA
-
๐Ÿง  - Skills detailed
#Python #Spark (Apache Spark) #Database Migration #SQL (Structured Query Language) #Data Processing #Oracle #Data Pipeline #Indexing #PostgreSQL #ADF (Azure Data Factory) #Airflow #Database Architecture #Triggers #"ETL (Extract #Transform #Load)" #S3 (Amazon Simple Storage Service) #PySpark #RDS (Amazon Relational Database Service) #Automation #RDBMS (Relational Database Management System) #Migration #AWS (Amazon Web Services) #Lambda (AWS Lambda) #Databases #Cloud #Data Mapping #Data Engineering #Databricks
Role description
Position: Senior Data Engineer (PySpark + Databricks + Oracle/PostgreSQL Migration) Location: Alpharetta, GA Job type: Contract ONLY W2 Interview Levels: 2 Rounds 1st Round-Webex Video 2nd round-Coding + F2F Exp Level: Min 10+ Years Expert-level proficiency in PySpark and Databricks. Primary Skill Focus: PySpark, Databricks (Hands-on coding test expected) Skills Required: 1. RDBMS Expertise: Strong hands-on experience with Oracle and PostgreSQL databases. Deep understanding of database architecture, design, indexing, and performance tuning. Expert-level SQL skills: queries, stored procedures, functions, triggers, and views. 2\.ย Database Migration Proven experience in end-to-end database migration projects (preferred: Oracle โ†’ PostgreSQL). Strong ability to perform data mapping, transformation, validation, and reconciliation. Experience using migration tools, scripts, and automation frameworks. 3\.ย Data Engineering & Analysis Advanced proficiency in Databricks for large-scale data processing. Expert in PySpark and Python for data transformation and analytics. Ability to build, enhance, and optimize complex ETL/ELT data pipelines. 4\.ย Job Scheduling & Automation Experience creating and maintaining Databricks jobs for scheduled reporting. Familiarity with workflow orchestration tools (Airflow, ADF, Step Functions, etc.). 5\.ย Performance Optimization Strong background in performance tuning for Oracle and PostgreSQL. Experience with index strategies, query optimization, execution plan analysis, and caching. 6\.ย Cloud Platforms Good understanding of AWS Cloud, including RDS, S3, EMR, Lambda, Glue, or similar services.