Skylar IT Consulting LLC

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "X months" and a pay rate of "$X per hour". Key skills include PySpark, Databricks, Informatica, Oracle, and experience with DLT and Expectation frameworks. Location: "Remote".
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Spark (Apache Spark) #Databricks #Unix #Airflow #Data Pipeline #Data Engineering #Data Architecture #Migration #Informatica #Oracle #Scala #Data Quality #PySpark #"ETL (Extract #Transform #Load)" #Cloud #Data Processing
Role description
Design, develop, and optimize data pipelines and transformations using PySpark and Databricks. Collaborate with data architects and analysts to define and implement data processing needs. Write production-grade PySpark/Scala code for ETL and data transformation. Integrate structured and unstructured data across cloud platforms. Manage and configure the Databricks environment, including clusters and notebooks. Troubleshoot and resolve issues related to Databricks performance and functionality. Exp in DLT Framework is critical Exp in Expectation framework is essential (Implement data quality checks and monitor processes.) Knowledge in Migration Project Informatica and Oracle skill is mandatory as the Program is migration from Informatica to Pyspark in DB Good exp in AIRFLOW DAG creation, UNIX