

Mega Cloud Lab
Data Engineer W2 Candidates Only
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 6+ years of experience in Databricks, Azure Data Factory, and PySpark. It is a remote position based in Minneapolis, MN, focusing on data modeling, cloud technologies, and data virtualization schemes.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
360
-
🗓️ - Date
April 30, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#ADF (Azure Data Factory) #Azure Data Factory #Delta Lake #Python #Data Pipeline #AI (Artificial Intelligence) #Snowflake #Data Engineering #Spark (Apache Spark) #GitHub #Virtualization #Cloud #Databricks #Data Modeling #SQL (Structured Query Language) #PySpark #Azure
Role description
Role - Data Engineer Only W2 Candidates
Location - Minneapolis, MN (Remote)
Job Details:
6+Years Databricks, Azure Data Factory, Databricks workflows, PySpark, Python, Databricks SQL
Config driven data pipelines
Databricks Genie conversational tool for AI driven insights using no code/low-code and code completion options using GitHub Copilot
Data Modeling: Star Schema and Snowflake Schema
Open table formats: Delta Lake and Iceberg
Cloud technologies: Azure is preferred
Data virtualization schemes
Job Details:
6+Years Databricks, Azure Data Factory, Databricks workflows, PySpark, Python, Databricks SQL
Config driven data pipelines
Databricks Genie conversational tool for AI driven insights using no code/low-code and code completion options using GitHub Copilot
Data Modeling: Star Schema and Snowflake Schema
Open table formats: Delta Lake and Iceberg
Cloud technologies: Azure is preferred
Data virtualization schemes
Skills: databricks,github,azure
Role - Data Engineer Only W2 Candidates
Location - Minneapolis, MN (Remote)
Job Details:
6+Years Databricks, Azure Data Factory, Databricks workflows, PySpark, Python, Databricks SQL
Config driven data pipelines
Databricks Genie conversational tool for AI driven insights using no code/low-code and code completion options using GitHub Copilot
Data Modeling: Star Schema and Snowflake Schema
Open table formats: Delta Lake and Iceberg
Cloud technologies: Azure is preferred
Data virtualization schemes
Job Details:
6+Years Databricks, Azure Data Factory, Databricks workflows, PySpark, Python, Databricks SQL
Config driven data pipelines
Databricks Genie conversational tool for AI driven insights using no code/low-code and code completion options using GitHub Copilot
Data Modeling: Star Schema and Snowflake Schema
Open table formats: Delta Lake and Iceberg
Cloud technologies: Azure is preferred
Data virtualization schemes
Skills: databricks,github,azure






