Tekskills Inc.

Azure Data Engineer- Only W2

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer in Alpharetta, GA, for 12+ months at a W2 pay rate. Key skills include Azure, Databricks, Python, PySpark, and Scala. Experience with enterprise-level data ingestion and transformation is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 4, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Alpharetta, GA
-
🧠 - Skills detailed
#Storage #Python #Spark (Apache Spark) #Data Warehouse #SQL (Structured Query Language) #Java #Scala #Data Pipeline #Apache Spark #Spark SQL #Azure #Data Architecture #Azure Databricks #Hadoop #"ETL (Extract #Transform #Load)" #PySpark #Unix #Data Access #Databases #Data Ingestion #Data Engineering #Databricks
Role description
Job Title: Azure Data Engineer Location: Alpharetta, GA, 30005 (hybrid) Duration: 12+ Months Must Have Skills: β€’ Azure β€’ Databricks β€’ Python β€’ PySpark β€’ Scala Nice to Have Skills: β€’ Unix β€’ Hadoop Detailed Job Description: β€’ Proficiency in Technologies like Azure, Databricks, PySpark, Python, Java, or Scala β€’ Design, build, and maintain scalable data pipelines that extract, transform, and load data from various sources into storage systems β€’ Develop and manage data architecture, including databases and data warehouses, to ensure data is organized, accessible, and efficient β€’ Work with business teams to understand their needs and deliver the data solutions they require β€’ Develop and maintain tools for data access and automate manual processes to improve efficiency. β€’ Implement large-scale ETL/ELT pipelines using tools like Apache Spark, PySpark, Spark SQL β€’ Experienced in data ingestion and transformation at enterprise level β€’ Strong analytical and problem-solving skills to troubleshoot data-related issues β€’ Excellent communication skills to work effectively with cross-functional teams.