CDI Solutions

Senior Azure Data Engineer Only W2 & Local Remote

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Azure Data Engineer with 10+ years of experience, primarily remote (local to NC), offering competitive W2 pay. Key skills include Azure SQL Server, Azure Data Lakes, and Data Factory. A Bachelor's degree in a related field is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 14, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Remote
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
North Carolina, United States
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #ADF (Azure Data Factory) #Scala #Databricks #Azure Data Factory #Redshift #BigQuery #Database Design #Programming #Microsoft Power BI #SQL Server #Data Management #BI (Business Intelligence) #Azure cloud #Computer Science #Consulting #DataOps #Data Engineering #Data Modeling #Data Science #Cloud #Data Lake #SQL (Structured Query Language) #Apache Kafka #Kafka (Apache Kafka) #Python #Data Governance #Tableau #Data Lineage #Azure #"ETL (Extract #Transform #Load)" #Looker #NoSQL #Microsoft Azure #Azure SQL #Snowflake #AWS Kinesis #Synapse #Java #Metadata
Role description
Hello One of my Client is Hiring for Senior Azure Data Engineer - Local to NC Role: Senior Azure Data Engineer - Local to NC- 10+ Years Exp profile only Location:Remote - Local to NC -- Anywhere in NC is okay Pay: (on my Client W2 directly) β€’ Will be mostly remote with 1 day onsite per month β€’ Top Skills: 1. Extensive Azure SQL Server experience--optimizing table, creating indexes, use analytics to set access, script creation 1. Azure Data Lakes--converting from source to another 1. Azure Data Factory--move data btw sources using ETL 1. Data Modeling 1. Experience working in a Greenfield environment Project: working on multiple projects to optimize SQL Server schema, entire Data Modeling and Analytics of greenfield, ETL, taking documents & extracting data for reports, etc. Qualifications Required Bachelor's degree in Computer Science, Engineering, Data Science, or a related quantitative field, or equivalent practical experience. 5+ years of progressive experience in data engineering, with a strong portfolio demonstrating expertise in building and managing large-scale data solutions. Proficiency in at least one major programming language used for data engineering (e.g., Python, Scala, Java). Extensive experience with modern data warehousing and/or data lake technologies (e.g., Snowflake, Databricks, Google BigQuery, Azure Synapse/Data Lake, AWS Redshift). Demonstrated experience designing and implementing ETL/ELT pipelines using various tools and frameworks. Strong understanding of data modeling principles, database design (relational and NoSQL), and SQL optimization. Deep experience with Microsoft Azure cloud services including data-related services. Excellent problem-solving, analytical, and critical thinking skills. Strong written and verbal communication skills, with the ability to articulate technical concepts to diverse audiences. Proven ability to be self-driven, adaptable, and manage multiple priorities in a fast-paced environment. Prior experience in a consulting or client-facing role. Preferred Familiarity with streaming data technologies (e.g., Apache Kafka, AWS Kinesis, Google Pub/Sub). Experience implementing DataOps principles and CI/CD pipelines for data solutions. Knowledge of data governance, metadata management, and data lineage tools. Experience with business intelligence tools (e.g., Tableau, Power BI, Looker).