

W2 Contract || Title- Azure Data Engineer in 100% Remote || USC & GC Only on W2 || 12 Years Exp.
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Engineer with 12+ years of experience, offering a 6-12+ month W2 contract, 100% remote. Key skills include Azure, ADF, Databricks, and T-SQL. USC/GC candidates only; strong communication is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 21, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Kansas City, MO
-
π§ - Skills detailed
#Indexing #Database Design #DBA (Database Administrator) #Databases #Data Engineering #ADF (Azure Data Factory) #Azure #Databricks #SQL (Structured Query Language) #Microsoft SQL #Data Warehouse #"ETL (Extract #Transform #Load)" #SQL Queries #SQL Server #Database Systems #Data Pipeline #Visualization #Datasets #MS SQL (Microsoft SQL Server) #Microsoft SQL Server #System Testing
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Note: Candidate must be Work on W2 and USC & GC Visa only.
Title: Azure Data Engineer (12+ Years Experinece)
Location: Kansas City, MO 100% Remote
Duration: 6-12+ Months (W2 Contract)
Interview: Video
Visa: USC/GC (must work on our w2) (need strong communication)
Job Description
Make sure candidates are exceptionally communicating well while speaking with my business partner over the video call because he will be recoding the video and sharing to the hiring manager.
Don't send any profile those who are not comfortable to do video call with my business partner, it will be for 15-20 minutes.
Communication must be flawless
Resume should not be more than 6 pages
Must have valid LinkedIn profile with profile pic and and good number of connection and must be crreated before 2020.
Job Description-
Must have strong Azure, ADF and Databricks experience.
The purpose of this position is to perform Data Development functions which include: the design of new or enhancement of existing enterprise database systems; maintenance and/or development of critical data processes; unit and system testing; support and help desk tasks. It also requires defining and adopting best practices for each of the data development functions as well as visualization and ETL processes. This position is also responsible for architecting ETL functions between a multitude of relational databases and external data files.
Essential Duties And Responsibilities
β’ Work with a highly dynamic team focused on Digital Transformation.
β’ Understand the domain and business processes to implement successful data pipelines.
β’ Provide work status and coordinate with Data Engineers.
β’ Manage customer deliverables and regularly report the status via Weekly/Monthly reviews.
β’ Design, develop and maintain ETL processes as well as Stored Procedures, Functions and Views
β’ Program in T-SQL with relational databases including currently supported versions of Microsoft SQL Server.
β’ Write high performance SQL queries using Joins, Cross Apply, Aggregate Queries, Merge, Pivot.
β’ Design normalized database tables with proper indexing and constraints.
β’ Perform SQL query tuning and performance optimization on complex and inefficient queries.
β’ Provide guidance in the use of table variable, temporary table, CTE appropriately to deal with large datasets.
β’ Collaborate with DBA on database design and performance enhancements.
β’ Leading in all phases of the software development life cycle in a team environment.
β’ Debug existing code and troubleshoot for issues.
β’ Design and provide a framework for maintaining existing data warehouse for reporting and data analytics.
β’ Follow best practices, design, develop, test and document ETL processes.