

Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a 12-month remote contract, offering competitive pay. Key skills include Azure stack, SQL Server, Snowflake, and programming in Python, SQL, C#, and Java. A Bachelor’s in Computer Science or related field is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 6, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Informatica #Scala #SQL Server #SQL (Structured Query Language) #Databases #Azure #C# #Security #Snowflake #Cloud #"ETL (Extract #Transform #Load)" #MDM (Master Data Management) #Computer Science #Data Pipeline #AI (Artificial Intelligence) #Data Engineering #Programming #Python #Databricks #Synapse #Java #Informatica Cloud #Data Quality
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Dice is the leading career destination for tech experts at every stage of their careers. Our client, Unique System Skills LLC, is seeking the following. Apply via Dice today!
We have an opening for Data Engineer with Remote. Kindly apply with your updated resume if you are interested. Please find the job details below:
Position: Data Engineering
Location: Remote
Duration: 12 Month Contract position
Description:
• Design, build, and manage scalable data pipelines and infrastructure
• Integrate data from various sources including APIs, databases, and external systems
• Transform raw data through enrichment, aggregation, and filtering techniques
• Ensure data quality, security, governance, and consistency across pipelines
• Collaborate with architects, analysts, AI engineers, and other stakeholders
• Monitor performance and optimize systems for speed and efficiency
• Lead innovation efforts and take ownership of delivering high-value data assets
• Translate complex data requirements into actionable engineering tasks
• Azure stack (Data Factory, Databricks, Synapse, CosmosDB, HDInsight)
• SQL Server (IaaS/PaaS)
• Snowflake, Informatica cloud tools (CD CIH, DIH, MDM, DQ)
• Programming: Python, SQL, C#, Java
• Strong communication skills with both technical and non-technical audiences
• Bachelor s in Computer Science, Information Systems, or similar