

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in NJ (onsite) with a contract length of "unknown" and a pay rate of "unknown." Key skills required include Azure Data Factory, Databricks, Python, PySpark, and data modeling. Experience with Snowflake is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 16, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New Jersey, United States
-
π§ - Skills detailed
#R #Data Processing #ADF (Azure Data Factory) #Physical Data Model #Azure #"ETL (Extract #Transform #Load)" #Databricks #Data Warehouse #Data Pipeline #DevOps #Scala #Azure DevOps #Cloud #Spark (Apache Spark) #PySpark #Azure Data Factory #Automation #Monitoring #Snowflake #Data Modeling #ERWin #Deployment #Data Engineering #Data Quality #Python
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Role: Data Engineer
Location: NJ(Onsite)
Responsibilities:
β’ Design and implement scalable data pipelines using Azure Data Factory and Databricks, integrating with external APIs and cloud services.
β’ Leverage Python and PySpark for distributed data processing, transformation, and automation across cloud platforms.
β’ Implement and manage Unity Catalog for secure, governed access to data assets within Databricks.
β’ Build and optimize cloud-based data warehouses using Snowflake and Databricks, ensuring performance and cost efficiency.
β’ Ensure data quality and integrity through validation, cleansing, and transformation operations.
β’ Collaborate with DevOps teams to develop CI/CD pipelines using Azure DevOps for automated deployment and monitoring.
β’ Translate business requirements into conceptual, logical, and physical data models to support enterprise data platforms
β’ Apply data modeling techniques and dimensional modeling using tools like Erwin ensuring scalability and performance
Regards
Praveen Kumar
Talent Acquisition Group β Strategic Recruitment Manager
praveen.r@themesoft.com| Themesoft Inc