

Techgene Solutions
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include core Java, Apache Spark, Azure Databricks, ETL processes, SQL, and NoSQL databases. Remote work location.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 14, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Version Control #Apache Spark #GitHub #Visualization #Azure Cosmos DB #Data Pipeline #"ETL (Extract #Transform #Load)" #Azure Data Factory #JUnit #Microsoft Azure #NoSQL #Hadoop #Azure Databricks #Programming #Databricks #Data Framework #SQL (Structured Query Language) #ADF (Azure Data Factory) #Databases #Maven #MySQL #Delta Lake #AWS (Amazon Web Services) #Unit Testing #Java #Big Data #Spark (Apache Spark) #Data Engineering #PySpark #Azure #Cloud #Database Management
Role description
Title: Data/Java Developer
Location: Remote
Job Description:
• High proficiency in core JAVA development language OOP, Collections, Multithreading, Data Structures and Exception Handling
• Proficiency with JUNIT testing framework, Maven build tool, GitHub version control and IntelliJ IDE.
• Proficient in Big Data Frameworks such as Apache Spark (preferred) or Hadoop
• Knowledgeable in Azure Databricks, Delta Lake, Spark Core, Azure Data Factory (ADF) and Unity Catalog
• Knowledgeable in analyzing data, finding data patterns, data visualization
• Ability to ingest and transform data using PySpark in Azure Databricks
• Understanding of key Data Warehousing and ETL/ELT Processes such as data pipelines and database management
• Ability to design, build, and maintain robust Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) pipelines
• Proficient in SQL and NoSQL databases, MySQL and Azure Cosmos DB are preferred
• Knowledgeable in object-oriented programming (OOP) concepts, design patterns, and general software architecture
• Knowledgeable in Cloud Platforms, such as Microsoft Azure (preferred) or AWS
Title: Data/Java Developer
Location: Remote
Job Description:
• High proficiency in core JAVA development language OOP, Collections, Multithreading, Data Structures and Exception Handling
• Proficiency with JUNIT testing framework, Maven build tool, GitHub version control and IntelliJ IDE.
• Proficient in Big Data Frameworks such as Apache Spark (preferred) or Hadoop
• Knowledgeable in Azure Databricks, Delta Lake, Spark Core, Azure Data Factory (ADF) and Unity Catalog
• Knowledgeable in analyzing data, finding data patterns, data visualization
• Ability to ingest and transform data using PySpark in Azure Databricks
• Understanding of key Data Warehousing and ETL/ELT Processes such as data pipelines and database management
• Ability to design, build, and maintain robust Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) pipelines
• Proficient in SQL and NoSQL databases, MySQL and Azure Cosmos DB are preferred
• Knowledgeable in object-oriented programming (OOP) concepts, design patterns, and general software architecture
• Knowledgeable in Cloud Platforms, such as Microsoft Azure (preferred) or AWS






