

Data Engineer (Big Data + ETL)-1
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer (Big Data + ETL) on a long-term contract in Research Triangle Park, NC or San Jose, CA. Key skills include Java, Scala, SQL, and experience with Hadoop, Spark, and cloud platforms.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
-
ποΈ - Date discovered
July 22, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Research Triangle Park, NC
-
π§ - Skills detailed
#Spark (Apache Spark) #ADF (Azure Data Factory) #Python #Databricks #Spring Boot #Azure #DevOps #AWS (Amazon Web Services) #Database Systems #SSIS (SQL Server Integration Services) #Big Data #Java #Data Engineering #Hadoop #Kafka (Apache Kafka) #Scala #Programming #Azure Data Factory #GCP (Google Cloud Platform) #"ETL (Extract #Transform #Load)" #Microservices #SQL (Structured Query Language) #Airflow #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Type: Contract
Job Category: IT
Job Description
Role: Data Engineer (Big data + ETL)
Location: Research Triangle Park, NC / San Jose, CA - Onsite
Long Term Contract β W2 / C2C
Job Description:
Proven experience in data engineering, software development, or related roles.
Proficiency in programming commonly used in data engineering - Java, Scala, Spring Boot, Microservices
Strong knowledge of database systems, data modelling techniques, and SQL proficiency.
Experience with big data technologies and frameworks (e.g., Hadoop, Spark, Kafka, etc.).
Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud Platform, etc.).
Excellent problem-solving skills and attention to detail.
Effective communication and collaboration skills in a team-oriented environment.
Ability to adapt to evolving technologies and business requirements.
Proficiency with ETL tools commonly used in data engineering (e.g., SSIS, Databricks, Azure Data Factory).
Hashtags: #DataEngineer #BigData #ETL #Hadoop #Spark #Kafka #DataPipelines #SQL #Airflow #CloudDataEngineering #HiringNow #DataJobs #DataEngineeringCareers #PythonForData #AWS #GCP #Azure #TechJobs #AnalyticsEngineering #JoinOurTeam
Required Skills DevOps Engineer