

Backend Data Engineer - 3 Month Assignment
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Backend Data Engineer on a 3-month remote contract, offering a pay rate of $86,311.33 - $103,944.82 per year. Key skills include proficiency in Python, Java, or Scala, and experience with SQL, NoSQL, and big data technologies like Spark and Kafka.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
472.4727272727
-
ποΈ - Date discovered
July 29, 2025
π - Project duration
3 to 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#Data Science #DevOps #Data Pipeline #Data Quality #Datasets #Automation #Java #Kafka (Apache Kafka) #Big Data #AWS (Amazon Web Services) #Azure #Python #Data Processing #Storage #Hadoop #Spark (Apache Spark) #Scala #Databases #Monitoring #"ETL (Extract #Transform #Load)" #Data Engineering #Cloud #Data Accuracy #Programming #NoSQL #Data Storage #SQL (Structured Query Language)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Core Responsibilities:
Building and maintaining APIs:
.
Backend Data Engineers create the interfaces that allow front-end applications to interact with data and services.
Designing and managing databases:
.
They work with both relational (SQL) and NoSQL databases, optimizing for performance and scalability.
Developing data pipelines:
.
They design and implement systems for ingesting, transforming, and storing large volumes of data.
Ensuring data quality and reliability:
.
They focus on data validation, cleaning, and monitoring to maintain data accuracy and consistency.
Working with big data technologies:
.
They often utilize tools like Spark, Hadoop, and Kafka for processing and analyzing large datasets.
Collaborating with other teams:
.
They work closely with front-end developers, data scientists, and other engineers to deliver data-powered solutions.
Key Skills:
Strong programming skills: Proficiency in languages like Python, Java, or Scala is essential.
Database expertise: Deep understanding of SQL and NoSQL databases, including query optimization.
Experience with data processing frameworks: Familiarity with tools like Spark, Hadoop, and Kafka for big data processing.
Cloud computing knowledge: Experience with cloud platforms like AWS, Azure, or Google Cloud for data storage and processing.
Understanding of DevOps principles: Familiarity with CI/CD pipelines and automation for data pipelines.
Problem-solving and analytical skills: The ability to troubleshoot complex data issues and design scalable solutions.
Communication and collaboration skills: The ability to work effectively in a team and communicate technical concepts to both technical and non-technical audiences.
Job Types: Full-time, Contract
Pay: $86,311.33 - $103,944.82 per year
Work Location: Remote