

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for a 6-month contract, offering a pay rate of "unknown." Remote work is available. Requires 10+ years of experience in Spark, Spring Batch, Java, and cloud solutions like AWS and Snowflake.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
-
ποΈ - Date discovered
July 10, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Unknown
-
π - Contract type
Fixed Term
-
π - Security clearance
Unknown
-
π - Location detailed
Durham, NC 27709
-
π§ - Skills detailed
#Data Engineering #Data Aggregation #NoSQL #REST (Representational State Transfer) #Datasets #"ETL (Extract #Transform #Load)" #Public Cloud #Batch #Scala #Code Reviews #Spark (Apache Spark) #Programming #Data Lake #Snowflake #Java #Big Data #Cloud #API (Application Programming Interface) #SQL (Structured Query Language) #AWS (Amazon Web Services) #Computer Science #Data Management #Storage #Apache Spark #Data Storage
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
6 Month Contract The Role
We are seeking a highly motivated Full Stack Engineer to join the Data Aggregation team at Fidelity Brokerage. Data Aggregation is a growing area and we are looking for a skilled engineer to drive design and development of industry leading external facing API solutions. The comprehensive API / data solutions will seek to bring together retail, clearing and custody capabilities to help external fintech partners with the financial goal planning, investment advice and financial projections capabilities to better serve our clients and more efficiently partner with them to accomplish their financial objectives.
The Expertise You Have
Bachelorβs degree in computer science, Information Systems, or a related field
proven track record in data engineering
10+ yearsβ experience developing Spark or Spring Batch Services for Data movement.
Schedule, Monitor and debug ETL Spring Batch and Spark Batch.
Hands on Experience with Java clients for consuming REST and SOAP APIs and Scala spark batch application.
Develop, Test, Deploy and maintain ETL Batch Jobs using Spring Batch and Apache Spark/EMR Jobs.
Utilizing Apache Spark cluster-computing framework to process big data and write to No-SQL Database ex Cassandra or Yugabyte.
Experience with cloud-based data warehousing and data lake solutions such as Snowflake
Experience with data storage and data management for large datasets like parquet, HDF5, etc.
Proven experience in building and deploying software solutions utilizing public cloud provider services like AWS.
The Skills You Bring
Champions innovative technology solutions to resolve sophisticated business problems
Works across groups to find opportunities for organization-wide technology initiatives
Brings external information, ideas, and expertise back to the team
Good understanding of the software development process including analysis, design, coding, system and user testing, problem resolution and planning
Identify creative ways to drive desired outcomes and promote culture of innovation by setting examples.
Collaborate with peers through code reviews, pair programming or interactive discussions daily.
You are comfortable working across multiple squads and adaptable to change.
Embrace customer-first mentality and enjoy developing user-friendly internet facing web applications that solve real life problems.
Have a passion for following outstanding software engineering practices and always looking to improve your engineering skills and industry knowledge.
Bring a data-driven & collaborative approach to decision making, both in day-to-day work and in making strategic trade-offs.