Big Data Engineer (Hadoop, AWS)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer (Hadoop, AWS) in Reston, VA, with a 12+ month contract. Requires 10+ years of experience, 7+ years in healthcare (preferably BCBS), and AWS Certified Big Data - Specialty certification. Local candidates only.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 24, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Reston, VA
🧠 - Skills detailed
#Data Access #Batch #Big Data #Data Extraction #Data Modeling #Data Warehouse #Scala #Data Lake #Java #Data Integration #Cloudera #Hadoop #AWS (Amazon Web Services) #Ab Initio #Cloud #NoSQL #Data Management #Database Design #Data Engineering #Automation #"ETL (Extract #Transform #Load)"
Role description

   • Title: Big Data Engineer (Hadoop, AWS)

   • Location: Reston, VA, Onsite-Hybrid.

   • Only Local to Washington DC, Maryland, Virginia or West Virginia states only.

   • Job Type: Contract, W2 position.

   • VISA Type: Only USC, GC, H4 EAD, L2. NO H1Bs.

   • Contract Length: 12 months+

   • Hours Per Week: 40

Job Description:

Seeking a Lead Big Data Engineer in Hadoop and AWS ecosystem in Healthcare industry; preferably BCBS! The Lead Data Engineer is responsible for orchestrating, deploying, maintaining and scaling cloud OR on-premise infrastructure targeting big data and platform data management (Relational and NoSQL, distributed and converged) with emphasis on reliability, automation and performance. This role will focus on leading the development of solutions and helping transform the company's platforms deliver data-driven, meaningful insights and value to company.

ESSENTIAL FUNCTIONS:

  1. 20% Lead the team to design, configure, implement, monitor, and manage all aspects of Data Integration Framework. Defines and develop the Data Integration best practices for the data management environment of optimal performance and reliability.

  1. 20% Develops and maintains infrastructure systems (e.g., data warehouses, data lakes) including data access APIs. Prepares and manipulates data using Hadoop or equivalent MapReduce platform.

  1. 15% Provides detailed guidance and performs work related to Modeling Data Warehouse solutions in the cloud OR on-premise. Understands Dimensional Modeling, De-normalized Data Structures, OLAP, and Data Warehousing concepts.

  1. 15% Oversees the delivery of engineering data initiatives and projects. Supports long term data initiatives as well as Ad-Hoc analysis and ELT/ETL activities. Creates data collection frameworks for structured and unstructured data. Applies data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.

  1. 15% Enforces the implementation of best practices for data auditing, scalability, reliability and application performance. Develop and apply data extraction, transformation and loading techniques in order to connect large data sets from a variety of sources.

  1. 10% Interprets data, analyzes results using statistical techniques, and provides ongoing reports. Executes quantitative analyses that translate data into actionable insights. Provides analytical and data-driven decision-making support for key projects. Designs, manages, and conducts quality control procedures for data sets using data from multiple systems.

  1. 5% Improves data delivery engineering job knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking state-of-the-art practices; participating in professional societies.

Required Skills:

  1. 10 years experience leading database design and ETL development. Experience in leading data engineering and cross functional teams to implement scalable and fine tuned ETL/ELT solutions (batch and streaming) for optimal performance. Experience developing and updating ETL/ELT scripts.

  1. Hands-on experience with Ab Initio ETL development in Hadoop and AWS ecosystem, relational database layout, development, data modeling.

  1. 7+ years hands on experience as a Big Data Engineer in Hadoop and AWS ecosystem in Healthcare industry; preferably BCBS.

  1. Hands on experience with developing application for batch data loads and data streaming using technologies using Cloudera/Hadoop and/or AWS technologies

  1. Healthcare industry required (preferable BCBS)

Licenses/Certifications:

  1. AWS Certified Big Data - Specialty (Must Have)

  1. Cloudera Certified Developer for Apache Hadoop (CCDH) (Must Have)

  1. OCP Java SE 6 Programmer Certification (Good to have)

  1. Must have LEAD experience at 10+ years of overall exp

  1. 7+ years hands on experience as a Big Data Engineer in Hadoop and AWS ecosystem in Healthcare industry; preferably BCBS.

  1. Hands on experience with developing application for batch data loads and data streaming using technologies using Cloudera/Hadoop and/or AWS technologies in Healthcare industry.