Sr. Data / Big Data Engineer - Cloudera

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Sr. Data/Big Data Engineer - Cloudera for a contract position (length unspecified) in Reston, VA (mostly remote, 1 day onsite/month). Requires a Master’s degree, 10+ years of experience, AWS and Cloudera certifications, and expertise in ETL/ELT solutions.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 23, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Reston, VA
🧠 - Skills detailed
#Data Engineering #Data Modeling #"ETL (Extract #Transform #Load)" #Java #Data Integration #NoSQL #Ab Initio #Databases #Python #Data Access #Database Design #Batch #Big Data #Data Lake #Scala #Programming #Hadoop #Data Extraction #Data Warehouse #Leadership #Data Management #Cloudera #SQL (Structured Query Language) #AWS (Amazon Web Services) #Computer Science #Automation #Cloud
Role description

Job Title: Sr. Data Engineer – Data Intelligence

Job Location: Reston, VA ( Mostly Remote – 1 Day in a Month Onsite )

Job Type: Contract

Job Summary:

The Lead Data Engineer is responsible for orchestrating, deploying, maintaining, and scaling cloud or on-premise infrastructure for big data and platform data management. This includes relational and NoSQL databases, distributed systems, and converged platforms, with a strong focus on reliability, automation, and performance. This role leads the development of data solutions, enabling the organization to derive meaningful insights and business value.

Essential Functions:

   • 20% Lead the design, implementation, and management of the Data Integration Framework, ensuring best practices for performance and reliability in data management.

   • 20% Develop and maintain data infrastructure (e.g., data warehouses, data lakes) and data access APIs, leveraging Hadoop or equivalent MapReduce platforms.

   • 15% Provide technical leadership in data modeling for cloud or on-premise Data Warehousing solutions, including Dimensional Modeling, De-normalized Data Structures, OLAP, and Data Warehousing concepts.

   • 15% Oversee the delivery of data engineering initiatives, supporting long-term data projects, ad hoc analysis, and ELT/ETL activities. Develop frameworks for structured and unstructured data and implement data extraction, transformation, and loading (ETL/ELT) techniques.

   • 15% Enforce best practices for data auditing, scalability, reliability, and performance optimization.

   • 10% Analyze data using statistical techniques, generate insights, and provide decision-making support for key projects. Ensure quality control and integrity of data sets from multiple sources.

   • 5% Continuously enhance technical knowledge through professional development, including educational workshops, networking, and industry benchmarking.

Qualifications:

Education & Experience:

   • Required:

   • Master’s degree in Information Technology, Computer Science, or Business Analytics.

   • 10+ years of experience in database design and ETL development.

   • Proven leadership in data engineering and cross-functional teams, implementing scalable ETL/ELT solutions (batch and streaming) for optimal performance.

   • Hands-on experience with Ab Initio ETL development in Hadoop and AWS ecosystems, relational databases, and data modeling.

   • In lieu of a Master’s degree: An additional 6 years of relevant work experience.

Preferred Skills & Expertise:

   • Expert proficiency in at least one programming language (SQL, NoSQL, Python).

   • Strong knowledge of structured and unstructured database technologies.

   • Ability to quickly learn new technologies and adapt to changing environments.

   • Strong customer service orientation and stakeholder management skills.

   • Proven ability to lead technical teams and drive Big Data and AWS initiatives.

   • Excellent organizational skills with the ability to handle multiple priorities.

   • Outstanding verbal and written communication skills.

Work Environment & Expectations:

   • Must work effectively in a fast-paced environment with changing priorities and deadlines.

   • Strong customer service and communication skills, including handling demanding or challenging stakeholders.

   • Must disclose any debarment or exclusion from working on Federal healthcare programs.

Licenses/Certifications:

   • Required:

   • AWS Certified Big Data – Specialty

   • Cloudera Certified Developer for Apache Hadoop (CCDH)

   • Preferred:

   • OCP Java SE 6 Programmer Certification