

Take2 Consulting, LLC
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Requires a Bachelor's degree in a STEM field, 10+ years of experience, Public Trust clearance, and expertise in data warehouse architecture, ETL processes, and AWS.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 8, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Yes
-
📍 - Location detailed
Virginia, United States
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #Data Management #MDM (Master Data Management) #Snowflake #Mathematics #Security #Data Warehouse #RDS (Amazon Relational Database Service) #Slowly Changing Dimensions #Amazon Redshift #Data Modeling #Amazon RDS (Amazon Relational Database Service) #Data Engineering #"ETL (Extract #Transform #Load)" #Data Cleansing #Visualization #Data Pipeline #Data Lake #Redshift #Data Analysis #Computer Science #Tableau #Metadata #Agile #Databricks #Oracle
Role description
Overview
This position involves developing and maintaining data infrastructure solutions for large-scale data warehouses, data lakes, and reporting systems. The role requires expertise in designing, building, and optimizing data pipelines and architectures that support reporting, analytics, and decision-making processes within an organization.
Education & Certification Requirements
Candidates should possess at least a Bachelor's degree in a STEM field such as Computer Science, Engineering, or Mathematics. A minimum of 10 years of relevant experience is preferred, with substitutions possible for experience in lieu of educational requirements.
Clearance Requirements
Public Trust clearance required.
Onsite Requirements
This role is a remote opportunity.
Responsibilities
• Design, develop, and implement data warehouse and data lake architectures to support reporting and analytics.
• Create and optimize data pipelines for structured, semi-structured, and unstructured data.
• Develop dimensional models and aggregate facts for reporting purposes.
• Manage slowly changing dimensions and master data management processes.
• Design ETL processes, including data cleansing and transformation.
• Develop report architecture and metadata models for reporting tools.
• Utilize ETL and data analytics tools to support reporting and data analysis.
• Collaborate within an Agile environment to deliver data solutions.
Qualifications
• 3+ years of experience with Data Warehouse / Data Lake architecture, design, and development.
• Proven experience in designing ETL architectures and implementing data cleansing.
• Strong knowledge of dimensional modeling and aggregation techniques.
• Extensive experience with ETL tools and data analytics software.
• Hands-on experience with AWS, Oracle, and Amazon RDS.
• Proficiency in report development and metadata modeling.
• Ability to introduce hardware or software solutions into existing environments with minimal disruption.
• Knowledge of Agile methodologies or the software development lifecycle (SDLC).
• Ability to obtain a security clearance.
Desired Skills
• Experience with Snowflake and/or Databricks.
• Familiarity with Amazon Redshift.
• Proficiency in Tableau for data visualization.
Overview
This position involves developing and maintaining data infrastructure solutions for large-scale data warehouses, data lakes, and reporting systems. The role requires expertise in designing, building, and optimizing data pipelines and architectures that support reporting, analytics, and decision-making processes within an organization.
Education & Certification Requirements
Candidates should possess at least a Bachelor's degree in a STEM field such as Computer Science, Engineering, or Mathematics. A minimum of 10 years of relevant experience is preferred, with substitutions possible for experience in lieu of educational requirements.
Clearance Requirements
Public Trust clearance required.
Onsite Requirements
This role is a remote opportunity.
Responsibilities
• Design, develop, and implement data warehouse and data lake architectures to support reporting and analytics.
• Create and optimize data pipelines for structured, semi-structured, and unstructured data.
• Develop dimensional models and aggregate facts for reporting purposes.
• Manage slowly changing dimensions and master data management processes.
• Design ETL processes, including data cleansing and transformation.
• Develop report architecture and metadata models for reporting tools.
• Utilize ETL and data analytics tools to support reporting and data analysis.
• Collaborate within an Agile environment to deliver data solutions.
Qualifications
• 3+ years of experience with Data Warehouse / Data Lake architecture, design, and development.
• Proven experience in designing ETL architectures and implementing data cleansing.
• Strong knowledge of dimensional modeling and aggregation techniques.
• Extensive experience with ETL tools and data analytics software.
• Hands-on experience with AWS, Oracle, and Amazon RDS.
• Proficiency in report development and metadata modeling.
• Ability to introduce hardware or software solutions into existing environments with minimal disruption.
• Knowledge of Agile methodologies or the software development lifecycle (SDLC).
• Ability to obtain a security clearance.
Desired Skills
• Experience with Snowflake and/or Databricks.
• Familiarity with Amazon Redshift.
• Proficiency in Tableau for data visualization.