Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 9, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Indiana, United States
-
🧠 - Skills detailed
#Data Warehouse #Scala #SQL (Structured Query Language) #Data Modeling #Python #Data Manipulation #Data Lake #Big Data #dbt (data build tool) #"ETL (Extract #Transform #Load)" #GCP (Google Cloud Platform) #Security #Data Engineering #Cloud #SQL Queries #Data Quality #Data Pipeline
Role description
Role: Sr. Data Engineer Location: Indiana, United States Duration: Long Term Contract About the Role We are seeking a highly skilled Senior Data Engineer with extensive experience in Big Data technologies to join our team. The ideal candidate will have a proven track record of designing, developing, and maintaining large-scale data solutions (100+ TB) and driving modern data engineering practices. Key Responsibilities β€’ Design, build, and maintain scalable and efficient data pipelines and ETL processes. β€’ Develop and optimize SQL queries for large-scale data manipulation. β€’ Build and maintain data models, data dictionaries, and ERDs for enterprise data systems. β€’ Collaborate with cross-functional teams to implement data solutions on Google Cloud Platform (GCP). β€’ Leverage DBT (Data Build Tool) to transform and model data. β€’ Ensure data quality, governance, and security best practices across the data ecosystem. β€’ Stay up-to-date with modern data technologies and recommend improvements to existing data infrastructure. Required Qualifications β€’ 12+ years of hands-on experience as a Data Engineer in large-scale data environments. β€’ Strong proficiency in SQL and Python. β€’ Expertise in designing and maintaining ERDs (Entity Relationship Diagrams). β€’ Hands-on experience with Google Cloud Platform (GCP) data services. β€’ Proven experience with DBT (Data Build Tool) for data modeling and transformation. β€’ Solid understanding of modern data technologies and architectures (Data Lakes, Data Warehouses, ELT/ETL frameworks, streaming pipelines, etc.). β€’ Excellent problem-solving, communication, and collaboration skills.