LogicsT Technologies

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown", offering a pay rate of "unknown". Key skills required include AWS, Snowflake, DBT, SQL, and Python. Candidates should have 8+ years of Data Engineering experience in a cloud-native environment.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
December 4, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Python #Version Control #Agile #Data Science #SQL (Structured Query Language) #Scala #dbt (data build tool) #Data Pipeline #Data Vault #Compliance #Airflow #Data Architecture #Redshift #Monitoring #Leadership #Deployment #"ETL (Extract #Transform #Load)" #Vault #GIT #Snowflake #Documentation #S3 (Amazon Simple Storage Service) #Data Governance #Batch #Data Analysis #Automation #Security #IAM (Identity and Access Management) #Migration #Lambda (AWS Lambda) #Cloud #BI (Business Intelligence) #AWS (Amazon Web Services) #Data Engineering #ML (Machine Learning)
Role description
Job Description We are seeking a highly skilled Architect / Senior Data Engineer to design, build, and optimise our modern data ecosystem. The ideal candidate will have deep experience with AWS cloud services, Snowflake, and DBT, along with a strong understanding of scalable data architecture, ETL/ELT development, and data modelling best practices. Responsibilities • Architect, design, and implement scalable, reliable, and secure data solutions using AWS, Snowflake, and DBT. • Develop end-to-end data pipelines (batch and streaming) to support analytics, machine learning, and business intelligence needs. • Lead the modernisation and migration of legacy data systems to cloud-native architectures. • Define and enforce data engineering best practices, including coding standards, CI/CD, testing, and monitoring. • Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into technical solutions. • Optimise Snowflake performance through query tuning, warehouse sizing, and cost management. • Establish and maintain data governance, security, and compliance standards across the data platform. • Mentor and guide junior data engineers, providing technical leadership and direction. Required Skills & Qualifications • 8+ years of experience in Data Engineering, with at least 3+ years in a cloud-native data environment. • Hands-on expertise in AWS services such as S3, Glue, Lambda, Step Functions, Redshift, and IAM. • Strong experience with Snowflake – data modelling, warehouse design, performance optimisation, and cost governance. • Proven experience with DBT (data build tool) – model development, documentation, and deployment automation. • Proficient in SQL, Python, and ETL/ELT pipeline development. • Experience with CI/CD pipelines, version control (Git), and workflow orchestration tools (Airflow, Dagster, Prefect, etc.). • Familiarity with data governance and security best practices, including role-based access control and data masking. • Strong understanding of data modelling techniques (Kimball, Data Vault, etc.) and data architecture principles. Preferred Qualifications • AWS Certification (e.g., AWS Certified Data Analytics – Speciality, Solutions Architect). • Strong communication and collaboration skills, with a track record of working in agile environments.