Programmers.io

Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown," offering a pay rate of "$X per hour." Key skills include complex SQL, ETL, Python, Snowflake, and Apache Kafka. Industry experience in data engineering is required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 19, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Austin, TX
-
🧠 - Skills detailed
#NumPy #Pandas #Apache Airflow #Tableau #Automation #Apache Kafka #Data Engineering #Microservices #Kafka (Apache Kafka) #S3 (Amazon Simple Storage Service) #Clustering #SnowPipe #Python #Libraries #Data Pipeline #Tableau Server #Airflow #Scripting #Programming #Slowly Changing Dimensions #SnowSQL #"ETL (Extract #Transform #Load)" #Snowflake #SQL (Structured Query Language) #SQL Queries
Role description
• Hands on experience with writing Complex SQL queries using Joins, Self Joins, Views, Materialized Views, Cursor also Recursive, use of GROUP BY, PARTITION BY functions and SQL Performance tuning • Hands on experience with ETL and Dimensional Data Modelling, Slowly Changing Dimensions like Type 1, 2, 3 o Good understanding of concepts like schema types, table types (fact, dimension etc.) • Proficiency in Python scripting and programming using Pandas, PyParsing, Airflow. o Pandas, Tableau server modules, NumPy, Datetime, Apache Airflow related modules, APIs o Setting up Python scripts on DataLab, scheduling processes, connecting with DataLake, S3 etc. o Data Pipeline automation o Strong Python programming skills o Apache Kafka and Python (using client libraries like Confluent's librdkafka or kafka python to produce and consume messages from Kafka topics. o Experience building streaming applications, data pipelines, and microservices etc. • Should have understanding on Snowflake Architecture, experience with designing and building solutions. o Architecture, design aspects, performance tuning, time travel, warehouse concepts, scaling, clustering, micro partitioning o Experience with SnowSQL, SnowPipe • Good to Have Experience with Snowflake performance optimization techniques • Experience with Vertica, Single store • Lead Experience in interacting with business and independently develop and lead data projects. Collaborating with Offshore and owning overall project delivery. • Actively participating in discussions with business to understand requirements, perform thorough impact analysis and provide suitable solutions.