Programmers.io

Data Engineer with AI

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with AI in Austin, TX (5x/week onsite) on a contract basis. Key skills include SQL, ETL, Python, Snowflake optimization, and AI integration. Experience with Gen AI and LLM frameworks is beneficial.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
Unknown
-
๐Ÿ—“๏ธ - Date
February 25, 2026
๐Ÿ•’ - Duration
Unknown
-
๐Ÿ๏ธ - Location
On-site
-
๐Ÿ“„ - Contract
Unknown
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
Austin, TX
-
๐Ÿง  - Skills detailed
#Complex Queries #Clustering #Programming #Tableau #Slowly Changing Dimensions #SnowPipe #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #Apache Airflow #Scripting #Snowflake #Data Pipeline #Pandas #SnowSQL #"ETL (Extract #Transform #Load)" #Automation #NumPy #Langchain #Data Management #Airflow #Data Engineering #Python #Tableau Server #AI (Artificial Intelligence) #Databases
Role description
Data Engineer Austin, TX (5x/ week onsite) Contract MUST HAVE: โ€ข Hands-on experience with writing Complex queries using โ€“ Joins, Self Joins, Views, Materialized Views, Cursor also Recursive, use of GROUP BY, PARTITION BY functions / SQL Performance tuning โ€ข Hands-on experience with ETL and Dimensional Data Modelling โ€“ Slowly Changing Dimensions (SCD โ€“ Type 1, 2, 3) o Good understanding of concepts like schema types, table types - fact-dimension etc. like how to design a dimension vs fact, design considerations factored etc. โ€ข Proficiency in Python scripting/programming โ€“ using Pandas, PyParsing, Airflow. o Pandas, Tableau server modules, Numpy, Datetime, Apache Airflow related modules, APIs o Setting up Python scripts on DataLab, scheduling processes, connecting with DataLake (S3 etc ) o Data Pipeline automation โ€ข Good understanding on Snowflake Architecture - experience with designing and building solutions. o Architecture, design aspects, performance tuning, time travel, warehouse concepts - scaling, clustering, micro-partitioning o Experience with SnowSQL, Snowpipe โ€ข Must Have - Experience with Snowflake performance optimization techniques โ€ข Own project delivery collaborating with Offshore. โ€ข Actively participating in discussions with business to understand requirements and provide suitable solutions. โ€ข Experience with AI (very beneficial) and advanced AI integration โ€“ o Good experience with Gen AI and LLM Integration which includes: ยง Having good understanding of RAG ยง Prompt and Context Engineering โ€“ structure, query and manage data context fed to LLMs ยง Vector Data Management โ€“ handling and storing data (also unstructured) in vector databases, indices for semantic search and RAG ยง Experience with LLM Orchestration frameworks โ€“ LangChain, LlamaIndex