TechLink Resources, Inc

Sr. Data Engineer (Snowpark, Python, SQL, Azure)- No H1B or C2C

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer in Orlando, FL, with a contract length of unspecified duration. The pay rate is also unspecified. Key skills include Snowflake, Python, SQL, and Azure Data Factory. Requires 3–5+ years of data engineering experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date
April 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Corp-to-Corp (C2C)
-
🔒 - Security
Unknown
-
📍 - Location detailed
Orlando, FL
-
🧠 - Skills detailed
#Azure Data Factory #Snowflake #"ETL (Extract #Transform #Load)" #Python #Cloud #AWS (Amazon Web Services) #Scrum #Data Migration #Data Integration #Data Engineering #Leadership #Data Transformations #Azure #Storage #Migration #Snowpark #Data Governance #SQL (Structured Query Language) #Debugging #Data Pipeline #Security #Agile #Scala #ADF (Azure Data Factory) #Azure cloud #Data Quality #AI (Artificial Intelligence) #Data Processing
Role description
PLEASE THE JOB DESCRIPTION CAREFULLY. THIS IS ON-SITE ORLANDO, FL POSITION. MUST BE ABLE TO INTERVIEW IN PERSON. NO H-1B AND NO C2C. Senior Data Engineer (Snowflake / Python / AI-Assisted Development) About the Role We’re looking for a Senior Data Engineer to help build and modernize enterprise data pipelines in a cloud-based environment. This role is highly hands-on and focused on execution and delivery, working closely with architects and team leads to implement scalable data solutions. You’ll play a key role in migrating existing data pipelines from Azure Data Factory into Snowflake (Snowpark) while leveraging modern AI-assisted development tools (like Cursor and Copilot) to improve speed, quality, and efficiency. What You’ll Be Doing • Build, refactor, and maintain scalable data pipelines for enterprise data systems • Migrate existing ADF pipelines into Snowflake Snowpark solutions • Design and implement data transformations for reporting, analytics, and KPIs • Work heavily with Snowflake (SQL + Snowpark) for data processing and storage • Develop and optimize Python-based data workflows • Integrate data from multiple sources and ensure high data quality • Collaborate with engineering leadership to execute on defined requirements • Leverage AI tools (Cursor, Copilot, etc.) for coding, debugging, and optimization • Ensure solutions meet performance, security, and data governance standards What We are Looking For • Strong hands-on coder (not just design/architecture) • Deep experience with Python + SQL for data engineering • Proven experience working with Snowflake in production • Experience with Azure Data Factory and translating pipelines into code • Someone who has done cloud data migrations (Azure → Snowflake preferred) • Comfortable working in Agile/Scrum environments • Familiar with using AI coding tools to move faster and write better code Basic Qualifications • 3–5+ years in Data Engineering / Data Integration • Advanced experience with Python and SQL (complex transformations + tuning) • Strong experience with Snowflake • Experience with Azure Data Factory (ADF) • Experience working in Agile teams • Experience using AI-assisted development tools (Cursor, Copilot, etc.) Nice to Have • Experience with Snowflake Snowpark, Tasks, and Streams • Exposure to AWS or Azure cloud environments • Experience with CI/CD pipelines for data engineering • Background in large-scale data migration projects Why This Role Stands Out • High-impact work modernizing enterprise data systems • Exposure to cutting-edge AI-assisted development practices • Strong engineering culture focused on delivery and quality • Opportunity to work on large-scale cloud data transformations