Drillo.AI

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in New Providence, NJ, on a contract basis, offering competitive pay. Requires 8 years of Snowflake experience, strong Python and SQL skills, and expertise in ETL processes and data modeling.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 18, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New Providence, NJ
-
🧠 - Skills detailed
#Snowflake #Data Architecture #Data Storage #DevOps #Azure #Data Integrity #Lambda (AWS Lambda) #Python #Schema Design #Security #SQL Queries #AWS (Amazon Web Services) #Data Pipeline #Documentation #Scripting #SQL (Structured Query Language) #Cloud #AWS Lambda #Storage #Data Security #Data Engineering #GitLab #"ETL (Extract #Transform #Load)"
Role description
Job Title: Senior Data Engineer (Python & Snowflake, SQL) Location: New Providence, NJ Employment Type: Contract Sr. Data Engineer (Python, Snowflake, SQL) β€’ The developer should have strong Python, Snowflake, SQL coding skills. β€’ The developer should be able to articulate few real time experience scenarios and should have a good aptitude to show case solutions for real life problems in Snowflake and Python. β€’ The developer should be able to write code in Python for some intermediate level problems given during the L1 assessment. β€’ Lead qualities to be able to guide a team and to own the end to end support of the project. Around 8 years’ experience as Snowflake Developer on design and development of data solutions within the Snowflake Data Cloud, leveraging its cloud-based data warehousing capabilities. Responsible for designing and implementing data pipelines, data models, and ETL processes, ensuring efficient and effective data storage, processing, and analysis. Able to write Complex SQL Queries, Write Python Stored Procedure code in Snowflake Job Description Summary: Data Modelling and Schema Design: β€’ Create and maintain well-structured data models and schemas within Snowflake, ensuring data integrity and efficient query performance. ETL/ELT Development: β€’ Design and implement ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) processes to load data into Snowflake from various sources. Data Pipeline Management: β€’ Build and optimize data pipelines to ingest data into Snowflake, ensuring accurate and timely data flow. SQL Optimization: β€’ Write and optimize SQL queries to enhance performance and efficiency within Snowflake. Performance Tuning: β€’ Identify and address performance bottlenecks within Snowflake, optimizing query execution and resource allocation. Security and Governance: β€’ Implement data security and governance best practices within Snowflake environments, including access control and encryption. Documentation and Maintenance: β€’ Maintain documentation for data models, data pipelines, and other Snowflake solutions. Troubleshooting and Support: β€’ Troubleshoot and resolve issues within Snowflake, providing technical support to users. β€’ Collaboration: β€’ Collaborate with data architects, data engineers, and business users to understand requirements and deliver solutions Other Skills: β€’ Experience with data warehousing concepts and data modelling. β€’ Hands-on experience in creating stored procedures, functions, tables, cursors. β€’ Experience in database testing, data comparison, and data transformation scripting. β€’ Capable of troubleshooting common database issues β€’ Hands on experience in Gitlab with understanding of CI/CD Pipeline, DevOps tools β€’ Knowledge on AWS Lambda and Azure Functions