SQL Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Advanced SQL Engineer with 10-12 years of experience in SQL development, Python, and Snowflake. The contract is remote, lasting over 6 months, offering competitive pay. Key skills include ETL/ELT pipeline design and cloud platform expertise.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
344
-
πŸ—“οΈ - Date discovered
September 3, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#GCP (Google Cloud Platform) #Airflow #Automation #Looker #Data Migration #AWS (Amazon Web Services) #Data Modeling #Microsoft Power BI #Data Processing #Version Control #"ETL (Extract #Transform #Load)" #SQL Queries #Leadership #ML (Machine Learning) #Scala #Databases #Azure #GIT #SnowPipe #Debugging #Datasets #Batch #Scripting #SQL (Structured Query Language) #Data Engineering #BI (Business Intelligence) #Security #dbt (data build tool) #Data Warehouse #DevOps #Snowflake #Data Quality #Tableau #Migration #Cloud #Python
Role description
Job Title: Advanced SQL Engineer – Python & Snowflake Location: Remote Experience: 10–12 Years Employment Type: Full-Time About the Role: We are seeking a highly experienced Advanced SQL Engineer with strong expertise in Python and Snowflake to design, develop, and optimize scalable data solutions. The ideal candidate will have deep experience in handling large datasets, building efficient ETL pipelines, and driving advanced analytics initiatives to support business decision-making. Key Responsibilities: β€’ Design, develop, and maintain complex SQL queries, stored procedures, and performance-optimized data models. β€’ Build, enhance, and optimize ETL/ELT pipelines leveraging Python and Snowflake. β€’ Architect scalable data warehouse solutions on Snowflake to support reporting and analytics. β€’ Collaborate with cross-functional teams including Data Engineers, Analysts, and Business Stakeholders to define data requirements and deliver actionable insights. β€’ Implement data quality, integrity, and governance best practices. β€’ Work on data migration, integration, and transformation projects involving diverse data sources. β€’ Ensure high performance of SQL queries and pipelines for large-scale, real-time, and batch data processing. β€’ Mentor junior engineers and provide technical guidance to the team. Required Skills & Qualifications: β€’ 10–12 years of experience in advanced SQL development, optimization, and data engineering. β€’ Strong hands-on experience with Snowflake Data Warehouse (architecture, optimization, security, and performance tuning). β€’ Expertise in Python for data engineering, scripting, and automation. β€’ Proven experience in designing and optimizing ETL/ELT pipelines. β€’ Strong understanding of data warehousing concepts, data modeling (star/snowflake schemas), and relational databases. β€’ Hands-on experience with cloud platforms (AWS, Azure, or GCP). β€’ Experience with version control (Git), CI/CD pipelines, and modern DevOps practices for data engineering. β€’ Strong problem-solving and debugging skills. β€’ Excellent communication and leadership skills with the ability to mentor junior team members. Preferred Qualifications: β€’ Experience with Snowpipe, Streams, and Tasks in Snowflake. β€’ Familiarity with orchestration tools (Airflow, dbt, or similar). β€’ Exposure to machine learning workflows using Python is a plus. β€’ Knowledge of BI/reporting tools like Tableau, Power BI, or Looker.