Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "$XX/hour." Key skills include Snowflake expertise, SQL, and ETL tools. Requires 3+ years in data engineering and familiarity with cloud platforms.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 24, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Airflow #"ETL (Extract #Transform #Load)" #Monitoring #Clustering #DevOps #Cloud #Scripting #Data Pipeline #Apache Airflow #Python #Compliance #Snowflake #Talend #SQL (Structured Query Language) #Data Engineering #dbt (data build tool) #Azure #Data Architecture #Data Quality #Data Governance #Data Modeling #Security #Scala #AWS (Amazon Web Services) #Documentation #GCP (Google Cloud Platform)
Role description
Key Responsibilities: β€’ Design and implement scalable data pipelines using Snowflake and other cloud-based data platform technologies. β€’ Develop and maintain ETL/ELT processes to ingest data from various sources. β€’ Optimize Snowflake performance through clustering, partitioning, query tuning and materialized views. β€’ Collaborate with cross-functional teams to understand data requirements and deliver solutions. β€’ Ensure data quality, integrity, and security across all data platforms. β€’ Automate data workflows and implement monitoring and alerting systems. β€’ Maintain documentation for data architecture, processes, and best practices. β€’ Stay current with Snowflake features and industry trends to continuously improve data infrastructure. Required Qualifications: β€’ 3+ years of experience in data engineering or related roles. β€’ Hands-on experience with Snowflake, including data modeling, performance tuning, and security. β€’ Proficiency in SQL and scripting languages (e.g., Python). β€’ Experience with ETL tools (e.g., dbt, Apache Airflow, Talend). β€’ Familiarity with cloud platforms (AWS, Azure, or GCP). β€’ Strong understanding of data warehousing concepts and best practices. Preferred Qualifications: β€’ Snowflake certification(s). β€’ Experience with CI/CD pipelines and DevOps practices. β€’ Knowledge of data governance and compliance standards. β€’ Excellent problem-solving and communication skills.