Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 5+ years of experience in Snowflake, SQL Server, AWS, and Matillion ETL. Contract length is "unknown," and pay rate is "unknown." Strong SQL and Python skills are required, along with data modeling experience.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 13, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
New York, United States
-
🧠 - Skills detailed
#Monitoring #Storage #Data Pipeline #Cloud #Snowflake #Data Integration #"ETL (Extract #Transform #Load)" #Matillion #Python #SQL Server #Data Engineering #Computer Science #AWS (Amazon Web Services) #Requirements Gathering #Data Quality #Scala #SQL (Structured Query Language) #Data Modeling #SQL Queries #Data Architecture
Role description
Position Overview: We are looking for an experienced Data Engineer with a strong foundation in data engineering best practices and experience with modern data warehousing and ETL platforms. The ideal candidate will have at least 5 years of hands-on experience, preferably with Snowflake, SQL Server, AWS and Matillion ETL, to build, optimize, and maintain data pipelines and data integration solutions. This role involves working closely with cross functional teams to support data-driven initiatives across the organization. Key Responsibilities: β€’ Data Pipeline Development and Maintenance: o Design, develop, and maintain scalable and high-performance ETL pipelines using Matillion ETL to ingest, transform, and load data into Snowflake. o Integrate and manage data from various sources, ensuring data quality, reliability, and performance. o Optimize SQL queries, scripts, and stored procedures in Snowflake and SQL Server to improve efficiency and reduce processing time. β€’ Data Warehousing and Modeling: o Develop and implement data models within Snowflake to support business intelligence and analytics requirements. o Collaborate with data architects and analysts to design and enhance data warehouse schemas and ensure alignment with reporting and analytics needs. β€’ Collaboration and Requirements Gathering: o Work with business stakeholders to understand data requirements and translate them into technical specifications. o Support data architects and analysts by providing guidance on data ingestion, storage, and retrieval methods. β€’ Platform Monitoring and Optimization: o Monitor data pipeline and ETL performance, identifying bottlenecks and making necessary adjustments to ensure optimal system performance. o Perform troubleshooting and root cause analysis to resolve data issues and prevent future occurrences. Required Qualifications: β€’ Experience: At least 5 years of experience in data engineering, with proficiency in β€’ Snowflake, SQL Server, AWS and ETL tools (preferably Matillion). β€’ Technical Skills: Strong SQL and Python skills with experience in query β€’ optimization and performance tuning. β€’ Education: Bachelor’s degree in computer science, Information Technology, or a β€’ related field. β€’ Soft Skills: Strong analytical, problem-solving, and communication skills. β€’ Data Engineering Skills: Experience in building, maintaining, and optimizing data β€’ pipelines and ETL processes. β€’ Data Modeling: Knowledge of data warehousing concepts and hands-on β€’ experience in data modeling, particularly within Snowflake. β€’ Preferred Qualifications: β€’ Additional Tools and Platforms: Experience with additional ETL tools, data β€’ transformation methods, or cloud platforms is a plus. β€’ Healthcare and PBM: PBM-specific knowledge and experience with transforming β€’ various healthcare data is a plus, but not required