Tech One IT

Sr. Snowflake Data Engineer with Power BI

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Snowflake Data Engineer with Power BI, offering a contract length of "X months" and a pay rate of "$X/hr". Key skills include Snowflake, Power BI, Python, and experience in financial services, particularly market risk.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 16, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
New York, NY
-
🧠 - Skills detailed
#Data Lake #Data Engineering #Data Quality #Debugging #Microsoft Power BI #MS SQL (Microsoft SQL Server) #Version Control #Monitoring #dbt (data build tool) #Data Processing #Cloud #Data Modeling #DAX #Data Pipeline #Automated Testing #Azure ADLS (Azure Data Lake Storage) #Data Analysis #Deployment #Snowflake #Streamlit #Scripting #Azure Data Factory #SSIS (SQL Server Integration Services) #SQL (Structured Query Language) #ADF (Azure Data Factory) #Data Accuracy #ADLS (Azure Data Lake Storage) #Python #SQL Server #Automation #Azure #Storage #SSRS (SQL Server Reporting Services) #Visualization #Data Extraction #BI (Business Intelligence) #SSAS (SQL Server Analysis Services) #"ETL (Extract #Transform #Load)" #Containers
Role description
Key Responsibilities: β€’ Design, develop, and implement data solutions using Snowflake and Power BI for the market risk platform, which includes VaR (Value-at-Risk), sensitivities, and stress testing data. β€’ Enhance and maintain existing Power BI models and cubes to ensure data accuracy, performance, and user accessibility (via Excel and Power BI Service). β€’ Contribute to migrating the legacy credit risk platform (SQL Server, SSIS, SSAS, and SSRS) to the new Snowflake/Power BI architecture. β€’ Develop and optimize ELT/ETL pipelines to ingest, transform, and load data into Snowflake, ensuring data quality and performance. β€’ Collaborate with risk managers and other stakeholders to gather requirements, estimate project scope, and translate business needs into technical specifications. β€’ Employ dbt (data build tool) for data modeling and transformation within Snowflake, ensuring data quality, lineage, and version control. β€’ Develop custom solutions using Python for data analysis, transformation, and integration tasks. β€’ Implement and maintain CI/CD pipelines for automated testing, deployment and monitoring of data solutions. β€’ Participate in troubleshooting and resolving data-related issues, performance bottlenecks, and system errors within the Snowflake and Power BI environment. β€’ Stay informed about industry trends and best practices in data warehousing, BI, and financial risk management. β€’ Communicate effectively with non-technical stakeholders and manage expectations regarding project deliverables and timelines. β€’ The capacity to plan, manage, and mentor junior developers, fostering their growth within the team. Functional / Technical Competencies: Essential: β€’ Experience as a BI Developer or Data Engineer with a focus on Snowflake and Power BI. β€’ Expertise in Snowflake, including data modeling (star/snowflake schemas), query optimization, and Snowflake-specific features (e.g., virtual warehouses, time travel). β€’ Experience designing and developing data pipelines using ADF (Azure Data Factory). β€’ Practical experience with dbt (data build tool) for data transformation and modeling in Snowflake. β€’ Proficiency in Power BI, including dashboard and report creation, data visualization best practices, and DAX (Data Analysis Expressions). β€’ Excellent SQL skills for data extraction, manipulation, and optimization. β€’ Hands-on experience with Python for data processing, scripting, and automation. β€’ Experience in the financial services industry, particularly with market risk concepts (VaR, sensitivities, stress testing). β€’ Proficiency in writing high-quality Technical Scope documents and Test Plans. Preferred: β€’ Experience with Control-M or other job scheduling tools for orchestrating data pipelines. β€’ Experience with Streamlit for building interactive web applications for data visualization. β€’ Experience with migrating legacy data platforms (SQL Server, SSIS, SSAS, SSRS) to cloud-based solutions like Snowflake/Power BI. β€’ Experience reverse engineering Excel EUCs and implementing robust ELT processes to replace them. β€’ Experience with Azure containers (e.g., Blob Storage) and Azure Data Lake Storage (Gen2). Personal Requirements: β€’ Strong analytical and problem-solving skills, coupled with meticulous attention to detail and a commitment to accuracy. β€’ Exceptional communication and interpersonal skills, enabling effective collaboration within a large, international team and with diverse stakeholders. β€’ A proactive and results-driven approach, taking full responsibility for development tasks, testing, and debugging, and consistently delivering high-quality results in a fast-paced environment. β€’ Excellent organizational skills and a disciplined, structured approach to working independently and under tight deadlines. β€’ The ability to manage expectations and thrive in a dynamic environment, adapting quickly to change and prioritizing workload effectively. β€’ Strong decision-making capabilities, consistently demonstrating sound judgment. β€’ A fast-learning aptitude, quickly grasping and applying new information in a complex domain. β€’ Strong English language speaking and writing skills for clear and effective communication.