Business Intelligence Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Business Intelligence Developer in SeaTac, WA, offering $60-70/hr for a 6-month contract. Requires 5+ years in analytics, hands-on experience with Databricks/Tableau, strong ETL, SQL skills, and excellent communication. Aviation experience is a plus.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
560
🗓️ - Date discovered
May 15, 2025
🕒 - Project duration
More than 6 months
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Seattle, WA
🧠 - Skills detailed
#BI (Business Intelligence) #Data Modeling #Data Engineering #Alteryx #Scala #Data Pipeline #Databricks #Data Processing #Data Storage #Cloud #Tableau #SQL (Structured Query Language) #Monitoring #Dataiku #AI (Artificial Intelligence) #"ETL (Extract #Transform #Load)" #Data Warehouse #Databases #Data Management #Data Science #Storage
Role description
Title: BI Developer Location: SeaTac, WA (hybrid 3 days/week onsite) Pay Rate: $60-70/hr 6 month contract, potential extension or conversion Requirements: • 5+ years working within Analytics/ data environment • Hands on experience with Databricks AND/ OR Tableau/ Tableau Prep • Hands on experience migrating data • Strong ETL experience • Strong SQL • Exceptional written and verbal communication • Alteryx Plus: • Dataiku • Aviation experience or background Job Description: · Design and build robust and scalable data pipelines following best engineering practices for software development, data modeling, data management, and resource optimization for hybrid environments (on-prem and cloud, databases, data warehouse, and lakehouse). · Build and maintain robust and scalable data pipelines to ensure the availability, quality, and accessibility of data for analytics and AI applications. · Write clean, efficient, and maintainable code for data processing, ensuring solutions are scalable and optimized for performance. · Develop and optimize data models that meet analytics and AI requirements, ensuring they are maintainable and aligned with business needs. · Implement efficient and cost-effective data storage and retrieval techniques to enhance system performance. · Collaborate with data scientists, analysts, and engineering teams to design and integrate data pipelines into production environments seamlessly. · Communicate effectively with stakeholders to understand product requirements and ensure alignment with organizational goals. · Monitor data pipeline performance, identifying and addressing issues to maintain reliability and efficiency. · Performs proactive cost monitoring and management on data pipelines to ensure resources are being used effectively and cost efficiently. · Document ETL processes, workflows, and methodologies to ensure knowledge sharing and process continuity within the team. · Stay updated on emerging tools, technologies, and methodologies in ETL and data engineering, incorporating innovations into workflows. · Participate in team training and professional development opportunities to enhance technical skills and problem- solving capabilities. Benefits packages for this role will start on the 1st day of employment and include medical, dental, and vision insurance, as well as HSA, FSA, and DCFSA account options, and 401k retirement account access with employer matching. Employees in this role are also entitled to paid sick leave and/or other paid time off as provided by applicable law