L2R Consulting

Business Intelligence Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Business Intelligence Engineer contractor position for 6 months, offering a pay rate of "X" per hour. Key skills include Power BI, DAX, SQL, and dimensional modeling. Requires 3+ years of Power BI experience and strong documentation abilities.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
400
-
πŸ—“οΈ - Date
February 13, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Storytelling #Data Lineage #Datasets #Data Engineering #Visualization #Semantic Models #Documentation #Tableau #Security #Snowflake #DAX #SQL (Structured Query Language) #AI (Artificial Intelligence) #Version Control #BI (Business Intelligence) #Data Quality #Anomaly Detection #Data Lake #Requirements Gathering #Microsoft Power BI #Data Storytelling #Data Governance
Role description
Job Title: Business Intelligence Engineer (Contractor) Roles and Responsibilities β€’ Partner with business stakeholders to translate questions and objectives into well-scoped dashboard/report requirements and success metrics. β€’ Design, build, and maintain enterprise-grade Power BI reports and dashboards, emphasizing clarity, performance, and consistent UX patterns. β€’ Develop and enhance Fabric semantic models (datasets) to support both standardized and custom analytic needs; apply dimensional modeling (star/snowflake) best practices. β€’ Write robust DAX and Power Query (M) to implement business logic, KPI calculations, row-level security, and efficient data shaping. β€’ Contribute to standards for visualization, semantic modeling, and release/change management across BI assets. β€’ Produce user-friendly documentation that explains complex logic, data lineage, and KPI definitions for non-technical audiences. β€’ Provide production support for business-critical reports, including incident triage, fixes, and stakeholder communications. β€’ Analyze data quality issues and anomalies; document findings, identify root causes, and recommend pragmatic remediation paths. β€’ Collaborate with data engineering to source/shape data from the Fabric Data Lake and align on Lakehouse/Warehouse usage patterns. β€’ Optimize model size, refresh strategies, and query performance (e.g., aggregations, composite models, incremental refresh). β€’ Participate in design reviews and knowledge-sharing sessions with analysts, engineers, and architects. β€’ Be available to collaborate during 8am–5pm Eastern Time, Monday–Friday. Core Skills Requirements β€’ Power BI (Reports, DAX, Power Query M), Fabric semantic models β€’ Dimensional modeling (facts, dimensions, conformed dimensions, SCD concepts) β€’ SQL (complex joins, window functions, CTEs, performance awareness) β€’ Working knowledge of Microsoft Fabric components, including Data Lake, Lakehouse, and Warehouse β€’ Requirements gathering, stakeholder communication, and data storytelling β€’ Documentation and change/release hygiene in a team environment Technical Skills Requirements β€’ 3+ years hands-on experience building production Power BI reports and dashboards for business stakeholders. β€’ 1+ years designing dimensional models and semantic layers to support self-service and governed BI (star schemas, role-playing dimensions, KPI standardization). β€’ Advanced proficiency in DAX (calculated columns/measures, time-intelligence, virtual tables) and Power Query M (data shaping, parameterization, functions). β€’ Proven experience implementing performance optimization in Power BI (model design, measure patterns, aggregations, incremental refresh, Direct Lake/DirectQuery vs Import trade-offs). β€’ Demonstrated ability to translate ambiguous business questions into analytical designs, mockups, and measurable outcomes. β€’ Proficient SQL for model sourcing, validation, and troubleshooting; familiarity with query plans and optimization basics. β€’ Practical understanding of medallion architecture, Fabric Lakehouse, and Warehouse, including how each supports semantic modeling and refresh strategies. β€’ Experience operationalizing BI assets: version control practices, release management, issue tracking, and production support. β€’ Strong written communication skills to create user-friendly documentation (KPI definitions, data caveats, usage guides). Nice to Have β€’ Exposure to Purview for data governance/cataloging. β€’ Experience working with PowerBI AI Agent or other equivalent ABI infrastructure. β€’ Experience with RLS design at scale and governance for enterprise datasets. β€’ Familiarity with UX standards for enterprise analytics and accessibility guidelines. β€’ Background in anomaly detection workflows and root-cause analysis techniques. β€’ Tableau or other BI tools. Engagement Details β€’ Contractor role with responsibility for delivering high-quality, production-ready BI assets in a collaborative environment. β€’ Expected to follow architectural guidance and data governance principles during all phases of design, development, testing, and support. β€’ Must work effectively with onshore/offshore teammates and business partners to ensure deliverables are met on time and with quality.