Business Intelligence Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Business Intelligence Architect with a 6-month contract, offering a pay rate of "$XX per hour." Candidates must have 5+ years in data architecture and BI, strong data modeling skills, and be U.S. citizens or permanent residents. Hybrid work required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 27, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Hybrid
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Royal Oak, MI
-
🧠 - Skills detailed
#Databases #Big Data #Data Management #AWS Glue #Cloud #Informatica #Fivetran #Hadoop #Spark (Apache Spark) #Tableau #Security #Visualization #Databricks #Physical Data Model #Business Objects #Leadership #ML (Machine Learning) #Storage #ADF (Azure Data Factory) #Data Architecture #SQL Server #Azure Data Factory #MongoDB #PyTorch #R #Compliance #Snowflake #Kafka (Apache Kafka) #BO (Business Objects) #AWS (Amazon Web Services) #Keras #Microsoft Power BI #S3 (Amazon Simple Storage Service) #SQL (Structured Query Language) #PostgreSQL #Looker #RDS (Amazon Relational Database Service) #Synapse #BI (Business Intelligence) #Strategy #Data Modeling #BigQuery #Documentation #Data Integration #Oracle #Azure #Dataflow #Python #Scala #"ETL (Extract #Transform #Load)" #Metadata #Talend #TensorFlow #Data Lake #Azure ADLS (Azure Data Lake Storage) #ADLS (Azure Data Lake Storage) #Data Pipeline #Data Warehouse #Data Catalog #Programming #SaaS (Software as a Service) #dbt (data build tool) #Data Governance #XML (eXtensible Markup Language) #Redshift
Role description
"We do not work with third-party agencies and are unable to sponsor visas, OPT, CPT, or other work authorizations. Candidates must be U.S. citizens or permanent residents." 😊 Position Overview The Data/Business Intelligence (BI) Architect is a hybrid role that combines data architecture, engineering, and business strategy. This role bridges the gap between technology and business, ensuring the data is accessible, reliable, secure, and optimized for decision-making. The Architect will design and maintain enterprise-level data solutions that support descriptive, diagnostic, predictive, and prescriptive analytics. We are seeking a senior-level resource with at least 5+ years of experience in data architecture and BI, with the ability to define technical strategies, mentor teams, and deliver scalable solutions aligned to business objectives. Key Responsibilities Stakeholder Collaboration β€’ Partner with business and IT stakeholders to translate requirements into technical specifications β€’ Identify and integrate diverse data sources into enterprise architecture. Data Architecture & Modeling β€’ Architect scalable, secure, and efficient data platforms (data warehouses, lakes, marts) β€’ Design conceptual, logical, and physical data models to meet analytics needs. Tools & Platform Selection β€’ Evaluate, recommend, and implement tools aligned with business and technical requirements. β€’ Support BI and visualization platforms that enable data-driven decision-making. ETL / ELT Development β€’ Design, develop, and maintain data pipelines, integrations, and ETL/ELT processes β€’ Ensure efficient movement and transformation of data across systems. Data Catalog & Metadata Management β€’ Create and maintain an enterprise data catalog with automated metadata integration. β€’ Establish data dictionaries, tagging standards, and documentation practices Data Governance & Discovery β€’ Enforce governance policies for quality, security, and compliance. β€’ Enable self-service analytics through curated and organized data assets. Performance Optimization β€’ Monitor, troubleshoot, and optimize BI systems and pipelines for cost, speed, and reliability. Technical Leadership β€’ Provide guidance, establish best practices, and mentor teams across the organization. Qualifications β€’ 5+ years of experience in data architecture, BI solutions, and analytics platforms. β€’ Proven experience designing solutions for enterprise data platforms (data warehouses, lakes, marts). β€’ Strong data modeling skills (conceptual, logical, physical). β€’ Hands-on experience with ETL/ELT tools and pipeline development. β€’ Knowledge of data governance, metadata, and catalog management. β€’ Strong analytical, problem-solving, and leadership skills. β€’ Excellent communication and stakeholder engagement abilities. β€’ Ability to be on-site at least two days per week. Technical Environment (Preferred, Not Required to Know All) Data Platforms: Data warehousing & lakes, dimensional modeling, cloud services (AWS Redshift, S3, RDS, Azure Data Lake Storage, Synapse Analytics, BigQuery, Databricks, Snowflake, Informatica) Databases: SQL Server, Oracle, PostgreSQL, MongoDB BI Tools: Power BI, Tableau, Business Objects, Crystal Reports, Looker ETL/ELT: AWS Glue, Azure Data Factory, Google Cloud Dataflow, Fivetran, Talend, dbt Big Data: Hadoop, Spark, Kafka Programming / APIs: SQL, Python, R, XML; ML/DL with TensorFlow, PyTorch, Scikit-learn, Keras Modeling Tools: MS Visio, ER/Studio, PowerDesigner Source Systems: On-premises, Cloud, and SaaS integrations