6013 Data / BI Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a 6013 Data / BI Architect on a contract basis for 40 hours per week, offering $55.00 - $60.00 per hour. Key skills include data architecture, ETL/ELT processes, and experience with cloud services like AWS and Azure.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
480
-
πŸ—“οΈ - Date discovered
August 28, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Pontiac, MI 48342
-
🧠 - Skills detailed
#Microsoft Power BI #Redshift #ADLS (Azure Data Lake Storage) #Scala #Apache Spark #PyTorch #BO (Business Objects) #Compliance #Data Warehouse #SaaS (Software as a Service) #PostgreSQL #MongoDB #R #Data Governance #Azure Data Factory #Data Quality #Dataflow #Visualization #ADF (Azure Data Factory) #Data Architecture #Azure #Tableau #Data Lake #ML (Machine Learning) #Informatica #SQL (Structured Query Language) #Data Strategy #Keras #API (Application Programming Interface) #Data Catalog #Data Mart #Synapse #Talend #Azure ADLS (Azure Data Lake Storage) #Physical Data Model #Data Ingestion #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #TensorFlow #Strategy #Storage #Leadership #Oracle #S3 (Amazon Simple Storage Service) #dbt (data build tool) #Looker #Databricks #Programming #Data Modeling #Kafka (Apache Kafka) #Big Data #Cloud #Snowflake #RDS (Amazon Relational Database Service) #AWS (Amazon Web Services) #Fivetran #Business Objects #Metadata #SQL Server #Hadoop #XML (eXtensible Markup Language) #BI (Business Intelligence) #BigQuery #Databases #Data Pipeline #AWS Glue #Security #Python
Role description
Job Description Data Business Intelligence (BI) Architect role is a hybrid of data architecture, engineering & business strategy, bridging the gap between tech data solutions & business objectives. Designs, develops & maintains the overall data strategy ensuring the County data in scope is accessible, reliable & secure for analysis and decision-making. The right candidate has experience in architecting data solutions that can be used for descriptive, diagnostic, predictive & prescriptive analytic solutions. KEY RESPONSIBILITIES: Stakeholder Collaboration: Work closely with business & IT stakeholders to gather req & translate business needs into tech specifications, including identification of data sources. Data Arch Design & Data Modeling: Architect & implement scalable, secure & efficient data solutions, including data warehouses, data lakes, and/or data marts. Design conceptual, logical & physical data models. Tool and Platform Selection: Evaluate, recommend & implement tools aligned with recommended architecture, including visualization tools aligned with business needs. ETL/ELT Pipeline Mgt: Design, develop & test data pipelines, integrations to source mgt & ETL / ELT processes to move data from various sources into the data warehouse. Data Catalog & Metadata Mgt: Design, create & maintain an enterprise-wide data catalog, automating metadata ingestion, establishing data dictionaries, and ensuring that all data assets are properly documented & tagged. Data Governance and Discovery: Enforce data governance policies through the data catalog, ensuring data quality, security & compliance. Enable self-service data discovery for users by curating & organizing data assets in an intuitive way. Performance Optimization: Monitor & optimize BI systems & data pipelines to ensure high performance, reliability & cost-effectiveness. Technical Leadership: Provide technical guidance & mentorship across the organization, establishing best practices for data mgt & BI development. Environment: Role incl. defining data platform tech stack. Example tech below; not required to have experience in all. Data Platforms: DW & lake concepts incl. dimensional modeling & cloud services (S3, AWS Redshift, RDS, Azure Data Lake Storage, Synapse Analytics, BigQuery, Databricks, Snowflake, Informatica) Databases: SQL & relational/non-relational (SQL Server, Oracle, PostgreSQL, MongoDB); BI Tools: Power BI, Business Objects, Tableau, Crystal, Looker ETL/ELT:Cloud native (AWS Glue, Azure Data Factory, Google Cloud Dataflow) & in-warehouse transform tools (Fivetran, Talend, dbt); Big Data Tech: Hadoop, Spark, Kafka Programming/API: Python, Keras, Scikit-learn, R, XML; ML/DL/Analytic Engines: TensorFlow, PyTorch, Trillium, Apache Spark Modeling Tools: MS Visio, ER/Studio, PowerDesigner; Source systems incl. on-prem, cloud, & SaaS Job Type: Contract Pay: $55.00 - $60.00 per hour Expected hours: 40 per week Work Location: In person