

Acro Service Corp
Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect with a 4-month contract, onsite in Las Vegas or Carson City, NV. Key skills include Azure Synapse, Snowflake, ETL/ELT design, and advanced SQL. Requires 5+ years in Data Architecture and governance tools.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
592
-
ποΈ - Date
January 20, 2026
π - Duration
3 to 6 months
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Nevada, United States
-
π§ - Skills detailed
#Data Governance #Microsoft Power BI #Data Lake #Data Modeling #Data Engineering #Data Processing #MDM (Master Data Management) #Data Lakehouse #Scala #Data Transformations #Data Integrity #Data Privacy #DevOps #BO (Business Objects) #Cloud #Collibra #Informatica #Metadata #Alation #Business Objects #Scripting #SQL (Structured Query Language) #Security #Data Design #Data Lineage #Data Management #Data Mapping #Documentation #"ETL (Extract #Transform #Load)" #Classification #BI (Business Intelligence) #Informatica Cloud #Azure #Data Analysis #Data Lifecycle #Scrum #Agile #Data Pipeline #Databricks #DAX #AI (Artificial Intelligence) #Python #Data Architecture #Data Catalog #ML (Machine Learning) #Synapse #Delta Lake #Snowflake
Role description
Job Title: Systems Analyst / Technical Data Analyst
Duration: 04 Months
Location: Las Vegas OR Carson City , NV (Onsite)
JOB DESCRIPTION
The Systems Analyst serves as the primary technical resource between the Business, Business Process Analysts, Data Management Office (DMO) and the execution teams. This role is responsible for translating high-level data strategies into tactical technical requirements.
The Systems Analyst will oversee the technical translation of data mapping, multi-stage data flows, and complex ETL/ELT logic. The ideal candidate possesses deep expertise in the Medallion Architecture (or similar) to ensure the delivery of scalable, governed, and high-fidelity data assets that serve as the foundation for the organizationβs data modernization projects.
Key Responsibilities
1. Data Engineering Integration & Pipeline Oversight
β’ Medallion Architecture Implementation: Involved in the tactical execution of the Medallion data design (Bronze/Silver/Gold) to modernize the organizationβs data lakehouse environment.
β’ ETL/ELT Logic Design: Architect and manage complex Extract, Transform, Load (ETL) specifications and transformational logic to automate multi-directional data flows.
β’ Persistent Staging & Data Lifecycle: Oversee the technical requirements for staging and landing zones to ensure high-availability and historical data integrity for downstream consumption.
β’ Technical Scalability & Performance Tuning: Optimize data system performance by identifying bottlenecks in data processing and ensuring the technical framework supports high-volume, diverse data sets while adhering to security protocols.
1. Technical Data Governance & Mapping
β’ This role focuses on executing source-to-target mappings and defining the logic needed for data transformations, including type conversions and business rules. You will be responsible for setting up automated quality checks to ensure all ingested data is accurate and auditable. A key part of the position involves integrating data privacy and regulatory standards into the system design while using Master Data Management (MDM) and data cataloging tools to document technical metadata and maintain clear data lineage from start to finish.
1. Reporting & Advanced Analytics Enablement
β’ Technical Requirements Engineering: Bridge the gap between business ambiguity and technical execution by converting stakeholder needs into detailed Functional Specification Documents (FSD) and Technical Design Documents (TDD).
β’ BI Stack Optimization: Manage and optimize the semantic layers of reporting tools (Power BI and Business Objects) to ensure performant data modeling and self-service scalability.
β’ AI Analytics Readiness: Architect the data environment for AI/ML consumption by ensuring "clean-room" data availability, feature set readiness, and robust dashboarding.
β’ SQL Querying: Execute SQL scripting to validate data sets and perform root-cause analysis on data discrepancies.
1. Technical Oversight
β’ Data Flow & Pipeline Optimization: Analyze existing data pipelines to identify opportunities for latency reduction and systemic throughput efficiency.
β’ Technical Liaison: Act as the technical point of contact for data projects, synchronizing efforts between Business Process Analysts, Data Engineers, and DevOps teams.
Required Technical Skills
Data Modeling: Modern Data Architectures, Azure Synapse, Snowflake, Databricks Lakehouse, and Delta Lake technologies.
Governance Tools: Hands-on experience with any data governance tool like , Microsoft Purview, Informatica Cloud Data Governance, Alation, or Collibra.
Security & Classification: Experience with Data Classification tools and implementation.
Languages: Advanced SQL, Python (preferred) for data analysis, and DAX/M-Code (for Power BI).
Technical Methodologies: Data Governance Frameworks, Data Lineage Documentation, and Agile Scrum
β’ A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
β’ 7 years of SQL skills for data validation and root-cause analysis.
β’ A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
β’ 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
β’ 7 years of technical documentation (FSD/TDD) creation and 5 years of PowerBI related data research and mapping experience.
β’ . A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
β’ 7 years of SQL skills for data validation and root-cause analysis.
β’ A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
β’ 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
β’ 7 years of technical documentation (FSD/TDD) creation and 5 years of PowerBI related data research and mapping experience.
The Company is an Equal Opportunity Employer and is committed to creating an equitable and inclusive environment for all.
Job Title: Systems Analyst / Technical Data Analyst
Duration: 04 Months
Location: Las Vegas OR Carson City , NV (Onsite)
JOB DESCRIPTION
The Systems Analyst serves as the primary technical resource between the Business, Business Process Analysts, Data Management Office (DMO) and the execution teams. This role is responsible for translating high-level data strategies into tactical technical requirements.
The Systems Analyst will oversee the technical translation of data mapping, multi-stage data flows, and complex ETL/ELT logic. The ideal candidate possesses deep expertise in the Medallion Architecture (or similar) to ensure the delivery of scalable, governed, and high-fidelity data assets that serve as the foundation for the organizationβs data modernization projects.
Key Responsibilities
1. Data Engineering Integration & Pipeline Oversight
β’ Medallion Architecture Implementation: Involved in the tactical execution of the Medallion data design (Bronze/Silver/Gold) to modernize the organizationβs data lakehouse environment.
β’ ETL/ELT Logic Design: Architect and manage complex Extract, Transform, Load (ETL) specifications and transformational logic to automate multi-directional data flows.
β’ Persistent Staging & Data Lifecycle: Oversee the technical requirements for staging and landing zones to ensure high-availability and historical data integrity for downstream consumption.
β’ Technical Scalability & Performance Tuning: Optimize data system performance by identifying bottlenecks in data processing and ensuring the technical framework supports high-volume, diverse data sets while adhering to security protocols.
1. Technical Data Governance & Mapping
β’ This role focuses on executing source-to-target mappings and defining the logic needed for data transformations, including type conversions and business rules. You will be responsible for setting up automated quality checks to ensure all ingested data is accurate and auditable. A key part of the position involves integrating data privacy and regulatory standards into the system design while using Master Data Management (MDM) and data cataloging tools to document technical metadata and maintain clear data lineage from start to finish.
1. Reporting & Advanced Analytics Enablement
β’ Technical Requirements Engineering: Bridge the gap between business ambiguity and technical execution by converting stakeholder needs into detailed Functional Specification Documents (FSD) and Technical Design Documents (TDD).
β’ BI Stack Optimization: Manage and optimize the semantic layers of reporting tools (Power BI and Business Objects) to ensure performant data modeling and self-service scalability.
β’ AI Analytics Readiness: Architect the data environment for AI/ML consumption by ensuring "clean-room" data availability, feature set readiness, and robust dashboarding.
β’ SQL Querying: Execute SQL scripting to validate data sets and perform root-cause analysis on data discrepancies.
1. Technical Oversight
β’ Data Flow & Pipeline Optimization: Analyze existing data pipelines to identify opportunities for latency reduction and systemic throughput efficiency.
β’ Technical Liaison: Act as the technical point of contact for data projects, synchronizing efforts between Business Process Analysts, Data Engineers, and DevOps teams.
Required Technical Skills
Data Modeling: Modern Data Architectures, Azure Synapse, Snowflake, Databricks Lakehouse, and Delta Lake technologies.
Governance Tools: Hands-on experience with any data governance tool like , Microsoft Purview, Informatica Cloud Data Governance, Alation, or Collibra.
Security & Classification: Experience with Data Classification tools and implementation.
Languages: Advanced SQL, Python (preferred) for data analysis, and DAX/M-Code (for Power BI).
Technical Methodologies: Data Governance Frameworks, Data Lineage Documentation, and Agile Scrum
β’ A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
β’ 7 years of SQL skills for data validation and root-cause analysis.
β’ A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
β’ 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
β’ 7 years of technical documentation (FSD/TDD) creation and 5 years of PowerBI related data research and mapping experience.
β’ . A minimum of 5 years of proficiency in Data Architecture and platforms like Azure Synapse, Snowflake, or Databricks.
β’ 7 years of SQL skills for data validation and root-cause analysis.
β’ A minimum of 5 years of experience with ETL/ELT logic design and managing multi-stage data flows.
β’ 2 years of expertise in data governance frameworks and tools like Purview, Alation, or Collibra.
β’ 7 years of technical documentation (FSD/TDD) creation and 5 years of PowerBI related data research and mapping experience.
The Company is an Equal Opportunity Employer and is committed to creating an equitable and inclusive environment for all.






