

ARK Solutions, Inc.
Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in Tallahassee, FL, on a contract basis. It requires 7+ years in data architecture, expertise in ETL processes, and familiarity with Azure, Snowflake, and Informatica. A Bachelor's/Master's in Computer Science is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 13, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Tallahassee, FL
-
🧠 - Skills detailed
#Data Architecture #Azure #Batch #Visualization #SQL (Structured Query Language) #Datasets #Informatica #Spatial Data #ADF (Azure Data Factory) #Data Security #Leadership #Cloud #Scripting #Compliance #Snowflake #Data Modeling #Azure Data Factory #API (Application Programming Interface) #SQL Server #Data Management #Database Querying #Cybersecurity #Computer Science #Physical Data Model #Database Management #Microsoft Power BI #SSIS (SQL Server Integration Services) #Data Extraction #Microsoft SQL Server #Programming #BI (Business Intelligence) #Metadata #Strategy #PostgreSQL #Data Pipeline #Data Manipulation #MS SQL (Microsoft SQL Server) #Data Integration #Microsoft SQL #Scala #Schema Design #Project Management #Data Quality #Automation #Data Governance #Databases #Security #Tableau #Oracle #Data Analysis #"ETL (Extract #Transform #Load)"
Role description
Title: Data Architect
Location: - Tallahassee, FL 32399 (Onsite)
Hire Type: Contract
Scope of Services
Modernize core enterprise data systems, various data initiatives for the strategic priorities of safety, mobility, cybersecurity, and operational efficiency must be documented and modernized for implementation of a framework to enable scalable data integration and analytics.
The successful candidate will design and implement a data integration framework to support enterprise data. The enterprise data assets consist of interfaces, enterprise applications, databases, automated processes, and reporting programs within the Department. The candidate will guide the transition and integration of the enterprise data to unify various asset domains. The integration framework is anchored on three complementary components.
Join location data with engineering datasets.
Integration backbone—orchestrate extract transform and load (ETL) pipelines, application programming interface (APIs), and service endpoints. Provides interoperable data flows across statewide enterprise systems and enforces lineage, stewardship, and “collect once, use many” principles.
Blueprint to standardize data schemas.
Together, these components transform the Department’s enterprise data ecosystem into a connected, governed, and analytics ready environment. Each data asset shares a common foundation of enterprise technology and governance. This involves reviewing as-is business processes, remediation strategies, reengineering, design, and integration. Ensuring compatibility with cloud architecture is a key objective, aligning with the state’s cloud-first policy. The role focuses on the analysis and remediation of data that is secure and scalable and built upon the foundation of platforms the Department has already invested in such as Azure, Snowflake, Informatica, and PostgreSQL. Once the data is integrated and provisioned within the agency the candidate will then verify data quality.
Current State Assessment:
Review existing systems to identify any data silos, schema inconsistencies and governance gaps.
Engage stakeholders to gather and document their integration needs
Analyze and verify existing data models to assess impacts related to implementation of a modern data model.
Evaluate the existing data models for dependencies and assess the scope of necessary changes.
Conduct impact analysis on reporting and application usage of data models to ensure business outcomes are not disrupted.
Framework Design and Architecture:
Define the enterprise data architecture strategy, including canonical models, application programming interfaces (APIs), and ontologies.
Map integration points across AZURE, Snowflake, Informatica, and PostgreSQL.
Develop conceptual, logical, and physical data models for key domains (e.g. financial, roadway characteristic, and work program administration).
Provide a comprehensive re-design of data models for various enterprise domains.
Ensure that the redesigned canonical data models are scalable, robust, and aligned with best practices in data management, supporting both current and future business needs.
Ensure spatial data integration with GIS systems.
Data Governance and Quality:
Implement established governance structure, meta data standards, and stewardship roles.
Design and implement data quality measures to ensure the timeliness, accuracy, and completeness of production data on a scheduled basis, including nightly, hourly, or less frequent intervals as needed.
Collaborate with development teams to automate and streamline ETL processes to support efficient data flow and integration.
Create metadata standards, lineage tracking, and stewardship roles that compliment existing roles defined in Informatica and Snowflake.
Data Integration and Interoperability:
Build ETL/ELT pipelines and real-time streaming solutions with a unified structure for ingestion and exchange for enterprise datasets
Ensure consistent identifiers enterprise Application.
Enable data exchange across internal systems and with external systems
Implement Orchestration using Informatica/AZURE Data Factory or Snowflake.
Design RESTful endpoints for spatial and engineering data.
Education
Bachelor’s or Master’s Degree in Computer Science, Information Systems, or other related field. Work experience can substitute on a year for year basis for the degree.
Experience
A minimum of 7 years of experience with large and complex database management systems.
A minimum of 7 years of experience as a data architect or senior data analyst, with extensive experience in designing, implementing, and maintaining data models and ETL processes.
10+ years of broad experience working with database management systems (such as Oracle, SQL Server, or DB2) and data integration tools.
Strong background in financial data systems, including data modeling, data governance, and data quality management.
Proven experience with ETL tools, data warehousing, and automation of data processes to ensure accuracy, timeliness, and completeness of data.
Experience in data visualization and reporting best practices, with familiarity with tools like Power BI, Tableau, or similar platforms.
Familiarity with cloud-based data integration solutions, such as Azure or Snowflake.
Preferred Qualifications:
Experience with DB2 systems.
Experience with Informatica or other modern data integration platforms.
Strong analytical skills and the ability to translate business requirements into technical specifications.
Excellent communication and collaboration skills, capable of working effectively with technical and non-technical stakeholders.
Primary Job Duties/ Tasks
Analyze and verify existing data models and interfaces to assess impacts and identify potential disruptions due to the implementation of a modernized data model.
Design and implement comprehensive data models to ensure compatibility with business processes and reporting needs.
Review and enhance ETL processes to improve data quality, timeliness, and accuracy, and implement robust data quality measures to maintain high standards of production data.
Collaborate with development teams to deploy revised data models, verifying successful implementation and resolving any integration issues.
Provide expert guidance on data visualization and reporting, establishing best practices to optimize business insights, and monitor dependencies with interfacing systems to propose remediation where necessary.
Act as a liaison between technical teams, business stakeholders, OOT, and OIT to ensure clear communication regarding data architecture changes and project timelines.
Mentor and provide technical leadership to team members, guiding them on best practices in data integration, architecture, and quality management.
Ensure continuous improvement in data processes by defining roadmaps, standardizing methodologies, and collaborating on strategic initiatives.
Evaluate and align current data systems with the state’s cloud-first initiative, ensuring data models are scalable and compatible with cloud architecture.
Knowledge:
Familiarity with financial management Work Program Application (WPA), RCI LRS, RCI/TCI, and Pavement Systems
In-depth understanding of data modeling principles, including relational and dimensional modeling techniques, canonical models, schema design, and integration patterns
Familiarity with the Florida PALM system or similar state or federal financial management systems.
Proficiency in database management systems (DBMS) such as Microsoft SQL Server and DB2.
Comprehensive understanding of ETL processes and data integration tools like Informatica, Snowflake, or Microsoft SSIS.
Proficiency with AZURE, Snowflake, Informatica, PostgreSQL, and cloud-first strategies.
Understanding of data pipelines, batch vs. real-time integration, and API-driven architectures.
Knowledge of spatial data concepts and event segmentation
Familiarity with data quality management practices, data governance frameworks, and best practices.
Knowledge of role-based access, encryption, and FDOT security standards.
Knowledge of data visualization tools (e.g., Power BI) and reporting best practices.
Understanding of data security, privacy regulations, and compliance requirements.
Skills:
Proficiency in SQL and scripting languages for database querying and data manipulation.
Strong analytical and problem-solving skills, with the ability to conduct complex systems analysis and identify areas for improvement.
Translate business requirements into technical solutions that scale.
Create architecture diagrams, data dictionaries, and governance playbooks for stakeholders.
Ability to design scalable and robust canonical data models and coordinate schemas across disparate systems to support business processes and reporting needs.
Ability to work with ETL tools to streamline and automate data extraction, transformation, and loading processes.
Build efficient ingestion and transformation workflows using Informatica or AZURE Data Factory.
Skilled in developing RESTful endpoints for spatial and engineering data exchange.
Skilled in verbal and written communication, with the ability to explain technical concepts to non-technical stakeholders.
Competency in project management, including the ability to plan, prioritize, and manage multiple tasks effectively.
Strong collaboration and interpersonal skills to work effectively with cross-functional teams and stakeholders to align technical and business goals.
Skilled in conducting data impact assessments and presenting findings to management and development teams.
Title: Data Architect
Location: - Tallahassee, FL 32399 (Onsite)
Hire Type: Contract
Scope of Services
Modernize core enterprise data systems, various data initiatives for the strategic priorities of safety, mobility, cybersecurity, and operational efficiency must be documented and modernized for implementation of a framework to enable scalable data integration and analytics.
The successful candidate will design and implement a data integration framework to support enterprise data. The enterprise data assets consist of interfaces, enterprise applications, databases, automated processes, and reporting programs within the Department. The candidate will guide the transition and integration of the enterprise data to unify various asset domains. The integration framework is anchored on three complementary components.
Join location data with engineering datasets.
Integration backbone—orchestrate extract transform and load (ETL) pipelines, application programming interface (APIs), and service endpoints. Provides interoperable data flows across statewide enterprise systems and enforces lineage, stewardship, and “collect once, use many” principles.
Blueprint to standardize data schemas.
Together, these components transform the Department’s enterprise data ecosystem into a connected, governed, and analytics ready environment. Each data asset shares a common foundation of enterprise technology and governance. This involves reviewing as-is business processes, remediation strategies, reengineering, design, and integration. Ensuring compatibility with cloud architecture is a key objective, aligning with the state’s cloud-first policy. The role focuses on the analysis and remediation of data that is secure and scalable and built upon the foundation of platforms the Department has already invested in such as Azure, Snowflake, Informatica, and PostgreSQL. Once the data is integrated and provisioned within the agency the candidate will then verify data quality.
Current State Assessment:
Review existing systems to identify any data silos, schema inconsistencies and governance gaps.
Engage stakeholders to gather and document their integration needs
Analyze and verify existing data models to assess impacts related to implementation of a modern data model.
Evaluate the existing data models for dependencies and assess the scope of necessary changes.
Conduct impact analysis on reporting and application usage of data models to ensure business outcomes are not disrupted.
Framework Design and Architecture:
Define the enterprise data architecture strategy, including canonical models, application programming interfaces (APIs), and ontologies.
Map integration points across AZURE, Snowflake, Informatica, and PostgreSQL.
Develop conceptual, logical, and physical data models for key domains (e.g. financial, roadway characteristic, and work program administration).
Provide a comprehensive re-design of data models for various enterprise domains.
Ensure that the redesigned canonical data models are scalable, robust, and aligned with best practices in data management, supporting both current and future business needs.
Ensure spatial data integration with GIS systems.
Data Governance and Quality:
Implement established governance structure, meta data standards, and stewardship roles.
Design and implement data quality measures to ensure the timeliness, accuracy, and completeness of production data on a scheduled basis, including nightly, hourly, or less frequent intervals as needed.
Collaborate with development teams to automate and streamline ETL processes to support efficient data flow and integration.
Create metadata standards, lineage tracking, and stewardship roles that compliment existing roles defined in Informatica and Snowflake.
Data Integration and Interoperability:
Build ETL/ELT pipelines and real-time streaming solutions with a unified structure for ingestion and exchange for enterprise datasets
Ensure consistent identifiers enterprise Application.
Enable data exchange across internal systems and with external systems
Implement Orchestration using Informatica/AZURE Data Factory or Snowflake.
Design RESTful endpoints for spatial and engineering data.
Education
Bachelor’s or Master’s Degree in Computer Science, Information Systems, or other related field. Work experience can substitute on a year for year basis for the degree.
Experience
A minimum of 7 years of experience with large and complex database management systems.
A minimum of 7 years of experience as a data architect or senior data analyst, with extensive experience in designing, implementing, and maintaining data models and ETL processes.
10+ years of broad experience working with database management systems (such as Oracle, SQL Server, or DB2) and data integration tools.
Strong background in financial data systems, including data modeling, data governance, and data quality management.
Proven experience with ETL tools, data warehousing, and automation of data processes to ensure accuracy, timeliness, and completeness of data.
Experience in data visualization and reporting best practices, with familiarity with tools like Power BI, Tableau, or similar platforms.
Familiarity with cloud-based data integration solutions, such as Azure or Snowflake.
Preferred Qualifications:
Experience with DB2 systems.
Experience with Informatica or other modern data integration platforms.
Strong analytical skills and the ability to translate business requirements into technical specifications.
Excellent communication and collaboration skills, capable of working effectively with technical and non-technical stakeholders.
Primary Job Duties/ Tasks
Analyze and verify existing data models and interfaces to assess impacts and identify potential disruptions due to the implementation of a modernized data model.
Design and implement comprehensive data models to ensure compatibility with business processes and reporting needs.
Review and enhance ETL processes to improve data quality, timeliness, and accuracy, and implement robust data quality measures to maintain high standards of production data.
Collaborate with development teams to deploy revised data models, verifying successful implementation and resolving any integration issues.
Provide expert guidance on data visualization and reporting, establishing best practices to optimize business insights, and monitor dependencies with interfacing systems to propose remediation where necessary.
Act as a liaison between technical teams, business stakeholders, OOT, and OIT to ensure clear communication regarding data architecture changes and project timelines.
Mentor and provide technical leadership to team members, guiding them on best practices in data integration, architecture, and quality management.
Ensure continuous improvement in data processes by defining roadmaps, standardizing methodologies, and collaborating on strategic initiatives.
Evaluate and align current data systems with the state’s cloud-first initiative, ensuring data models are scalable and compatible with cloud architecture.
Knowledge:
Familiarity with financial management Work Program Application (WPA), RCI LRS, RCI/TCI, and Pavement Systems
In-depth understanding of data modeling principles, including relational and dimensional modeling techniques, canonical models, schema design, and integration patterns
Familiarity with the Florida PALM system or similar state or federal financial management systems.
Proficiency in database management systems (DBMS) such as Microsoft SQL Server and DB2.
Comprehensive understanding of ETL processes and data integration tools like Informatica, Snowflake, or Microsoft SSIS.
Proficiency with AZURE, Snowflake, Informatica, PostgreSQL, and cloud-first strategies.
Understanding of data pipelines, batch vs. real-time integration, and API-driven architectures.
Knowledge of spatial data concepts and event segmentation
Familiarity with data quality management practices, data governance frameworks, and best practices.
Knowledge of role-based access, encryption, and FDOT security standards.
Knowledge of data visualization tools (e.g., Power BI) and reporting best practices.
Understanding of data security, privacy regulations, and compliance requirements.
Skills:
Proficiency in SQL and scripting languages for database querying and data manipulation.
Strong analytical and problem-solving skills, with the ability to conduct complex systems analysis and identify areas for improvement.
Translate business requirements into technical solutions that scale.
Create architecture diagrams, data dictionaries, and governance playbooks for stakeholders.
Ability to design scalable and robust canonical data models and coordinate schemas across disparate systems to support business processes and reporting needs.
Ability to work with ETL tools to streamline and automate data extraction, transformation, and loading processes.
Build efficient ingestion and transformation workflows using Informatica or AZURE Data Factory.
Skilled in developing RESTful endpoints for spatial and engineering data exchange.
Skilled in verbal and written communication, with the ability to explain technical concepts to non-technical stakeholders.
Competency in project management, including the ability to plan, prioritize, and manage multiple tasks effectively.
Strong collaboration and interpersonal skills to work effectively with cross-functional teams and stakeholders to align technical and business goals.
Skilled in conducting data impact assessments and presenting findings to management and development teams.




