

Tryfacta, Inc.
Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect with a 5-month contract in Tallahassee, FL, offering competitive pay. Key skills include 7+ years in database management, data architecture, ETL processes, and financial data systems. Familiarity with Azure, Snowflake, and Informatica is preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 13, 2026
🕒 - Duration
3 to 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Tallahassee, FL
-
🧠 - Skills detailed
#Data Architecture #Azure #Visualization #SQL (Structured Query Language) #Datasets #Informatica #Leadership #Cloud #Snowflake #SQL Server #Data Modeling #API (Application Programming Interface) #Computer Science #Database Management #Microsoft Power BI #Programming #BI (Business Intelligence) #PostgreSQL #Data Integration #Scala #Data Quality #Automation #Data Governance #Databases #Tableau #Oracle #Data Analysis #"ETL (Extract #Transform #Load)"
Role description
Title/Role: -Data Architect C - Advanced
Address: Burns Building, 605 Suwannee Street Tallahassee, FL 32399
Duration: 5 Months(Possible extension based on performance, need and budget availability)
Job Summary:
The successful candidate will design and implement a data integration framework to support enterprise data. The enterprise data assets consist of interfaces, enterprise applications, databases, automated processes, and reporting programs within the Department. The candidate will guide the transition and integration of the enterprise data to unify various asset domains. The integration framework is anchored on three complementary components.
• Join location data with engineering datasets
• Integration backbone—orchestrate extract transform and load (ETL) pipelines, application programming interface (APIs), and service endpoints. Provides interoperable data flows across statewide enterprise systems and enforces lineage, stewardship, and “collect once, use many” principles.
• Blueprint to standardize data schemas.
Together, these components transform the Department’s enterprise data ecosystem into a connected, governed, and analytics ready environment. Each data asset shares a common foundation of enterprise technology and governance. This involves reviewing as-is business processes, remediation strategies, reengineering, design, and integration. Ensuring compatibility with cloud architecture is a key objective, aligning with the state’s cloud-first policy. The role focuses on the analysis and remediation of data that is secure and scalable and built upon the foundation of platforms the Department has already invested in such as Azure, Snowflake, Informatica, and PostgreSQL. Once the data is integrated and provisioned within the agency the candidate will then verify data quality.
Required Experience:
• A minimum of 7 years of experience with large and complex database management systems.
• A minimum of 7 years of experience as a data architect or senior data analyst, with extensive experience in designing, implementing, and maintaining data models and ETL processes.
• 10+ years of broad experience working with database management systems (such as Oracle, SQL Server, or DB2) and data integration tools.
• Strong background in financial data systems, including data modeling, data governance, and data quality management.
• Proven experience with ETL tools, data warehousing, and automation of data processes to ensure accuracy, timeliness, and completeness of data.
• Experience in data visualization and reporting best practices, with familiarity with tools like Power BI, Tableau, or similar platforms.
• Familiarity with cloud-based data integration solutions, such as Azure or Snowflake.
Preferred Qualifications:
• Experience with DB2 systems.
• Experience with Informatica or other modern data integration platforms.
• Strong analytical skills and the ability to translate business requirements into technical specifications.
• Excellent communication and collaboration skills, capable of working effectively with technical and non-technical stakeholders.
Primary Job Duties/ Tasks:
The candidate submitted must be able to perform the following duties and/or tasks:
• Analyze and verify existing data models and interfaces to assess impacts and identify potential disruptions due to the implementation of a modernized data model.
• Design and implement comprehensive data models to ensure compatibility with business processes and reporting needs.
• Review and enhance ETL processes to improve data quality, timeliness, and accuracy, and implement robust data quality measures to maintain high standards of production data.
• Collaborate with development teams to deploy revised data models, verifying successful implementation and resolving any integration issues.
• Provide expert guidance on data visualization and reporting, establishing best practices to optimize business insights, and monitor dependencies with interfacing systems to propose remediation where necessary.
• Act as a liaison between technical teams, business stakeholders, OOT, and OIT to ensure clear communication regarding data architecture changes and project timelines.
• Mentor and provide technical leadership to team members, guiding them on best practices in data integration, architecture, and quality management.
• Ensure continuous improvement in data processes by defining roadmaps, standardizing methodologies, and collaborating on strategic initiatives.
• Evaluate and align current data systems with the state’s cloud-first initiative, ensuring data models are scalable and compatible with cloud architecture.
Education:
Bachelor’s or Master’s Degree in Computer Science, Information Systems, or other related field. Work experience can substitute on a year for year basis for the degree.
Title/Role: -Data Architect C - Advanced
Address: Burns Building, 605 Suwannee Street Tallahassee, FL 32399
Duration: 5 Months(Possible extension based on performance, need and budget availability)
Job Summary:
The successful candidate will design and implement a data integration framework to support enterprise data. The enterprise data assets consist of interfaces, enterprise applications, databases, automated processes, and reporting programs within the Department. The candidate will guide the transition and integration of the enterprise data to unify various asset domains. The integration framework is anchored on three complementary components.
• Join location data with engineering datasets
• Integration backbone—orchestrate extract transform and load (ETL) pipelines, application programming interface (APIs), and service endpoints. Provides interoperable data flows across statewide enterprise systems and enforces lineage, stewardship, and “collect once, use many” principles.
• Blueprint to standardize data schemas.
Together, these components transform the Department’s enterprise data ecosystem into a connected, governed, and analytics ready environment. Each data asset shares a common foundation of enterprise technology and governance. This involves reviewing as-is business processes, remediation strategies, reengineering, design, and integration. Ensuring compatibility with cloud architecture is a key objective, aligning with the state’s cloud-first policy. The role focuses on the analysis and remediation of data that is secure and scalable and built upon the foundation of platforms the Department has already invested in such as Azure, Snowflake, Informatica, and PostgreSQL. Once the data is integrated and provisioned within the agency the candidate will then verify data quality.
Required Experience:
• A minimum of 7 years of experience with large and complex database management systems.
• A minimum of 7 years of experience as a data architect or senior data analyst, with extensive experience in designing, implementing, and maintaining data models and ETL processes.
• 10+ years of broad experience working with database management systems (such as Oracle, SQL Server, or DB2) and data integration tools.
• Strong background in financial data systems, including data modeling, data governance, and data quality management.
• Proven experience with ETL tools, data warehousing, and automation of data processes to ensure accuracy, timeliness, and completeness of data.
• Experience in data visualization and reporting best practices, with familiarity with tools like Power BI, Tableau, or similar platforms.
• Familiarity with cloud-based data integration solutions, such as Azure or Snowflake.
Preferred Qualifications:
• Experience with DB2 systems.
• Experience with Informatica or other modern data integration platforms.
• Strong analytical skills and the ability to translate business requirements into technical specifications.
• Excellent communication and collaboration skills, capable of working effectively with technical and non-technical stakeholders.
Primary Job Duties/ Tasks:
The candidate submitted must be able to perform the following duties and/or tasks:
• Analyze and verify existing data models and interfaces to assess impacts and identify potential disruptions due to the implementation of a modernized data model.
• Design and implement comprehensive data models to ensure compatibility with business processes and reporting needs.
• Review and enhance ETL processes to improve data quality, timeliness, and accuracy, and implement robust data quality measures to maintain high standards of production data.
• Collaborate with development teams to deploy revised data models, verifying successful implementation and resolving any integration issues.
• Provide expert guidance on data visualization and reporting, establishing best practices to optimize business insights, and monitor dependencies with interfacing systems to propose remediation where necessary.
• Act as a liaison between technical teams, business stakeholders, OOT, and OIT to ensure clear communication regarding data architecture changes and project timelines.
• Mentor and provide technical leadership to team members, guiding them on best practices in data integration, architecture, and quality management.
• Ensure continuous improvement in data processes by defining roadmaps, standardizing methodologies, and collaborating on strategic initiatives.
• Evaluate and align current data systems with the state’s cloud-first initiative, ensuring data models are scalable and compatible with cloud architecture.
Education:
Bachelor’s or Master’s Degree in Computer Science, Information Systems, or other related field. Work experience can substitute on a year for year basis for the degree.




