COGENT Infotech

Senior Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Architect, hybrid in Tallahassee, FL, with a contract length of "X months" at a pay rate of "$X/hour." Requires 7+ years in data architecture, strong financial data systems experience, and proficiency in ETL processes and data visualization tools.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
February 12, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Tallahassee, FL
-
🧠 - Skills detailed
#Azure #Microsoft SQL #Tableau #Documentation #Compliance #Data Analysis #MS SQL (Microsoft SQL Server) #Informatica Cloud #SQL (Structured Query Language) #Data Governance #SSIS (SQL Server Integration Services) #"ETL (Extract #Transform #Load)" #Informatica #Security #Microsoft Power BI #Data Integration #Cloud #SQL Server #Microsoft SQL Server #Automation #Data Architecture #Snowflake #Database Querying #Project Management #Data Modeling #Oracle #Computer Science #Talend #Data Manipulation #Leadership #BI (Business Intelligence) #Scripting #Data Security #Deployment #Scala #Data Extraction #Visualization #Data Quality #Data Management #Database Management
Role description
Data Architect Hybrid Role- Final onsite interview required Tallahassee, FL Scope: Data Analysis and Impact Assessment: β€’ Analyze and verify existing Work Program Datamart and General Ledger data models to assess impacts related to changes in Chart of Account data elements due to the Project. β€’ Evaluate the existing data models for dependencies on current Chart of Account structures and assess the scope of necessary changes. β€’ Conduct impact analysis on reporting and application usage of data models to ensure business outcomes are not disrupted by the transition. Data Model Design and Enhancement: β€’ Provide a comprehensive design of data models, incorporating new Chart of Account data elements introduced by the system. β€’ Ensure that the redesigned data models are scalable, robust, and aligned with best practices in data management, supporting both current and future business needs. Data Quality and ETL Process Improvement: β€’ Analyze existing ETL (Extract, Transform, Load) jobs to identify inefficiencies and potential areas for enhancement. β€’ Design and implement robust data quality measures to ensure the timeliness, accuracy, and completeness of production data on a scheduled basis, including nightly, hourly, or less frequent intervals as needed. β€’ Collaborate with development teams to automate and streamline ETL processes to support efficient data flow and integration. Data Visualization and Reporting Best Practices: β€’ Provide expertise in data visualization and reporting, offering best practices to support the business’s ability to analyze and interpret financial data effectively. β€’ Develop strategies to enhance existing reporting frameworks, ensuring that they are compatible with the new data models and data structures. Collaboration and Deployment: β€’ Work closely with existing development teams to deploy new and revised data models, verifying that the implementation aligns with functional requirements and business goals. β€’ Act as a liaison between technical and non-technical stakeholders, ensuring clear communication regarding data model changes, system impacts, and timelines. Deliverables: β€’ Comprehensive assessment report on current Work Program Datamart and General Ledger data models. β€’ Enhanced data model designs integrating Chart of Account updates. β€’ Thorough documentation on data quality measures and ETL process enhancements. β€’ Guidelines on best practices for data visualization and reporting. β€’ Detailed implementation plan for the deployment and verification of new data models. β€’ Consistent updates and presentations to stakeholders on project progress and key insights. Education: Bachelor’s or Master’s Degree in Computer Science, Information Systems. Experience β€’ A minimum of 7 years of experience with large and complex database management systems. β€’ A minimum of 7 years of experience as a data architect or senior data analyst, with extensive experience in designing, implementing, and maintaining data models and ETL processes. β€’ 10+ years of broad experience working with database management systems (such as Oracle, SQL Server, or DB2) and data integration tools. β€’ Strong background in financial data systems, including data modeling, data governance, and data quality management. β€’ Proven experience with ETL tools, data warehousing, and automation of data processes to ensure accuracy, timeliness, and completeness of data. β€’ Experience in data visualization and reporting best practices, with familiarity with tools like Power BI, Tableau, or similar platforms. β€’ Familiarity with cloud-based data integration solutions, such as Azure or Snowflake. Preferred Qualifications: β€’ Experience with Informatica, Cloud Data Governance and Catalog, Cloud Data Integration, or other modern data integration platforms. β€’ Strong analytical skills and the ability to translate business requirements into technical specifications. β€’ Excellent communication and collaboration skills, capable of working effectively with technical and non-technical stakeholders. Primary Job Duties/ Tasks The submitted candidate must be able to perform the following duties and/or tasks: β€’ Analyze and verify existing data models and interfaces in the Work Program Datamart and General Ledger to assess impacts and identify potential disruptions. β€’ Design and implement comprehensive data models, integrating new data elements to ensure compatibility with business processes and reporting needs. β€’ Review and enhance ETL processes to improve data quality, timeliness, and accuracy, and implement robust data quality measures to maintain high standards of production data. β€’ Collaborate with development teams to deploy revised data models, verifying successful implementation and resolving any integration issues. β€’ Provide expert guidance on data visualization and reporting, establishing best practices to optimize business insights, and monitor dependencies with interfacing systems to propose remediation where necessary. β€’ Act as a liaison between technical teams, business stakeholders, and OIT to ensure clear communication regarding data architecture changes and project timelines. β€’ Mentor and provide technical leadership to team members, guiding them on best practices in data integration, architecture, and quality management. β€’ Ensure continuous improvement in data processes by defining roadmaps, standardizing methodologies, and collaborating on strategic initiatives. β€’ Evaluate and align current data systems with the state’s cloud-first initiative, ensuring data models are scalable and compatible with cloud architecture. Knowledge: β€’ Strong knowledge of financial management business processes, including accounting, ledger management, and budget reporting. β€’ In-depth understanding of data modeling principles, including relational and dimensional modeling techniques. β€’ Familiarity with the system or similar state or federal financial management systems. β€’ Proficiency in database management systems (DBMS) such as Microsoft SQL Server and DB2. β€’ Comprehensive understanding of ETL processes and data integration tools like Informatica, Talend, or Microsoft SSIS. β€’ Knowledge of cloud-based data integration platforms (e.g., Azure, Snowflake) and cloud-first strategies. β€’ Familiarity with data quality management practices, data governance frameworks, and best practices. β€’ Knowledge of data visualization tools (e.g., Power BI) and reporting best practices. β€’ Understanding of data security, privacy regulations, and compliance requirements. Skills: β€’ Proficiency in SQL and scripting languages for database querying and data manipulation. β€’ Strong analytical and problem-solving skills, with the ability to conduct complex systems analysis and identify areas for improvement. β€’ Skill in designing scalable and robust data models that support business processes and reporting needs. β€’ Ability to work with ETL tools to streamline and automate data extraction, transformation, and loading processes. β€’ Skilled in verbal and written communication, with the ability to explain technical concepts to non-technical stakeholders. β€’ Competency in project management, including the ability to plan, prioritize, and manage multiple tasks effectively. β€’ Strong collaboration and interpersonal skills to work effectively with cross-functional teams and stakeholders. β€’ Skilled in conducting data impact assessments and presenting findings to management and development teams. Abilities: β€’ Ability to lead and mentor technical teams, providing guidance on best practices in data architecture, integration, and quality management. β€’ Ability to conduct detailed analysis of existing systems and identify opportunities for optimization and modernization. β€’ Capability to develop comprehensive data models and ETL processes that align with business requirements and organizational goals. β€’ Ability to translate user requirements into technical specifications and scalable data solutions. β€’ Ability to integrate research, trends, and industry best practices into continuous improvement efforts. β€’ Ability to communicate effectively with stakeholders at all levels, ensuring alignment between technical solutions and business needs. β€’ Ability to adapt to changing requirements and priorities, maintaining a focus on quality and project deadlines. β€’ Strong organizational skills and attention to detail, with the ability to document data architecture, processes, and standards. β€’ Ability to ensure data solutions comply with security, privacy, and compliance standards.