

Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect on a 24+ month contract in Tallahassee, FL, offering a pay rate of "pay rate". Key skills include data modeling, Informatica tools, and cloud data lakehouse architecture. A Bachelor's degree and relevant certifications are required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 2, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Tallahassee, FL
-
π§ - Skills detailed
#Classification #BI (Business Intelligence) #Data Profiling #Data Lakehouse #Observability #Consulting #Data Catalog #Data Extraction #SQL (Structured Query Language) #Data Modeling #Tableau #S3 (Amazon Simple Storage Service) #Data Lake #Data Warehouse #Delta Lake #Data Ingestion #MDM (Master Data Management) #UAT (User Acceptance Testing) #Regression #DataOps #Data Enrichment #SaaS (Software as a Service) #Storage #Computer Science #Data Dictionary #ERWin #Lambda (AWS Lambda) #Data Architecture #Qlik #Data Pipeline #Automation #Physical Data Model #Informatica #Databases #Python #Replication #Data Governance #"ETL (Extract #Transform #Load)" #AWS S3 (Amazon Simple Storage Service) #Monitoring #Cloud #Metadata #AWS Lambda #Security #AWS (Amazon Web Services) #Data Science #Dataiku #ML (Machine Learning) #Data Quality #Data Privacy #Data Security #Informatica Cloud #Programming #Databricks #Data Storage #Data Access #NoSQL #API (Application Programming Interface) #Data Integration #Snowflake #Compliance #Data Management #Batch #Quality Assurance
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Our government client is looking for an experienced Data Architect on an onsite 24+ months renewable contract role in Tallahassee, FL.
Role: Data Architect
Job Responsibilities
This scope of work is for a Data and Analytics Architect/Engineer who will be responsible for collaborating with the Department and its System Implementer (SI) to implement and maintain technical aspects of the EDAP component within the data modernization (DM) program. This contractor will provide data modeling/design services, as well as hands-on engineering development during the implementation of various (EDAP) use cases and will provide support services once in production. Contractor will provide these services to the Department.
Contractor Responsibilities
SERVICE TASKS: As directed by the Department, contractorβs staff will perform the following tasks in the time and manner specified:
β’ Monthly and as directed by the Department, provide technical, data, and process subject matter expertise and championship to the Department related to all aspects of the Enterprise Data and Analytics Platform (EDAP) project.
β’ Monthly and as directed by the Department, reverse engineer, study, and utilize Informatica Cloud Data Profiling (CDP) to profile prioritized source system data models, dictionaries, and data flows to understand and document current state and identify data problems that will have to be remediated within the EDAP.
β’ Monthly and as directed by the Department, work with business subject matter experts (SMEs) and the Enterprise Data and Analytics Architect to document the current state Entity Relationship Diagram (ERD) in Erwin as well as current state logical and physical data models for in- scope Department subject areas.
β’ Monthly and as directed by the Department, perform needs analysis, analyze business processes, develop process maps, in collaboration with project team members.
β’ Monthly and as directed by the Department, collaborate with Data Governance team to develop, maintain, and align data standards and business glossary/data dictionary with the data models, as well as develop error handling and data quality rules to be used in the data pipelines, adhering to the main data quality dimensions of accuracy, completeness, timeliness, consistency, validity, and uniqueness.
β’ Monthly and as directed by the Department, collaborate with business SMEβs, the SI, architects, and technical data stewards to create and maintain target state data models (conceptual, logical, physical) for both the data warehouse and data lakehouse. This includes physical and logical data structures and both normalized (i.e., third normal form β 3NF), dimensional models (i.e., star and snowflake schemas, and delta lake/iceberg schemas.
β’ Monthly and as directed by the Department, use Informatica Cloud Metadata Command Center (MCC) and Cloud Data Governance and Catalog (CDGC) to design, develop, and support metadata (both active and passive) tagging / collection and maintenance processes and pipelines, ensuring metadata is kept in sync in the various EDAP components.
β’ Monthly and as directed by the Department, collaborate with Department OIT and technical SMEs to provision source data connectors for purposes of data profiling, lineage, quality scoring, and integration. Manage the Informatica secure agent server clusters to optimize performance and cost.
β’ Monthly and as directed by the Department, collaborate with the SI, EDAP project team and architects to design (source to target mapping), develop, and support data pipeline components, including ingestion, data lake layers, data warehouse structures, data lakehouse tables, ensuring high data quality, security, pipeline orchestration, performance and cost optimization.
β’ Monthly and as directed by the Department, collaborate with the SI, EDAP project team and architects to design, develop and support Informatica IDMC Cloud
β’ Customer 360 SaaS MDM and RDM, including source system data ingestion, domain models, data enrichment and quality, match/merge logic, survivorship rules, golden record, and data extraction for the data hubs and data lakehouse/warehouse environments.
β’ Monthly and as directed by the Department, collaborate with the SI and EDAP project team in designing, developing, and supporting a performance-optimized, high quality data hub environment for system-to-system data exchanges to replace βspaghettiβ integrations that accommodates batch and real-time updates and are accessed via Application Programming Interfaces (APIs).
β’ Monthly and as directed by the Department, collaborate with the SI and OIT staff, and the Department analyst community to implement and support ESRI, ABI (Qlik, Tableau) and DSML (Dataiku) tools, including connections to EDAP data structures and metadata, as well as fine-grained user access controls(row and column) utilizing classifications in the metadata.
β’ Monthly and as directed by the Department, collaborate with Department OIT staff to optimize performance and cost of Snowflake virtual warehouse clusters and
β’ Databricks classic compute plane.
β’ Monthly and as directed by the Department, collaborate with the SI and OIT staff to ensure the EDAP design incorporates robust security and privacy compliance throughout the cloud ecosystem, including AWS S3, Databricks, Snowflake, Informatica, Qlik, Tableau and ESRI.
β’ Monthly and as directed by the Department, ensure the data pipeline, data storage, and data access implementations addresses data privacy, and regulatory compliance to HIPAA and other regulated sensitive data types and includes sensitive data identification, classification, and tagging or metadata for both structured and unstructured data (i.e. files).
β’ Monthly and as directed by the Department, ensure EDAP components addresses secure system and user (developers and analysts) access to tools and data as well as monitoring/alerting of access to various components of EDAP. User authorization includes fine-grained user access controls such as RBAC, PBAC, ABAC, at the rowand column-level.
β’ Monthly, follow the EDAP testing plan to perform various data pipeline testing functions, including unit, integration, regression and support user acceptance testing.
β’ DATA SECURITY AND CONFIDENTIALITY TASK: As directed by the Department, the Contractor, its employees, subcontractors, and agents must always comply with all Department data security procedures and policies in the performance of this scope of work as specified in the Data Security and Confidentiality document attached to the purchase order.
Contractor Qualifications And Experience
Contractor staff assigned to this agreement must possess the following minimum qualifications and experience:
β’ Minimum of Bachelorβs (4-year) degree in Computer Science, Analytics/Data Science, Information Systems, Business Administration, Public Health Informatics, or another related field.
β’ Current data and/or analytics certification such as Certified Data Management Professional (CDMP). Eighteen or more hours of participation in webinars or conferences over the last 3 years related to data and analytics may be substituted for the certification.
β’ Five or more years of experience collaborating with various lines of business and the analyst/data science community; must demonstrate an understanding of general
β’ business operations related to healthcare and administration, the ability to translate use case requirements into data and analytics (D&A) solutions, and the ability to articulate D&A solutions, tool usage and data models to a non-technical audience.
β’ Five or more years of experience modeling, engineering, implementing, and supporting data warehouses (both normalized and dimensional) including two or more years of experience with Snowflake Data Warehouse.
β’ Three or more years of experience designing, engineering, implementing, and supporting cloud-based data lakes, including lake layers and bucket structures in AWS S3 and related Apache Foundation tools such as Parquet.
β’ Two or more years of experience modeling, engineering, implementing, and supporting cloud data lakehouse structures (Delta Lake, Hudi, and Iceberg) utilizing Databricks.
β’ Seven or more years of experience in data modeling (including Entity Relationship, Logical, Conceptual, and Physical models) for analytical purposes, and data
β’ profiling/reverse engineering for both schema-on-read and schema-on-write environments. The experience must include advanced proficiency with Erwin data
β’ modeler.
β’ Five or more years of experience with data pipeline/Integration tools, including source to target mapping, coding, data quality and enrichment transforms, observability, orchestration, performance optimization, and testing, via various methods (i.e., ETL/ELT/batch, CDC, Streaming) using Informatica tools.
β’ This experience must include two or more years of experience using Informatica IDMC suite of tools including Cloud Data Integration (CDI), Cloud Data Quality
β’ (CDQ), Cloud Data Profiling (CDP), and Cloud Data Ingestion and Replication (CDIR).
β’ Five or more years of experience with SQL programming, three or more years of experience with Python or a similar object-oriented high-level programming language.
β’ Experience with AWS Lambda functions is helpful.
β’ Five or more years of experience engineering relational (both row and columnar store) and NoSQL (i.e., Document, Graph, Vector, Key-Value) databases.
β’ Three or more years of experience working within the AWS cloud infrastructure.
β’ Two or more years of experience utilizing Dev or DataOps processes.
β’ Five or more years of data and analytics testing/quality assurance and acceptance experience, including best practices, tools, and automation.
β’ Two or more years of experience working with metadata and a data catalog that support the data lakehouse and data warehouse operations, as well as supporting machine learning.
β’ Two or more years of experience implementing and supporting Master Data Management (MDM) and Reference Data Management (RDM) in Informatica IDMC (Cloud) Customer
β’ and Reference 360 SaaS.
β’ Four or more years of experience supporting cloud-based Analytics & Business Intelligence (ABI) tools: Qlik and Tableau, including secure data warehouse connections, performance and user access controls.
β’ Two or more years of experience supporting cloud-based Data Science and Machine Learning platforms (DSML): Dataiku, including statistical model life-cycle management, endpoints, and machine learning.
β’ Must possess excellent communication skills, including verbal, written and diagramming.
About Vector
Vector Consulting, Inc., (Headquartered in Atlanta) is an IT Talent Acquisition Solutions firm committed to delivering results. Since our founding in 1990, we have been partnering with our customers, understanding their business, and developing solutions with a commitment to quality, reliability and value. Our continuing growth has been and continues to be built around successful relationships that are based on our organization's operating philosophy and commitment to
β’
β’ People, Partnerships, Purpose and Performance - THE VECTOR WAY
www.vectorconsulting.com
βCelebrating 30 years of service.β