USEReady

Business Analyst - Data Integration

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Business Analyst – Data Integration for a 12+ month contract in Alpharetta, GA, offering a competitive pay rate. Requires 5–8 years in data integration, proficiency in SQL, and experience with API projects in FinTech or financial services.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 12, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alpharetta, GA
-
🧠 - Skills detailed
#Monitoring #Data Integration #Collibra #Data Modeling #UAT (User Acceptance Testing) #Documentation #JSON (JavaScript Object Notation) #Data Governance #Business Analysis #Jira #Metadata #Data Pipeline #Snowflake #Redshift #Airflow #Agile #Cloud #GCP (Google Cloud Platform) #PostgreSQL #Grafana #REST (Representational State Transfer) #Normalization #Data Engineering #Azure #Alation #Scrum #Physical Data Model #Datadog #AWS (Amazon Web Services) #Data Quality #"ETL (Extract #Transform #Load)" #Scala #Computer Science #dbt (data build tool) #Fivetran #API (Application Programming Interface) #Data Catalog #Compliance #SQL (Structured Query Language) #Swagger
Role description
Business Analyst – Data Integration Alpharetta, GA 12+ months Contract- Onsite Role Overview We are seeking an experienced Business Analyst – Data Integration to join growing Technology & Data team. In this role, you will serve as the critical bridge between business stakeholders and engineering teams, driving the design, documentation, and delivery of enterprise-grade data integration initiatives. You will own the end-to-end lifecycle of API-based and ETL integration projects — from identifying data elements and metadata, to creating source-to-target mapping documents and data dictionaries — ensuring that private markets data flows accurately, reliably, and performantly across our platform. Key Responsibilities Data Integration & API Projects • Lead business analysis for data integration (API and ETL) projects, gathering and translating complex business requirements into actionable technical specifications. • Identify, catalog, and validate data elements and metadata across source and target systems to create comprehensive source-to-target mapping documents. • Build and maintain data dictionaries that define data fields, formats, lineage, and business rules across private markets data ecosystem. • Collaborate with engineering and data engineering teams to design integration workflows that connect investor, shareholder, company, and transaction data domains. • Partner with third-party data vendors and internal product teams to define API contracts, payload structures, and data exchange standards. Enterprise Data Modeling • Contribute to the design and governance of enterprise data model, ensuring consistency across investor records, cap table data, fund structures, and private placement transactions. • Develop logical and physical data models in alignment with business requirements and downstream reporting needs. • Review and validate data model changes with architects and senior engineers, ensuring referential integrity and scalability. • Maintain model documentation and facilitate data model reviews across cross-functional stakeholders. Troubleshooting, Performance Tuning & Monitoring • Diagnose and resolve data pipeline failures, data quality issues, and integration anomalies across staging, UAT, and production environments. • Use database tools (e.g., SQL, Snowflake, dbt, Datadog, Grafana, or similar) to monitor pipeline health, job execution times, and data freshness SLAs. • Conduct performance tuning analysis on slow-running queries, inefficient joins, and poorly indexed tables — producing optimization recommendations backed by profiling data. • Set up and refine alerting thresholds for data pipelines to enable proactive issue detection and minimize downstream business impact. • Produce root cause analysis (RCA) reports following incidents and drive post-mortem remediation actions. Stakeholder Collaboration & Documentation • Work cross-functionally with Deal Operations, Research, Product, Legal, and Compliance teams to identify use cases, data requirements, and functional specifications. • Facilitate requirements workshops, data discovery sessions, and sign-off reviews with business stakeholders at all levels. • Author and maintain integration runbooks, BRDs (Business Requirements Documents), functional specifications, and test plans. • Support UAT by defining acceptance criteria, coordinating test data preparation, and tracking defect resolution. • Communicate integration progress, risks, and blockers clearly to engineering leads and business sponsors. Required Qualifications • 5–8 years of experience as a Business Analyst with a strong focus on data integration, ETL, or API-based projects in a FinTech, financial services, or enterprise data environment. • Hands-on experience creating source-to-target mapping documents and data dictionaries for complex, multi-source integration projects. • Solid understanding of enterprise data modeling concepts (logical/physical models, ERDs, normalization, dimensional modeling). • Proficiency in SQL for data querying, profiling, and validation across relational and cloud-based data platforms (e.g., Snowflake, PostgreSQL, Redshift). • Experience with API integration projects — including REST/JSON payload analysis, Swagger/OpenAPI documentation, and endpoint mapping. • Demonstrated experience in performance tuning and troubleshooting data pipelines or integration jobs. • Familiarity with data monitoring tools (Datadog, Monte Carlo, Great Expectations, or equivalent) and alerting frameworks. • Experience working in Agile/Scrum environments with tools such as Jira and Confluence. • Strong documentation skills with the ability to produce clear, audience-appropriate technical and business-facing artifacts. • Bachelor's degree in Computer Science, Information Systems, Finance, or a related quantitative field. Preferred Qualifications • Experience in FinTech, private equity, capital markets, or investment platforms (familiarity with cap tables, private placements, or fund structures is a strong plus). • Exposure to dbt, Airflow, Fivetran, or other modern data stack tooling. • Knowledge of data governance frameworks and data catalog tools (e.g., Alation, Collibra, Atlan). • Experience with cloud platforms such as AWS, GCP, or Azure in a data engineering context. • CBAP, PMI-PBA, or equivalent business analysis certification. • Familiarity with SDLC governance and change management practices in a regulated financial environment. Thanks and regards, Asha Krishna Associate Director - Talent Partner - US & Canada https://www.useready.com/ Email: ashak@useready.com