

Axiom Global Technologies
Business Data Analyst
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Business Data Analyst with a contract length of "unknown", offering a pay rate of "unknown". Key skills include advanced SQL, SSIS, and strong ETL concepts. Experience in insurance/finance is preferred. Work location is "unknown".
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
May 2, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Jersey City, NJ
-
🧠 - Skills detailed
#SSIS (SQL Server Integration Services) #Azure Data Factory #Data Pipeline #Data Quality #Business Analysis #Documentation #Snowflake #Azure #Databricks #ADF (Azure Data Factory) #"ETL (Extract #Transform #Load)" #Data Lineage #SQL (Structured Query Language) #Data Analysis
Role description
We are looking for a skilled Technical Data / Business Analyst with strong expertise in SQL & SSIS to support modernization of legacy ETL pipelines.
Key Responsibilities:
• Analyze and reverse engineer SSIS packages, SQL procedures & data flows
• Create Source-to-Target Mapping (STTM) and functional specs
• Translate legacy ETL logic into modern data pipelines (ADF / Databricks)
• Document data lineage, transformations, and dependencies
• Work with stakeholders to validate business rules and define test criteria
• Support reconciliation, testing, and data validation
Required Skills:
• Advanced SQL (joins, CTEs, performance tuning)
• Hands-on SSIS experience
• Strong ETL concepts (SCD, incremental loads, data quality, error handling)
• Excellent documentation and analytical skills
Nice to Have:
• Experience with Azure Data Factory, Databricks, Snowflake
• Exposure to data domains like insurance/finance
We are looking for a skilled Technical Data / Business Analyst with strong expertise in SQL & SSIS to support modernization of legacy ETL pipelines.
Key Responsibilities:
• Analyze and reverse engineer SSIS packages, SQL procedures & data flows
• Create Source-to-Target Mapping (STTM) and functional specs
• Translate legacy ETL logic into modern data pipelines (ADF / Databricks)
• Document data lineage, transformations, and dependencies
• Work with stakeholders to validate business rules and define test criteria
• Support reconciliation, testing, and data validation
Required Skills:
• Advanced SQL (joins, CTEs, performance tuning)
• Hands-on SSIS experience
• Strong ETL concepts (SCD, incremental loads, data quality, error handling)
• Excellent documentation and analytical skills
Nice to Have:
• Experience with Azure Data Factory, Databricks, Snowflake
• Exposure to data domains like insurance/finance






