

Senior Data Analyst
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Analyst with a contract length of "unknown," offering a pay rate of "unknown," located in "unknown." Requires 10+ years of experience in data analysis, expertise in SQL, DB2, Snowflake, and knowledge of the Property and Casualty insurance industry.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
720
-
🗓️ - Date discovered
July 17, 2025
🕒 - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
📄 - Contract type
Unknown
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Woodbridge, NJ
-
🧠 - Skills detailed
#Data Quality #Python #Data Analysis #Documentation #Deployment #Computer Science #"ETL (Extract #Transform #Load)" #SQL (Structured Query Language) #Data Lifecycle #Data Mart #Jira #Databases #Data Cleansing #AWS (Amazon Web Services) #Data Architecture #Data Transformations #Snowflake #Informatica #Data Mapping #Data Modeling #Data Warehouse
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Position: Senior Data Analyst
Overview
We are seeking a results-driven Senior Data Analyst to drive the successful execution of enterprise data initiatives within our insurance technology landscape. This individual will play a critical role in overseeing the design, execution, and support of complex data warehouse solutions related to policies, claims, billing, quoting, agencies, and supporting data marts. As a seasoned analyst, you will serve as a strategic link between business needs and technical implementation—ensuring high-quality data solutions are delivered on time and aligned with organizational goals.
This role requires end-to-end project ownership, technical mentoring, deep data knowledge, and the ability to collaborate across cross-functional teams. You’ll also be responsible for managing your workload while coordinating deliverables across internal teams and stakeholders.
What You’ll Do
• Lead full-cycle data projects—from requirement gathering and analysis to deployment and post-implementation support.
• Collaborate with stakeholders to translate business needs into technical specifications and data models.
• Design and optimize complex data transformations and mappings across DB2, Snowflake, and other relational databases.
• Support testing strategies, including unit, integration, user acceptance, and performance testing phases.
• Develop documentation for system specifications, data flows, and transformation logic following SDLC best practices.
• Mentor and guide junior analysts, providing direction and oversight as needed.
• Interface with ETL developers, application teams, and QA resources to ensure cohesive data solutions.
• Troubleshoot data issues and perform performance tuning across production environments.
• Maintain a proactive stance by identifying opportunities for improvement and solving data quality challenges before they impact operations.
• Balance multiple initiatives, manage shifting priorities, and help ensure timely project completion.
• Serve as a subject matter expert in insurance data lifecycle, including policy and claims operations.
Required Qualifications
• Bachelor’s degree in Computer Science or related field (or equivalent experience).
• 10+ years of hands-on experience working in data analysis roles, preferably within insurance or finance.
• Expertise in SQL and strong familiarity with DB2 and Snowflake environments.
• Strong understanding of data transformation, cleansing, and mapping techniques.
• Experience with project estimation, planning, and delivery under structured SDLC methodologies.
• Familiarity with Jira or other project tracking tools.
• Advanced knowledge in data modeling and performance tuning strategies.
• Demonstrated ability to manage projects and mentor teams.
• Proven success in working across functional teams and delivering production-ready data systems.
• Strong analytical mindset and attention to detail.
• Excellent communication skills with the ability to articulate complex technical issues to both technical and non-technical audiences.
• Must be comfortable working independently and collaboratively in a fast-paced environment.
• Knowledge of the Property and Casualty (P&C) insurance industry and understanding of policy and claim life cycles.
• Familiarity with Informatica or similar ETL technologies for code review and impact analysis.
• Working knowledge of AWS-based data architecture.
• Exposure to Python (a plus, not required).
• Hands-on experience in large-scale Data Warehouse projects using modern tools and methodologies.
Skills: data,data cleansing,performance tuning,data mapping,operations,aws,communication,informatica,data solutions,python,mentoring,sql,sdlc methodologies,data modeling,jira,property & casualty insurance,db2,data transformation,project planning,insurance,snowflake,project estimation,data warehouse