Data Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst III, requiring 5-7 years of experience, with a contract length of "X months" and a pay rate of "$X/hour." Key skills include SQL, Data Modeling, ETL tools, and Python proficiency.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
600
🗓️ - Date discovered
April 25, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Unknown
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
United States
🧠 - Skills detailed
#Amazon Neptune #"ETL (Extract #Transform #Load)" #ADF (Azure Data Factory) #Data Modeling #Alation #Programming #NumPy #Azure Data Factory #Leadership #Data Analysis #SQL (Structured Query Language) #Azure #Strategy #Python #Libraries #Neo4J #Data Interpretation #Workday #Documentation #Pandas #Databases #Tableau #Presto #BI (Business Intelligence) #Scala #Talend #Airflow #Data Engineering #Informatica
Role description

Data Analyst III

Years of Experience: 5-7 years of experience

Key Projects/Responsibilities:

   • Be adept at sourcing data and executing reporting deliverables.

   • Use tools and programming languages like SQL (presto/hive), Tableau, Excel/Sheets, and many other internal tools (such as Workday) to work efficiently at scale.

   • Apply your technical expertise to understanding business needs, scoping requests, and synthesizing insights in collaboration with other members of the analytics team.

   • Apply your technical expertise to help solve, inform, influence how our HR tools operate and support decision making for our HR technology team.

   • Capture and maintain reliable documentation to support ongoing project deliverables and alignment with stakeholders and engineering teams.

Minimum Qualifications:

   • 3+ years’ experience with SQL (or similar language aimed at querying relational databases).

   • 3+ years Data Modeling Experience and designing reporting tables for consumption by BI software.

   • Experience with ETL and orchestration tools that manipulating large data sets through software suites such as (Informatica, Talend, Azure Data Factory, etc).

   • Experience initiating and driving projects to completion with minimal guidance.

   • Experience processing and analyzing data sets, interpreting them for making business decisions.

   • Experience communicating the results of analyses with product and leadership teams to influence the overall strategy of the product.

   • The ability to have effective conversations with clients about their support needs and requirements.

   • Exceptional professionalism and customer-service skills.

   • Must be comfortable working in a fast-paced and demanding environment.

Preferred Qualifications:

   • Python Proficiency

   • Experience with python libraries such as pandas, NumPy, scikitlearn/sklearn.

   • Experience with graph database platforms (e.g. Neo4j, Amazon Neptune, Stardog).

   • Experience with python native and fully programmable DAGs such as Airflow.

   • Experience building reporting solutions with graph data structures.

How will performance be measured?

   • Accuracy of Data Interpretation

   • Task comprehension and solution efficiency

   • Clarity of Written Communication

   • Data Hygiene and Audit Pass Rate

   • Escalation Awareness and Help Seeking Behavior

Who would be Great Fit?

-Familiar with internal STACK.

-Data engineer experience and can work with customers and understands report ready data – someone that is thinking through the problem.

Disqualifiers:

People without SQL experience.

People that are not familiar with Data.

No experience with BI projects

Interview Process:

3 Rounds of Interview-

1 – Phone screening. (10 mins)

2 – Technical - SQL screening, BI questions, and ability to do modeling and reporting.(45 mins)

3 – Behavioral interviews. (30 mins)