

Swoon
Data Analyst
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst with a 7-month contract in downtown Chicago, paying competitively. Key skills include Python, SQL, AWS S3, Snowflake, and dashboarding tools like Tableau. Hybrid work requires 3 days in-office weekly.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
368
-
ποΈ - Date
April 28, 2026
π - Duration
Unknown
-
ποΈ - Location
Hybrid
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Chicago, IL
-
π§ - Skills detailed
#Scripting #Data Processing #Tableau #Unix #Data Management #Linux #AWS (Amazon Web Services) #Data Engineering #Automation #Shell Scripting #Visualization #Data Lifecycle #Microsoft Power BI #Cloud #Data Manipulation #Datasets #Databricks #S3 (Amazon Simple Storage Service) #AWS S3 (Amazon Simple Storage Service) #Jupyter #Data Strategy #Snowflake #Python #PySpark #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Strategy #Data Analysis #BI (Business Intelligence)
Role description
Position: Data Analyst
Client: Fortune 100 Bank
Location: Chicago downtown - Hybrid 3x per week
Duration: 7-month contract to start
Pay Rate:
Role Overview
We are seeking a technically-inclined Data Analyst to join our team in downtown Chicago for a 7-month contract. In this role, you will be responsible for transforming raw data into actionable insights, utilizing a modern cloud data stack to support enterprise-level decision-making. The ideal candidate is comfortable navigating both the coding and visualization aspects of the data lifecycle.
Key Responsibilities
β’ Data Engineering & Analysis: Utilize Python and SQL to retrieve, manipulate, and clean complex datasets for downstream analysis.
β’ Cloud Data Management: Interact with AWS S3 for data consumption and leverage Snowflake for cloud data warehousing needs.
β’ Visualization: Design and maintain intuitive dashboards to communicate key performance metrics to stakeholders.
β’ Automation: Use Linux/Unix shell scripting to navigate environments and automate routine data tasks.
β’ Collaboration: Work within a hybrid team structure to align data deliverables with business objectives.
Technical Requirements
β’ Python/PySpark: Hands-on experience using Databricks or Jupyter Notebooks for data processing.
β’ SQL: Advanced proficiency in data manipulation and complex query writing.
β’ Cloud Infrastructure: Experience with AWS (S3) and Snowflake cloud data warehousing.
β’ Operating Systems: Comfortable with Linux/Unix basics and shell scripting.
β’ Dashboarding: Experience with visual reporting tools such as Tableau, Power BI, or QuickSight.
Logistics & Benefits
β’ Work Schedule: Hybrid model requiring 3 days per week in our Downtown Chicago office.
β’ Duration: 7-month initial contract.
β’ Impact: Direct involvement in high-level data strategy within a collaborative environment
Position: Data Analyst
Client: Fortune 100 Bank
Location: Chicago downtown - Hybrid 3x per week
Duration: 7-month contract to start
Pay Rate:
Role Overview
We are seeking a technically-inclined Data Analyst to join our team in downtown Chicago for a 7-month contract. In this role, you will be responsible for transforming raw data into actionable insights, utilizing a modern cloud data stack to support enterprise-level decision-making. The ideal candidate is comfortable navigating both the coding and visualization aspects of the data lifecycle.
Key Responsibilities
β’ Data Engineering & Analysis: Utilize Python and SQL to retrieve, manipulate, and clean complex datasets for downstream analysis.
β’ Cloud Data Management: Interact with AWS S3 for data consumption and leverage Snowflake for cloud data warehousing needs.
β’ Visualization: Design and maintain intuitive dashboards to communicate key performance metrics to stakeholders.
β’ Automation: Use Linux/Unix shell scripting to navigate environments and automate routine data tasks.
β’ Collaboration: Work within a hybrid team structure to align data deliverables with business objectives.
Technical Requirements
β’ Python/PySpark: Hands-on experience using Databricks or Jupyter Notebooks for data processing.
β’ SQL: Advanced proficiency in data manipulation and complex query writing.
β’ Cloud Infrastructure: Experience with AWS (S3) and Snowflake cloud data warehousing.
β’ Operating Systems: Comfortable with Linux/Unix basics and shell scripting.
β’ Dashboarding: Experience with visual reporting tools such as Tableau, Power BI, or QuickSight.
Logistics & Benefits
β’ Work Schedule: Hybrid model requiring 3 days per week in our Downtown Chicago office.
β’ Duration: 7-month initial contract.
β’ Impact: Direct involvement in high-level data strategy within a collaborative environment






