

BCforward
Data Analyst
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst with a contract length of "unknown" and a pay rate of $50.00 - $70.00 per hour. Key skills include 5+ years in data modeling, expertise in Cogndum and Treeno, and proficiency in Databricks and SQL.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
February 11, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Columbus, OH
-
🧠 - Skills detailed
#Visualization #AI (Artificial Intelligence) #Tableau #Python #Alteryx #Programming #Databricks #Data Wrangling #Jira #Data Processing #Model Deployment #Data Management #Redshift #Agile #Lambda (AWS Lambda) #Data Engineering #S3 (Amazon Simple Storage Service) #Documentation #Complex Queries #Scala #Security #"ETL (Extract #Transform #Load)" #Data Lake #Datasets #Data Governance #ML (Machine Learning) #Data Analysis #SQL (Structured Query Language) #AWS (Amazon Web Services) #Data Pipeline #Data Science #Compliance #SQL Queries #Data Modeling #Cloud #Java #Jupyter #Deployment
Role description
Job Responsibilities
We are building and standing up a modern data system to support enterprise-wide data management, analytics, and advanced reporting capabilities. The system will leverage data modeling, data lakes, and data science workflows to ensure scalability, reliability, and actionable insights. This role will involve working with cutting-edge technologies such as Cogndum, Treeno, Databricks, and modern data science platforms.
Responsibilities
Design, develop, and maintain data models using Cogndum and Treeno, ensuring they align with business and technical requirements.
Build scalable data pipelines and ETL workflows using Databricks to ingest, process, and transform large datasets.
Architect and manage Data Lakes for structured and unstructured data, ensuring data availability, integrity, and governance.
Write and optimize SQL queries for data analysis, reporting, and integration across multiple systems.
Develop and implement Python-based data science workflows, including preprocessing, modeling, and deployment of ML solutions.
Use Java for back-end integrations, APIs, or high-performance data processing where required.
Partner with business stakeholders, data scientists, and engineers to translate requirements into technical solutions.
Ensure compliance with data governance, security, and regulatory standards.
Prepare and maintain clear documentation of data models, workflows, and processes.
Required Skills & Qualifications
5+ years of experience in data modeling, data engineering, or data analytics.
Hands-on expertise in Data Modeling using Cogndum and Treeno.
Strong experience with Databricks for building pipelines, transformations, and analytics.
Proficiency in Data Lakes architecture and implementation.
Advanced knowledge of SQL for complex queries, tuning, and reporting.
Strong programming experience with Python (data wrangling, ML workflows) and Java (back-end integration, processing).
Experience in data science workflows, including exploratory data analysis (EDA), feature engineering, and model integration.
Excellent problem-solving, analytical, and communication skills.
Nice to Have
Experience working with AWS cloud services (S3, Redshift, Glue, EMR, Lambda).
Familiarity with Jupyter Notebooks for prototyping, data analysis, and collaborative development.
Exposure to AI/ML projects and model deployment pipelines.
Experience in agile methodologies and tools such as Jira/Confluence.
Pay: $50.00 - $70.00 per hour
Benefits:
401(k)
Dental insurance
Health insurance
Vision insurance
Experience:
Alteryx: 3 years (Required)
Tableau: 3 years (Required)
SQL: 3 years (Required)
Data visualization: 3 years (Required)
Work Location: In person
Job Responsibilities
We are building and standing up a modern data system to support enterprise-wide data management, analytics, and advanced reporting capabilities. The system will leverage data modeling, data lakes, and data science workflows to ensure scalability, reliability, and actionable insights. This role will involve working with cutting-edge technologies such as Cogndum, Treeno, Databricks, and modern data science platforms.
Responsibilities
Design, develop, and maintain data models using Cogndum and Treeno, ensuring they align with business and technical requirements.
Build scalable data pipelines and ETL workflows using Databricks to ingest, process, and transform large datasets.
Architect and manage Data Lakes for structured and unstructured data, ensuring data availability, integrity, and governance.
Write and optimize SQL queries for data analysis, reporting, and integration across multiple systems.
Develop and implement Python-based data science workflows, including preprocessing, modeling, and deployment of ML solutions.
Use Java for back-end integrations, APIs, or high-performance data processing where required.
Partner with business stakeholders, data scientists, and engineers to translate requirements into technical solutions.
Ensure compliance with data governance, security, and regulatory standards.
Prepare and maintain clear documentation of data models, workflows, and processes.
Required Skills & Qualifications
5+ years of experience in data modeling, data engineering, or data analytics.
Hands-on expertise in Data Modeling using Cogndum and Treeno.
Strong experience with Databricks for building pipelines, transformations, and analytics.
Proficiency in Data Lakes architecture and implementation.
Advanced knowledge of SQL for complex queries, tuning, and reporting.
Strong programming experience with Python (data wrangling, ML workflows) and Java (back-end integration, processing).
Experience in data science workflows, including exploratory data analysis (EDA), feature engineering, and model integration.
Excellent problem-solving, analytical, and communication skills.
Nice to Have
Experience working with AWS cloud services (S3, Redshift, Glue, EMR, Lambda).
Familiarity with Jupyter Notebooks for prototyping, data analysis, and collaborative development.
Exposure to AI/ML projects and model deployment pipelines.
Experience in agile methodologies and tools such as Jira/Confluence.
Pay: $50.00 - $70.00 per hour
Benefits:
401(k)
Dental insurance
Health insurance
Vision insurance
Experience:
Alteryx: 3 years (Required)
Tableau: 3 years (Required)
SQL: 3 years (Required)
Data visualization: 3 years (Required)
Work Location: In person






