

Senior Data Analyst
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Analyst contractor, onsite in Dallas, TX, for a contract length of "X months" at a pay rate of "$X/hour". Requires 5+ years in data analytics, experience with Snowflake, SQL, and ETL processes. US Citizenship required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 3, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Dallas, TX
-
π§ - Skills detailed
#GitHub #Data Analysis #"ETL (Extract #Transform #Load)" #Agile #Data Accuracy #Libraries #Matillion #SQL Queries #SSIS (SQL Server Integration Services) #Jenkins #ML (Machine Learning) #SAP #Pandas #Scala #Liquibase #Databases #Azure #Alteryx #SnowPipe #Data Ingestion #Deployment #Datasets #Batch #SQL (Structured Query Language) #Data Engineering #Data Pipeline #ADF (Azure Data Factory) #Database Administration #DevOps #Data Cleansing #Databricks #Snowflake #Azure Data Factory #Python
Role description
Hi,
Looking for Only US Citizen and locals
Role: Data Analytics Contractor
Location: 2200 Ross Avenue, Dallas, TX, USA, 75201 -Onsite
Type : contract
Description
2200 Ross Avenue, Dallas, TX, USA, 75201
The services You will provide the client project team:
As a Data Analytics Contractor, you will analyze and interpret complex data sets to provide actionable insights and support decision-making processes. Collect, process, and analyze large datasets to identify trends, patterns, and insights. Develop and maintain data models, dashboards, and reports to visualize data findings. Utilize statistical and analytical tools to perform data analysis and generate insights. Collaborate with project teams to understand data requirements and deliver relevant analytical solutions. Ensure data accuracy and integrity by performing data validation and quality checks.
Additional Details for Role :
Key Responsibilities
β’ Develop and optimize complex batch and near-real-time SQL queries to meet product requirements and business needs.
β’ Design and implement core foundational datasets that are reusable, scalable, and performant.
β’ Architect, implement, deploy, and maintain data-driven solutions in Snowflake and Databricks.
β’ Develop and manage data pipelines supporting multiple reports, tools, and applications.
β’ Conduct advanced statistical analysis to yield actionable insights, identify correlations/trends, measure performance, and visualize disparate sources of data.
Required Skills and Experience
β’ 5+ years in operations, supply chain, data engineering, data analytics, and/or database administration.
β’ Experience in design, implementations, and optimization in Snowflake or other relational SQL databases
β’ Experience with data cleansing, curation, mining, manipulation, and analysis from disparate systems
β’ Experience with configuration control (GitHub preferred) and Data DevOps practices using GitHub Actions, Jenkins or other deployment pipelines that provide Continuous Integration and Continuous Delivery (CI/CD)
β’ Experience with Data Extract, Transform and Load (ETL) processes using SQL as the foundation
β’ Experience with database structures, modeling implementation such as third normal form
β’ US Citizenship
Preferred Skills
β’ Experience with Manufacturing, Operations, and/or Supply Chain process and systems
β’ Experience using MRP/ERP systems (SAP)
β’ Experience with Python and related libraries such as Pandas for advanced data analytics
β’ Experience with schema deployment solutions such as SchemaChange or Liquibase
β’ Working knowledge of Agile Software development methodologies
β’ Ability to filter, extract, and analyze information from large, complex datasets
β’ Great verbal and written communication skills to collaborate cross functionally
β’ Experience with data deployment solutions such as Snowflake Tasks, Matillion, SSIS, Azure Data Factory (ADF) or Alteryx
β’ Experience with Snowflake Streams, Stages and Snowpipes for data ingestion
β’ Experience with VS Code and GitHub Desktop for integrated development
β’ Experience with web application relational database models and APIs
β’ Knowledge of statistical modeling and machine learning methods
Hi,
Looking for Only US Citizen and locals
Role: Data Analytics Contractor
Location: 2200 Ross Avenue, Dallas, TX, USA, 75201 -Onsite
Type : contract
Description
2200 Ross Avenue, Dallas, TX, USA, 75201
The services You will provide the client project team:
As a Data Analytics Contractor, you will analyze and interpret complex data sets to provide actionable insights and support decision-making processes. Collect, process, and analyze large datasets to identify trends, patterns, and insights. Develop and maintain data models, dashboards, and reports to visualize data findings. Utilize statistical and analytical tools to perform data analysis and generate insights. Collaborate with project teams to understand data requirements and deliver relevant analytical solutions. Ensure data accuracy and integrity by performing data validation and quality checks.
Additional Details for Role :
Key Responsibilities
β’ Develop and optimize complex batch and near-real-time SQL queries to meet product requirements and business needs.
β’ Design and implement core foundational datasets that are reusable, scalable, and performant.
β’ Architect, implement, deploy, and maintain data-driven solutions in Snowflake and Databricks.
β’ Develop and manage data pipelines supporting multiple reports, tools, and applications.
β’ Conduct advanced statistical analysis to yield actionable insights, identify correlations/trends, measure performance, and visualize disparate sources of data.
Required Skills and Experience
β’ 5+ years in operations, supply chain, data engineering, data analytics, and/or database administration.
β’ Experience in design, implementations, and optimization in Snowflake or other relational SQL databases
β’ Experience with data cleansing, curation, mining, manipulation, and analysis from disparate systems
β’ Experience with configuration control (GitHub preferred) and Data DevOps practices using GitHub Actions, Jenkins or other deployment pipelines that provide Continuous Integration and Continuous Delivery (CI/CD)
β’ Experience with Data Extract, Transform and Load (ETL) processes using SQL as the foundation
β’ Experience with database structures, modeling implementation such as third normal form
β’ US Citizenship
Preferred Skills
β’ Experience with Manufacturing, Operations, and/or Supply Chain process and systems
β’ Experience using MRP/ERP systems (SAP)
β’ Experience with Python and related libraries such as Pandas for advanced data analytics
β’ Experience with schema deployment solutions such as SchemaChange or Liquibase
β’ Working knowledge of Agile Software development methodologies
β’ Ability to filter, extract, and analyze information from large, complex datasets
β’ Great verbal and written communication skills to collaborate cross functionally
β’ Experience with data deployment solutions such as Snowflake Tasks, Matillion, SSIS, Azure Data Factory (ADF) or Alteryx
β’ Experience with Snowflake Streams, Stages and Snowpipes for data ingestion
β’ Experience with VS Code and GitHub Desktop for integrated development
β’ Experience with web application relational database models and APIs
β’ Knowledge of statistical modeling and machine learning methods