

ektello
Data Analyst
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst with a 12-month contract in New York, NY, paying $43-47/hr. Requires a Bachelor's degree, 3+ years in data analysis, strong SQL and Python skills, and experience with data pipelines and Snowflake.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
376
-
🗓️ - Date
April 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, NY
-
🧠 - Skills detailed
#IAM (Identity and Access Management) #A/B Testing #Jira #"ETL (Extract #Transform #Load)" #Agile #Snowflake #Computer Science #Mathematics #Lambda (AWS Lambda) #Athena #Automation #EC2 #SQL Queries #S3 (Amazon Simple Storage Service) #Data Pipeline #Big Data #SQL (Structured Query Language) #GitHub #AWS (Amazon Web Services) #Data Lake #Statistics #Python #Data Analysis
Role description
Title: Data Analyst
Client: Global leader in Technology/ Electronics
Duration: 12 months contract with HIGH chance for extension or become permanent.
Location: One Pennsylvania Plaza, New York, NY (Onsite)
Pay: $43-47/hr W2 + Benefits/PTO
Top Required Skills
• Bachelor's degree
• 3+ years in data analysis or engineering
• Strong SQL skills – Minimum 3 years hands-on SQL experience.
• Minium 2 years of hands-on Python experience
• Hands-on data pipelines experience
• Exposure to Snowflake or Big data
Responsibilities:
• Working closely with internal stakeholders, translate business requirements into SQL queries and Python scripts for Ad Hoc analysis and custom reporting solutions.
• Transform and cleanse outputs into meaningful analysis for business/sales teams Explore data to identify client ad performance opportunities, trends and anomalies, contribute to new reporting capabilities/solutions.
• Architect data pipelines to connect different data sources and automate reporting flows.
• Develop and maintain Extract, Transform & Load (ETL) processes tailored to evolving client needs.
• Tune and optimize current reporting solutions to meet updated client needs, new measurement methodologies, or improve query performance.
Qualifications:
• Bachelor's Degree
• 3+ years in a related engineering/analytics work.
• Strong SQL skills - 3 years using SQL.
• 2+ years of Python experience.
• Exposure to Snowflake or Big data.
• Experience with automation, and ability to build custom solutions for unique client needs across various software tools.
• Nice to have experience with AWS or Automation tools such as IAM, Lambda, EC2 and S3.
• Nice to have Jira, Confluence, GitHub and other agile development tool experience.
• Ability to communicate technical roadblocks to non-technical stakeholders.
Bonus:
• Degree in computer science/engineering or quantitative discipline (Economics, Statistics, Engineering, Physics, Mathematics.
• Familiarity with media, TV measurement.
• Experience designing AB tests.
• Big Data / Data Lake Experience.
• Exposure to Data Cleanrooms.
• Athena Experience.
Title: Data Analyst
Client: Global leader in Technology/ Electronics
Duration: 12 months contract with HIGH chance for extension or become permanent.
Location: One Pennsylvania Plaza, New York, NY (Onsite)
Pay: $43-47/hr W2 + Benefits/PTO
Top Required Skills
• Bachelor's degree
• 3+ years in data analysis or engineering
• Strong SQL skills – Minimum 3 years hands-on SQL experience.
• Minium 2 years of hands-on Python experience
• Hands-on data pipelines experience
• Exposure to Snowflake or Big data
Responsibilities:
• Working closely with internal stakeholders, translate business requirements into SQL queries and Python scripts for Ad Hoc analysis and custom reporting solutions.
• Transform and cleanse outputs into meaningful analysis for business/sales teams Explore data to identify client ad performance opportunities, trends and anomalies, contribute to new reporting capabilities/solutions.
• Architect data pipelines to connect different data sources and automate reporting flows.
• Develop and maintain Extract, Transform & Load (ETL) processes tailored to evolving client needs.
• Tune and optimize current reporting solutions to meet updated client needs, new measurement methodologies, or improve query performance.
Qualifications:
• Bachelor's Degree
• 3+ years in a related engineering/analytics work.
• Strong SQL skills - 3 years using SQL.
• 2+ years of Python experience.
• Exposure to Snowflake or Big data.
• Experience with automation, and ability to build custom solutions for unique client needs across various software tools.
• Nice to have experience with AWS or Automation tools such as IAM, Lambda, EC2 and S3.
• Nice to have Jira, Confluence, GitHub and other agile development tool experience.
• Ability to communicate technical roadblocks to non-technical stakeholders.
Bonus:
• Degree in computer science/engineering or quantitative discipline (Economics, Statistics, Engineering, Physics, Mathematics.
• Familiarity with media, TV measurement.
• Experience designing AB tests.
• Big Data / Data Lake Experience.
• Exposure to Data Cleanrooms.
• Athena Experience.






