

Senior Data Analyst – Reporting & Automation
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Analyst – Reporting & Automation, offering a 6-month hybrid contract in NYC/Santa Monica, CA, with a pay rate of up to $67/hr. Key skills include advanced SQL, Python/JavaScript scripting, and BI tools like Looker/Tableau.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
536
-
🗓️ - Date discovered
September 18, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Hybrid
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
New York City Metropolitan Area
-
🧠 - Skills detailed
#JavaScript #Data Quality #Scripting #Statistics #Macros #Visualization #Data Modeling #Data Analysis #"ETL (Extract #Transform #Load)" #Regression #GitHub #Tableau #Automation #GIT #Logging #BI (Business Intelligence) #Datasets #Streamlit #Version Control #Redshift #SQL (Structured Query Language) #Python #BigQuery #Snowflake #Looker #Airflow
Role description
City: NYC/ Santa Monica, CA
Onsite/ Hybrid/ Remote: Hybrid, 4 days onsite
Duration: 6 months with strong potential for extension/conversion (up to 24 months)
Rate Range: Up to$67/hr on W2 depending on experience (no C2C or 1099 or sub-contract)
Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
• Advanced SQL for analytical ETL/pipeline development (Snowflake or similar MPP).
• Scripting for automation (Python or JavaScript) including Google Apps Script.
• Experience automating recurring reporting (email sends, deck builds) end-to-end.
• Data quality engineering in pipelines (inline checks, alerting, exception handling).
• Dashboarding/BI (Looker and/or Tableau) with strong data modeling fundamentals.
• Version control and code review workflows (Git/GitHub).
• Excel workflow automation (macros, formula-driven processes refactored to code).
Responsibilities:
• Build end-to-end reporting automation for business-critical initiatives (templated email sends, automated deck builds, scheduled jobs).
• Translate intake requests into clear data requirements and SLAs; design resilient pipelines that minimize manual touchpoints.
• Develop performant SQL and transformation logic against vetted analytical datasets; productionize jobs with parameterization and logging.
• Embed inline data-quality tests and safeguards (freshness, completeness, threshold checks) to ensure executive-grade accuracy.
• Create maintainable dashboards/visuals for operational through executive audiences; push beyond third-party tool limits when needed.
• Partner cross-functionally with upstream data/software engineering and domain analysts to align schemas, definitions, and cadence.
• Document solutions (runbooks, configs, data contracts) and contribute to reusable internal tooling/components for reporting enablement.
• Proactively identify opportunities to de-risk manual workflows and drive standardization across domains (e.g., Partnerships, Subscriber Planning).
Qualifications:
• Bachelor’s degree in a STEM/analytical field.
• 5+ years in analytics or data product/enablement roles with measurable automation impact.
• 3+ years hands-on SQL in an analytical ETL environment (Snowflake/BigQuery/Redshift).
• 3+ years scripting (Python or JavaScript); experience with Google Apps Script for workflow automation.
• Proven track record converting manual Excel processes into robust, scheduled pipelines.
• BI experience (Looker/Tableau); strong ability to convey technical/analytical concepts to diverse audiences.
• Familiarity with Git/GitHub and collaborative dev practices.
Preferred:
• Advanced custom visualization frameworks (D3.js, Streamlit) to extend interactivity beyond standard BI tools.
• Experience prototyping “homegrown” reporting tools/products from scratch.
• Workflow orchestration familiarity (e.g., Airflow or equivalent).
• Applied statistics exposure (hypothesis testing, regressions) for KPI validation and QA.
City: NYC/ Santa Monica, CA
Onsite/ Hybrid/ Remote: Hybrid, 4 days onsite
Duration: 6 months with strong potential for extension/conversion (up to 24 months)
Rate Range: Up to$67/hr on W2 depending on experience (no C2C or 1099 or sub-contract)
Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have:
• Advanced SQL for analytical ETL/pipeline development (Snowflake or similar MPP).
• Scripting for automation (Python or JavaScript) including Google Apps Script.
• Experience automating recurring reporting (email sends, deck builds) end-to-end.
• Data quality engineering in pipelines (inline checks, alerting, exception handling).
• Dashboarding/BI (Looker and/or Tableau) with strong data modeling fundamentals.
• Version control and code review workflows (Git/GitHub).
• Excel workflow automation (macros, formula-driven processes refactored to code).
Responsibilities:
• Build end-to-end reporting automation for business-critical initiatives (templated email sends, automated deck builds, scheduled jobs).
• Translate intake requests into clear data requirements and SLAs; design resilient pipelines that minimize manual touchpoints.
• Develop performant SQL and transformation logic against vetted analytical datasets; productionize jobs with parameterization and logging.
• Embed inline data-quality tests and safeguards (freshness, completeness, threshold checks) to ensure executive-grade accuracy.
• Create maintainable dashboards/visuals for operational through executive audiences; push beyond third-party tool limits when needed.
• Partner cross-functionally with upstream data/software engineering and domain analysts to align schemas, definitions, and cadence.
• Document solutions (runbooks, configs, data contracts) and contribute to reusable internal tooling/components for reporting enablement.
• Proactively identify opportunities to de-risk manual workflows and drive standardization across domains (e.g., Partnerships, Subscriber Planning).
Qualifications:
• Bachelor’s degree in a STEM/analytical field.
• 5+ years in analytics or data product/enablement roles with measurable automation impact.
• 3+ years hands-on SQL in an analytical ETL environment (Snowflake/BigQuery/Redshift).
• 3+ years scripting (Python or JavaScript); experience with Google Apps Script for workflow automation.
• Proven track record converting manual Excel processes into robust, scheduled pipelines.
• BI experience (Looker/Tableau); strong ability to convey technical/analytical concepts to diverse audiences.
• Familiarity with Git/GitHub and collaborative dev practices.
Preferred:
• Advanced custom visualization frameworks (D3.js, Streamlit) to extend interactivity beyond standard BI tools.
• Experience prototyping “homegrown” reporting tools/products from scratch.
• Workflow orchestration familiarity (e.g., Airflow or equivalent).
• Applied statistics exposure (hypothesis testing, regressions) for KPI validation and QA.