Data Analyst - SQL

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analyst in Smithfield, RI, with a contract length of "unknown". Pay rate is "unknown". Requires strong SQL, AWS, and batch scheduling skills, along with 5+ years in production/data support. Bachelor's degree required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 22, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Smithfield, RI
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Snowflake #Data Analysis #GitHub #Batch #Oracle #AWS (Amazon Web Services) #Debugging #Linux #Scripting #Databases #Unix #Deployment #EC2 #Jenkins #Python #S3 (Amazon Simple Storage Service)
Role description
Job Description: Data Analyst – Batch & Data Services Smithfield RI About the Role We are seeking an experienced Data Analyst to join our Batch & Data Services team. This role provides exposure to a wide range of data applications and batch processes across our product portfolio. It requires strong analytical skills, technical expertise, and the ability to collaborate across diverse, global teams. Key Responsibilities β€’ Support data applications, Control-M batch environment, and AWS batch processes. β€’ Analyze and resolve production data and batch cycle issues. β€’ Participate in on-call rotation for production support (every 3 weeks – weekend coverage). β€’ Perform change reviews and manage code deployments (GitHub, Jenkins). β€’ Automate manual processes and recommend efficiency improvements. β€’ Collaborate with engineering and infrastructure teams to maintain a stable production environment. Top Skills (Must-Haves) β€’ Strong SQL skills (joins, debugging, log analysis). β€’ Hands-on experience with Oracle and Snowflake databases. β€’ AWS experience (S3, EC2). β€’ Batch scheduling with Control-M. β€’ CI/CD tools such as GitHub and Jenkins. Nice-to-Have Skills β€’ Experience in financial services or large-scale enterprise environments. β€’ Exposure to multi-source, federated data environments. β€’ Python or other scripting languages. Qualifications β€’ Bachelor’s degree in a related field. β€’ 5+ years of IT experience, primarily in production/data/batch support. β€’ Hands-on experience with PL/SQL, Unix/Linux, AWS, Snowflake, and scripting. β€’ Strong incident management skills: ability to troubleshoot, lead recovery calls, and drive root cause analysis. β€’ Strong communication skills with ability to work across global teams. β€’ Ability to work independently while staying aligned with team objectives. Work Schedule β€’ Standard hours: 8:30 AM – 4:30 PM β€’ On-call rotation: every 3 weeks (weekend support for deployments) The Team You’ll be part of a global team operating across the US and India, providing 24/7 production support including incident management, problem resolution, change deployments, and reporting for critical business applications. The team works closely with product engineering, business stakeholders, and enterprise infrastructure partners.