Data Analyst

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an entry-level Data Analyst, remote, with a contract length of over 6 months, offering competitive pay. Key skills required include SQL, ETL tools (ADF, Synapse), and familiarity with Azure services. A Bachelor’s degree is essential.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
August 22, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
Remote
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Data Analysis #Indexing #SSIS (SQL Server Integration Services) #Data Extraction #Monitoring #Microsoft Power BI #Data Pipeline #SQL (Structured Query Language) #Computer Science #DAX #Security #ADF (Azure Data Factory) #Python #Unit Testing #"ETL (Extract #Transform #Load)" #BI (Business Intelligence) #GIT #Documentation #Data Quality #Version Control #Base #Data Modeling #Azure DevOps #Synapse #SQL Server #DevOps #Azure SQL #Azure Data Factory #Scripting #Database Management #Databases #Databricks #Data Governance #Azure #Database Design
Role description
Data Analyst (Entry Level) Work Authorization • Candidates must be authorized to work in the U.S. without current or future sponsorship. • No C2C / No third-party vendors. Location: Remote Employment Type: Full-time, W2 Work Authorization: No sponsorship available. No C2C/third parties. About the Role We’re hiring an entry-level Data Analyst to support clients teams with database management, ETL pipelines, and data quality so leaders can make reliable, data-driven decisions. You’ll collaborate with senior engineers, PMs, and business stakeholders and gain hands-on experience with the Azure data stack at enterprise scale. What You’ll Do • Partner with senior management, technical, and client teams to gather data requirements and define data models. • Design, implement, automate, and maintain ETL/ELT pipelines (Azure Data Factory/Synapse/SSIS). • Write and optimize SQL/T-SQL for data extraction, transformation, and validation. • Document logical and physical database designs; help implement schema changes. • Maintain and enhance existing databases and data processes; triage and resolve data issues. • Test data workflows, monitor quality, and implement fixes to ensure accuracy and reliability. • Create clear documentation and contribute to reporting/dashboards (often in Power BI). Basic Qualifications • Bachelor’s degree in Computer Science, Engineering, Information Systems, or related field. • 0–2 years of professional experience in a data, analytics, or database role (internships/co-ops count). • Proficiency with SQL and familiarity with relational databases (SQL Server/Azure SQL). • Exposure to ETL concepts and tools (ADF, Synapse, or SSIS preferred). • Knowledge of a scripting language (Python or PowerShell). • Strong communication, troubleshooting, and documentation skills. • Passion for working with data and enabling data-driven decisions. Preferred Qualifications • Experience working with or supporting Big Tech / large-scale enterprise environments (internships, projects, contract roles, or prior FTE). • Familiarity with Azure data services (Azure Data Factory, Synapse Analytics, Databricks, Cosmos DB). • Basics of Power BI (data modeling, DAX fundamentals) and version control (Git/Azure DevOps). • Understanding of data modeling, indexing, and performance tuning. Nice to Have • Exposure to CI/CD for data pipelines, unit testing, and monitoring/alerting. • Knowledge of data governance, privacy, and security best practices. Work Authorization • Candidates must be authorized to work in the U.S. without current or future sponsorship. • No C2C / No third-party vendors. Compensation & Benefits • Compensation: $[range] base + benefits (medical, dental, vision, PTO, 401(k)). • Final offer depends on location, experience, and interview performance. How to Apply Please submit your resume and a brief note highlighting: 1. Your SQL + ETL experience, 1. Azure/Power BI exposure, and 1. Any experience supporting Big Tech or large-scale data environments.