aKUBE

Sr. Data Analytics Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Analytics Engineer in Glendale/Burbank, CA, for 12 months at up to $96/hr. Requires expert SQL, Snowflake, dbt, Python, AWS familiarity, and 5+ years in enterprise environments. Bachelor's in a STEM field is mandatory.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
768
-
πŸ—“οΈ - Date
October 22, 2025
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Glendale, CA
-
🧠 - Skills detailed
#Cloud #dbt (data build tool) #Data Modeling #Automation #Agile #Computer Science #GitLab #Documentation #GIT #Deployment #GitHub #Scala #Model Deployment #SQL (Structured Query Language) #Python #Scripting #Datasets #Lambda (AWS Lambda) #Scrum #Snowflake #DevOps #Data Engineering #AWS (Amazon Web Services) #Data Architecture #S3 (Amazon Simple Storage Service) #Version Control #"ETL (Extract #Transform #Load)"
Role description
City: Glendale, CA/ Burbank, CA Onsite/ Hybrid/ Remote: Onsite (4 days a week) Duration: 12 months Rate Range: Up to$96/hr on W2 depending on experience (no C2C or 1099 or sub-contract) Work Authorization: GC, USC, All valid EADs except OPT, CPT, H1B Must Have: β€’ Expert-level SQL (data modeling, optimization, query performance) β€’ Hands-on experience with Snowflake β€’ dbt for data transformation and analytics modeling β€’ Proficiency in Python for scripting and automation β€’ Familiarity with AWS cloud services β€’ Version control using GitHub/GitLab β€’ Experience in Agile/Scrum environments Responsibilities: β€’ Build and maintain analytical data models and assets within Snowflake, transforming raw data into trusted, consumable datasets. β€’ Partner with Product Managers and stakeholders to translate business requirements into scalable data products. β€’ Develop and manage transformation workflows using dbt and SQL for analytics use cases. β€’ Ensure quality and performance of data assets through query optimization and validation. β€’ Collaborate with Data Architects, SRE, and Platform teams within an Agile pod structure. β€’ Support model deployment, version control, and documentation using Git-based workflows. β€’ Maintain communication and alignment with cross-functional partners, ensuring deliverables meet evolving business needs. Qualifications: β€’ Bachelor’s degree in Computer Science, Information Systems, or a related STEM field (required). β€’ 5+ years of experience as an Analytics Engineer or Data Engineer in enterprise-scale environments. β€’ Strong understanding of data warehousing principles, dimensional modeling, and pipeline design. β€’ Working knowledge of AWS ecosystem (e.g., S3, Lambda, Glue). β€’ Strong communication and collaboration skills, with the ability to manage multiple priorities in a fast-paced environment. β€’ Experience working in Agile/Scrum teams with structured DevOps processes (pull requests, merge requests, version control).