Simplify Software Experts LLC/ Simpli Software Solutions Inc

Data Anallytics Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analytics Engineer with a 12+ month contract, paying competitively, located in Elk Grove, CA, Sunnyvale, CA, or Austin, TX (Hybrid). Requires 5-10 years in Snowflake, Data Migration, Advanced SQL, and Python.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
April 23, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Sunnyvale, CA
-
🧠 - Skills detailed
#Data Governance #Scala #SQL (Structured Query Language) #Spark (Apache Spark) #PySpark #Airflow #Data Quality #Data Migration #Documentation #AWS (Amazon Web Services) #dbt (data build tool) #Data Engineering #SQL Queries #S3 (Amazon Simple Storage Service) #Data Ingestion #Normalization #Pandas #Libraries #Anomaly Detection #Data Lake #"ETL (Extract #Transform #Load)" #Snowflake #Delta Lake #Compliance #Data Science #Python #Athena #Datasets #Version Control #Cloud #Migration
Role description
Data Analytics Engineer Location: Elk Grove, CA or Sunnyvale, CA or Austin, TX (Hybrid) Duration: 12+ Months Job Description About the Role We are seeking a highly skilled Data Analytics Engineer β€” a data engineer with deep specialization in analytics β€” to support complex data infrastructure initiatives and enable data-driven insights at scale. This role sits at the intersection of data engineering and analytics and involves building pipelines, modeling datasets, and ensuring quality, scalability, and accessibility of data systems across cloud platforms and tools. Key Responsibilities Data Engineering β€’ Build and maintain scalable, efficient ETL/ELT pipelines to move and transform data from diverse sources. β€’ Develop and manage orchestration workflows using tools like Airflow, dbt, or Prefect. β€’ Work with Snowflake, AWS, and modern data lake/warehouse architectures (e.g., S3 + Athena, Delta Lake). β€’ Leverage Python libraries such as pandas, pyarrow, or pyspark to manipulate and prepare data for analytics use cases. Data Infrastructure & Management β€’ Design and optimize data models (star/snowflake schemas, dimensional modeling, normalization). β€’ Implement and monitor data quality, including validation, deduplication, and anomaly detection processes. β€’ Ensure consistency through version control, CI/CD pipelines, and collaborative coding practices. Analytics Enablement β€’ Support data migration efforts, from legacy to modern cloud platforms. β€’ Write advanced SQL queries for reporting, analytics, and transformation. β€’ Partner with analysts, data scientists, and business users to deliver clean, curated, and reliable datasets. Professional Competencies β€’ Communication: Clearly explain technical concepts to both technical and non-technical stakeholders; participate in peer reviews. β€’ Documentation: Maintain well-structured, reproducible documentation for logic, decisions, and workflows. β€’ Collaboration: Work closely with cross-functional teams and share ownership through knowledge transfer. β€’ Initiative: Proactively identify data challenges and suggest scalable, maintainable solutions. β€’ Attention to Detail: Deliver clean, maintainable code with a focus on long-term trust in data systems. Required Skills & Experience Skill Area Experience Preference Snowflake 2–5 Years Required Data Migration 5–10 Years Required Database Technologies 5–10 Years Required Advanced SQL 5–10 Years Required Data Engineering 5–10 Years Required Python 5–10 Years Required Preferred Qualifications β€’ Experience in high-scale data environments with multiple stakeholders β€’ Exposure to real-time data ingestion, streaming pipelines, or large-scale data platform migrations β€’ Familiarity with data governance and compliance frameworks Previous contributions to team standards or open-source tooling in the data ecosystem