Aklip Technologies

Data Analytics Engineer (Only W2)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Analytics Engineer with 5+ years of data engineering experience, proficient in ETL/ELT, Python, and cloud platforms like AWS. Contract duration is 6-12 months, located in Sunnyvale, CA (Hybrid), with a W2 pay structure.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 17, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Sunnyvale, CA
-
🧠 - Skills detailed
#Spark (Apache Spark) #GIT #Data Migration #Data Lake #Data Pipeline #Delta Lake #Airflow #S3 (Amazon Simple Storage Service) #Version Control #Data Engineering #Migration #Snowflake #Pandas #PySpark #Athena #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Normalization #Data Quality #Documentation #AWS (Amazon Web Services) #Data Manipulation #Anomaly Detection #Data Modeling #Monitoring #Cloud #SQL (Structured Query Language) #Python
Role description
Data Analytics Engineer Duration: 6-12 Months Location: Sunnyvale, CA(Hybrid) Job type: Contract(Only W2) Job Description We are looking for a data analytics engineer (a data engineer who specializes in the data analytics space) with the following expertise: Core Professional Competencies • Communication: Clearly communicates technical work to diverse audiences, verbally and in writing. Participates in peer reviews and team discussions with clarity and purpose. • Documentation: Maintains clear, structured documentation of project logic, decisions, and maintenance. Contributes to team standards for reproducibility and transparency. • Collaboration: Works effectively with cross-functional partners. Values shared ownership and ensures continuity through knowledge sharing. • Initiative: Comfortable in ambiguity; proactively identifies issues and opportunities. Demonstrates curiosity and critical thinking. • Attention to Detail: Delivers high-quality, consistent code and documentation that supports long-term maintainability and trust in data systems. Data Engineering Expertise (5+ years) • Experienced in building and maintaining data pipelines (ETL/ELT) • Proficient with orchestration tools (e.g., Airflow, dbt, Prefect) • Comfortable working with cloud platforms (e.g., AWS) and tools like Snowflake • Familiar with data lake and warehouse architecture (e.g., S3 + Athena, Delta Lake) • Strong Python skills for data manipulation (e.g., pandas, pyarrow, pyspark) Data Infrastructure & Management (5+ years) • Expertise in data modeling (star/snowflake schemas, normalization, dimensional modeling) • Skilled in maintaining data quality and integrity (data monitoring, validation, deduplication, anomaly detection) • Familiar with version control, CI/CD practices for data workflows (e.g., Git) Skill Sets Niche Skill Experience Preference Snowflake No 2-5 Years Is Required Data Migration No 5-10 Years Is Required Database Technologies No 5-10 Years Is Required Advanced SQL No 5-10 Years Is Required Data Engineering No 5-10 Years Is Required python No 5-10 Years Is Required