EPITEC

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Engineer position for an ongoing W2 contract in San Jose, CA, offering a hybrid work arrangement. Requires 5+ years in software engineering, proficiency in big data (Snowflake, Hadoop), and expertise in Java, SQL, and GoLang.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 16, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Research Skills #Data Manipulation #Kafka (Apache Kafka) #SQL (Structured Query Language) #Big Data #Data Storage #Code Reviews #Computer Science #Java #Snowflake #Agile #Hadoop #Golang #Data Engineering #Storage
Role description
Job Title: Data Engineer Job Type: On-Going W2 Contract Location: San Jose, CA Work Arrangement: Hybrid RESPONSIBILITIES • Interface with other technical personnel or team members to finalize requirements. • Write and review portions of detailed specifications for the development of complex system components. • Complete complex bug fixes. • Work closely with other development team members to understand complex product requirements and translate them into software designs. • Successfully implement development processes, coding best practices, and code reviews. • Operate in various development environments (Agile, Waterfall, etc.) while collaborating with key stakeholders. • Resolve complex technical issues, as necessary. • Train entry-level software engineers as directed by department management, ensuring they are knowledgeable in critical aspects of their roles. • Keep abreast of new technology developments. • Design and work with complex data models. • Mentor less-senior software developers on development methodologies and optimization techniques. • All other duties as assigned. QUALIFICATIONS • 5+ years of Software Engineering experience • BS Engineering/Computer Science or equivalent experience required TECHNICAL SKILLS • Proficiency in big data, with preference given to experience with Snowflake, Hadoop. • Good understanding of Kafka. • Advanced knowledge of software development methodologies (e.g., Agile). • Strong proficiency with data manipulation language including optimization techniques. • Strong understanding of normalized/dimensional data modelling principles. • Strong knowledge of multiple data storage subsystems. • Expertise in development languages including but not limited to: Java/J2EE, SQL, GoLang. • Strong research skills. • Strong knowledge in industry best practices in development. • Knowledge in using and developing applicable tool sets. • Ability to interface competently with other technical personnel or team members to finalize requirements. • Ability to work well with internal and external technology resources. • Knowledge of test-driven development. • Ability to write and review portions of detailed specifications for the development of complex system components. • Ability to complete complex bug fixes. • Good oral and written communications skills.