Big Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include proficiency in big data technologies like Snowflake and Hadoop, along with 5+ years of software engineering experience and a BS in Engineering/Computer Science.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date discovered
August 27, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
San Jose, CA
-
🧠 - Skills detailed
#Databases #Computer Science #Big Data #Data Manipulation #SQL (Structured Query Language) #Cloud #Storage #Java #Hadoop #Data Engineering #Golang #Snowflake #Data Storage #Research Skills #Code Reviews #Agile #Kafka (Apache Kafka)
Role description
Summary The software engineer will be is responsible for architecting, implementing, and supporting data storage and retrieval solutions for fraud-related customer data both on-prem and in the cloud. The team handles petabytes of data, stored in various databases. Responsibilities Interface with other technical personnel or team members to finalize requirements. β€’ Write and review portions of detailed specifications for the development of complex system components. β€’ Complete complex bug fixes. β€’ Work closely with other development team members to understand complex product requirements and translate them into software designs. β€’ Successfully implement development processes, coding best practices, and code reviews. β€’ Operate in various development environments (Agile, Waterfall, etc.) while collaborating with key stakeholders. β€’ Resolve complex technical issues, as necessary. Qualifications 5+ years of Software Engineering experience β€’ BS Engineering/Computer Science or equivalent experience required Technical Skills Proficiency in big data, with preference given to experience with Snowflake, Hadoop. β€’ Good understanding of Kafka. β€’ Advanced knowledge of software development methodologies (e.g., Agile). β€’ Strong proficiency with data manipulation language including optimization techniques. β€’ Strong understanding of normalized/dimensional data modelling principles. β€’ Strong knowledge of multiple data storage subsystems. β€’ Expertise in development languages including but not limited to: Java/J2EE, SQL, GoLang. β€’ Strong research skills. β€’ Strong knowledge in industry best practices in development. β€’ Knowledge in using and developing applicable tool sets. β€’ Ability to interface competently with other technical personnel or team members to finalize requirements.