Tential Solutions

Big Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include Hadoop, Spark, Hive, Kubernetes, Python or Scala, and AWS services. A Bachelor's degree and five years of experience are required, preferably in Financial Services.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 6, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Rockville, MD
-
🧠 - Skills detailed
#Monitoring #Trino #AI (Artificial Intelligence) #Hadoop #Jenkins #Grafana #Kubernetes #Scala #GitHub #Computer Science #Big Data #Amazon EMR (Amazon Elastic MapReduce) #S3 (Amazon Simple Storage Service) #Data Engineering #"ETL (Extract #Transform #Load)" #Data Pipeline #Terraform #Data Processing #SQL (Structured Query Language) #Prometheus #Spark (Apache Spark) #Infrastructure as Code (IaC) #Python #AWS (Amazon Web Services) #Programming #Cloud #Complex Queries #Lambda (AWS Lambda)
Role description
Overview: Join our dynamic team as a Big Data Engineer, where you'll leverage cutting-edge technologies to design, develop, and optimize large-scale data processing systems. This thrilling role will have you collaborating with cross-functional teams to architect data pipelines and implement integration solutions that drive decision-making and enhance operational efficiency. As a cornerstone of our operations, your expertise in big data technologies and distributed systems will help us harness the power of our data for business transformation. Required Skills β€’ Big Data Technologies: Proficient in Hadoop, Spark, Hive, and Trino. β€’ Container Orchestration & Kubernetes: Strong experience managing Kubernetes architecture and running Spark workloads on Amazon EMR on EKS. β€’ Programming Languages: Proficiency in Python or Scala with a focus on modular and performant code. β€’ SQL Skills: Expertise in window functions, multi-table joins, and writing complex queries. β€’ Cloud Technologies: Familiarity with AWS services such as S3, Glue, Lambda, and CloudWatch. Nice To Have Skills β€’ Experience with CI/CD pipelines (Jenkins, GitHub Actions). β€’ Knowledge of Infrastructure as Code (Terraform, CloudFormation). β€’ Familiarity with monitoring tools like Prometheus and Grafana. β€’ AWS certifications (e.g., AI Practitioner, Solutions Architect). Preferred Education And Experience β€’ Bachelor’s degree in Computer Science, Information Systems, or a related field, with at least five (5) years of relevant experience. A Master's degree and experience in the Financial Services industry are preferred. Other Requirements β€’ Strong written and verbal communication skills. β€’ Ability to work effectively in fast-paced environments. β€’ Willingness to remain current with emerging technologies and best practices in the industry.