

Data Engineer (Python & AWS) - Local Preferred
We are seeking a skilled and enthusiastic Mid-Level Data Engineer to join our team. The ideal candidate will have a strong foundation in data engineering, with excellent programming skills in Python and basic knowledge of Java is a plus. This role requires experience with data orchestration tools like Airflow, container orchestration platforms such as EKS and ECS, and infrastructure as code using Terraform. Experience with AWS enterprise implementations and big data platforms is also essential.
Qualifications:
o Bachelor’s degree in computer science, Software Engineering or a related study, or equivalent experience
o 7 + years of experience spanning at least two IT disciplines, including technical architecture, application development, middleware, database management
o 3+ years of experience with AWS enterprise implementations including EC2, Data Pipeline, & EMR
o Hands-on experience in working with SPARK and handling terabyte size datasets
o Experience in implementing complex ETL transformations on big data platform like NoSQL databases (Mongo, DynamoDB, Cassandra)
o 5+ years of Programming Experience in Python
o Strong understanding of Event-Driven Architecture (EDA), Event Streaming - preferably Apache Kafka
o Comfortable learning cutting edge technologies and applications to greenfield project
Additional Skills:
·Strong analytical and problem-solving skills, with attention to detail.
·Ability to work independently and collaboratively in a team environment.
·Good communication skills, with the ability to convey technical concepts to non-technical stakeholders.
·A proactive approach to learning and adapting to new technologies and methodologies
We are seeking a skilled and enthusiastic Mid-Level Data Engineer to join our team. The ideal candidate will have a strong foundation in data engineering, with excellent programming skills in Python and basic knowledge of Java is a plus. This role requires experience with data orchestration tools like Airflow, container orchestration platforms such as EKS and ECS, and infrastructure as code using Terraform. Experience with AWS enterprise implementations and big data platforms is also essential.
Qualifications:
o Bachelor’s degree in computer science, Software Engineering or a related study, or equivalent experience
o 7 + years of experience spanning at least two IT disciplines, including technical architecture, application development, middleware, database management
o 3+ years of experience with AWS enterprise implementations including EC2, Data Pipeline, & EMR
o Hands-on experience in working with SPARK and handling terabyte size datasets
o Experience in implementing complex ETL transformations on big data platform like NoSQL databases (Mongo, DynamoDB, Cassandra)
o 5+ years of Programming Experience in Python
o Strong understanding of Event-Driven Architecture (EDA), Event Streaming - preferably Apache Kafka
o Comfortable learning cutting edge technologies and applications to greenfield project
Additional Skills:
·Strong analytical and problem-solving skills, with attention to detail.
·Ability to work independently and collaboratively in a team environment.
·Good communication skills, with the ability to convey technical concepts to non-technical stakeholders.
·A proactive approach to learning and adapting to new technologies and methodologies