

Aklip Technologies
Data Engineer With Kafka And Flink
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with Kafka and Flink, offering a hybrid position in NYC, NY or Fortmill, SC. Contract duration exceeds 6 months, with a focus on Apache Kafka and Flink, Java/Scala programming, and distributed systems expertise.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
February 11, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
New York, United States
-
🧠 - Skills detailed
#Python #Programming #Logging #Apache Kafka #Data Quality #Azure #Spark (Apache Spark) #Data Engineering #Documentation #Scala #Security #Docker #Monitoring #GCP (Google Cloud Platform) #Code Reviews #Kubernetes #AWS (Amazon Web Services) #Data Pipeline #Kafka (Apache Kafka) #Data Modeling #Cloud #Java #JSON (JavaScript Object Notation)
Role description
Job Title: Kafka / Flink Engineer
Location: NYC, NY || Fortmill, SC Hybrid
C2C /Fulltime
Job Description:
We are looking for a Kafka / Flink Engineer to design, build, and support real-time streaming data platforms. The ideal candidate will have strong hands-on experience with Apache Kafka and Apache Flink, and a solid understanding of distributed systems, stream processing, and data pipelines in production environments.
Key Responsibilities:
Design, develop, and maintain real-time data streaming solutions using Apache Kafka and Apache Flink
Build and optimize Kafka producers, consumers, topics, and stream processing applications
Develop stateful and stateless Flink jobs for real-time analytics and event processing
Ensure high availability, fault tolerance, and scalability of streaming platforms
Monitor, troubleshoot, and resolve performance and reliability issues in production
Collaborate with data engineers, platform teams, and application teams to integrate streaming solutions
Implement data quality, security, and governance best practices
Participate in code reviews, design discussions, and technical documentation
Required Skills & Qualifications:
Strong hands-on experience with Apache Kafka (Kafka Streams, Connect, Schema Registry)
Solid experience with Apache Flink for real-time stream processing
Strong programming skills in Java or Scala (Python is a plus)
Experience working with distributed systems and event-driven architectures
Knowledge of message serialization formats (Avro, Protobuf, JSON)
Experience with monitoring and logging tools for streaming systems
Good understanding of data modeling and streaming data patterns
Preferred / Nice to Have:
Experience with cloud platforms (AWS, Azure, or GCP)
Knowledge of containerization and orchestration (Docker, Kubernetes)
Experience with CI/CD pipelines for data platforms
Exposure to other streaming tools such as Spark Streaming or Pulsar
Understanding of security, authentication, and authorization in Kafka ecosystems
Job Title: Kafka / Flink Engineer
Location: NYC, NY || Fortmill, SC Hybrid
C2C /Fulltime
Job Description:
We are looking for a Kafka / Flink Engineer to design, build, and support real-time streaming data platforms. The ideal candidate will have strong hands-on experience with Apache Kafka and Apache Flink, and a solid understanding of distributed systems, stream processing, and data pipelines in production environments.
Key Responsibilities:
Design, develop, and maintain real-time data streaming solutions using Apache Kafka and Apache Flink
Build and optimize Kafka producers, consumers, topics, and stream processing applications
Develop stateful and stateless Flink jobs for real-time analytics and event processing
Ensure high availability, fault tolerance, and scalability of streaming platforms
Monitor, troubleshoot, and resolve performance and reliability issues in production
Collaborate with data engineers, platform teams, and application teams to integrate streaming solutions
Implement data quality, security, and governance best practices
Participate in code reviews, design discussions, and technical documentation
Required Skills & Qualifications:
Strong hands-on experience with Apache Kafka (Kafka Streams, Connect, Schema Registry)
Solid experience with Apache Flink for real-time stream processing
Strong programming skills in Java or Scala (Python is a plus)
Experience working with distributed systems and event-driven architectures
Knowledge of message serialization formats (Avro, Protobuf, JSON)
Experience with monitoring and logging tools for streaming systems
Good understanding of data modeling and streaming data patterns
Preferred / Nice to Have:
Experience with cloud platforms (AWS, Azure, or GCP)
Knowledge of containerization and orchestration (Docker, Kubernetes)
Experience with CI/CD pipelines for data platforms
Exposure to other streaming tools such as Spark Streaming or Pulsar
Understanding of security, authentication, and authorization in Kafka ecosystems





