

Flink Engineer
Position: Flink with Datastreams API.
Location – Dallas/ Hybrid
F2F Interview at Dallas, TX
Project Request: Flink with Datastreams API.
Requirements: Please provide supporting documentation or case studies confirming you have successfully implemented Flink for previous clients—specifically using the Datastreams API – Proof of experience is critical.
• Confirmation that you are currently and actively supporting a client using Flink with the Datastreams API is required.
Requirements for Resource:
Proficient in writing and supporting both functional and non-functional aspects of Flink with DataStreams API.
Flink Functional Requirements:
• Expertise in Flink APIs (DataStream, Process functions, etc.).
• Competence in state management (checkpoints and savepoints) with local storage.
• Configuration of connectors like EventHub, Kafka, and MongoDB.
• Implementation of Flink API Aggregators.
• Handling watermarks for out-of-order events.
• Management of state using Azure Data Lake Storage (ADLS).
Flink Non-Functional Requirements:
• Set up a private Flink cluster within a designated AKS environment.
• Configure both session-based and application-type deployments.
• Define and build nodes and slots.
• Manage and configure Job/Task Managers.
• Establish necessary connectors, e.g., external storage for the Flink Cluster.
• Configure heap memory and RocksDB for state management.
• Define and set up checkpoints and savepoints for state recovery.
• Enable Auto-Pilot capabilities.
• Integrate network resources, such as Azure EventHub and external databases like MongoDB.
• Implement integration with ArgoCD for job submissions.
• Install LTM agents for logging and Dynatrace agents for monitoring purposes.
• Provide access to the Flink Dashboard.
• Establish High Availability (HA) and Disaster Recovery (DR) configurations.
Additional Requirements:
• Fully manage the underlying AKS infrastructure that supports Flink and its applications.
Ensure end-to-end observability of Flink and AKS using Dynatrace
Thanks and Regards
Jayesh Bhati
Recruitment and operation
Email: Jayesh@smartitframe.com
Smart IT Frame LLC.
Position: Flink with Datastreams API.
Location – Dallas/ Hybrid
F2F Interview at Dallas, TX
Project Request: Flink with Datastreams API.
Requirements: Please provide supporting documentation or case studies confirming you have successfully implemented Flink for previous clients—specifically using the Datastreams API – Proof of experience is critical.
• Confirmation that you are currently and actively supporting a client using Flink with the Datastreams API is required.
Requirements for Resource:
Proficient in writing and supporting both functional and non-functional aspects of Flink with DataStreams API.
Flink Functional Requirements:
• Expertise in Flink APIs (DataStream, Process functions, etc.).
• Competence in state management (checkpoints and savepoints) with local storage.
• Configuration of connectors like EventHub, Kafka, and MongoDB.
• Implementation of Flink API Aggregators.
• Handling watermarks for out-of-order events.
• Management of state using Azure Data Lake Storage (ADLS).
Flink Non-Functional Requirements:
• Set up a private Flink cluster within a designated AKS environment.
• Configure both session-based and application-type deployments.
• Define and build nodes and slots.
• Manage and configure Job/Task Managers.
• Establish necessary connectors, e.g., external storage for the Flink Cluster.
• Configure heap memory and RocksDB for state management.
• Define and set up checkpoints and savepoints for state recovery.
• Enable Auto-Pilot capabilities.
• Integrate network resources, such as Azure EventHub and external databases like MongoDB.
• Implement integration with ArgoCD for job submissions.
• Install LTM agents for logging and Dynatrace agents for monitoring purposes.
• Provide access to the Flink Dashboard.
• Establish High Availability (HA) and Disaster Recovery (DR) configurations.
Additional Requirements:
• Fully manage the underlying AKS infrastructure that supports Flink and its applications.
Ensure end-to-end observability of Flink and AKS using Dynatrace
Thanks and Regards
Jayesh Bhati
Recruitment and operation
Email: Jayesh@smartitframe.com
Smart IT Frame LLC.