Flink Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Flink Engineer with expertise in the Datastreams API, offering a hybrid position in Dallas. Contract length and pay rate are unspecified. Key skills include proficiency in Flink APIs, state management, and AKS infrastructure management.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 25, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Dallas, TX
🧠 - Skills detailed
#Data Lake #Databases #API (Application Programming Interface) #Azure ADLS (Azure Data Lake Storage) #Disaster Recovery #Dynatrace #Azure #Deployment #Logging #Monitoring #Observability #MongoDB #Kafka (Apache Kafka) #Storage #Documentation #ADLS (Azure Data Lake Storage)
Role description

Position: Flink with Datastreams API.

Location – Dallas/ Hybrid

F2F Interview at Dallas, TX

Project Request: Flink with Datastreams API.

Requirements: Please provide supporting documentation or case studies confirming you have successfully implemented Flink for previous clients—specifically using the Datastreams API – Proof of experience is critical.

   • Confirmation that you are currently and actively supporting a client using Flink with the Datastreams API is required.

Requirements for Resource:

Proficient in writing and supporting both functional and non-functional aspects of Flink with DataStreams API.

Flink Functional Requirements:

   • Expertise in Flink APIs (DataStream, Process functions, etc.).

   • Competence in state management (checkpoints and savepoints) with local storage.

   • Configuration of connectors like EventHub, Kafka, and MongoDB.

   • Implementation of Flink API Aggregators.

   • Handling watermarks for out-of-order events.

   • Management of state using Azure Data Lake Storage (ADLS).

Flink Non-Functional Requirements:

   • Set up a private Flink cluster within a designated AKS environment.

   • Configure both session-based and application-type deployments.

   • Define and build nodes and slots.

   • Manage and configure Job/Task Managers.

   • Establish necessary connectors, e.g., external storage for the Flink Cluster.

   • Configure heap memory and RocksDB for state management.

   • Define and set up checkpoints and savepoints for state recovery.

   • Enable Auto-Pilot capabilities.

   • Integrate network resources, such as Azure EventHub and external databases like MongoDB.

   • Implement integration with ArgoCD for job submissions.

   • Install LTM agents for logging and Dynatrace agents for monitoring purposes.

   • Provide access to the Flink Dashboard.

   • Establish High Availability (HA) and Disaster Recovery (DR) configurations.

Additional Requirements:

   • Fully manage the underlying AKS infrastructure that supports Flink and its applications.

Ensure end-to-end observability of Flink and AKS using Dynatrace

Thanks and Regards

Jayesh Bhati

Recruitment and operation

Email: Jayesh@smartitframe.com

Smart IT Frame LLC.