Data Processing Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a remote Data Processing Engineer on a 3-month contract, offering $50-$55/hr. Key skills include AWS Managed Flink, SNS, Glue, and advanced Python. Experience with Apache Beam and ETL pipeline development is required.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
440
🗓️ - Date discovered
April 25, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Remote
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
New York, United States
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Apache Airflow #AWS Glue #SNS (Simple Notification Service) #Data Storage #Storage #AWS IAM (AWS Identity and Access Management) #Data Architecture #Data Catalog #GCP (Google Cloud Platform) #Python #AWS (Amazon Web Services) #Cloud #Terraform #AWS Lambda #Apache Beam #IAM (Identity and Access Management) #Amazon CloudWatch #Documentation #Lambda (AWS Lambda) #Airflow #Data Processing
Role description

Our client in Rochester, NY, is looking to hire a remote Data Processing Engineer.

This is a fully-remote/part-time/3-month contract position (No Third Party Employers please)

Hourly range based on experience: $50-$55/HR

Primary Responsibilities:

   • Implement AWS SNS topic as GCP PubSub equivalent

   • Convert Apache Beam configurations to AWS Managed Flink for ~16 jobs

   • Integrate AWS Managed Flink with AWS Glue

   • Configure Glue Data Catalog for processed data storage

   • Set up and configure AWS Managed Airflow

   • Migrate workflows from Google Composer to AWS MWAA

   • Assist with testing and documentation of data processing components

   • Support AWS Marketplace integration for data processing aspects

Required AWS Service Skills:

   • AWS Managed Flink

   • AWS SNS

   • AWS Glu

   • AWS MWAA (Managed Airflow)

   • AWS Lambda

   • Amazon CloudWatch

   • AWS IAM

Required Technical Skills:

   • Python (advanced)

   • Terraform

   • Apache Beam

   • Apache Airflow

   • ETL Pipeline Development

   • Data Processing

   • Streaming Data Architecture

Required Soft Skills:

   • Problem-solving

   • Team Collaboration

   • Technical Documentation

   • Communication Adaptability