

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include AWS (Lambda, S3, Airflow), Kafka, Python, SQL (Snowflake), and Jenkins. Experience with integrations between PeopleSoft and Workday is essential.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 18, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New York City Metropolitan Area
-
π§ - Skills detailed
#S3 (Amazon Simple Storage Service) #Pandas #API (Application Programming Interface) #PeopleSoft #Data Engineering #Lambda (AWS Lambda) #Python #AWS (Amazon Web Services) #Airflow #Workday #Snowflake #Spark (Apache Spark) #Kafka (Apache Kafka) #Batch #SQL (Structured Query Language) #Jenkins #PySpark #Data Warehouse #Automation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Must Have:
β’ AWS - Lambda, s3, airflow, batch, fargate
β’ Kafka
β’ Python - data frames and pandas
β’ Streaming/API development
β’ SQL - Snowflake
β’ Jenkins
Nice to Have:
β’ PySpark
β’ Test automation
Other Notes:
β’ PeopleSoft is being moved over to Workday
β’ Small portion of ESM β doing a lot of integrations between Coupa/Beeline & Peoplesoft/Workday
β’ A lot of work in OnePipeline
β’ Data Publishing to Data Warehouse which is Snowflake via OneStream