

PTR Global
Big Data Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Big Data Developer, a 6-month contract position based in Alpharetta, GA, with a pay rate of "X". Required skills include 7+ years in data engineering, strong Python, Spark, SQL, and UNIX Shell knowledge.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
Unknown
-
ποΈ - Date
January 3, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Alpharetta, GA
-
π§ - Skills detailed
#DevOps #Scala #Deployment #Impala #Spark (Apache Spark) #Informatica #Python #Big Data #Automation #"ETL (Extract #Transform #Load)" #Informatica Cloud #Code Reviews #Migration #AWS (Amazon Web Services) #GitHub #Unix #Apache Spark #Data Integration #Cloud #Scripting #Snowflake #Batch #Data Engineering #SQL (Structured Query Language) #Azure #HDFS (Hadoop Distributed File System)
Role description
Job Description: CPS Data CoE - Big Data Engineer
β’ This position is for Big data engineer for Client Wealth Management Framework CoE team at clientβs Alpharetta or New York offices.
β’ CoE team is responsible to define and govern the data platforms.
β’ We are looking for colleagues with strong sense of ownership and ability to drive solutions.
β’ The role is primarily responsible to automate the existing process and bring new ideas and innovation.
β’ The candidate is expected to code, conduct code reviews, and test framework as needed, along with participating in application architecture and design and other phases of the automation.
β’ The ideal candidate will be a self-motivated team player committed to delivering on time and should be able to work with or without minimal supervision.
Responsibilities:Design & Develop new automation framework for ETL processing
β’ Support existing framework and become technical point of contact for all related teams
β’ Enhance existing ETL automation framework as per user requirements
β’ Performance tuning of spark, snowflake ETL jobs
β’ New technology POC and suitability analysis for Cloud migration
β’ Process optimization with the help of automation and new utility development
β’ Work in collaboration for any issues and new features
β’ Support any batch issue
β’ Support application team teams with any queries
Required Skills:7 plus years of Data engipluseering experience
β’ Must be strong in UNIX Shell, Python scripting knowledge
β’ Must be strong in Spark
β’ Must have strong knowledge of SQL
β’ Hands-on knowledge on how HDFS/Hive/Impala/Spark works
β’ Strong in logical reasoning capabilities
β’ Should have working knowledge of Github, DevOps, CICD/ Enterprise code management tools
β’ Strong collaboration and communication skills
β’ Must possess strong team-player skills and should have excellent written and verbal communication skills
β’ Ability to create and maintain a positive environment of shared success
β’ Ability to execute and prioritize a task and resolve issues without aid from direct manager or project sponsor
β’ Good to have working experience on snowflake & any data integration tool i.e. informatica cloud
Primary skills:Python, Big Data & Apache Spark
Desired skills:Snowflake/Azure/AWS any cloud
β’ IDMC/any ETL tool
Note:Role Name: Big Data Engineer
β’ Position Title: Consultant
β’ Location: Alpharetta, GA
β’ The department is comprised of 10 organizations: Sales, Banking & Corporate-Client Technology, Investment Products & Markets Technology, Client Reporting, Core Processing, Private and International Wealth Management Technology, Technology Integration Office, Enterprise Infrastructure & Production Management, Capital Markets Application & Data Services, Deployment Planning & Release Management, and the Chief Operating Office.
β’ Department Profile Wealth Managements Core Platform Services group provides horizontal services to all Wealth management Development Teams.
β’ Client mission is to provide stable and scalable infrastructure and technology solutions for entire Wealth Management.
β’ Client serves as a centralized interface through which Wealth Management teams can obtain infrastructure solutions and project support by working closely with application owners throughout the SDLC process to ensure that established products/services are leveraged, and new requirements are fulfilled.
Job Description: CPS Data CoE - Big Data Engineer
β’ This position is for Big data engineer for Client Wealth Management Framework CoE team at clientβs Alpharetta or New York offices.
β’ CoE team is responsible to define and govern the data platforms.
β’ We are looking for colleagues with strong sense of ownership and ability to drive solutions.
β’ The role is primarily responsible to automate the existing process and bring new ideas and innovation.
β’ The candidate is expected to code, conduct code reviews, and test framework as needed, along with participating in application architecture and design and other phases of the automation.
β’ The ideal candidate will be a self-motivated team player committed to delivering on time and should be able to work with or without minimal supervision.
Responsibilities:Design & Develop new automation framework for ETL processing
β’ Support existing framework and become technical point of contact for all related teams
β’ Enhance existing ETL automation framework as per user requirements
β’ Performance tuning of spark, snowflake ETL jobs
β’ New technology POC and suitability analysis for Cloud migration
β’ Process optimization with the help of automation and new utility development
β’ Work in collaboration for any issues and new features
β’ Support any batch issue
β’ Support application team teams with any queries
Required Skills:7 plus years of Data engipluseering experience
β’ Must be strong in UNIX Shell, Python scripting knowledge
β’ Must be strong in Spark
β’ Must have strong knowledge of SQL
β’ Hands-on knowledge on how HDFS/Hive/Impala/Spark works
β’ Strong in logical reasoning capabilities
β’ Should have working knowledge of Github, DevOps, CICD/ Enterprise code management tools
β’ Strong collaboration and communication skills
β’ Must possess strong team-player skills and should have excellent written and verbal communication skills
β’ Ability to create and maintain a positive environment of shared success
β’ Ability to execute and prioritize a task and resolve issues without aid from direct manager or project sponsor
β’ Good to have working experience on snowflake & any data integration tool i.e. informatica cloud
Primary skills:Python, Big Data & Apache Spark
Desired skills:Snowflake/Azure/AWS any cloud
β’ IDMC/any ETL tool
Note:Role Name: Big Data Engineer
β’ Position Title: Consultant
β’ Location: Alpharetta, GA
β’ The department is comprised of 10 organizations: Sales, Banking & Corporate-Client Technology, Investment Products & Markets Technology, Client Reporting, Core Processing, Private and International Wealth Management Technology, Technology Integration Office, Enterprise Infrastructure & Production Management, Capital Markets Application & Data Services, Deployment Planning & Release Management, and the Chief Operating Office.
β’ Department Profile Wealth Managements Core Platform Services group provides horizontal services to all Wealth management Development Teams.
β’ Client mission is to provide stable and scalable infrastructure and technology solutions for entire Wealth Management.
β’ Client serves as a centralized interface through which Wealth Management teams can obtain infrastructure solutions and project support by working closely with application owners throughout the SDLC process to ensure that established products/services are leveraged, and new requirements are fulfilled.




