Zensar Technologies

Senior GCP Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior GCP Data Architect in Atlanta, GA, with a contract length of "unknown" and a pay rate of "$unknown." Key skills include GCP, Big Query, Airflow, Python, and experience in marketing and retail data ecosystems. A bachelor's degree and GCP certification are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 20, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Atlanta, GA
-
🧠 - Skills detailed
#Data Governance #Data Warehouse #Splunk #Migration #Airflow #Compliance #Data Lake #Data Engineering #Data Modeling #Leadership #Unix #Data Cleansing #Storage #Dataflow #Jira #Data Quality #Scala #Data Privacy #Cloud #Automation #Programming #Metadata #Scripting #Monitoring #Shell Scripting #SQL (Structured Query Language) #Security #Data Lineage #Apache Airflow #"ETL (Extract #Transform #Load)" #Datasets #Oracle #GIT #BigQuery #Agile #Data Pipeline #DBeaver #Databases #MySQL #Python #Data Migration #DataStage #GCP (Google Cloud Platform) #Hadoop #Data Architecture #Computer Science #Observability
Role description
Looking for a workplace where people realize their full potential, are recognized for the impact they make, and enjoy the company of the peers they work with? Welcome to Zensar! Read on for more details on the role and about us. Job Title: Senior Data Engineer / Data Architect Location: Atlanta, GA Overview of the Role The Senior Data Engineer / Data Architect will play a key role in designing, building, and optimizing modern data platforms and pipelines that support enterprise‑level analytical, personalization, and operational use cases. This role involves architecting scalable data lake solutions, modernizing legacy data processes, enabling identity resolution, orchestrating cloud‑based ETL workflows, and ensuring high standards of data quality, governance, and compliance. The ideal candidate brings hands‑on expertise in Google Cloud Platform (GCP), Big Query, Airflow, Python, and enterprise data migration, along with strong experience supporting marketing, retail, and customer personalization ecosystems. The role requires strong collaboration skills, leadership in cross‑functional environments, and the ability to translate business requirements into scalable technical solutions. Key Responsibilities: Data Architecture & Engineering • Architect and develop scalable data lake and data warehouse solutions using GCP (Big Query, Cloud Storage) and other modern cloud technologies. • Design end‑to‑end ETL/ELT workflows using Apache Airflow (Cloud Composer), Python, SQL, and Unix Shell scripting. • Build, optimize, and maintain high‑performance data models supporting analytics, personalization, and downstream business processes. • Implement incremental and real‑time ingestion patterns (including delta loads, streaming, and True Delta‑based updates). Marketing & Personalization Data Ecosystem • Integrate enterprise datasets with Adobe Experience Platform (AEP), Customer Journey Analytics (CJA), and Adobe Journey Optimizer (AJO). • Implement identity resolution workflows using platforms such as Amperity, ensuring accuracy, governance, and privacy compliance. • Develop suppression logic and orchestration workflows that coordinate customer journeys across marketing channels to prevent duplicate targeting. Data Migration & Modernization • Lead migrations from legacy platforms (Oracle, DB2, on‑prem systems) to modern GCP‑based architectures. • Re-engineer legacy ETL jobs (e.g., DataStage, PL/SQL pipelines) into scalable Python‑ and Airflow‑based cloud workflows. • Conduct detailed source‑to‑target mappings, data cleansing, validation, and reconciliation for high‑volume migrations. Data Governance, Compliance & Quality • Implement and enforce CCPA, data privacy standards, and security best practices across pipelines and platforms. • Ensure high data quality through automated validation, proactive monitoring, and observability dashboards. • Establish and maintain data lineage, metadata standards, and domain‑specific governance practices. Production Support & Operational Excellence • Provide L3 support for complex data pipelines, ensuring stability, scalability, and optimized performance. • Troubleshoot production issues, conduct root‑cause analysis, and implement preventive measures. • Collaborate with cross‑functional teams (engineering, analytics, marketing, product) to support ongoing data initiatives. Leadership & Collaboration Work closely with onshore/offshore teams to manage priorities, guide development, and ensure timely delivery. Engage business stakeholders to understand requirements, define KPIs, and shape data solutions that align with strategic objectives. Provide architectural recommendations, technical mentorship, and thought leadership within the data engineering function. 1. Qualifications Required Skills • 14+ years of experience in data engineering, data architecture, or related disciplines. • Hands‑on experience with Google Cloud Platform (BigQuery, Dataflow, Composer/Airflow, Cloud Storage). • Strong programming expertise in Python, SQL, and Unix Shell scripting. • Deep understanding of ETL/EL • T pipelines, orchestration, and workflow automation. • Experience with major databases such as Oracle, DB2, MySQL, Cloud SQL. • Proficiency with tools such as Control‑M, Git, JIRA, DBeaver, and cloud development consoles. • Strong background in data modeling (conceptual, logical, physical). • Experience integrating with marketing platforms such as AEP, CJA, AJO, or Amperity. Preferred Skills • Experience in retail, e‑commerce, telecom, or customer analytics domains. • Familiarity with Hadoop, Hive, or other big‑data technologies. • Understanding of identity resolution workflows and customer 360 datasets. • Exposure to ServiceNow, Splunk, Kibana, or similar monitoring platforms. • Experience with Agile methodologies and enterprise SDLC practices. Education & Certifications: • Bachelor’s degree in Computer Science, Information Technology, Engineering, or related field. • Certifications in GCP (e.g., GCP Data Engineer) or equivalent cloud certifications are a strong plus. • Additional certifications in Agile, data governance, or related areas are a plus. Advantage Zensar We are a digital solutions and technology services company that partners with global organizations across industries to achieve digital transformation. With a strong track record of innovation, investment in digital solutions, and commitment to client success, at Zensar, you can help clients achieve new thresholds of performance. A subsidiary of RPG Group, Zensar has its HQ in India, and offices across the world, including Mexico, South Africa, UK and USA. Zensar is all about celebrating individuality, creativity, innovation, and flexibility. We hire based on values, talent, and the potential necessary to fill a given job profile, irrespective of nationality, sexuality, race, color, and creed. We also put in policies to empower this assorted talent pool with the right environment for growth. At Zensar, you Grow, Own, Achieve, Learn. Learn more about our culture: https://www.zensar.com/careers/who-we-are Ready to #ExperienceZensar? Begin your application by clicking on the ‘Apply Online’ button below. Be sure to have your resume handy! If you’re having trouble applying, drop a line to careers@zensar.com.