

Database Architect
PRIMARY DUTIES & RESPONSIBILITIES:
Architect complex data solutions in an Azure and Snowflake Cloud environment.
Guide team on migrating existing processes and data from On Premises Oracle/ SQL Server and other environments to Azure cloud and Snowflake
Maintain the Azure components and services.
Drive and build out Data Quality and Data Lineage tools and concepts
Create an ecosystem of Data Products and a Data Catalog that allows users to self-serve and explore the data in the Data Warehouse.
Enable end-to-end automation for deployment using continuous integration and continuous delivery.
Implement data pipelines to ingest data to the platform, standardize the data, and transform the data into business facing datasets.
Perform unit and integration testing of data pipelines.
Perform data integration at scale with Azure Data Factory
Work with Analysts and business SMEs to translate business requirements into curated datasets suitable for analytics solutions.
Document data pipelines for maintainability.
Work with source system owners and business owners to incorporate business changes into data pipelines.
Design data architecture for future platform growth including data warehousing, streaming analytics, and data visualization.
Assist in testing, governance, data quality, training, and documentation efforts.
Actively engage in business stakeholder requirement workshops to understand, interpret, and translate requirements into effective technical solutions.
Job Requirements:
Bachelor's degree in Computer Science/Computer Engineering or related fields AND 12 years of experience in data engineering, designing and building scalable and reliable data pipelines and infrastructure OR
Master’s Degree with 8 years of experience in data engineering, designing and building scalable and reliable data pipelines and infrastructure.
4+ years of experience with cloud-based data platforms, such as Azure.
4+ years of experience working on cloud data warehouses, such as Snowflake.
Hands on Data Modeling Experience (like Kimball, Imon methodologies).
Demonstrated experience with Data Architecture, working in all phases of full life cycle data analytics development: Requirements, Architecture, Design, Testing, and deployment.
PREFERRED QUALIFICATIONS:
Strong experience with SQL and Python.
Proficiency in performance tuning the ETLs/ELTs and database solutions for large data sets.
Proficiency with modern Agile development methodologies.
Hands-on experience with Data Ingestion tools like ADF.
Experience with transformation tools like Coalesce is highly preferred.
Deliver maintenance and improvements for existing applications.
The ability to be an exceptional team player, act as a technical mentor, interested in sharing knowledge with other team members, and passionate about learning new technologies.
Experience with Master Data Management tools like Profisee is highly desired.
PRIMARY DUTIES & RESPONSIBILITIES:
Architect complex data solutions in an Azure and Snowflake Cloud environment.
Guide team on migrating existing processes and data from On Premises Oracle/ SQL Server and other environments to Azure cloud and Snowflake
Maintain the Azure components and services.
Drive and build out Data Quality and Data Lineage tools and concepts
Create an ecosystem of Data Products and a Data Catalog that allows users to self-serve and explore the data in the Data Warehouse.
Enable end-to-end automation for deployment using continuous integration and continuous delivery.
Implement data pipelines to ingest data to the platform, standardize the data, and transform the data into business facing datasets.
Perform unit and integration testing of data pipelines.
Perform data integration at scale with Azure Data Factory
Work with Analysts and business SMEs to translate business requirements into curated datasets suitable for analytics solutions.
Document data pipelines for maintainability.
Work with source system owners and business owners to incorporate business changes into data pipelines.
Design data architecture for future platform growth including data warehousing, streaming analytics, and data visualization.
Assist in testing, governance, data quality, training, and documentation efforts.
Actively engage in business stakeholder requirement workshops to understand, interpret, and translate requirements into effective technical solutions.
Job Requirements:
Bachelor's degree in Computer Science/Computer Engineering or related fields AND 12 years of experience in data engineering, designing and building scalable and reliable data pipelines and infrastructure OR
Master’s Degree with 8 years of experience in data engineering, designing and building scalable and reliable data pipelines and infrastructure.
4+ years of experience with cloud-based data platforms, such as Azure.
4+ years of experience working on cloud data warehouses, such as Snowflake.
Hands on Data Modeling Experience (like Kimball, Imon methodologies).
Demonstrated experience with Data Architecture, working in all phases of full life cycle data analytics development: Requirements, Architecture, Design, Testing, and deployment.
PREFERRED QUALIFICATIONS:
Strong experience with SQL and Python.
Proficiency in performance tuning the ETLs/ELTs and database solutions for large data sets.
Proficiency with modern Agile development methodologies.
Hands-on experience with Data Ingestion tools like ADF.
Experience with transformation tools like Coalesce is highly preferred.
Deliver maintenance and improvements for existing applications.
The ability to be an exceptional team player, act as a technical mentor, interested in sharing knowledge with other team members, and passionate about learning new technologies.
Experience with Master Data Management tools like Profisee is highly desired.