JR Snowflake Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a JR Snowflake Data Architect with 2-4 years of experience in AWS, AI/ML, and banking. The contract is onsite in Charlotte, hybrid model, with a pay rate of "unknown." Key skills include Snowflake, R, Python, and predictive analytics.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
🗓️ - Date discovered
April 22, 2025
🕒 - Project duration
Unknown
🏝️ - Location type
Hybrid
📄 - Contract type
Unknown
🔒 - Security clearance
Unknown
📍 - Location detailed
Charlotte, NC
🧠 - Skills detailed
#AI (Artificial Intelligence) #Ansible #Security #Computer Science #Jenkins #Storage #Matillion #Kubernetes #Cloud #JSON (JavaScript Object Notation) #Data Integration #Puppet #CHEF #Docker #Containers #AWS (Amazon Web Services) #ML (Machine Learning) #R #DataStage #Data Architecture #GIT #Deployment #DevSecOps #DevOps #GitHub #Python #Agile #API (Application Programming Interface) #Snowflake #Data Engineering
Role description

Snowflake Data Architect (Junior) with AI/ML, AWS and banking domain industry (mandatory), integration and agile.

Expertise in R, Python and predictive analytics

Onsite in Charlotte - Hybrid model - onsite 4 days a week. Data Architect II – 3+ years of experience in the following:

   • 

   • AWS Cloud Services; Strong development background;

   • 

   • AI;

   • 

   • Snowflake.

   • 

   • TOGAF certification not required but is a nice to have to demonstrate architecture knowledge.

   • 

   • Must have excellent communication skills – both verbal and written, collaborative team player, and able to work in a matrixed environment across LOB’s, Proactive – positive attitude, team player.

   • 

   • Motivated and able to look for improvement opportunities.

   • 

   • Provide location and interview availability.

   • 

   • 3 interviews: 2 – 2 panel interviews and a final interview with EM – at least one of the interviews will be onsite in Charlotte.

Job Description

This associate will be responsible to research and conduct POCs to ensure solutions are built in alignment with platform principles and standards and drive the architecture for key cross team/cross product development projects (via architecture/design documents and developing code key modules).

The Work Itself:

Developing data integration architectures and define architecture standards & guidelines (architecture and design patterns)

Defines deployment topology by working with peers across the technology organization.

Optimizes use of Platform and technologies.

Work with Business and delivery partners to understand future requirements and implications for architecture

Identify architecture solution alternatives, driving business requirements, and availability of new technologies

Identify new technologies and tools to support architecture(s) and determine architectural roadmaps

Encourage early adoption of tools and architecture across the organization

Review solutions and projects to ensure they are consistent with standard methodologies in data and analytics

Primary Skills:

Bachelor’s Degree in a related field (Computer Science, AI/ML).

2 to 4 years of IT and business/industry development experience

Knowledge of data standards (security, ingestion, warehousing, storage)

Validated knowledge of successful design, architecture and development using AWS Cloud technologies.

Experience with AWS product categories like Analytics, Application Integration, Compute, Database, Serverless.

Experience with Snowflake

Experience with AI/ML

Collaborate with Agile teams to recommend the right architectures with high quality and software standard methodologies

Strong communication skills.

Nice to have:

Experience with One or more data Integration Technologies (Matillion, Pentaho, Datastage, etc.).

API’s, Apigee, Developer Portals. Expertise in JSON, RESTful services, and similar related tech

Expertise in R, Python and predictive analytics.

Containers: Docker, Kubernetes, OpenShift, Ansible, Nexus, Software defined networking.

DevOps/DevSecOps: Chef, Puppet, Jenkins, Packer, Git, GitHub

Skill/Experience/Education

Mandatory Skills

AWS Data Engineer, Snowflake Data Engineer

Desired Skills

Desired Skills Bachelor’s Degree in a related field (Computer Science, AI/ML). 2 to 4 years of IT and business/industry development experience Knowledge of data standards (security, ingestion, warehousing, storage) Validated knowledge of successful design, architecture and development using AWS Cloud technologies. Experience with AWS product categories like Analytics, Application Integration, Compute, Database, Serverless. Experience with Snowflake Experience with AI/ML Collaborate with Agile teams to recommend the right architectures with high quality and software standard methodologies Strong communication skills. Expertise in R, Python and predictive analytics