

MM International, LLC
Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect with a 24+ month contract in Woodbridge Township, NJ (Hybrid). Requires 8+ years of experience in data architecture, 5+ years with Alteryx, Power BI, GCP, and strong data visualization skills. Local candidates only.
🌎 - Country
United States
💱 - Currency
Unknown
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 4, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Woodbridge, NJ 07095
-
🧠 - Skills detailed
#API (Application Programming Interface) #SonarQube #Jira #POSTMAN #GCP (Google Cloud Platform) #Maven #Containers #Batch #Cloud #Data Lineage #GitLab #MS SQL (Microsoft SQL Server) #Python #Java #SAS #Data Dictionary #Kubernetes #REST (Representational State Transfer) #Alteryx #JSON (JavaScript Object Notation) #PostgreSQL #DevSecOps #Agile #Scripting #Data Access #Visualization #Data Governance #Dremio #Virtualization #Azure #Data Architecture #GitHub #SQL (Structured Query Language) #MongoDB #Teradata #Splunk #Data Analysis #Microservices #Monitoring #Databricks #Snowflake #Bash #Logging #Spark (Apache Spark) #Tableau #Deployment #Apache Spark #Microsoft Power BI #Scrum #Docker #BigQuery #Metadata #Data Management #Jenkins #JUnit #Liquibase #JMeter #Talend #Kanban #"ETL (Extract #Transform #Load)" #Scala #Kafka (Apache Kafka) #Data Modeling #REST API #MySQL #BI (Business Intelligence)
Role description
Title: Data Architect Locations: Woodbridge Township, NJ (Hybrid), (3 days onsite/2 wfh) Duration: 24+ Months Contract
Local to NJ Only Inperson interview(2nd round) Must Have skills: Alteryx Power BI OR Tableau GCP Dremio is plus Needs below skills: 5+Years exp Alteryx / Power BI / Tableau / GCP (Instead of the dremio experience w strong data visualization) - Strong Data Architecture Required Skills: 8+ years of hands-on experience as a Data Architect working on innovative, scalable, data-heavy, on-demand enterprise applications and guiding data analysts & engineers 5+ years hands-on experience working on Greenfield solutions, leading data tracks and building data sourcing, integration from scratch
working closely with partners in Infosec, Infrastructure and Operations and establish data & system integration best practices
identifying, managing and remediating IT risk and tech debt
7+ years of hands-on experience building and delivering self-serve solutions leveraging various approved on-prem and cloud capabilities Split time between leading design, implementation & software engineers and performing below hands-on role:
7+ years of back-end and ETL experience in consuming, producing and integration with enterprise data providers in batch or on-demand 5+ years of experience with Data Analytics & Reporting using PowerBI, SAS Viya, Tableau 5+ years of experience with cloud-based data solutions like Openshift Data Foundation (ODF), Openshift Data Access (ODA), BigQuery, Snowflake, Talend Cloud, Databricks 5+ years of experience with ETL tools like Alteryx, Talend, Xceptor, Apache Camel, Kafka Streams 4+ years of experience with data virtualization using tools like Dremio, BigQuery Omni, Redhat Virtualization 7+ years of experience with Rest API, Apigee, Kafka, JSON to consume and produce data topics 7+ years of experience with data modeling, metadata management, data governance, data lineage and data dictionary 7+ years of experience with MS SQL, MYSQL, PostgreSQL, MongoDB, Teradata, Apache Spark & deployment tools like Liquibase 3+ years of experience with Java 8+, Spring Framework, SpringBoot and Microservices 3+ years of experience with Docker, Kubernetes, Openshift containers 3+ years of experience with Cloud Poviders like Openshift (OCP), GCP, Azure 3+ years of experience with CI/CD tools like Jenkins, Harness, UCD, GitHub, Maven, Gradle 3+ years of experience with DevSecOps tools like SonarQube, GitLab, Checkmarx, Black Duck 3+ years of experience with Logging, Monitoring tools ELK Stack, Splunk, AppDynamics 5+ years of experience Proficiency in scripting languages such as Python, Bash, Shell 3+ years of experience with tests frameworks using Jasmine, Karma, Selenium, Junit, Jmeter, RestAssured, Postman 3+ years of experience working in Agile environment using Scrum/Kanban 3+ years of experience working in Jira, Confluence
Title: Data Architect Locations: Woodbridge Township, NJ (Hybrid), (3 days onsite/2 wfh) Duration: 24+ Months Contract
Local to NJ Only Inperson interview(2nd round) Must Have skills: Alteryx Power BI OR Tableau GCP Dremio is plus Needs below skills: 5+Years exp Alteryx / Power BI / Tableau / GCP (Instead of the dremio experience w strong data visualization) - Strong Data Architecture Required Skills: 8+ years of hands-on experience as a Data Architect working on innovative, scalable, data-heavy, on-demand enterprise applications and guiding data analysts & engineers 5+ years hands-on experience working on Greenfield solutions, leading data tracks and building data sourcing, integration from scratch
working closely with partners in Infosec, Infrastructure and Operations and establish data & system integration best practices
identifying, managing and remediating IT risk and tech debt
7+ years of hands-on experience building and delivering self-serve solutions leveraging various approved on-prem and cloud capabilities Split time between leading design, implementation & software engineers and performing below hands-on role:
7+ years of back-end and ETL experience in consuming, producing and integration with enterprise data providers in batch or on-demand 5+ years of experience with Data Analytics & Reporting using PowerBI, SAS Viya, Tableau 5+ years of experience with cloud-based data solutions like Openshift Data Foundation (ODF), Openshift Data Access (ODA), BigQuery, Snowflake, Talend Cloud, Databricks 5+ years of experience with ETL tools like Alteryx, Talend, Xceptor, Apache Camel, Kafka Streams 4+ years of experience with data virtualization using tools like Dremio, BigQuery Omni, Redhat Virtualization 7+ years of experience with Rest API, Apigee, Kafka, JSON to consume and produce data topics 7+ years of experience with data modeling, metadata management, data governance, data lineage and data dictionary 7+ years of experience with MS SQL, MYSQL, PostgreSQL, MongoDB, Teradata, Apache Spark & deployment tools like Liquibase 3+ years of experience with Java 8+, Spring Framework, SpringBoot and Microservices 3+ years of experience with Docker, Kubernetes, Openshift containers 3+ years of experience with Cloud Poviders like Openshift (OCP), GCP, Azure 3+ years of experience with CI/CD tools like Jenkins, Harness, UCD, GitHub, Maven, Gradle 3+ years of experience with DevSecOps tools like SonarQube, GitLab, Checkmarx, Black Duck 3+ years of experience with Logging, Monitoring tools ELK Stack, Splunk, AppDynamics 5+ years of experience Proficiency in scripting languages such as Python, Bash, Shell 3+ years of experience with tests frameworks using Jasmine, Karma, Selenium, Junit, Jmeter, RestAssured, Postman 3+ years of experience working in Agile environment using Scrum/Kanban 3+ years of experience working in Jira, Confluence