

Data Architect
β - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Architect contract position in Iselin, NJ (Hybrid). Pay ranges from $75.00 to $80.00 per hour. Requires 8+ years of enterprise experience, proficiency in Spark/PySpark, and strong skills in Java, Python, or Scala.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
640
-
ποΈ - Date discovered
June 12, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Iselin, NJ 08830
-
π§ - Skills detailed
#Scala #Data Modeling #Databases #Dremio #Data Analysis #MS SQL (Microsoft SQL Server) #Logging #AI (Artificial Intelligence) #GitHub #JUnit #SonarQube #Jira #Docker #Java #BigQuery #MySQL #Tableau #Data Architecture #Virtualization #REST (Representational State Transfer) #Bash #Batch #GCP (Google Cloud Platform) #Cloud #PySpark #Monitoring #REST API #Alteryx #Data Dictionary #"ETL (Extract #Transform #Load)" #Metadata #Microsoft Power BI #SQL (Structured Query Language) #Talend #DevSecOps #BI (Business Intelligence) #ML (Machine Learning) #Data Governance #POSTMAN #Azure #Databricks #Visualization #Scrum #Splunk #Data Management #GitLab #Snowflake #Data Lineage #PostgreSQL #Programming #MongoDB #Teradata #Maven #Kanban #Kafka (Apache Kafka) #Agile #Microservices #Spark (Apache Spark) #Data Integration #Python #Scripting #Kubernetes #Jenkins
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job Title: Data Architect
Location: Iselin, NJ (Hybrid β 3 days onsite)
Key Responsibilities:
Lead design, development, and implementation of innovative, scalable enterprise data applications.
Guide and mentor data analysts and engineers on best practices in data sourcing, integration, and pipeline engineering.
Build Greenfield data solutions from the ground up, working closely with Infosec, Infrastructure, and Operations teams.
Manage IT risk and technical debt through effective data governance and remediation strategies.
Develop and deliver self-service analytics solutions leveraging both on-prem and cloud environments.
Hands-on involvement in backend ETL development, data modeling, and integration with enterprise data sources.
Utilize REST APIs, Kafka, and other data streaming technologies for data consumption and production.
Lead metadata management, data lineage, and data dictionary development initiatives.
Required Skills & Experience:
8+ years as a Data Architect in enterprise environments.
5+ years hands-on experience with Greenfield data solutions and leading data tracks.
Proficiency in pipeline engineering using Spark or PySpark.
Strong programming skills in Java, Python, or Scala.
Experience with data visualization tools: Power BI, Alteryx, Tableau.
Expertise in backend and ETL processes (batch/on-demand) and data integration.
Familiarity with cloud data platforms such as Openshift Data Foundation, BigQuery, Snowflake, Databricks.
Experience with ETL tools: Alteryx, Talend, Apache Camel, Kafka Streams.
Knowledge of data virtualization platforms including Dremio, BigQuery Omni, and Redhat Virtualization.
Proficient with databases: MS SQL, MySQL, PostgreSQL, MongoDB, Teradata.
Solid experience in microservices architecture using Java 8+, Spring Framework, SpringBoot.
Familiarity with containerization and orchestration tools like Docker, Kubernetes, Openshift.
Experience with cloud providers: Openshift (OCP), GCP, Azure.
Working knowledge of CI/CD pipelines using Jenkins, GitHub, Maven, Gradle.
Experience with DevSecOps tools: SonarQube, GitLab, Checkmarx, Black Duck.
Familiarity with monitoring and logging tools such as ELK Stack, Splunk, AppDynamics.
Proficient in scripting languages like Python, Bash, Shell.
Experience with testing frameworks: Jasmine, Karma, Selenium, Junit, RestAssured, Postman.
Agile/Scrum/Kanban experience with tools like Jira and Confluence.
Nice to Have:
Experience with Dremio and Alteryx specifically.
Background in financial services or financial data architectures.
Exposure to AI and machine learning integrations.
Job Type: Contract
Pay: $75.00 - $80.00 per hour
Expected hours: 8 per week
Benefits:
Life insurance
Professional development assistance
Retirement plan
Schedule:
8 hour shift
Experience:
Data Architect in enterprise environments: 8 years (Required)
Greenfield data solutions and leading data tracks.: 6 years (Required)
pipeline engineering using Spark or PySpark.: 7 years (Required)
Location:
Iselin, NJ 08830 (Required)
Work Location: In person