

Info Way Solutions
Data Architect
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Architect in Boston, MA, for over 6 months at a competitive pay rate. Key skills include strong SQL, Snowflake, DataStage, and Java/Python for API development. Requires 7–10+ years of data engineering experience.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 23, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Boston, MA
-
🧠 - Skills detailed
#Vault #Data Engineering #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Clustering #SQL Server #Java #API (Application Programming Interface) #Data Architecture #Aurora #DataStage #Python #Oracle #MySQL #Linux #Snowflake #Jira #Jenkins #Migration #Automation
Role description
Role: Data Architect
Boston MA (onsite)
6+ months
Role Summary:
Design and implement ingestion and transformation pipelines (DataStage/Snowflake/SQL Server/Aurora/MySQL), and build Java/Python-backed Data APIs for downstream consumers.
Key Responsibilities:
• Create ELT/ETL pipelines (DataStage, Python) into Snowflake and/or Aurora/SQL Server; enforce data contracts and DQ checks.
• Engineer dimensional or data-vault style models in OneView; optimize cost and performance (Snowflake).
• Build/operate Data API services (Java/Python), including pagination, filtering, caching, and authentication.
• Implement CI/CD (Jenkins), unit/integration tests; infra-as-code where applicable; Linux automation.
• Execute migration waves: dual-run, reconcile, and cutover; generate decommission evidence.
Required Skills / Experience:
• Strong SQL across SQL Server, Aurora (MySQL/Postgres-compatible), MySQL, Oracle SQL/PLSQL.
• Snowflake performance patterns (clustering, virtual warehouses); DataStage ETL.
• API services usage/integration; Java/Python service development experience.
• JIRA, Jenkins; Red Hat Linux exposure.
• 7–10+ years of data engineering experience.
Role: Data Architect
Boston MA (onsite)
6+ months
Role Summary:
Design and implement ingestion and transformation pipelines (DataStage/Snowflake/SQL Server/Aurora/MySQL), and build Java/Python-backed Data APIs for downstream consumers.
Key Responsibilities:
• Create ELT/ETL pipelines (DataStage, Python) into Snowflake and/or Aurora/SQL Server; enforce data contracts and DQ checks.
• Engineer dimensional or data-vault style models in OneView; optimize cost and performance (Snowflake).
• Build/operate Data API services (Java/Python), including pagination, filtering, caching, and authentication.
• Implement CI/CD (Jenkins), unit/integration tests; infra-as-code where applicable; Linux automation.
• Execute migration waves: dual-run, reconcile, and cutover; generate decommission evidence.
Required Skills / Experience:
• Strong SQL across SQL Server, Aurora (MySQL/Postgres-compatible), MySQL, Oracle SQL/PLSQL.
• Snowflake performance patterns (clustering, virtual warehouses); DataStage ETL.
• API services usage/integration; Java/Python service development experience.
• JIRA, Jenkins; Red Hat Linux exposure.
• 7–10+ years of data engineering experience.






