

AI Developer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for an AI Developer in Woburn, MA, with a contract length of unspecified duration and a pay rate of W2/1099. Key skills include Python, Java, SQL, and data visualization tools. Requires 2+ years of relevant experience in data analysis or software development, preferably in financial services.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
September 5, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Woburn, MA
-
π§ - Skills detailed
#Data Pipeline #Computer Science #Datasets #PostgreSQL #Python #Data Science #Azure #Pandas #Data Manipulation #Data Wrangling #Synapse #Data Engineering #Data Extraction #Databases #SQL (Structured Query Language) #Programming #Data Lake #NumPy #Cloud #Libraries #"ETL (Extract #Transform #Load)" #Scala #Version Control #GIT #Data Cleansing #Documentation #Visualization #Microsoft Power BI #VBA (Visual Basic for Applications) #Java #BI (Business Intelligence) #Automation #BigQuery #Redshift #AWS (Amazon Web Services) #Data Management #Data Integrity #ML (Machine Learning) #Anomaly Detection #Data Modeling #Data Processing #Predictive Modeling #DAX #SciPy #Scripting #Tableau #Data Analysis #AI (Artificial Intelligence)
Role description
NO C2C- W2/1099
Job Title: AI Developer (2 openings)
Location: Woburn, MA
Primary Skills: AI
Required Skills
Job Description
β’ Must be able to sit on site in Woburn, MA 5x a week
β’ The Enterprise Data Management team is a nimble, cross-functional group that thrives on solving complex data challenges. We manage enterprise data assets, engineer scalable reporting solutions, and deliver actionable insights that inform strategic decisions across the bank.
Weβre seeking an AI Developer with a strong analytical mindset and a passion for codingβsomeone who can bridge the gap between raw data and business intelligence. This role offers the opportunity to work with modern data stacks and contribute to the evolution of our data infrastructure.
KEY RESPONSIBILITIES
Data Analysis & Insights Generation (30%)
β’ Perform exploratory data analysis (EDA) and statistical profiling using Python (leveraging libraries such as Pandas, NumPy, and SciPy) and Java for backend data processing tasks.
β’ Develop reusable scripts and modular code for data wrangling, anomaly detection, and KPI tracking.
β’ Apply object-oriented programming principles to build scalable data pipelines and analytical utilities.
Data Visualization & Reporting (40%)
β’ Design and implement interactive dashboards using Tableau, Power BI, or custom-built web interfaces.
β’ Utilize DAX, Power Query, and Salesforce APIs to integrate disparate data sources into unified reporting layers.
β’ Translate complex datasets into intuitive visual narratives that support executive decision-making.
Data Extraction & Preparation (40%)
β’ Build and maintain ETL workflows using Python, Java, and SQL-based tools to extract data from PostgreSQL, MSSQL, and cloud-based data lakes.
β’ Automate data cleansing and transformation routines using Apache POI (for Excel automation in Java), VBA, and Power Query.
β’ Ensure data integrity through rigorous validation, schema enforcement, and exception handling.
REQUIREMENTS
β’ Programming Proficiency: Strong command of Python for data analysis and scripting, and working knowledge of Java for backend data processing and integration tasks.
β’ SQL Expertise: Advanced querying skills across relational databases (PostgreSQL, MSSQL).
β’ Data Engineering Mindset: Familiarity with ETL concepts, data modeling, and pipeline orchestration.
β’ Tool Agnostic Flexibility: Comfortable switching between tools and languages to solve problems efficiently.
β’ Collaborative Communication: Ability to work closely with data scientists, business stakeholders, and technical teams to translate requirements into analytical solutions.
NICE TO HAVES (Or Things You'll Get To Learn)
β’ Experience with DuckDB, Polars, or other high-performance analytical engines.
β’ Exposure to cloud data platforms like AWS Redshift, Azure Synapse, or Google BigQuery.
β’ Familiarity with Git for version control and collaborative development.
β’ Interest in machine learning, predictive modeling, or statistical inference.
β’ Prior experience in financial services or other regulated industries.
QUALIFICATIONS
β’ 2+ years of experience in data analysis, software development, or business intelligence, preferably in financial services or a regulated industry.
β’ Proficiency in Python and Java, with experience in data manipulation, automation, and backend integration.
β’ Strong SQL skills and familiarity with relational databases (PostgreSQL, MSSQL).
β’ Experience with data visualization tools (Power BI, Tableau) and dashboard development.
β’ Familiarity with ETL processes, data modeling, and version control systems (e.g., Git).
β’ Experience with Excel (including Power Query and VBA), and process documentation tools (Visio, Lucidchart).
β’ Excellent communication and stakeholder management skills.
β’ Bachelorβs degree in Computer Science, Data Science, Information Systems, or a related field.
β’ High proficiency in technical writing and documentation
Best Regards,
Grace Abinezer
Recruitment Consultant | H3 Technologies, LLC
π (859) 287-0731
π§ grace@h3-staffing.com
π www.h3-technologies.com
71 Cavalier Blvd., Suite 208, Florence, KY - 41042
linkedin.com/in/grace-ebenezer-bb66a0251
NO C2C- W2/1099
Job Title: AI Developer (2 openings)
Location: Woburn, MA
Primary Skills: AI
Required Skills
Job Description
β’ Must be able to sit on site in Woburn, MA 5x a week
β’ The Enterprise Data Management team is a nimble, cross-functional group that thrives on solving complex data challenges. We manage enterprise data assets, engineer scalable reporting solutions, and deliver actionable insights that inform strategic decisions across the bank.
Weβre seeking an AI Developer with a strong analytical mindset and a passion for codingβsomeone who can bridge the gap between raw data and business intelligence. This role offers the opportunity to work with modern data stacks and contribute to the evolution of our data infrastructure.
KEY RESPONSIBILITIES
Data Analysis & Insights Generation (30%)
β’ Perform exploratory data analysis (EDA) and statistical profiling using Python (leveraging libraries such as Pandas, NumPy, and SciPy) and Java for backend data processing tasks.
β’ Develop reusable scripts and modular code for data wrangling, anomaly detection, and KPI tracking.
β’ Apply object-oriented programming principles to build scalable data pipelines and analytical utilities.
Data Visualization & Reporting (40%)
β’ Design and implement interactive dashboards using Tableau, Power BI, or custom-built web interfaces.
β’ Utilize DAX, Power Query, and Salesforce APIs to integrate disparate data sources into unified reporting layers.
β’ Translate complex datasets into intuitive visual narratives that support executive decision-making.
Data Extraction & Preparation (40%)
β’ Build and maintain ETL workflows using Python, Java, and SQL-based tools to extract data from PostgreSQL, MSSQL, and cloud-based data lakes.
β’ Automate data cleansing and transformation routines using Apache POI (for Excel automation in Java), VBA, and Power Query.
β’ Ensure data integrity through rigorous validation, schema enforcement, and exception handling.
REQUIREMENTS
β’ Programming Proficiency: Strong command of Python for data analysis and scripting, and working knowledge of Java for backend data processing and integration tasks.
β’ SQL Expertise: Advanced querying skills across relational databases (PostgreSQL, MSSQL).
β’ Data Engineering Mindset: Familiarity with ETL concepts, data modeling, and pipeline orchestration.
β’ Tool Agnostic Flexibility: Comfortable switching between tools and languages to solve problems efficiently.
β’ Collaborative Communication: Ability to work closely with data scientists, business stakeholders, and technical teams to translate requirements into analytical solutions.
NICE TO HAVES (Or Things You'll Get To Learn)
β’ Experience with DuckDB, Polars, or other high-performance analytical engines.
β’ Exposure to cloud data platforms like AWS Redshift, Azure Synapse, or Google BigQuery.
β’ Familiarity with Git for version control and collaborative development.
β’ Interest in machine learning, predictive modeling, or statistical inference.
β’ Prior experience in financial services or other regulated industries.
QUALIFICATIONS
β’ 2+ years of experience in data analysis, software development, or business intelligence, preferably in financial services or a regulated industry.
β’ Proficiency in Python and Java, with experience in data manipulation, automation, and backend integration.
β’ Strong SQL skills and familiarity with relational databases (PostgreSQL, MSSQL).
β’ Experience with data visualization tools (Power BI, Tableau) and dashboard development.
β’ Familiarity with ETL processes, data modeling, and version control systems (e.g., Git).
β’ Experience with Excel (including Power Query and VBA), and process documentation tools (Visio, Lucidchart).
β’ Excellent communication and stakeholder management skills.
β’ Bachelorβs degree in Computer Science, Data Science, Information Systems, or a related field.
β’ High proficiency in technical writing and documentation
Best Regards,
Grace Abinezer
Recruitment Consultant | H3 Technologies, LLC
π (859) 287-0731
π§ grace@h3-staffing.com
π www.h3-technologies.com
71 Cavalier Blvd., Suite 208, Florence, KY - 41042
linkedin.com/in/grace-ebenezer-bb66a0251