Python Developer (only W2)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Junior Python Developer in Woburn, Massachusetts, with a contract length of 6-12+ months. Pay rate is competitive. Key skills include Python, Java, SQL, ETL, and data visualization tools like Power BI. A Bachelor’s degree and 2 years of relevant experience are required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
-
🗓️ - Date discovered
September 11, 2025
🕒 - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
📄 - Contract type
W2 Contractor
-
🔒 - Security clearance
Unknown
-
📍 - Location detailed
Woburn, MA
-
🧠 - Skills detailed
#Data Modeling #Python #GIT #Cloud #VBA (Visual Basic for Applications) #Databases #Documentation #Data Lake #PostgreSQL #Version Control #Visualization #Predictive Modeling #Computer Science #Data Integrity #API (Application Programming Interface) #Data Engineering #BI (Business Intelligence) #ML (Machine Learning) #Data Manipulation #"ETL (Extract #Transform #Load)" #Tableau #Data Cleansing #Data Science #Data Processing #Microsoft Power BI #Data Extraction #Azure #BigQuery #DAX #Data Management #AI (Artificial Intelligence) #Scala #SQL (Structured Query Language) #Datasets #Automation #Java #Data Analysis #Programming #Scripting #AWS (Amazon Web Services) #Synapse #Redshift
Role description
Job Title: Junior Python Developer Location: Woburn, Massachusetts Duration: 6-12+ Month Interview Process: Video Location: Woburn, MA – 5 days a week.Onsite Job Description: Qual Call Notes: • His mission is the AI mission • Had a family intern, and that individual worked through the summer-- he was very technical and found a role for him, but now he's off to grad school. • So technically an add to staff • Also, last year this time hired a traditional Data Analyst and over the course of the year dove into more automation and script writing-- built their own AI environment and application • Starting to leverage AI-- this resource moved into more of a Developer role • Was very proficient in Python coding, etc. • Worked with use cases to build out-- have framework and documentation • Developer role-- publicly presented as a Data Analyst role • Python skills to write script/code • Can develop • Write the scripts and automations to help execute stuff • Tech skills: • Write UI in Java and Rust • Leverage Python to provide the output • Using AI to write boiler plate code and then augmenting it • Need to understand how to make the code cohesive and standard • Do use API's • Pipelines not as important • Don't want to be single-threaded • Need to understand analysis and PowerBI in order to be able to craft their output. • Primary focus is their development aspect and building a product • Team: • Manager, and one other individual • Interacting with other folks in the team as well • Has about 6 people in various roles/capacities, going to be a collaborative environment-- questions about banking industry, will this be well received, etc. • Commercial lending-- leases and agreements The Enterprise Data Management team is a nimble, cross-functional group that thrives on solving complex data challenges. We manage enterprise data assets, engineer scalable reporting solutions, and deliver actionable insights that inform strategic decisions across the bank. We’re seeking a Data Analyst with a strong analytical mindset and a passion for coding—someone who can bridge the gap between raw data and business intelligence. This role offers the opportunity to work with modern data stacks and contribute to the evolution of our data infrastructure. KEY RESPONSIBILITIES Data Visualization & Reporting (40%) • Design and implement interactive dashboards using Tableau, Power BI, or custom-built web interfaces. • Utilize DAX, Power Query, and Salesforce APIs to integrate disparate data sources into unified reporting layers. • Translate complex datasets into intuitive visual narratives that support executive decision-making. Data Extraction & Preparation (40%) • Build and maintain ETL workflows using Python, Java, and SQL-based tools to extract data from PostgreSQL, MSSQL, and cloud-based data lakes. • Automate data cleansing and transformation routines using Apache POI (for Excel automation in Java), VBA, and Power Query. • Ensure data integrity through rigorous validation, schema enforcement, and exception handling. REQUIREMENTS • Programming Proficiency: Strong command of Python for data analysis and scripting, and working knowledge of Java for backend data processing and integration tasks. • SQL Expertise: Advanced querying skills across relational databases (PostgreSQL, MSSQL). • Data Engineering Mindset: Familiarity with ETL concepts, data modeling, and pipeline orchestration. • Tool Agnostic Flexibility: Comfortable switching between tools and languages to solve problems efficiently. • Collaborative Communication: Ability to work closely with data scientists, business stakeholders, and technical teams to translate requirements into analytical solutions. NICE TO HAVES (Or Things You'll Get To Learn) • Experience with DuckDB, Polars, or other high-performance analytical engines. • Exposure to cloud data platforms like AWS Redshift, Azure Synapse, or Google BigQuery. • Familiarity with Git for version control and collaborative development. • Interest in machine learning, predictive modeling, or statistical inference. • Prior experience in financial services or other regulated industries. QUALIFICATIONS • 2 years of experience in data analysis, software development, or business intelligence, preferably in financial services or a regulated industry. • Proficiency in Python and Java, with experience in data manipulation, automation, and backend integration. • Strong SQL skills and familiarity with relational databases (PostgreSQL, MSSQL). • Experience with data visualization tools (Power BI, Tableau) and dashboard development. • Familiarity with ETL processes, data modeling, and version control systems (e.g., Git). • Experience with Excel (including Power Query and VBA), and process documentation tools (Visio, Lucidchart). • Excellent communication and stakeholder management skills. • Bachelor’s degree in Computer Science, Data Science, Information Systems, or a related field. • High proficiency in technical writing and documentation.