

Hope Tech
Backend Development- Python
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Backend Development-Python position in McLean, VA, with a contract until Mar 24, 2026, offering a competitive pay rate. Requires 7+ years in Python, SQL, and Snowflake integration, along with strong database skills.
π - Country
United States
π± - Currency
Unknown
-
π° - Day rate
Unknown
-
ποΈ - Date
March 25, 2026
π - Duration
Unknown
-
ποΈ - Location
On-site
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
McLean, VA
-
π§ - Skills detailed
#MySQL #GIT #Jira #SciPy #SQL Queries #Django #NoSQL #SQL Server #Data Extraction #NLTK (Natural Language Toolkit) #Sybase #Python #REST (Representational State Transfer) #Databases #Microservices #Data Integration #Business Analysis #Data Integrity #Data Pipeline #SpaCy #Data Migration #Data Architecture #Pandas #Migration #Snowflake #"ETL (Extract #Transform #Load)" #Datasets #Docker #Data Analysis #Computer Science #SQL (Structured Query Language) #Web Development #Data Warehouse #BitBucket #Jenkins #API (Application Programming Interface) #GraphQL #NumPy #Eclipse
Role description
McLean, VA
Contract
Mar 24, 2026
Backend Development- Python
Must Have Qualifications: 7+ years of backend development- Python required. Strong database skills-SQL Server or DB2, data extraction using Snowflake, and solid data analysis experience.
Responsibilities:
- Work closely with LPMS Business, SF Securitization, and I&CM BTO support teams to
support technology delivery activities in support of Data Transformation Program.
- Play a key role in analyzing and converting complex SQL queries to align with new data
sources, supporting large-scale data migrations, and ensuring seamless data integration
across multiple systems.
- Update LPMS applications to source securitization data from Nexus.
- Support and execute data migration activities, ensuring data integrity and accuracy.
- Work with Snowflake for data warehousing, migration, and advanced analytics.
Basic Requirements:
- 7+ years of Python and microservices experience is a must.
- 5+ years of hands-on experience with SQL, including complex query analysis and
optimization.
- 1 β 3 years of experience in integrating with Snowflake APIs and/or GraphQL for data
extraction and manipulation.
- 7-10 years of IT experience in design, development, integration and testing of software
solutions based on Python, REST/SOAP Webservices, and web development.
- 7-10 years of experience with relational databases including Sybase, SQL Server, and
MySQL. Good knowledge and experience with database technologies such as SQL and
NoSQL
- Demonstrated ability to analyze and join large datasets across multiple sources.
- Proficient in integrating with Snowflake APIs and/or GraphQL for data extraction and
manipulation.
- 3-5 years of experience with managing, building, and deploying code through tools
including Jira, Eclipse, GIT, Bitbucket, Gradle, Docker, and Jenkins.
- 3-5 years of experience designing and executing unit tests using python testing framework
based on business requirements and functional specifications.
- Experience with popular Python frameworks such as Django, and FAST API.
- In-depth understanding of the Python software development stacks, ecosystems,
frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn.
- Bachelorβs degree in computer science or related field.
Preferred Skills:
- Work independently contributing to the success of assigned project(s).
- Collaborate with cross-functional teams including data architects, business analysts, and
project managers.
- Demonstrated business acumen, problem solving skills, intellectual maturity, and
relationship management skills.
- Prior Freddie Mac experience and understanding Corporate Data Warehouse is a plus.
- Experience with ETL tools and data pipeline orchestration
McLean, VA
Contract
Mar 24, 2026
Backend Development- Python
Must Have Qualifications: 7+ years of backend development- Python required. Strong database skills-SQL Server or DB2, data extraction using Snowflake, and solid data analysis experience.
Responsibilities:
- Work closely with LPMS Business, SF Securitization, and I&CM BTO support teams to
support technology delivery activities in support of Data Transformation Program.
- Play a key role in analyzing and converting complex SQL queries to align with new data
sources, supporting large-scale data migrations, and ensuring seamless data integration
across multiple systems.
- Update LPMS applications to source securitization data from Nexus.
- Support and execute data migration activities, ensuring data integrity and accuracy.
- Work with Snowflake for data warehousing, migration, and advanced analytics.
Basic Requirements:
- 7+ years of Python and microservices experience is a must.
- 5+ years of hands-on experience with SQL, including complex query analysis and
optimization.
- 1 β 3 years of experience in integrating with Snowflake APIs and/or GraphQL for data
extraction and manipulation.
- 7-10 years of IT experience in design, development, integration and testing of software
solutions based on Python, REST/SOAP Webservices, and web development.
- 7-10 years of experience with relational databases including Sybase, SQL Server, and
MySQL. Good knowledge and experience with database technologies such as SQL and
NoSQL
- Demonstrated ability to analyze and join large datasets across multiple sources.
- Proficient in integrating with Snowflake APIs and/or GraphQL for data extraction and
manipulation.
- 3-5 years of experience with managing, building, and deploying code through tools
including Jira, Eclipse, GIT, Bitbucket, Gradle, Docker, and Jenkins.
- 3-5 years of experience designing and executing unit tests using python testing framework
based on business requirements and functional specifications.
- Experience with popular Python frameworks such as Django, and FAST API.
- In-depth understanding of the Python software development stacks, ecosystems,
frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn.
- Bachelorβs degree in computer science or related field.
Preferred Skills:
- Work independently contributing to the success of assigned project(s).
- Collaborate with cross-functional teams including data architects, business analysts, and
project managers.
- Demonstrated business acumen, problem solving skills, intellectual maturity, and
relationship management skills.
- Prior Freddie Mac experience and understanding Corporate Data Warehouse is a plus.
- Experience with ETL tools and data pipeline orchestration






