fusionSpan

Senior Data Engineer - B2B Contract

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer - B2B Contract, with a contract length of "unknown" and a pay rate of "unknown." Key skills include 3+ years ETL experience, SQL, Python, and cloud databases (AWS/Azure). Experience with ERP and CRM solutions preferred.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
December 11, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Maryland, United States
-
🧠 - Skills detailed
#Cloud #Fivetran #Matillion #Migration #SQL (Structured Query Language) #Azure cloud #AI (Artificial Intelligence) #Python #RDBMS (Relational Database Management System) #MLOAD (MultiLoad) #Azure #Snowflake #Documentation #Agile #Semantic Models #Databases #Data Conversion #ML (Machine Learning) #Data Analysis #Data Access #Data Engineering #Scala #Strategy #AWS (Amazon Web Services) #Talend #SAP #Requirements Gathering #"ETL (Extract #Transform #Load)" #Data Lake #CRM (Customer Relationship Management) #Data Quality
Role description
fusionSpan is a fast-growing company with development centers in the USA, Canada, El-Salvador, Poland, and India. fusionSpan's focus is on software development, enterprise CRM implementations, digital strategy, and enterprise middleware. A data engineer at fusionSpan will be part of the cross-functional Data & Integrations team at fusionSpan. This team handles all data analysis and ETL issues across multiple projects.The ideal candidate needs to be able to work autonomously and adapt to an evolving work structure. If you love working with data, then this is the perfect position for you. Responsibilities β€’ Utilize extract/transform/load ETL technologies using snowflake and other cloud data platforms β€’ Interpret data, analyze results using statistical techniques and provide ongoing reports, β€’ Develop and implement databases, data collection systems, data analytics, and other strategies that optimize statistical efficiency and quality, β€’ Acquire data from primary or secondary data sources and maintain databases/data systems, β€’ Evaluate and optimize data structures, β€’ Identify, analyze, and interpret trends or patterns in complex data sets, β€’ Filter and β€œclean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems, β€’ Monitor, troubleshoot, and improve pipeline transparency, performance, scalability, and reliability, using Snowflake OpenFlow and related ELT/ETL tools β€’ Ensure AI/ML readiness of data by preparing and maintaining semantic models, ensuring robust data quality, and establishing and enforcing data access β€’ Produce field mapping and translation documentation for use in both manual and scripted migrations, β€’ Work within Agile methodology managing tasks and tickets as assigned, β€’ Communicate with clients and team members for requirements gathering, clarification, and planning for data conversions, β€’ Document work and work processes for use by team members. Qualifications β€’ 3+ years of experience with an ETL tool is required such as Talend, Fivetran, Matillion, Airbyte, β€’ 4+ Experience using SQL and RDBMS is required β€’ 2+ Experience with AWS/Azure cloud databases/data lakes β€’ 1+ years of active and frequent coding experience in Python required. β€’ Experience with ERP solutions such as SAP and CRM solutions such as Salesforce preferred β€’ Mastery of Excel