Senior Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a contract-to-hire basis, requiring 5+ years of data engineering and 2+ years in the mortgage industry. Key skills include Snowflake, Azure services, ETL processes, and programming in Python and SQL. 100% remote.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
600
-
πŸ—“οΈ - Date discovered
September 26, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Remote
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Deployment #Vault #Security #Visualization #Azure Data Factory #ADF (Azure Data Factory) #Data Quality #Agile #Spark (Apache Spark) #Synapse #Snowflake #DevOps #Oracle #Databricks #"ETL (Extract #Transform #Load)" #Data Integration #MySQL #.Net #GIT #Cloud #Monitoring #SQL Server #Automation #Azure #SQL (Structured Query Language) #Version Control #Tableau #Python #Data Warehouse #PySpark #Logic Apps #Azure DevOps #C# #MS SQL (Microsoft SQL Server) #Scala #SSIS (SQL Server Integration Services) #Computer Science #Programming #Spark SQL #Data Governance #Documentation #Azure cloud #Database Systems #Data Engineering #Data Modeling
Role description
Data Engineer - Snowflake / Azure β€’ Contract to Hire β€’ Must be a USC or GC Holder - No C2C or 1099 β€’ 100% Remote Optomi, in partnership with a leader in home lending, is seeking a data engineer with strong snowflake experience, to help integrate Snowflake into their current environment. The Senior Data Engineer leverages expertise in Azure cloud services and data engineering to design, implement, and optimize scalable data solutions that drive business value and insights. Essential Duties and Responsibilities β€’ Designs and implements scalable ETL pipelines using Azure services (or equivalent cloud technologies) β€’ Implements and optimizes data warehouse architecture and performance β€’ Leads data integration projects and mentors junior engineers β€’ Establishes data quality standards and monitoring systems β€’ Collaborates with stakeholders to understand data requirements β€’ Develops and maintains documentation for data processes β€’ Performs miscellaneous duties as assigned Position Requirements Education β€’ Bachelor’s degree in computer science, information technology, or a related field, or equivalent related work experience required. Experience β€’ Minimum of 5 years of progressive data engineering experience β€’ Minimum 2 years of mortgage industry experience Functional/Technical Skills β€’ Azure Services: Data Factory, Synapse Analytics, Databricks, Logic Apps, Functions, Key Vault (or equivalent technologies from other cloud vendors) β€’ Programming: Python, PySpark, SQL, T-SQL, C#, .NET, Scala β€’ Database Systems: SQL Server, Azure Data Warehouse, MySQL, Oracle β€’ ETL/ELT Tools: Azure Data Factory, SSIS β€’ Version Control: Azure DevOps, Git β€’ Visualization Tools: PowerBI, Tableau β€’ Experience with cloud-based data warehouse architecture β€’ Strong knowledge of ETL/ELT processes and best practices β€’ Expertise in data modeling and optimization β€’ Excellent problem-solving and analytical abilities β€’ Strong communication and collaboration skills β€’ Experience with CI/CD pipelines and deployment automation β€’ Knowledge of data governance and security practices β€’ Familiarity with agile development methodologies