SQL/SSIS/Data Lake Engineer @ Onsite

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a SQL/SSIS/Data Lake Engineer in Charlotte, NC, lasting 12 months+. Key skills include SQL, SSIS, Liquibase, and TFS for legacy pipelines, with future work involving Hive, Dremio, Python, and Airflow for data lake projects.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 22, 2025
πŸ•’ - Project duration
More than 6 months
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#SQL (Structured Query Language) #Project Management #Spark (Apache Spark) #Virtualization #Data Integration #PySpark #Data Engineering #Data Lake #Liquibase #SSIS (SQL Server Integration Services) #Python #Data Pipeline #Programming #Airflow #Dremio #Datasets
Role description
SQL/SSIS/Data Lake Engineer Location: Charlotte, NC Duration: 12 Months+ Looking for someone who can manage existing data pipelines with SQL, SSIS, and Liquibase, and then in the future, move these pipelines to a data lake. The Immediate Project (Legacy Pipeline) They need someone who can immediately step in and manage existing data pipelines. This person should be highly proficient in: β€’ SQL: For database queries and manipulation. β€’ SSIS: A Microsoft tool used for building data integration workflows. β€’ Liquibase: A tool used to manage and track database changes. β€’ TFS (Team Foundation Server): A system for source code management and project management. β€’ Autosys: For scheduling and automating data jobs. The Future Project (Data Lake Pipeline) This candidate will also need to help with a new project to move the data from the old systems to a new data lake. For this, they need skills in: β€’ Hive: A tool that allows you to query data in a data lake using SQL-like commands. β€’ Dremio: A data virtualization platform that connects to data lakes. β€’ Python/PySpark: Programming languages used to process large datasets. β€’ Airflow: A platform to program, schedule, and monitor data workflows. β€’ Data Lake Architecture: A general understanding of how these new systems are built. β€’ In short, they need a well-rounded Data Engineer who is an expert in older technologies like SQL and SSIS and also has a strong understanding of new technologies like Python and Data Lakes.