

Sibitalent Corp
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown," offering a pay rate of "unknown." Key skills include SQL, Python, and experience with generative AI models, relational databases, and data engineering challenges.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
January 30, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Orlando, FL
-
🧠 - Skills detailed
#Snowflake #Data Pipeline #DevOps #GIT #"ETL (Extract #Transform #Load)" #Knowledge Graph #AI (Artificial Intelligence) #Databases #Deployment #PostgreSQL #SQL (Structured Query Language) #Python #Airflow #Kubernetes #Docker #Data Engineering
Role description
Requirements
• 7+ years of data engineering experience across multiple environments (e.g., Dev, QA, Production) with DevOps practices for code deployment.
• Experience working with a range of generative AI models, tools, and concepts.
• Must Have
• 5+ years of proven experience using SQL and Python.
• Must Have
• 3+ years of experience designing and building relational databases (e.g., Snowflake, PostgreSQL, or similar).
• Must have
• Experience translating high-level requirements into technical data engineering tasks.
• Experience developing solutions for complex data engineering challenges that support advanced analytics.
• Familiarity with concepts such as Knowledge Graphs, Data Mesh, and data-sharing platforms.
• 3+ years of experience managing and deploying code with source control tools (e.g., Git-based platforms).
• 2+ years of experience with job scheduling and orchestration tools (e.g., Airflow or similar).
• Several years of experience building and maintaining ELT/ETL data pipelines.
• Must Have
• Experience with containerization technologies (e.g., Docker, Kubernetes).
• Must Have
Requirements
• 7+ years of data engineering experience across multiple environments (e.g., Dev, QA, Production) with DevOps practices for code deployment.
• Experience working with a range of generative AI models, tools, and concepts.
• Must Have
• 5+ years of proven experience using SQL and Python.
• Must Have
• 3+ years of experience designing and building relational databases (e.g., Snowflake, PostgreSQL, or similar).
• Must have
• Experience translating high-level requirements into technical data engineering tasks.
• Experience developing solutions for complex data engineering challenges that support advanced analytics.
• Familiarity with concepts such as Knowledge Graphs, Data Mesh, and data-sharing platforms.
• 3+ years of experience managing and deploying code with source control tools (e.g., Git-based platforms).
• 2+ years of experience with job scheduling and orchestration tools (e.g., Airflow or similar).
• Several years of experience building and maintaining ELT/ETL data pipelines.
• Must Have
• Experience with containerization technologies (e.g., Docker, Kubernetes).
• Must Have






