Data Engineer with GENAI

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with GENAI in Charlotte, NC, offering a contract position. Key skills include Spark, ETL, Snowflake, and Generative AI. Experience with Python and limited knowledge of data pipelines and cloud data warehouses is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 3, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
On-site
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
Charlotte, NC
-
🧠 - Skills detailed
#Redshift #Airflow #Data Management #Indexing #Metadata #Python #Data Lineage #dbt (data build tool) #AI (Artificial Intelligence) #Data Warehouse #Spark (Apache Spark) #Snowflake #Migration #Cloud #Data Engineering #Data Pipeline #"ETL (Extract #Transform #Load)"
Role description
Role: Data Engineer with AI Engineer Location: Charlotte, NC (Onsite) Skills: Spark, ETL, Data Engineer, Snowflakes, Gen AI Job Description: Β· Limited experience with query optimization, indexing, and performance profiling. No background in database scaling, migration, or high availability Β· Little to no hands-on experience with ETL/ELT tools like Spark, Airflow, or dbt or maintaining large-scale ETL systems Β· No experience designing end-to-end or real-time data pipelines, event-driven processing, Β· Limited experience with cloud data warehouses like Snowflake, Big Query, or Redshift Β· Minimal focus on data lineage, metadata management, complex data transformation and optimization techniques Β· Should have experience in python Β· Hands-on working experience with Generative AI patterns and frameworks , LLM Β· Sound knowledge of RAG