

Intellectt Inc
Senior Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer in California, offered as a full-time contract for over 6 months, with a pay rate of "unknown." Key skills include strong Python, SQL, data pipeline experience, and familiarity with AWS and orchestration tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 17, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
California, United States
-
🧠 - Skills detailed
#Security #Big Data #Data Engineering #Data Modeling #Airflow #Hadoop #FastAPI #SQL (Structured Query Language) #Data Quality #Data Security #Redshift #Data Pipeline #Snowflake #Flask #BigQuery #Pandas #Python #Cloud #AWS (Amazon Web Services) #"ETL (Extract #Transform #Load)" #Spark (Apache Spark)
Role description
Position: Senior Data Engineer
Location: California
Duration: Full-time / Contract
We are seeking a Senior Data Engineer to design, build, and maintain robust data pipelines, warehouses, and models to support analytics and business needs.
Skills & Experience:
• Strong Python (Pandas, SQL Alchemy, Flask/FastAPI) & SQL
• Experience with data pipelines & warehousing (Redshift, BigQuery, Snowflake)
• Familiarity with Airflow or other orchestration tools
• Knowledge of AWS/cloud platforms
• Understanding of data modeling, integration, and architecture
• Bonus: Big Data (Spark/Hadoop), data security knowledge
Responsibilities:
• Build and maintain ETL pipelines and data models
• Integrate data from multiple sources
• Ensure data quality, performance, and accuracy
• Collaborate with analysts & stakeholder
Position: Senior Data Engineer
Location: California
Duration: Full-time / Contract
We are seeking a Senior Data Engineer to design, build, and maintain robust data pipelines, warehouses, and models to support analytics and business needs.
Skills & Experience:
• Strong Python (Pandas, SQL Alchemy, Flask/FastAPI) & SQL
• Experience with data pipelines & warehousing (Redshift, BigQuery, Snowflake)
• Familiarity with Airflow or other orchestration tools
• Knowledge of AWS/cloud platforms
• Understanding of data modeling, integration, and architecture
• Bonus: Big Data (Spark/Hadoop), data security knowledge
Responsibilities:
• Build and maintain ETL pipelines and data models
• Integrate data from multiple sources
• Ensure data quality, performance, and accuracy
• Collaborate with analysts & stakeholder