

Net2Source Inc.
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in Burbank, CA, with a contract length of W2. Key skills include 7-10 years of experience in Python, SQL, AWS, and Snowflake, along with expertise in ETL/ELT concepts and data integration.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
April 1, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Burbank, CA
-
🧠 - Skills detailed
#GIT #Python #Snowflake #AWS (Amazon Web Services) #Data Engineering #Data Integration #Data Modeling #Data Pipeline #"ETL (Extract #Transform #Load)" #JDBC (Java Database Connectivity) #SQL (Structured Query Language) #Cloud #Datasets
Role description
Role name: Data Engineer Studio Economics
Work site: Burbank, CA (Onsite local only )
Type : W2 only
• Hands on experience in Data Engineering with Python
• Strong SQL Skillset
• Exp in AWS
• Exp in Snowflake -----
• Data Engineer-Seven to Ten Years,PYTHON
• Seven to Ten Years,Snowflake-Seven to Ten Years -----
• Hands-on experience with Snowflake (SQL, data modeling, views, tasks, warehouse optimization).
• Solid understanding of ETL/ELT concepts, data pipelines and cloud-based data platforms.
• Proficiency in SQL and experience working with structured datasets.
• Experience building data integrations between systems (APIs, connectors, ETL tools or JDBC-based pipelines).
• Familiarity with source control tools (Git) and collaborative development practices.
• Strong problem-solving skills, attention to detail and ability to work independently.
Role name: Data Engineer Studio Economics
Work site: Burbank, CA (Onsite local only )
Type : W2 only
• Hands on experience in Data Engineering with Python
• Strong SQL Skillset
• Exp in AWS
• Exp in Snowflake -----
• Data Engineer-Seven to Ten Years,PYTHON
• Seven to Ten Years,Snowflake-Seven to Ten Years -----
• Hands-on experience with Snowflake (SQL, data modeling, views, tasks, warehouse optimization).
• Solid understanding of ETL/ELT concepts, data pipelines and cloud-based data platforms.
• Proficiency in SQL and experience working with structured datasets.
• Experience building data integrations between systems (APIs, connectors, ETL tools or JDBC-based pipelines).
• Familiarity with source control tools (Git) and collaborative development practices.
• Strong problem-solving skills, attention to detail and ability to work independently.





