Signature IT World Inc

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer on a W2 contract basis in Chicago, IL/Omaha, NE (hybrid). Requires 10+ years of experience, proficiency in Databricks, Snowflake, PySpark, SQL, and ETL tools. Bachelor's degree in a related field is essential.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
March 25, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Hybrid
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Chicago, IL
-
🧠 - Skills detailed
#Databricks #PySpark #GIT #Deployment #dbt (data build tool) #Code Reviews #Data Modeling #Python #Data Processing #Infrastructure as Code (IaC) #Data Pipeline #Cloud #Data Architecture #Snowflake #Terraform #"ETL (Extract #Transform #Load)" #Data Engineering #Datasets #Computer Science #Spark (Apache Spark) #SQL (Structured Query Language) #Informatica #Database Design #Talend #Airflow #DevOps
Role description
Job Title : Data Architect Location : Chicago, IL 60604/Omaha, Ne (1st preference) -Hybrid Type : W2 Contract Job Details: Minimum years of experience: 10+ years of experiences Must Have Skills: Databricks, Snowflake, Pyspark (Hands-on experience in development) Job Description: β€’ Bachelor’s degree in computer science, Information Systems, or a related field (or equivalent experience). β€’ 5–8 years of experience in data engineering or a related role. β€’ Advanced proficiency in SQL for complex data transformation and analysis. β€’ Hands-on experience with cloud-based data platforms such as Databricks, Snowflake, or similar tools. β€’ Experience with ETL/ELT tools and frameworks (e.g., Informatica, Talend, dbt, or equivalent). β€’ Strong proficiency in Python and/or PySpark for data processing and pipeline development. β€’ Strong understanding of data modeling, database design principles, and building curated datasets for analytics and operational use cases. β€’ Experience with DevOps practices and Git-based development (branching strategies, pull requests, code reviews). β€’ Experience implementing CI/CD for data pipelines/workflows and managing deployments across environments. β€’ CPG Domain Knowledge will be a plus. β€’ Familiarity with orchestration and workflow tools (e.g., Databricks Workflows, Airflow, or similar) is preferred. β€’ Familiarity with Infrastructure as Code (e.g., Terraform, CloudFormation) and/or containerization concepts is a plus. β€’ Strong problem-solving skills, attention to detail, and ability to troubleshoot complex issues end-to-end. β€’ Excellent communication skills and ability to collaborate across technical and non-technical teams. Thanks and Regards, Ajay Kumar ajay.k@sitwinc.com Signature IT World Inc. LinkedIn: https://www.linkedin.com/in/ajay-k-875ab212a/