

Snowflake ETL AWS Engineer - InPerson Interview Must
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer with 8-10 years of experience, focusing on AWS, Python, SQL, and Snowflake. Contract length is unspecified, with an on-site location in Providence, NJ. Key skills include orchestration tools, Spark, and Scala.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 9, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New Providence, NJ
-
π§ - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Lineage #Shell Scripting #SQL (Structured Query Language) #Automated Testing #Python #Jira #Informatica #Snowflake #GIT #Data Warehouse #Data Ingestion #Airflow #Java #Data Management #Data Engineering #Metadata #AWS (Amazon Web Services) #Scala #Quality Assurance #Spark (Apache Spark) #Scripting
Role description
Data Engineer
Providence NJ (F2F interview and onsite)-Need local candidate
Skill Top FocusβAWS,Python, SQL, Snowflake and exp with Orchestration Tools e.g. Airflow, Informatica, Automic and three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
Role: Sr. Data Engineer
Job Description
β’ Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems
β’ Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance
β’ Work in tandem with our engineering team to identify and implement the most optimal solutions
β’ Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design
β’ Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures
β’ Able to manage deliverables in fast paced environments
Areas of Expertise
β’ At least 8-10 years of experience designing and development of data solutions in enterprise environment
β’ At least 5+ yearsβ experience on Snowflake Platform
β’ Strong hands on SQL and Python development
β’ Experience with designing and development data warehouses in Snowflake
β’ A minimum of three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
β’ Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic
β’ Good understanding on Metadata and data lineage
β’ Hands on knowledge on SQL Analytical functions
β’ Strong knowledge and hands-on experience in Shell scripting, Java Scripting
β’ Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering.
β’ Good understanding and exposure to Git, Confluence and Jira
β’ Good problem solving and troubleshooting skills.
β’ Team player, collaborative approach and excellent communication skills
Data Engineer
Providence NJ (F2F interview and onsite)-Need local candidate
Skill Top FocusβAWS,Python, SQL, Snowflake and exp with Orchestration Tools e.g. Airflow, Informatica, Automic and three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
Role: Sr. Data Engineer
Job Description
β’ Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems
β’ Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance
β’ Work in tandem with our engineering team to identify and implement the most optimal solutions
β’ Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design
β’ Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures
β’ Able to manage deliverables in fast paced environments
Areas of Expertise
β’ At least 8-10 years of experience designing and development of data solutions in enterprise environment
β’ At least 5+ yearsβ experience on Snowflake Platform
β’ Strong hands on SQL and Python development
β’ Experience with designing and development data warehouses in Snowflake
β’ A minimum of three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
β’ Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic
β’ Good understanding on Metadata and data lineage
β’ Hands on knowledge on SQL Analytical functions
β’ Strong knowledge and hands-on experience in Shell scripting, Java Scripting
β’ Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering.
β’ Good understanding and exposure to Git, Confluence and Jira
β’ Good problem solving and troubleshooting skills.
β’ Team player, collaborative approach and excellent communication skills