

Snowflake ETL AWS Engineer - InPerson Interview Must
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer with 8-10 years of experience, focusing on AWS, Python, SQL, Snowflake, and orchestration tools. Contract length is unspecified, with a pay rate of "unknown," and requires local candidates for in-person interviews in Providence, NJ.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
August 23, 2025
π - Project duration
Unknown
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
New Providence, NJ
-
π§ - Skills detailed
#Scripting #Data Management #Shell Scripting #Data Engineering #Snowflake #Automated Testing #Jira #SQL (Structured Query Language) #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #AWS (Amazon Web Services) #Data Ingestion #Python #Quality Assurance #GIT #Data Lineage #Informatica #Java #Metadata #Scala #Airflow #Data Warehouse
Role description
Data Engineer
Providence NJ (F2F interview and onsite)-Need local candidate
Skill Top FocusβAWS,Python, SQL, Snowflake and exp with Orchestration Tools e.g. Airflow, Informatica, Automic and three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
Role: Sr. Data Engineer
Job Description
β’ Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems
β’ Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance
β’ Work in tandem with our engineering team to identify and implement the most optimal solutions
β’ Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design
β’ Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures
β’ Able to manage deliverables in fast paced environments
Areas of Expertise
β’ At least 8-10 years of experience designing and development of data solutions in enterprise environment
β’ At least 5+ yearsβ experience on Snowflake Platform
β’ Strong hands on SQL and Python development
β’ Experience with designing and development data warehouses in Snowflake
β’ A minimum of three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
β’ Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic
β’ Good understanding on Metadata and data lineage
β’ Hands on knowledge on SQL Analytical functions
β’ Strong knowledge and hands-on experience in Shell scripting, Java Scripting
β’ Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering.
β’ Good understanding and exposure to Git, Confluence and Jira
β’ Good problem solving and troubleshooting skills.
β’ Team player, collaborative approach and excellent communication skills
Data Engineer
Providence NJ (F2F interview and onsite)-Need local candidate
Skill Top FocusβAWS,Python, SQL, Snowflake and exp with Orchestration Tools e.g. Airflow, Informatica, Automic and three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
Role: Sr. Data Engineer
Job Description
β’ Experienced Data management specialist responsible for developing, overseeing, organizing, storing, and analyzing data and data systems
β’ Participate in all aspects of the software development lifecycle for Snowflake solutions, including planning, requirements, development, testing, and quality assurance
β’ Work in tandem with our engineering team to identify and implement the most optimal solutions
β’ Ensure platform performance, uptime, and scale, maintaining high standards for code quality and thoughtful design
β’ Troubleshoot incidents, identify root causes, fix and document problems, and implement preventive measures
β’ Able to manage deliverables in fast paced environments
Areas of Expertise
β’ At least 8-10 years of experience designing and development of data solutions in enterprise environment
β’ At least 5+ yearsβ experience on Snowflake Platform
β’ Strong hands on SQL and Python development
β’ Experience with designing and development data warehouses in Snowflake
β’ A minimum of three years experience in developing production-ready data ingestion and processing pipelines using Spark, Scala
β’ Strong hands-on experience with Orchestration Tools e.g. Airflow, Informatica, Automic
β’ Good understanding on Metadata and data lineage
β’ Hands on knowledge on SQL Analytical functions
β’ Strong knowledge and hands-on experience in Shell scripting, Java Scripting
β’ Able to demonstrate experience with software engineering practices including CI/CD, Automated testing and Performance Engineering.
β’ Good understanding and exposure to Git, Confluence and Jira
β’ Good problem solving and troubleshooting skills.
β’ Team player, collaborative approach and excellent communication skills