

Sr Data Engineer
Job Title: Data Engineer (Contract) - Job#5502
Worksite: Remote
The Data Engineer will be responsible for developing ETL and data pipeline using AWS, Snowflake & DBT. The ideal candidate is an experienced data pipeline builder using ELT methodology . Selected candidate will be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.
• Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS technologies.
• Design and develop ELT, ETL, Event driven data integration architecture solutions
• Work with the Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation and integration requirements
• Troubleshoot and tune complex SQL
• Utilize On-Prem and Cloud-based ETL platforms, Cloud Datawarehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.
• Develop data validation processes to ensure data quality
• Demonstrated ability to work individually and as a part of the team in a collaborative manner
Qualifications
• 8+ years of experience with Data Engineering, ETL, data warehouse/data mart development, data lake development
• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
• Cloud Datawarehouse like Snowflake, Google BigQuery, Amazo Redshift
• AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.
• Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend, AWS Glue
• GitHub and its integration with the ETL tools for version control
• Informatica PowerCenter, various scripting languages, SQL, querying tools
• Data management tools and platforms including Spark, Hadoop/Hive, NoSQL, APIs,
• Streaming, and other analytic data platforms
• Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., a plus.
• Experience with Agile/Scrum is valuable.
About Delphi-US
Delphi-US is a national recruiting firm based in Newport, Rhode Island. We specialize in IT, Engineering and Professional Staffing services for organizations from Main Street to Wall Street. Our mission is simple: To connect great people to great companies. We accomplish this with a proprietary skill-based and cultural matching process that results in higher qualified submissions along with increased interviews and offer rates. You’ll find our team is friendly, professional and ready to advocate on your behalf, armed with industry trends, an understanding of employer expectations.
Job Title: Data Engineer (Contract) - Job#5502
Worksite: Remote
The Data Engineer will be responsible for developing ETL and data pipeline using AWS, Snowflake & DBT. The ideal candidate is an experienced data pipeline builder using ELT methodology . Selected candidate will be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.
• Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS technologies.
• Design and develop ELT, ETL, Event driven data integration architecture solutions
• Work with the Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation and integration requirements
• Troubleshoot and tune complex SQL
• Utilize On-Prem and Cloud-based ETL platforms, Cloud Datawarehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.
• Develop data validation processes to ensure data quality
• Demonstrated ability to work individually and as a part of the team in a collaborative manner
Qualifications
• 8+ years of experience with Data Engineering, ETL, data warehouse/data mart development, data lake development
• Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.
• Cloud Datawarehouse like Snowflake, Google BigQuery, Amazo Redshift
• AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.
• Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend, AWS Glue
• GitHub and its integration with the ETL tools for version control
• Informatica PowerCenter, various scripting languages, SQL, querying tools
• Data management tools and platforms including Spark, Hadoop/Hive, NoSQL, APIs,
• Streaming, and other analytic data platforms
• Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., a plus.
• Experience with Agile/Scrum is valuable.
About Delphi-US
Delphi-US is a national recruiting firm based in Newport, Rhode Island. We specialize in IT, Engineering and Professional Staffing services for organizations from Main Street to Wall Street. Our mission is simple: To connect great people to great companies. We accomplish this with a proprietary skill-based and cultural matching process that results in higher qualified submissions along with increased interviews and offer rates. You’ll find our team is friendly, professional and ready to advocate on your behalf, armed with industry trends, an understanding of employer expectations.