Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 4-month contract, paying $76.39/hour. Located remotely, it requires 8+ years of experience in Data Engineering, proficiency in AWS, Snowflake, SQL, and ELT methodologies. A Bachelor's degree in a related field is mandatory.
🌎 - Country
United States
💱 - Currency
$ USD
💰 - Day rate
Unknown
Unknown
608
🗓️ - Date discovered
April 22, 2025
🕒 - Project duration
3 to 6 months
🏝️ - Location type
Remote
📄 - Contract type
W2 Contractor
🔒 - Security clearance
Unknown
📍 - Location detailed
Houston, TX
🧠 - Skills detailed
#Metadata #Informatica #S3 (Amazon Simple Storage Service) #Computer Science #Version Control #Scripting #Matillion #Databases #Cloud #SQL (Structured Query Language) #Hadoop #SNS (Simple Notification Service) #Data Integration #Data Mart #HTTP & HTTPS (Hypertext Transfer Protocol & Hypertext Transfer Protocol Secure) #Java #Informatica Cloud #Redshift #AWS (Amazon Web Services) #Boomi #SQS (Simple Queue Service) #Lambda (AWS Lambda) #Spark (Apache Spark) #EC2 #Data Architecture #Data Science #Data Analysis #Data Lake #dbt (data build tool) #Data Warehouse #Amazon Redshift #BigQuery #BI (Business Intelligence) #Scrum #"ETL (Extract #Transform #Load)" #Data Management #GitHub #Python #Talend #NoSQL #Agile #Informatica PowerCenter #Data Pipeline #Scala #Data Quality #Snowflake #AWS Glue #Data Engineering
Role description

   • Pay Rate: $76.39/hour (on W2)

   • Location: Houston, TX 77002 (Remote)

   • Work Hours: 9:00 AM - 5:00 PM

   • Contract Length: 4 months

Raise is currently hiring a contract team member on behalf of our client. They’re expanding their team to meet growing needs, making this a unique opportunity to work with an industry leader.

Overview

   • The Data Engineer will be responsible for developing ETL and data pipeline using AWS, Snowflake & DBT.

   • The ideal candidate is an experienced data pipeline builder using ELT methodology.

   • Must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products.

Responsibilities

   • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide

   • variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS technologies.

   • Design and develop ELT, ETL, Event-driven data integration architecture solutions.

   • Work with the Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation and integration requirements.

   • Troubleshoot and tune complex SQL.

   • Utilize On-Prem and Cloud-based ETL platforms, Cloud Datawarehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.

   • Develop data validation processes to ensure data quality.

   • Demonstrated ability to work individually and as a part of the team in a collaborative manner.

Qualifications

   • Bachelor's degree (or foreign equivalent) in Computer Science, Computer Engineering, or a related field.

   • 8+ years of experience with Data Engineering, ETL, data warehouse/data mart development, data lake development.

   • Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases.

   • Experience working with Cloud Datawarehouse like Snowflake, Google BigQuery, Amazon Redshift

   • Experience with AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.

   • Experience with Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend and AWS Glue.

   • Experience with GitHub and its integration with the ETL tools for version control.

   • Experience with Informatica PowerCenter, various scripting languages, SQL, querying tools.

   • Familiarity with modern data management tools and platforms including Spark, Hadoop/Hive, NoSQL, APIs, Streaming, and other analytic data platforms.

   • Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., a plus.

   • Experience with Agile/Scrum is valuable.

   • Must Speak English.

   • Ability to work with business as well as IT stakeholders.

   • Strong written and oral communication skills with the ability to work with users, peers, and management.

   • Strong interpersonal skills.

   • Ability to work independently and as part of a team to successfully execute projects.

   • Highly motivated, self-starter with problem-solving skills.

   • Ability to multitask and meet aggressive deadlines efficiently and effectively.

   • Extraordinary attention to detail and accuracy.

Looking for meaningful work? We can help

Raise is an established hiring firm with over 65 years of experience. We believe strongly in making the world a better place through work, which is why we’re a certified B Corporation and donate 10% of our profits to charity.

We strive to build teams that reflect the diversity of the communities we work in. We encourage all qualified applicants to apply, including people from traditionally underrepresented groups such as women, visible minorities, Indigenous peoples, people identifying as LGBTQ2SI, veterans, and people with visible/nonvisible disabilities.

We have a dedicated webpage for accommodations where you can learn more about what we offer, and request accommodation: https://raise.jobs/accommodations/

In order to submit candidates for roles, our clients will sometimes require personal information to confirm the identity of applicants and their legal status to work. Raise will never ask you for personal or banking information unless you have been selected for a job. If you are ever unsure about the legitimacy of this or another job posting by Raise (or have any other questions), please contact us at +1 800-567-9675 or hello@raiserecruiting.com



#USVM