

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer on an 8+ month contract in Houston, TX (Hybrid). Pay rate is $70-$72/hr. Requires 8+ years in data engineering, advanced SQL, AWS, and experience with ETL tools and Cloud Data Warehouses like Snowflake.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
576
-
ποΈ - Date discovered
July 18, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Hybrid
-
π - Contract type
W2 Contractor
-
π - Security clearance
Unknown
-
π - Location detailed
Houston, TX
-
π§ - Skills detailed
#S3 (Amazon Simple Storage Service) #Java #Boomi #Talend #Hadoop #Pandas #Metadata #Data Engineering #Cloud #Data Management #Lambda (AWS Lambda) #Data Mart #Version Control #Databases #Data Pipeline #NoSQL #Python #SQS (Simple Queue Service) #AWS (Amazon Web Services) #Agile #Informatica #SNS (Simple Notification Service) #Matillion #Scrum #Amazon Redshift #EC2 #NumPy #Snowflake #BigQuery #Spark (Apache Spark) #Data Integration #Scripting #Data Lake #Redshift #SQL (Structured Query Language) #Data Analysis #AWS Glue #Programming #Data Architecture #Computer Science #Data Science #GitHub #Data Quality #Informatica Cloud #Scala #SQL Queries #"ETL (Extract #Transform #Load)" #Data Warehouse #BI (Business Intelligence) #Automation
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Akkodis is seeking a Sr. Data Engineer for a 08+ Months Contract position with our Direct Client located in Houston, TX (Hybrid - 4 days/week Onsite).
Pay Range: $70 - $72/hr on W2; The rate may be negotiable based on experience, education, geographic location, and other factors.
Job Description:
The ideal candidate for this role will be responsible for developing Python modules using Numpy, Pandas, and dynamic programming in AWS and Snowflake. They will expand and optimize our ETL and data pipeline architecture and data flow across our Business Portfolio. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our data product owners, developers, data architects, data analysts, and data scientists on BI and Analytic initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems, and products. The right candidate will be excited by the prospect of optimizing or even re-designing our companyβs data architecture to support our next generation of products and data initiative.
The candidate will also assist in issue resolution, job orchestration, automation, and continuous improvement of our data integration processes.
Key Responsibilities:
β’ Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Cloud Integration ETL Tools, Cloud Data Warehouse, SQL, and AWS.
β’ Design and develop ELT, ETL, event-driven data integration architecture solutions.
β’ Work with Data Analysts, Data Architects, BI Architects, Data Scientists, and Data Product Owners to establish an understanding of source data and determine data transformation/integration requirements.
β’ Troubleshoot and tune complex SQL queries.
β’ Utilize On-Prem and Cloud-based ETL platforms, Cloud Data Warehouse, AWS, GitHub, various scripting languages, SQL, querying tools, data quality tools, and metadata management tools.
β’ Develop data validation processes to ensure data quality.
β’ Demonstrated ability to work individually and as part of a team in a collaborative manner.
Preferred Skills & Qualifications:
β’ Bachelor's degree (or foreign equivalent) in Computer Science, Computer Engineering, or a related field.
β’ 8+ yearsβ experience with data engineering, ETL, data warehouse, data mart, and data lake development.
β’ Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL), and familiarity with a variety of databases.
β’ Experience working with Cloud Data Warehouses like Snowflake, Google BigQuery, Amazon Redshift.
β’ Experience with AWS cloud services: EC2, S3, Lambda, SQS, SNS, etc.
β’ Experience with Cloud Integration Tools like Matillion, Dell Boomi, Informatica Cloud, Talend, AWS Glue.
β’ Experience with GitHub and its integration with ETL tools for version control.
β’ Familiarity with modern data management tools and platforms including Spark, Hadoop/Hive, NoSQL, APIs, treaming, and other analytic data platforms.
β’ Experience with object-oriented/object function scripting languages: Python, Java, Scala, etc., is a plus.
β’ Experience with Agile/Scrum methodologies is valuable.
Other Skills and Abilities:
β’ Must speak English.
β’ Ability to work with business as well as IT stakeholders.
β’ Strong written and oral communication skills with the ability to work with users, peers, and management.
β’ Strong interpersonal skills.
β’ Ability to work independently and as part of a team to successfully execute projects.
β’ Highly motivated, self-starter with problem-solving skills.
β’ Ability to multitask and meet aggressive deadlines efficiently and effectively.
β’ Extraordinary attention to detail and accuracy.
If you are interested in this role, then please click APPLY NOW. For other opportunities available at Akkodis, or any questions, please contact Ayush Garg at 610-735-6513 or ayush.garg@akkodisgroup.com.
Equal Opportunity Employer/Veterans/Disabled
Benefit offerings include medical, dental, vision, term life insurance, short-term disability insurance, additional voluntary benefits, commuter benefits, and a 401K plan. Our program provides employees the flexibility to choose the type of coverage that meets their individual needs. Available paid leave may include Paid Sick Leave, where required by law; any other paid leave required by Federal, State, or local law; and Holiday pay upon meeting eligibility criteria. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs which are direct hire to a client.
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy.
The Company will consider qualified applicants with arrest and conviction records.