

Akkodis
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in San Antonio, TX, offering a 9-month contract at $42-$44/hr. Key skills include ETL, Informatica, Mulesoft, Oracle PL/SQL, Apache Spark, and Python. A bachelor's degree in Computer Science or related field is required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
352
-
ποΈ - Date
April 21, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
1099 Contractor
-
π - Security
Yes
-
π - Location detailed
San Antonio, TX
-
π§ - Skills detailed
#Monitoring #Cloud #HBase #Pandas #Hadoop #Spark (Apache Spark) #NumPy #Apache Spark #EDW (Enterprise Data Warehouse) #Python #JSON (JavaScript Object Notation) #Web Services #NiFi (Apache NiFi) #Informatica #Security #"ETL (Extract #Transform #Load)" #Data Warehouse #Data Integration #Databases #Data Architecture #DevOps #TensorFlow #Data Engineering #MySQL #Big Data #Datasets #Data Science #API (Application Programming Interface) #Computer Science #Informatica PowerCenter #Libraries #SQL (Structured Query Language) #Data Quality #BI (Business Intelligence) #Automation #XML (eXtensible Markup Language) #Oracle #Data Analysis #Kafka (Apache Kafka) #MongoDB
Role description
Akkodis is seeking a Data Engineer with a client in San Antonio, TX 78215-Onsite for 9-month Contract.
Title: Data Engineer
Location: San Antonio, TX 78215-Onsite
Duration: 9-month Contract
Pay Rate: $42-$44/hr (The rate may be negotiable based on experience, education, geographic location, and other factors.)
Position Summary
Provide the development and automation of computing processes to detect, predict, and respond to opportunities in business operations. Working with a variety of disparate datasets that encompass many disciplines and business units, including weather, transmission and distribution grid infrastructure, power generation, gas delivery, commercial market operations, safety and security, and customer engagement. Strive to transform and implement true business integration, leveraging top-notch data integration best practices. Merging and securing data in a way that reduces the cost to maintain and increases the utilization of enterprise-wide data as an asset. Developing business intelligence.
Task and Responsibilities
β’ Design, develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements.
β’ Daily production support for Enterprise Data Warehouse, including jobs in Informatica/Mulesoft and Oracle PL/SQL, and be flexible to manage high severity incidents/problem resolution.
β’ Participate in troubleshooting and resolving data integration issues such as data quality.
β’ Deliver increased productivity and effectiveness through rapid delivery of high-quality applications.
β’ Provide work estimates and communicate status of assignments.
β’ Assist in QA efforts on tasks by providing input for test cases and supporting test case execution.
β’ Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts.
β’ Develop Informatica/Mulesoft Mappings and complex Oracle PL/SQL programs for the Data Warehouse.
β’ Responsible for selecting and using DevOps tools for continuous integration, builds, and monitoring of solutions.
β’ May provide input to the area budget.
β’ Makes some independent decisions and recommendations which affect the section, department and/or division.
Minimum Qualifications
β’ Bachelorβs degree in Computer Science, Engineering, or related field from an accredited university.
β’ Experience in a data integration role.
β’ Experience using Apache Spark, NiFi, and/or Kafka.
β’ Experience using Python.
β’ Experience integrating enterprise software using ETL modules.
β’ Knowledge of data architecture, structures and principles with the ability to critique data and system designs.
β’ Ability to design, create and/or modify data processes that meet key timelines while conforming to predefined specifications utilizing the Informatica and/or Mulesoft platform.
β’ Understanding of big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB).
β’ Ability to integrate data from Web services in XML, JSON, flat file format, and SOAP.
β’ Knowledge of core concepts of RESTful API Modeling Language (RAML 1.0) and designing with MuleSoft solutions.
Preferred Qualifications
β’ Relevant Certifications
β’ Experience in API Management
β’ Proficiency with the following databases/technologies: MuleSoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL
β’ Knowledge of Test-Driven Development (TDD)
β’ Familiarity with cloud-based architecture
β’ Experience with data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, Scikit-learn, TensorFlow)
Equal Opportunity Employer/Veterans/Disabled
Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client.
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy.
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
Β· The California Fair Chance Act
Β· Los Angeles City Fair Chance Ordinance
Β· Los Angeles County Fair Chance Ordinance for Employers
Β· San Francisco Fair Chance Ordinance
Akkodis is seeking a Data Engineer with a client in San Antonio, TX 78215-Onsite for 9-month Contract.
Title: Data Engineer
Location: San Antonio, TX 78215-Onsite
Duration: 9-month Contract
Pay Rate: $42-$44/hr (The rate may be negotiable based on experience, education, geographic location, and other factors.)
Position Summary
Provide the development and automation of computing processes to detect, predict, and respond to opportunities in business operations. Working with a variety of disparate datasets that encompass many disciplines and business units, including weather, transmission and distribution grid infrastructure, power generation, gas delivery, commercial market operations, safety and security, and customer engagement. Strive to transform and implement true business integration, leveraging top-notch data integration best practices. Merging and securing data in a way that reduces the cost to maintain and increases the utilization of enterprise-wide data as an asset. Developing business intelligence.
Task and Responsibilities
β’ Design, develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements.
β’ Daily production support for Enterprise Data Warehouse, including jobs in Informatica/Mulesoft and Oracle PL/SQL, and be flexible to manage high severity incidents/problem resolution.
β’ Participate in troubleshooting and resolving data integration issues such as data quality.
β’ Deliver increased productivity and effectiveness through rapid delivery of high-quality applications.
β’ Provide work estimates and communicate status of assignments.
β’ Assist in QA efforts on tasks by providing input for test cases and supporting test case execution.
β’ Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts.
β’ Develop Informatica/Mulesoft Mappings and complex Oracle PL/SQL programs for the Data Warehouse.
β’ Responsible for selecting and using DevOps tools for continuous integration, builds, and monitoring of solutions.
β’ May provide input to the area budget.
β’ Makes some independent decisions and recommendations which affect the section, department and/or division.
Minimum Qualifications
β’ Bachelorβs degree in Computer Science, Engineering, or related field from an accredited university.
β’ Experience in a data integration role.
β’ Experience using Apache Spark, NiFi, and/or Kafka.
β’ Experience using Python.
β’ Experience integrating enterprise software using ETL modules.
β’ Knowledge of data architecture, structures and principles with the ability to critique data and system designs.
β’ Ability to design, create and/or modify data processes that meet key timelines while conforming to predefined specifications utilizing the Informatica and/or Mulesoft platform.
β’ Understanding of big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB).
β’ Ability to integrate data from Web services in XML, JSON, flat file format, and SOAP.
β’ Knowledge of core concepts of RESTful API Modeling Language (RAML 1.0) and designing with MuleSoft solutions.
Preferred Qualifications
β’ Relevant Certifications
β’ Experience in API Management
β’ Proficiency with the following databases/technologies: MuleSoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL
β’ Knowledge of Test-Driven Development (TDD)
β’ Familiarity with cloud-based architecture
β’ Experience with data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, Scikit-learn, TensorFlow)
Equal Opportunity Employer/Veterans/Disabled
Benefit offerings available for our associates include medical, dental, vision, life insurance, short-term disability, additional voluntary benefits, an EAP program, commuter benefits, and a 401K plan. Our benefit offerings provide employees the flexibility to choose the type of coverage that meets their individual needs. In addition, our associates may be eligible for paid leave including Paid Sick Leave or any other paid leave required by Federal, State, or local law, as well as Holiday pay where applicable. Disclaimer: These benefit offerings do not apply to client-recruited jobs and jobs that are direct hires to a client.
To read our Candidate Privacy Information Statement, which explains how we will use your information, please visit https://www.akkodis.com/en/privacy-policy.
The Company will consider qualified applicants with arrest and conviction records in accordance with federal, state, and local laws and/or security clearance requirements, including, as applicable:
Β· The California Fair Chance Act
Β· Los Angeles City Fair Chance Ordinance
Β· Los Angeles County Fair Chance Ordinance for Employers
Β· San Francisco Fair Chance Ordinance






