Jobs via Dice

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Integration Developer in San Antonio, TX, for a 7-month contract at $40-$43 per hour. Key skills include ETL solutions with Informatica/Mulesoft, Apache Spark, Python, and data architecture experience. Bachelor's degree required.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
344
-
🗓️ - Date
April 21, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
San Antonio, TX
-
🧠 - Skills detailed
#Monitoring #Cloud #HBase #Pandas #Hadoop #Spark (Apache Spark) #NumPy #Apache Spark #EDW (Enterprise Data Warehouse) #Python #JSON (JavaScript Object Notation) #Web Services #NiFi (Apache NiFi) #Informatica #"ETL (Extract #Transform #Load)" #Data Warehouse #Data Integration #Databases #Data Architecture #DevOps #TensorFlow #Data Engineering #MySQL #Big Data #Datasets #Base #Data Science #API (Application Programming Interface) #Computer Science #Informatica PowerCenter #Libraries #SQL (Structured Query Language) #Data Quality #BI (Business Intelligence) #XML (eXtensible Markup Language) #Oracle #Data Analysis #Kafka (Apache Kafka) #MongoDB
Role description
job summary: Role Overview: Data Integration Developer Location: San Antonio, TX (On-site) Duration: 7-Month Project Are you ready to transform complex data into actionable business intelligence? We are looking for a skilled Data Integration Developer to join our team in San Antonio for a high-impact, 7-month project. In this role, you will bridge the gap between disparate datasets-ranging from weather and grid infrastructure to market operations-to build automated solutions that predict and respond to real-time business needs. What You'll Do Architect & Automate: Design, develop, and test robust ETL/Data Integration solutions using Informatica, Mulesoft, and Oracle PL/SQL. Innovate with Big Data: Leverage technologies like Apache Spark, NiFi, Kafka, and Python to manage large-scale data flows and secure enterprise-wide data assets. Solve Complex Problems: Troubleshooting data quality issues, performing bug fixes, and tuning performance to ensure high-speed, high-quality application delivery. Modernize Operations: Utilize DevOps tools for continuous integration and monitoring while working within a collaborative, fast-paced environment. location: San Antonio, Texas job type: Contract salary: $40 - 43 per hour work hours: 8am to 5pm education: Bachelors responsibilities: • Design, Develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements. • Daily production support for Enterprise Data Warehouse including jobs in Informatica/Mulesoft and Oracle PL/SQL; and be flexible to manage high severity incidents/problem resolution. • Participate in troubleshooting and resolving data integration issues such as data quality. • Deliver increased productivity and effectiveness through rapid delivery of high-quality applications. • Provide work estimates and communicate status of assignments. • Assist in QA efforts on tasks by providing input for test cases and supporting test case execution. • Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts. • Develop Informatica/Mulesoft Mappings and complex Oracle PL/SQL programs for the Data Warehouse. • Responsible for selecting and using DevOps tools for continuous integration, builds, and monitoring of solutions. • May provide input to area budget. • Makes some independent decisions and recommendations which affect the section, department and/or division. • Performs other duties as assigned. qualifications: • Bachelor's degree in Computer Science, Engineering, or related field from an accredited university. • Experience in a data integration role. • Experience using Apache Spark, Nifi and/or Kafka. • Experience using Python. • Experience integrating enterprise software using ETL modules. • Knowledge of data architecture, structures and principles with the ability to critique data and system designs. • Ability to design, create and/or modify data processes that meet key timelines while conforming to predefined specifications utilizing the Informatica and/or Mulesoft platform. • Understanding of big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB). • Ability to integrate data from Web services in XML, JSON, flat file format, SOAP. • Knowledge of core concepts of RESTful API Modeling Language (RAML 1.0) and designing with MuleSoft solutions. Preferred Qualifications • Relevant Certifications • Experience in API Management • Proficiency with the following databases/technologies: Mulesoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL • Knowledge of Test Driven Development (TDD) • Familiarity with Cloud base architecture • Experince with data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, scikit-learn, TensorFlow) • Experience in a technology organization Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status. At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including: medical, prescription, dental, vision, AD&D, and life insurance offerings, short-term disability, and a 401K plan (all benefits are based on eligibility). This posting is open for thirty (30) days.