

HireTalent - Diversity Staffing & Recruiting Firm
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer in San Antonio, TX, for 8 months at $44.67/hr on W2. Requires strong Python, ETL/data integration experience, and familiarity with Informatica, MuleSoft, and big data tools. Relevant certifications preferred.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
352
-
ποΈ - Date
April 22, 2026
π - Duration
More than 6 months
-
ποΈ - Location
On-site
-
π - Contract
W2 Contractor
-
π - Security
Unknown
-
π - Location detailed
San Antonio, TX
-
π§ - Skills detailed
#Data Analysis #MySQL #Data Warehouse #"ETL (Extract #Transform #Load)" #Computer Science #Monitoring #NumPy #SQL (Structured Query Language) #Informatica PowerCenter #Cloud #Data Modeling #Data Engineering #Kafka (Apache Kafka) #TensorFlow #REST API #Pandas #NiFi (Apache NiFi) #Informatica #Oracle #REST (Representational State Transfer) #MongoDB #Security #Data Integration #DevOps #Python #EDW (Enterprise Data Warehouse) #Data Architecture #Big Data #XML (eXtensible Markup Language) #Hadoop #Spark (Apache Spark) #Data Quality #API (Application Programming Interface) #Datasets #JSON (JavaScript Object Notation)
Role description
Job Title: Data Engineer - NO C2C only W2 (Green Card & US Citizen)
Location β San Antonio,TX
Duration β 08 months
Pay - $44.67/hr on W2
Position Summary
Develop and automate data-driven solutions to support business operations, enabling prediction, issue detection, and decision-making. Work with large, diverse datasets across domains (infrastructure, market, customer, security) while improving data integration, accessibility, security, and cost efficiency.
Key Responsibilities
β’ Design, develop, and maintain ETL/data integration solutions
β’ Provide production support for Enterprise Data Warehouse (Informatica, MuleSoft, Oracle PL/SQL)
β’ Troubleshoot data integration and data quality issues
β’ Perform bug fixes, root cause analysis, and performance tuning
β’ Build Informatica/MuleSoft mappings and complex PL/SQL programs
β’ Support QA (test case design & execution)
β’ Deliver solutions within timelines and provide status updates
β’ Use DevOps tools for CI/CD, builds, and monitoring
β’ Handle high-severity incidents and system issues
β’ Contribute to technical decisions and improvements
Minimum Qualifications
β’ Bachelorβs degree in Computer Science/Engineering or related field
β’ Experience in ETL/data integration
β’ Strong Python skills
β’ Experience with Spark, NiFi, or Kafka
β’ Knowledge of data architecture and data modeling
β’ Experience with Informatica and/or MuleSoft
β’ Exposure to big data tools (Hadoop, Spark, Hive, MongoDB, etc.)
β’ Experience with XML, JSON, SOAP, flat file integrations
β’ Understanding of REST APIs and RAML
Preferred Qualifications
β’ Relevant certifications
β’ Experience in API management
β’ Hands-on with MuleSoft Anypoint, Informatica PowerCenter, Oracle, PL/SQL, MySQL
β’ Knowledge of TDD and cloud architecture
β’ Experience with data analysis tools (NumPy, Pandas, TensorFlow, etc.)
β’ Experience in tech-driven environments
Core Competencies
β’ Ownership & initiative
β’ Strong communication
β’ Technical expertise
β’ Results-oriented mindset
Job Title: Data Engineer - NO C2C only W2 (Green Card & US Citizen)
Location β San Antonio,TX
Duration β 08 months
Pay - $44.67/hr on W2
Position Summary
Develop and automate data-driven solutions to support business operations, enabling prediction, issue detection, and decision-making. Work with large, diverse datasets across domains (infrastructure, market, customer, security) while improving data integration, accessibility, security, and cost efficiency.
Key Responsibilities
β’ Design, develop, and maintain ETL/data integration solutions
β’ Provide production support for Enterprise Data Warehouse (Informatica, MuleSoft, Oracle PL/SQL)
β’ Troubleshoot data integration and data quality issues
β’ Perform bug fixes, root cause analysis, and performance tuning
β’ Build Informatica/MuleSoft mappings and complex PL/SQL programs
β’ Support QA (test case design & execution)
β’ Deliver solutions within timelines and provide status updates
β’ Use DevOps tools for CI/CD, builds, and monitoring
β’ Handle high-severity incidents and system issues
β’ Contribute to technical decisions and improvements
Minimum Qualifications
β’ Bachelorβs degree in Computer Science/Engineering or related field
β’ Experience in ETL/data integration
β’ Strong Python skills
β’ Experience with Spark, NiFi, or Kafka
β’ Knowledge of data architecture and data modeling
β’ Experience with Informatica and/or MuleSoft
β’ Exposure to big data tools (Hadoop, Spark, Hive, MongoDB, etc.)
β’ Experience with XML, JSON, SOAP, flat file integrations
β’ Understanding of REST APIs and RAML
Preferred Qualifications
β’ Relevant certifications
β’ Experience in API management
β’ Hands-on with MuleSoft Anypoint, Informatica PowerCenter, Oracle, PL/SQL, MySQL
β’ Knowledge of TDD and cloud architecture
β’ Experience with data analysis tools (NumPy, Pandas, TensorFlow, etc.)
β’ Experience in tech-driven environments
Core Competencies
β’ Ownership & initiative
β’ Strong communication
β’ Technical expertise
β’ Results-oriented mindset






