Primary Talent Partners

Data Engineer 2

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer 2 on an 8-month contract in San Antonio, TX, paying $40.00 - $45.00/hr. Key skills include advanced Python, SQL, ETL, and experience with Hadoop, Spark, and Mulesoft. A Bachelor's degree is required.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
360
-
πŸ—“οΈ - Date
April 17, 2026
πŸ•’ - Duration
More than 6 months
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
San Antonio, TX
-
🧠 - Skills detailed
#Pandas #Libraries #Web Services #XML (eXtensible Markup Language) #Datasets #Informatica #HBase #"ETL (Extract #Transform #Load)" #Computer Science #Data Engineering #MySQL #API (Application Programming Interface) #Cloud #Apache Spark #Automation #DevOps #EDW (Enterprise Data Warehouse) #Security #TensorFlow #MongoDB #Spark (Apache Spark) #Data Science #BI (Business Intelligence) #Big Data #Data Warehouse #Base #SQL (Structured Query Language) #Databases #Informatica PowerCenter #JSON (JavaScript Object Notation) #Data Architecture #Alteryx #Data Quality #Python #Monitoring #Data Analysis #Oracle #Kafka (Apache Kafka) #Hadoop #Data Integration #NumPy #NiFi (Apache NiFi)
Role description
Primary Talent Partners has a new contract opening for a Data Engineer 2Β with our energy and utilities client in San Antonio, TX. This is an 8-month contract with a potential for extension. Pay: $40.00 - $45.00/hr;Β W2 contract, no PTO, no Benefits. ACA-compliant supplemental package available for enrollment.Β Candidates must be legally authorized to work in the United States and must be able to sit on Primary Talent Partners W2 without sponsorship. Description: Provide the development and automation of computing processes to detect, predict and respond to opportunities in business operations. Working with a variety of disparate datasets that encompass many disciplines and business units including weather, transmission and distribution grid infrastructure, power generation, gas delivery, commercial market operations, safety and security and customer engagement. Strive to transform and implement true business integration, leveraging top-notch data integration best practices. Merging and securing data in a way that reduces the cost to maintain and increases the utilization of enterprise-wide data as an asset. Developing business intelligence. TASK AND RESPONSIBILITES β€’ Design, Develop, and unit test new or existing ETL/Data Integration solutions to meet business requirements. β€’ Daily production support for Enterprise Data Warehouse including jobs in Informatica/Mulesoft and Oracle PL/SQL; and be flexible to manage high severity incidents/problem resolution. β€’ Participate in troubleshooting and resolving data integration issues such as data quality. β€’ Deliver increased productivity and effectiveness through rapid delivery of high-quality applications. β€’ Provide work estimates and communicate status of assignments. β€’ Assist in QA efforts on tasks by providing input for test cases and supporting test case execution. β€’ Analyze transaction errors, troubleshoot issues in the software, develop bug-fixes, involved in performance tuning efforts. β€’ Develop Informatica/Mulesoft Mappings and complex Oracle PL/SQL programs for the Data Warehouse. β€’ Responsible for selecting and using DevOps tools for continuous integration, builds, and monitoring of solutions. β€’ May provide input to area budget. β€’ Makes some independent decisions and recommendations which affect the section, department and/or division. β€’ Performs other duties as assigned. Minimum Qualifications β€’ Bachelor's degree in Computer Science, Engineering, or related field from an accredited university. β€’ Administration of Hadoop platforms and solutions, β€’ Advanced Python skills β€’ Alteryx β€’ API β€’ SQL β€’ Experience in a data integration role. β€’ Experience using Apache Spark, Nifi and/or Kafka. β€’ Experience integrating enterprise software using ETL modules. β€’ Knowledge of data architecture, structures and principles with the ability to critique data and system designs. β€’ Ability to design, create and/or modify data processes that meet key timelines while conforming to predefined specifications utilizing the Informatica and/or Mulesoft platform. β€’ Understanding of big data technologies and platforms (Hadoop, Spark, MapReduce, Hive, HBase, MongoDB). β€’ Ability to integrate data from Web services in XML, JSON, flat file format, SOAP. β€’ Knowledge of core concepts of RESTful API Modeling Language (RAML 1.0) and designing with MuleSoft solutions. Preferred Qualifications β€’ Relevant Certifications β€’ Experience in API Management β€’ Proficiency with the following databases/technologies: Mulesoft Anypoint Studio, Informatica PowerCenter, Oracle RDMS, PL/SQL, MySQL β€’ Knowledge of Test Driven Development (TDD) β€’ Familiarity with Cloud base architecture β€’ Experince with data analysis & model prototyping using Spark/Python/SQL and common data science tools & libraries (e.g. NumPy, Pandas, scikit-learn, TensorFlow) β€’ Experience in a technology organization Primary Talent Partners is an Equal Opportunity / Affirmative Action employer committed to diversity in the workplace. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, age, national origin, disability, protected veteran status, gender identity, or any other factor protected by applicable federal, state, or local laws. If you are a person with a disability needing assistance with the application or at any point in the hiring process, please contact us at info@primarytalentpartners.com #PTPJobs