PTR Global

Senior Pipeline Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Pipeline Engineer in Plano, TX, with a contract-to-hire arrangement at $70/hr on W2 or $125K base. Requires 3+ years in software engineering, AWS, SQL, Java/Python, and strong Agile experience. Financial services knowledge preferred.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
560
-
🗓️ - Date
October 14, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
W2 Contractor
-
🔒 - Security
Unknown
-
📍 - Location detailed
Plano, TX
-
🧠 - Skills detailed
#Data Analysis #SQL (Structured Query Language) #Debugging #Java #Storage #Data Pipeline #Scala #Athena #ML (Machine Learning) #S3 (Amazon Simple Storage Service) #Base #Security #Cloud #AI (Artificial Intelligence) #Snowflake #Airflow #Lambda (AWS Lambda) #Data Security #AWS (Amazon Web Services) #Python #Database Querying #JSON (JavaScript Object Notation) #Automation #MySQL #Data Lake #Agile #Data Framework #Oracle #IAM (Identity and Access Management) #Programming #"ETL (Extract #Transform #Load)"
Role description
Senior Pipeline Engineer Onsite Role in Plano, TX Contract to Hire: open for USC/GC ; client is not offering Sponsorship. $70/hr on W2 ..$125K (Base) ---------------------------------------------------------- The primary focus is on enhancing, building, and delivering secure, stable, and scalable data collection, storage, access, and analytics solutions for card data. The engineer will work in an agile environment, tackling complex challenges across multiple data pipelines, architectures, and consumers. They are expected to be a firm-wide resource, a leader in Agile, a technical coach, and contribute to cutting-edge technology trends. Key responsibilities include designing, developing, and troubleshooting software solutions, creating high-quality production code, producing architectural artifacts, managing data collection and analytics, and designing data pipelines and database strategies. Must-Have Skills: • Experience: 3+ years applied experience in software engineering concepts. • Cloud Technologies: AWS cloud experience, including S3. • Database: Experience with SQL-based technologies (e.g., MySQL/Oracle DB). • Programming Languages: Proficient in Java or Python. • Development & Debugging: Experience in developing, debugging, and maintaining code in a large corporate environment with modern programming and database querying languages. • Data Analysis: Experience with statistical data analysis and determining appropriate tools/data patterns. • SDLC: Proficiency in all aspects of the Software Development Life Cycle. • Agile Methodologies: Solid understanding of CI/CD, Application Resiliency, and Security. • Automation: Proficiency in automation and continuous delivery methods. • Technical Discipline Knowledge: Demonstrated knowledge of software applications and technical processes within a discipline (e.g., cloud, AI, ML, mobile). Desired Skills: • Data Warehousing: Snowflake knowledge or experience. • Financial Services: In-depth knowledge of the financial services industry and IT systems. • Data Platforms: Experience building Data Lakes, Data Platforms, Data Frameworks, and/or designing Data as a Service APIs. • Advanced AWS Cloud Implementation: • AWS Data Services: Proficiency in Lake Formation, Glue ETL (or) EMR, S3, Glue Catalog, Athena, Kinesis (or) MSK, Airflow (or) Lambda + Step Functions + Event Bridge. • Data De/Serialization: Expertise in at least 2 formats: Parquet, Iceberg, AVRO, JSON-LD. • AWS Data Security: Good understanding of security concepts such as Lake Formation, IAM, Service Roles, Encryption, KMS, Secrets Manager.