

Catapult Solutions Group
Data Platform Developer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Platform Developer on a long-term contract in Austin, TX, offering $50-55/hr. Requires 5+ years in data pipelines, strong BigQuery experience, proficiency in Python/JavaScript, and familiarity with automation tools.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
440
-
🗓️ - Date
April 4, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Austin, TX
-
🧠 - Skills detailed
#BigQuery #Tableau #Data Integrity #JavaScript #Python #Cloud #Data Analysis #Monitoring #Data Engineering #Scripting #Strategy #SaaS (Software as a Service) #Logging #CRM (Customer Relationship Management) #Looker #Data Pipeline #Observability #Scala #Automation #BI (Business Intelligence) #Data Warehouse #AI (Artificial Intelligence)
Role description
PLATFORM ENGINEER/SOFTWARE DEVELOPER
Data Infrastructure & Automation Systems
United States | Austin, TX Strongly Preferred
Must be USC or GC holder to convert perm
Looking for 2-8 years of experience with the skills below
Must be Located near Austin, TX 78746 to work hybrid
Long-term Contract to hire position
Starting Pay between $50-55/hr
About the Role
Our client is hiring a Platform Engineer to design and operate internal systems that support data infrastructure, integrations, and automation across their operational stack. This role sits at the intersection of data engineering, platform architecture, and automation systems.
You will build integrations across multiple SaaS platforms, design scalable data pipelines, and develop automation frameworks that reduce manual operational work. Success in this role means data moves consistently between tools, workflows become automated, and teams can rely on the underlying systems to support day-to-day operations.
You will report to the Product Owner for platform systems and collaborate closely with analytics, strategy, and product teams. This is a U.S.-based role with a strong preference for engineers in Austin, TX. Remote candidates within the U.S. will be considered.
What You’ll Do
• Design and maintain integrations across operational tools including Accelo, BigQuery, Looker Studio, Notion/Podio, and automation platforms
• Build and maintain data pipelines that normalize operational, marketing, and reporting data into a centralized data warehouse
• Architect automation workflows that eliminate repetitive production, reporting, and campaign management tasks
• Develop internal platform utilities, APIs, and scripts that support reporting, analytics, and operational systems
• Ensure platform reliability through monitoring, logging, and data integrity across the stack
• Collaborate with analytics, strategy, and product teams to translate operational needs into scalable platform capabilities
• Continuously identify opportunities to replace manual workflows with reusable infrastructure and automation systems
Required Skills & Experience
• 5+ years building production-grade data pipelines, integrations, or internal platforms
• Strong experience working with cloud data warehouses (BigQuery strongly preferred)
• Experience integrating SaaS platforms via APIs and automation frameworks
• Proficiency in Python, JavaScript, or similar scripting languages
• Experience building or supporting BI and reporting pipelines (Looker, Looker Studio, Tableau, or similar)
• Experience designing scalable data models and data pipelines
• Strong systems-thinking mindset with the ability to architect durable infrastructure
• Experience working with automation tooling (Make, Zapier, n8n, or similar)
Ideal Experience
Strong applicants typically have several of the following — candidates do not need all of them:
• Experience building internal developer platforms or operational systems
• Experience designing event-driven data pipelines or integration frameworks
• Experience integrating CRM, marketing, or operational SaaS systems
• Experience designing analytics-ready data models
• Experience implementing observability, logging, and monitoring systems
• Experience with modern automation platforms such as Make, n8n, or similar orchestration tools
AI-Native Tooling (Nice to Have)
Our client is actively experimenting with AI-assisted infrastructure and automation. Experience in any of the following is a plus:
• Building or integrating AI agents into operational workflows
• Working with MCP (Model Context Protocol) servers
• Developing AI-assisted automation pipelines
• Experience with agent orchestration frameworks
• Using AI tools to accelerate development, data analysis, or workflow automation
• Experience with LLM-enabled tooling inside internal platforms
What We Value in Engineers
• Think in systems, not scripts
• Prefer durable infrastructure over quick patches
• Translate messy operational problems into clean technical solutions
• Take ownership of the reliability and maintainability of the systems they build
• Communicate clearly with both technical and non-technical teammates
If you're in Austin and Interested, Apply Today!
PLATFORM ENGINEER/SOFTWARE DEVELOPER
Data Infrastructure & Automation Systems
United States | Austin, TX Strongly Preferred
Must be USC or GC holder to convert perm
Looking for 2-8 years of experience with the skills below
Must be Located near Austin, TX 78746 to work hybrid
Long-term Contract to hire position
Starting Pay between $50-55/hr
About the Role
Our client is hiring a Platform Engineer to design and operate internal systems that support data infrastructure, integrations, and automation across their operational stack. This role sits at the intersection of data engineering, platform architecture, and automation systems.
You will build integrations across multiple SaaS platforms, design scalable data pipelines, and develop automation frameworks that reduce manual operational work. Success in this role means data moves consistently between tools, workflows become automated, and teams can rely on the underlying systems to support day-to-day operations.
You will report to the Product Owner for platform systems and collaborate closely with analytics, strategy, and product teams. This is a U.S.-based role with a strong preference for engineers in Austin, TX. Remote candidates within the U.S. will be considered.
What You’ll Do
• Design and maintain integrations across operational tools including Accelo, BigQuery, Looker Studio, Notion/Podio, and automation platforms
• Build and maintain data pipelines that normalize operational, marketing, and reporting data into a centralized data warehouse
• Architect automation workflows that eliminate repetitive production, reporting, and campaign management tasks
• Develop internal platform utilities, APIs, and scripts that support reporting, analytics, and operational systems
• Ensure platform reliability through monitoring, logging, and data integrity across the stack
• Collaborate with analytics, strategy, and product teams to translate operational needs into scalable platform capabilities
• Continuously identify opportunities to replace manual workflows with reusable infrastructure and automation systems
Required Skills & Experience
• 5+ years building production-grade data pipelines, integrations, or internal platforms
• Strong experience working with cloud data warehouses (BigQuery strongly preferred)
• Experience integrating SaaS platforms via APIs and automation frameworks
• Proficiency in Python, JavaScript, or similar scripting languages
• Experience building or supporting BI and reporting pipelines (Looker, Looker Studio, Tableau, or similar)
• Experience designing scalable data models and data pipelines
• Strong systems-thinking mindset with the ability to architect durable infrastructure
• Experience working with automation tooling (Make, Zapier, n8n, or similar)
Ideal Experience
Strong applicants typically have several of the following — candidates do not need all of them:
• Experience building internal developer platforms or operational systems
• Experience designing event-driven data pipelines or integration frameworks
• Experience integrating CRM, marketing, or operational SaaS systems
• Experience designing analytics-ready data models
• Experience implementing observability, logging, and monitoring systems
• Experience with modern automation platforms such as Make, n8n, or similar orchestration tools
AI-Native Tooling (Nice to Have)
Our client is actively experimenting with AI-assisted infrastructure and automation. Experience in any of the following is a plus:
• Building or integrating AI agents into operational workflows
• Working with MCP (Model Context Protocol) servers
• Developing AI-assisted automation pipelines
• Experience with agent orchestration frameworks
• Using AI tools to accelerate development, data analysis, or workflow automation
• Experience with LLM-enabled tooling inside internal platforms
What We Value in Engineers
• Think in systems, not scripts
• Prefer durable infrastructure over quick patches
• Translate messy operational problems into clean technical solutions
• Take ownership of the reliability and maintainability of the systems they build
• Communicate clearly with both technical and non-technical teammates
If you're in Austin and Interested, Apply Today!






