Palo Alto Networks

Data Visualization Engineer (Contract)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is a Data Visualization Engineer (Contract) for 12 months, offering $75-$80/hour, fully remote in the USA. Requires 3-5 years of Salesforce data architecture experience, proficiency in Tableau, and strong analytical skills. Bachelor’s degree or equivalent experience needed.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
640
-
🗓️ - Date
November 11, 2025
🕒 - Duration
More than 6 months
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Security #Spark (Apache Spark) #Data Architecture #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Data Quality #Data Engineering #SQL Queries #Cloud #Python #GCP (Google Cloud Platform) #Cybersecurity #API (Application Programming Interface) #Documentation #Dataflow #BI (Business Intelligence) #Data Pipeline #Visualization #Data Modeling #Tableau
Role description
Data Visualization Engineer [Contract] Role Description This role is a contract assignment at Palo Alto Networks. Contractors will not be employed by Palo Alto Networks but through our trusted staffing partners. Palo Alto Networks is looking for a Data Visualization Engineer - Contractor to work with the Marketplace Engagement Team and is a great opportunity for a talented individual who has the desire to associate with a fast-growing company in a truly international environment. • Location - USA (Remote) • Duration - 12 months As a Data Visualization Engineer - Contractor you will directly and visibly contribute to our overall success and daily operations within the Marketplace Engagement Team and will report to Sr. Director, Business Development This role is pivotal to the success of the organization, is collaborative in nature, and will continue to flex and grow as the organization matures. We are seeking a highly motivated Data Visualization Engineer to own the entire backend data pipeline for our Tableau reporting environment. This role requires deep technical expertise in Tableau's data layer and strong proficiency with Salesforce and other platforms as data sources. If you are a self-starter who thrives in fast-paced, high-growth environments with minimal supervision this is the role for you Your Impact • Data Modeling & Transformation: Architect and build optimized Tableau data sources by performing complex Joins and strategic Data Blending across Salesforce (primary source) and third-party platforms. • Complex Logic & Cleansing: Write advanced calculations, including Level of Detail (LOD) Expressions, and utilize functions to parse, clean, and standardize data from free-text fields (like descriptions) into usable columns for analysis. • Data Hygiene & Matching: Lead data quality initiatives, particularly for Marketing data, by developing fuzzy and exact matching logic to reconcile external partner-provided accounts with our internal Salesforce Account records. • Strategic Oversight & Delivery: • Gap Analysis & Resolution: Actively find gaps in current reporting capabilities, clearly articulate data deficiencies to IT and stakeholders, and drive the necessary data or system changes to resolve these issues and deliver required reports. • Reporting & Delivery: Translate complex technical data into clear, unambiguous, and simple-to-use Tableau dashboards that deliver immediate, actionable insights to stakeholders. Your Experience • Advanced Salesforce Data Architecture: Possess expert knowledge (3-5 years) of Salesforce data structures, including managing complexities like Account Hierarchies and cross-object relationships. Must understand Salesforce platform limitations (e.g., API limits, governor limits) and propose practical, high-performance alternate solutions. • Experience working SFDC Data Objects (Opportunity, Quote, Accounts, Subscriptions, Entitlements) • Experience with BI tools and visualization platforms (e.g. Tableau) • Strong analytical and problem-solving skills, with the ability to analyze complex data sets and derive actionable insights • Excellent communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams • Familiarity with cloud platforms such as Google Cloud Platform (GCP), and experience with relevant services (e.g. GCP Dataflow, GCP DataProc, Biq Query, Procedures, Cloud Composer etc). • Self-Management: Must be a proactive, self-starter capable of managing projects end-to-end, navigating technical challenges independently, and consistently meeting tight deadlines. • Demonstrated readiness to leverage GenAI tools to enhance efficiency within the typical stages of the data engineering lifecycle, for example by generating complex SQL queries, creating initial Python/Spark script structures, or auto-generating pipeline documentation, is a nice-to-have Education Compensation Bachelor’s degree or equivalent relevant work experience required. The compensation offered for this position will depend on qualifications, experience, and work location. For candidates who receive an offer,this is the pay range that Magnit (the staffing agency) reasonably expects to pay for this position: $75/hour to $80/hour. Please note that the compensation information in this posting reflects the hourly wage only and does not include benefits. Magnit offers Medical, Dental, Vision and 401K. Information About Palo Alto Networks At Palo Alto Networks®, everything starts and ends with our mission: Being the cybersecurity partner of choice, protecting our digital way of life. We have the vision of a world where each day is safer and more secure than the one before. These aren’t easy goals to accomplish — but we’re not here for easy. We’re here for better. We are a company built on the foundation of challenging and disrupting the way things are done, and we’re looking for innovators who are as committed to shaping the future of cybersecurity as we are.