Falcon Smart IT

Sr.Data Engineer/ Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Data Engineer/Architect with a contract length of "unknown", offering a pay rate of "unknown", located onsite in Las Vegas. Requires 10+ years of experience, expertise in Python, Snowflake, SQL, and strong data architecture skills in financial services.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
May 8, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Las Vegas, NV
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #Storage #"ETL (Extract #Transform #Load)" #Data Architecture #Data Processing #Data Lake #Cloud #Security #Spark (Apache Spark) #SQL (Structured Query Language) #Snowflake #Libraries #PySpark #Scala #Automation #Data Manipulation #Datasets #Data Security #SnowPipe #Python #Data Modeling #Clustering #Snowpark #Batch #Data Engineering #S3 (Amazon Simple Storage Service)
Role description
Job Title: Sr.Data Engineer/ Architect Location: Onsite - Las Vegas Job Type: Contract Job Description: Expereince Range : 10+ Years Must Required Skils: Skills - Strong in Python ; Snowflake ; SQL Primary Responsibilities: β€’ Data Architecture & Design: Define and implement enterprise-scale data architecture blueprints within the Snowflake ecosystem, ensuring alignment with financial services security and performance standards. β€’ Pipeline Engineering: Design, build, and maintain end-to-end ELT/ETL pipelines using Python (and PySpark) to automate the ingestion and transformation of large, complex financial datasets. β€’ Warehouse Optimization: Lead performance tuning efforts by leveraging Snowflake-specific features such as clustering keys, micro-partitioning, and query profile analysis to handle high-concurrency transaction volumes. β€’ Data Modernization: Partner with Risk Technology and other internal stakeholders to migrate legacy data process flows into modern, scalable cloud-native solutions. β€’ Security & Governance: Implement robust data security measures, including Role-Based Access Control (RBAC), dynamic data masking, and row-access policies to protect sensitive cardholder information. β€’ Workflow Automation: Orchestrate complex data flows using tools like Snowpipe, Streams, and Tasks to ensure real-time or micro-batch data availability. Required Skills: β€’ Snowflake Mastery: Deep expertise in Snowflake’s three-layer architecture (Storage, Compute, Services), Snowpark, Zero-Copy Cloning, and Time Travel. β€’ Expert Python: Advanced proficiency in Python for data processing, automation, and script development, with a strong grasp of data-centric libraries. β€’ Advanced SQL: Expert-level SQL skills for complex data manipulation, window functions, and advanced performance optimization. β€’ Data Modeling: Strong experience in Dimensional Modeling (Star and Snowflake schemas) specifically for large-scale enterprise environments. β€’ Cloud Infrastructure: Familiarity with AWS services (e.g., S3, Glue, EMR) as they integrate with the Snowflake data lake. β€’ Analytical Problem Solving: Ability to perform root cause analysis on data bottlenecks and translate business requirements into technical specs.