Brxio Inc

Data Warehouse Architect + Security Czar

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Warehouse Architect + Security Czar, offering a hybrid position in Seattle, WA, with a contract length of over 6 months and a pay rate of $145,000 - $180,000. Key skills include SQL, ETL, cloud services (AWS, Azure), and security frameworks.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
818
-
🗓️ - Date
March 25, 2026
🕒 - Duration
More than 6 months
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Seattle, WA 98101
-
🧠 - Skills detailed
#ADLS (Azure Data Lake Storage) #Data Management #Java #Web Services #MongoDB #Azure ADLS (Azure Data Lake Storage) #Microsoft Azure #Data Lake #Scala #NoSQL #Oracle #SQL Server #Big Data #Agile #Linux #Data Modeling #Python #ADF (Azure Data Factory) #AWS (Amazon Web Services) #Data Integration #Databases #Compliance #Unix #Cloud #Data Architecture #Hadoop #"ETL (Extract #Transform #Load)" #Datasets #Spark (Apache Spark) #SQL (Structured Query Language) #Informatica #Database Design #Azure Data Factory #Azure #Data Warehouse #Automation #Bash #Security #Scripting #Programming #Storage
Role description
<Overview> Ignite your career as a Data Warehouse Architect and Security Czar, where you will lead the design, development, and security of cutting-edge data warehouse solutions. This dynamic role combines technical expertise with strategic oversight, empowering organizations to harness the full potential of their data while safeguarding sensitive information. You’ll be at the forefront of implementing innovative data architectures, ensuring robust security protocols, and driving data-driven decision-making across diverse business units. If you thrive in a fast-paced environment and are passionate about shaping the future of enterprise data management, this is your opportunity to make a significant impact. <Responsibilities> - Architect scalable, high-performance data warehouse solutions utilizing technologies such as SQL Server, Oracle, Hadoop, Spark, and NoSQL databases to support complex analytics and reporting needs. - Develop comprehensive data models and design efficient ETL (Extract, Transform, Load) processes using tools like Informatica and scripting languages such as Bash (Unix shell) to ensure seamless data integration. - Lead the implementation of big data solutions on cloud platforms including AWS and Azure Data Lake, leveraging services like Azure Data Factory and Azure Data Lake Storage for optimal performance. - Establish and enforce security protocols for all data assets, including encryption standards, access controls, and compliance measures to serve as the organization’s Security Czar. - Collaborate within Agile teams to deliver iterative enhancements to data architecture while maintaining alignment with business goals and technical standards. - Integrate Linked Data principles and semantic web technologies to enable interconnected data ecosystems that enhance analytics capabilities. - Monitor system performance, troubleshoot issues proactively, and optimize database design for efficiency and scalability. <Requirements> - Proven experience designing and implementing enterprise-level data warehouses using SQL Server, Oracle, Hadoop, or similar big data platforms. - Strong expertise in data modeling, database design, and ETL development with tools such as Informatica or equivalent frameworks. - Proficiency in programming languages including Java and Python for automation and advanced analytics tasks. - Hands-on experience with cloud services like AWS (Amazon Web Services) and Microsoft Azure—specifically Azure Data Lake—and familiarity with cloud security best practices. - Deep understanding of NoSQL databases (e.g., MongoDB, Cassandra) for handling unstructured or semi-structured data sources. - Knowledge of security frameworks related to data protection, encryption standards, access management, and compliance regulations. - Familiarity with Agile methodologies to facilitate flexible project delivery cycles in a collaborative environment. - Strong analytical skills combined with experience in Hadoop ecosystem components such as Spark for processing large datasets efficiently. - Ability to perform database design tasks using SQL and Bash scripting within Unix/Linux environments for automation purposes. - Excellent communication skills to articulate complex technical concepts clearly across teams and stakeholders. Embark on a journey where your expertise transforms how organizations leverage their most valuable asset—data—while ensuring its integrity and security are uncompromised! Pay: $145,000.00 - $180,000.00 per year Work Location: Hybrid remote in Seattle, WA 98101