NITYA Software Solutions Inc

Azure Data Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for an Azure Data Architect on a contract basis for an in-person position, paying $60.00 - $65.00 per hour. Key skills include Azure Data Lake, ETL processes, big data frameworks, and strong programming proficiency.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
520
-
🗓️ - Date
February 3, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Alpharetta, GA 30009
-
🧠 - Skills detailed
#MS SQL (Microsoft SQL Server) #Security #Java #Oracle #Data Warehouse #Agile #Data Architecture #Data Lake #"ETL (Extract #Transform #Load)" #Spark (Apache Spark) #Compliance #Microsoft Azure #Data Processing #Database Systems #Scala #Azure #Microsoft SQL #Data Modeling #Datasets #Data Integrity #Migration #SQL (Structured Query Language) #Data Framework #Microsoft SQL Server #Bash #Python #SQL Server #Unix #Big Data #AWS (Amazon Web Services) #Databases #Database Design #Hadoop #NoSQL #Programming #Cloud #Informatica #Scripting
Role description
OverviewWe are seeking an experienced Azure Data Architect to lead the design, development, and implementation of scalable data solutions on the Microsoft Azure platform. The ideal candidate will possess a strong background in big data technologies, data modeling, and cloud architecture, with a focus on creating robust data warehouses and analytics solutions. This role offers an exciting opportunity to work on cutting-edge data projects, leveraging a wide array of tools and technologies to enable data-driven decision-making across the organization. Responsibilities Design and develop comprehensive data architectures utilizing Azure Data Lake, Azure, and other cloud-based services to support enterprise analytics initiatives. Develop and optimize ETL processes using tools such as Informatica, SQL, and scripting languages like Bash (Unix shell). Implement scalable data models and schemas for Data warehouses and Big Data environments, ensuring high performance and data integrity. Integrate diverse data sources including NoSQL, Oracle, Microsoft SQL Server, and Linked Data to create unified data platforms. Collaborate with cross-functional teams to adopt Agile methodologies for iterative development and continuous improvement. Utilize frameworks such as Spark, Hadoop, and Analytics tools to process large datasets efficiently. Ensure compliance with best practices in Database design and security standards across all data solutions. Provide technical guidance on cloud migration strategies involving AWS and Azure services. Stay current with emerging trends in Big Data, cloud computing, and analytics to recommend innovative solutions. Skills Extensive experience with Azure Data Lake, Azure, and cloud architecture design principles. Proficiency in programming languages including Java, Python, and scripting with Bash (Unix shell). Strong knowledge of database systems such as Microsoft SQL Server, Oracle, and experience with NoSQL databases. Deep understanding of Data modeling, ETL processes, and data warehousing concepts. Hands-on experience with big data frameworks like Spark, Hadoop, and related tools for large-scale data processing. Familiarity with integrating diverse datasets through linked data techniques. Knowledge of cloud platforms including both AWS and Microsoft Azure ecosystems. Experience working within an Agile development environment, emphasizing collaboration and iterative progress. Strong analytical skills, with the ability to translate complex business requirements into scalable technical solutions. This position offers a dynamic environment where innovative thinking is valued, providing opportunities to shape the future of enterprise data architecture through advanced cloud-based solutions. Job Type: Contract Pay: $60.00 - $65.00 per hour Work Location: In person