Devout Corporation

Databricks Certified Data Engineer /AWS Solutions Architect

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Databricks Certified Data Engineer / AWS Solutions Architect, offering a hybrid contract in Washington, DC for 2-3 days on-site. Pay ranges from $70.00 to $85.00 per hour. Key skills include AWS, Databricks, ETL tools, and strong programming abilities.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
680
-
🗓️ - Date
March 10, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
Hybrid
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Washington, DC 20004
-
🧠 - Skills detailed
#Data Quality #Datasets #EC2 #S3 (Amazon Simple Storage Service) #BI (Business Intelligence) #Shell Scripting #AWS (Amazon Web Services) #Cloud #HDFS (Hadoop Distributed File System) #Storage #Automation #Scripting #SQL Server #Big Data #Python #SQL (Structured Query Language) #ML (Machine Learning) #PySpark #Data Integration #Oracle #Agile #Data Lake #Data Pipeline #Data Warehouse #Database Design #Scala #Informatica #Databases #Hadoop #Data Architecture #Security #Talend #Bash #MS SQL (Microsoft SQL Server) #Redshift #Azure #Data Processing #Data Engineering #Programming #Microsoft SQL Server #Data Framework #Lambda (AWS Lambda) #Looker #Databricks #"ETL (Extract #Transform #Load)" #Microsoft SQL #Unix #Documentation #Spark (Apache Spark) #Apache Hive #Java #Data Storage
Role description
Job OverviewDevout, Inc. is seeking a highly skilled and motivated Databricks Certified Data Engineer / AWS Solutions Architect to join our dynamic data team. This role offers an exciting opportunity to design, develop, and optimize scalable data solutions leveraging cloud platforms and big data technologies. As a key player, you will architect robust data pipelines, implement innovative analytics solutions, and ensure seamless integration across diverse data sources. Your expertise will empower our organization to harness the full potential of data-driven insights, driving strategic decision-making and operational excellence. Duties Design, build, and maintain scalable data pipelines using Databricks, Spark, and Hadoop ecosystems to support complex analytics and machine learning models. Develop and optimize cloud-based data architectures on AWS, including services like S3, Redshift, Lambda, and EC2 instances, ensuring high availability and security. Collaborate with cross-functional teams to gather requirements and translate them into efficient ETL (Extract, Transform, Load) processes utilizing tools such as Talend, Informatica, or custom scripting in Python, Bash, or Shell scripting. Implement database solutions with Microsoft SQL Server, Oracle, and Azure Data Lake for structured and unstructured data storage. Design and develop RESTful APIs for seamless data integration across platforms and applications. Conduct analysis of large datasets using SQL, Python, Spark, and other analytical tools to derive actionable insights. Support model training activities by preparing datasets and ensuring data quality for machine learning initiatives. Maintain comprehensive documentation of architecture designs, workflows, and best practices aligned with Agile development methodologies. Requirements Proven certification as a Databricks Certified Data Engineer and AWS Solutions Architect demonstrating expertise in cloud architecture and big data processing. Extensive experience working with AWS cloud services, including S3, EC2, Lambda, Redshift, and related tools. Strong programming skills in Java, Python, Bash (Unix shell), or Shell scripting for automation and pipeline development. Deep understanding of big data frameworks such as Hadoop ecosystem components (HDFS, Hive), Spark (PySpark), Apache Hive, and related technologies. Hands-on experience with ETL tools like Talend or Informatica for efficient data integration workflows. Proficiency in database design principles for Data Warehouses using SQL Server, Oracle databases; familiarity with Looker or similar BI tools for reporting. Knowledge of Linked Data concepts for semantic web applications is a plus. Ability to analyze complex datasets with strong attention to detail; excellent problem-solving skills are essential. Experience working within Agile teams to deliver iterative solutions in fast-paced environments. Familiarity with analysis techniques involving model training and predictive analytics is advantageous. Join us to leverage your expertise in cloud architecture and big data engineering! Be part of a forward-thinking organization committed to innovation through cutting-edge technology solutions that transform how we analyze and utilize data across industries. US Citizen required, Public Trust Hybrid 2-3 on-site Pay: $70.00 - $85.00 per hour Work Location: Hybrid remote in Washington, DC 20006