

Smart IT Frame LLC
Data Engineer
⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with 6-10 years of experience, focusing on PySpark, AWS Lambda, and AWS Glue. It offers a remote contract with day shifts and requires strong problem-solving, communication skills, and a commitment to data security and compliance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 14, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Remote
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
United States
-
🧠 - Skills detailed
#Documentation #Lambda (AWS Lambda) #AWS Glue #Data Engineering #Data Processing #Data Security #Spark (Apache Spark) #Data Architecture #PySpark #AWS (Amazon Web Services) #Storage #Compliance #Scala #AWS Lambda #Data Quality #Security
Role description
Job Title: Data Engineer
Remote
We are seeking an experienced Data Engineer with 6 to 10 years of experience to join our team. The ideal candidate will have expertise in PySpark, AWS Lambda, and AWS Glue with day shifts.
Responsibilities
• Develop and implement scalable data architectures using PySpark, AWS Lambda, and AWS Glue to support business objectives and enhance data processing capabilities.
• Collaborate with cross-functional teams to understand data requirements and translate them into effective architectural solutions.
• Optimize data workflows and processes to ensure efficient data handling and storage, improving overall system performance.
• Provide technical guidance and support to development teams, ensuring best practices are followed in data architecture and implementation.
• Develop and maintain documentation for data architecture, processes, and best practices to ensure clarity and consistency across teams.
• Evaluate and recommend new technologies and tools that can enhance data processing and storage capabilities, keeping the company at the forefront of technological advancements.
• Ensure data security and compliance with industry standards and regulations, safeguarding sensitive information and maintaining trust.
• Monitor and troubleshoot data systems to identify and resolve issues promptly, minimizing downtime and ensuring seamless operations.
• Collaborate with stakeholders to identify opportunities for data-driven improvements and innovations, contributing to the companys growth and success.
• Lead efforts to improve data quality and integrity, implementing strategies to ensure accurate and reliable data across systems.
• Provide training and mentorship to team members, fostering a culture of continuous learning and development.
• Participate in strategic planning sessions to align data architecture initiatives with company goals and objectives.
• Contribute to the companys mission by developing data solutions that have a positive impact on society, enhancing the quality of life for communities served.
Qualifications
• Possess a strong understanding of PySpark, AWS Lambda, and AWS Glue, with proven experience in designing and implementing data architectures.
• Demonstrate excellent problem-solving skills and the ability to work collaboratively with cross-functional teams.
• Exhibit strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
• Have a track record of optimizing data workflows and processes to improve system performance and efficiency.
• Show commitment to data security and compliance, with a focus on safeguarding sensitive information.
• Display a proactive approach to learning and adopting new technologies and tools that enhance data capabilities
Job Title: Data Engineer
Remote
We are seeking an experienced Data Engineer with 6 to 10 years of experience to join our team. The ideal candidate will have expertise in PySpark, AWS Lambda, and AWS Glue with day shifts.
Responsibilities
• Develop and implement scalable data architectures using PySpark, AWS Lambda, and AWS Glue to support business objectives and enhance data processing capabilities.
• Collaborate with cross-functional teams to understand data requirements and translate them into effective architectural solutions.
• Optimize data workflows and processes to ensure efficient data handling and storage, improving overall system performance.
• Provide technical guidance and support to development teams, ensuring best practices are followed in data architecture and implementation.
• Develop and maintain documentation for data architecture, processes, and best practices to ensure clarity and consistency across teams.
• Evaluate and recommend new technologies and tools that can enhance data processing and storage capabilities, keeping the company at the forefront of technological advancements.
• Ensure data security and compliance with industry standards and regulations, safeguarding sensitive information and maintaining trust.
• Monitor and troubleshoot data systems to identify and resolve issues promptly, minimizing downtime and ensuring seamless operations.
• Collaborate with stakeholders to identify opportunities for data-driven improvements and innovations, contributing to the companys growth and success.
• Lead efforts to improve data quality and integrity, implementing strategies to ensure accurate and reliable data across systems.
• Provide training and mentorship to team members, fostering a culture of continuous learning and development.
• Participate in strategic planning sessions to align data architecture initiatives with company goals and objectives.
• Contribute to the companys mission by developing data solutions that have a positive impact on society, enhancing the quality of life for communities served.
Qualifications
• Possess a strong understanding of PySpark, AWS Lambda, and AWS Glue, with proven experience in designing and implementing data architectures.
• Demonstrate excellent problem-solving skills and the ability to work collaboratively with cross-functional teams.
• Exhibit strong communication skills, with the ability to convey complex technical concepts to non-technical stakeholders.
• Have a track record of optimizing data workflows and processes to improve system performance and efficiency.
• Show commitment to data security and compliance, with a focus on safeguarding sensitive information.
• Display a proactive approach to learning and adopting new technologies and tools that enhance data capabilities