

BrickRed Systems
Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6+ year background in data engineering solutions, advanced ETL skills, and expertise in cloud data systems. Contract length is unspecified, with competitive pay. Location is flexible; strong SQL, Python, and big data experience required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
528
-
ποΈ - Date
November 25, 2025
π - Duration
Unknown
-
ποΈ - Location
Unknown
-
π - Contract
Unknown
-
π - Security
Unknown
-
π - Location detailed
Massachusetts, United States
-
π§ - Skills detailed
#PySpark #Automation #Shell Scripting #Snowflake #Consulting #Cloud #SQL (Structured Query Language) #Data Quality #Matillion #Data Science #Jenkins #Data Modeling #AWS (Amazon Web Services) #Data Lake #GitHub #Databricks #Computer Science #Data Profiling #Scripting #Version Control #Terraform #Big Data #"ETL (Extract #Transform #Load)" #Data Engineering #Agile #Python #Scala #Scrum #Data Pipeline #Spark (Apache Spark) #Airflow #Libraries #Hadoop #Automated Testing
Role description
We are seeking an highly skilled data engineering professional who thrives in fast-paced environments, adapts quickly to new technologies, and demonstrates strong analytical and problem-solving skills. You should possess a deep understanding of data structures, algorithms, and modern engineering practices, along with the ability to lead, influence, and communicate effectively across technical and business teams.
Required Experience
β’ Bachelorβs degree in Computer Science, Data Science, Software Engineering, or related field (or equivalent combination of education and experience)
β’ 6+ years of experience designing and implementing end-to-end data engineering solutions
β’ Advanced skills in data modeling, data warehousing, and ETL development
β’ Hands-on experience with ETL tools β Matillion and/or PySpark preferred
β’ Expertise in building and operating distributed, highly available data systems for large-scale ingestion and processing
β’ Strong experience with cloud-based scalable data lake solutions (Databricks, Snowflake, AWS)
β’ Proficiency with big data technologies: Hadoop, Hive, Spark, EMR, and orchestration tools like Airflow
β’ Strong SQL skills and proficiency in Python and Shell scripting
β’ Experience with CI/CD tools: GitHub, Jenkins, Terraform, Databricks Asset Bundles
Key Responsibilities
β’ Define technical requirements, project scope, and estimates; translate backlog items into engineering tasks
β’ Lead development of technical solutions aligned with architectural standards
β’ Collaborate with architecture and platform teams; review and validate POCs
β’ Design and build data products and features within Agile/Scrum frameworks
β’ Develop distributed data pipelines using modern big data tools
β’ Conduct data profiling and analysis for scalable solution design
β’ Drive technical strategies for new initiatives and optimize existing systems
β’ Troubleshoot complex data issues and perform root cause analysis
β’ Build utilities, UDFs, and libraries to support scalable data flows
β’ Implement advanced automation through workflow orchestration tools
β’ Conduct collaborative design, code, and dataset reviews
β’ Mentor and guide junior engineers
β’ Proactively address data quality issues and build automated testing frameworks
β’ Drive best practices in version control, code review, CI/CD, and cloud development workflows
Characteristics We Value
β’ Strong collaboration and team-first mindset
β’ Ability to manage multiple projects in a fast-paced Agile environment
β’ Excellent communication and problem-solving skills
β’ Commitment to continuous learning and improvement
β’ Passion for mentoring and supporting team growth
About Brickred Systems:
Brickred Systems is a global leader in next-generation technology, consulting, and business process service companies. We enable clients to navigate their digital transformation. Brickred Systems delivers a range of consulting services to our clients across multiple industries around the world. Our practices employ highly skilled and experienced individuals with a client-centric passion for innovation and delivery excellence.
With ISO 27001 and ISO 9001 certification and over a decade of experience in managing the systems and workings of global enterprises, we harness the power of cognitive computing hyper-automation, robotics, cloud, analytics, and emerging technologies to help our clients adapt to the digital world and make them successful. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.
We are seeking an highly skilled data engineering professional who thrives in fast-paced environments, adapts quickly to new technologies, and demonstrates strong analytical and problem-solving skills. You should possess a deep understanding of data structures, algorithms, and modern engineering practices, along with the ability to lead, influence, and communicate effectively across technical and business teams.
Required Experience
β’ Bachelorβs degree in Computer Science, Data Science, Software Engineering, or related field (or equivalent combination of education and experience)
β’ 6+ years of experience designing and implementing end-to-end data engineering solutions
β’ Advanced skills in data modeling, data warehousing, and ETL development
β’ Hands-on experience with ETL tools β Matillion and/or PySpark preferred
β’ Expertise in building and operating distributed, highly available data systems for large-scale ingestion and processing
β’ Strong experience with cloud-based scalable data lake solutions (Databricks, Snowflake, AWS)
β’ Proficiency with big data technologies: Hadoop, Hive, Spark, EMR, and orchestration tools like Airflow
β’ Strong SQL skills and proficiency in Python and Shell scripting
β’ Experience with CI/CD tools: GitHub, Jenkins, Terraform, Databricks Asset Bundles
Key Responsibilities
β’ Define technical requirements, project scope, and estimates; translate backlog items into engineering tasks
β’ Lead development of technical solutions aligned with architectural standards
β’ Collaborate with architecture and platform teams; review and validate POCs
β’ Design and build data products and features within Agile/Scrum frameworks
β’ Develop distributed data pipelines using modern big data tools
β’ Conduct data profiling and analysis for scalable solution design
β’ Drive technical strategies for new initiatives and optimize existing systems
β’ Troubleshoot complex data issues and perform root cause analysis
β’ Build utilities, UDFs, and libraries to support scalable data flows
β’ Implement advanced automation through workflow orchestration tools
β’ Conduct collaborative design, code, and dataset reviews
β’ Mentor and guide junior engineers
β’ Proactively address data quality issues and build automated testing frameworks
β’ Drive best practices in version control, code review, CI/CD, and cloud development workflows
Characteristics We Value
β’ Strong collaboration and team-first mindset
β’ Ability to manage multiple projects in a fast-paced Agile environment
β’ Excellent communication and problem-solving skills
β’ Commitment to continuous learning and improvement
β’ Passion for mentoring and supporting team growth
About Brickred Systems:
Brickred Systems is a global leader in next-generation technology, consulting, and business process service companies. We enable clients to navigate their digital transformation. Brickred Systems delivers a range of consulting services to our clients across multiple industries around the world. Our practices employ highly skilled and experienced individuals with a client-centric passion for innovation and delivery excellence.
With ISO 27001 and ISO 9001 certification and over a decade of experience in managing the systems and workings of global enterprises, we harness the power of cognitive computing hyper-automation, robotics, cloud, analytics, and emerging technologies to help our clients adapt to the digital world and make them successful. Our always-on learning agenda drives their continuous improvement through building and transferring digital skills, expertise, and ideas from our innovation ecosystem.






