Custom Business Solutions, Inc.

Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a contract length of "unknown" and a pay rate of "unknown." Key skills include experience with AWS, Snowflake, SQL, R, Python, and data modeling. A Bachelor's Degree is required; USC/GC only.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
October 31, 2025
🕒 - Duration
Unknown
-
🏝️ - Location
Unknown
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Grand Rapids, MI
-
🧠 - Skills detailed
#AWS (Amazon Web Services) #XML (eXtensible Markup Language) #SQL Server #Data Exploration #R #Scripting #SAP #Version Control #Database Management #Programming #Agile #Python #GIT #Data Engineering #Lean #Stories #Tableau #Project Management #Snowflake #Visualization #Scrum #SQL (Structured Query Language) #Data Pipeline #BI (Business Intelligence) #Cloud #Physical Data Model #Web API #MS SQL (Microsoft SQL Server) #Microsoft SQL #Data Wrangling #Datasets #Oracle #Data Cleansing #Data Quality #Data Warehouse #JSON (JavaScript Object Notation) #"ETL (Extract #Transform #Load)" #Databases #Microsoft SQL Server #RDBMS (Relational Database Management System) #Jira #Data Design #Data Governance
Role description
Job Title: Data Engineer TECHNICAL SKILLS Must Have • Experience interacting with and extracting data from modern web APIs • Experience with cloud platforms such as AWS and Snowflake • Experience with relational database management systems (Oracle/Microsoft SQL Server) • Experience with scripting languages such as R and Python • Experience with SQL • Familiarity with at least one commonly used programming language Capability to design and document conceptual, logical, and physical data models for relational and dimensionally modeled databases • Familiarity with modern source/version control tools (Git, CodeCommit, Subversion • Familiarity with various raw data source types and how to interpret them. (Unstructured -JSON/BSON, Flat files, XML, etc.) Nice To Have • Data visualization experience in common business intelligence tools (Tableau, D3.js, R/Shiny, SAP Products). • Data wrangling experience using a variety of tools and language. • Experience with lean-agile principles and framework for project completion. Interpret process performance outputs and improve workflow performance for affected jobs. • Familiarity with at least one commonly used programming language. • Familiarity with project management workflow tracking software such as JIRA/Trello. • Knowledge of Software Development Life Cycle and best practices. • Strong technical writing and presentation skills. Specific Skill Set Requirements: Experience with relational database management systems (Oracle/Microsoft SQL Server) Familiarity with at least one commonly used programming language Knowledge of Software Development Life Cycle and best practices Data wrangling experience using a variety of tools and languages Experience with SQL Strong technical writing and presentation skills Data visualization experience in common business intelligence tools (Tableau, D3.js, R/Shiny, SAP Products) Familiarity with project management workflow tracking software such as JIRA/Trello Experience with lean-agile principles and framework for project completion Capability to design and document conceptual, logical, and physical data models for relational and dimensionally modeled databases Interpret process performance outputs and improve workflow performance for affected jobs Familiarity with modern source/version control tools (Git, CodeCommit, Subversion) Familiarity with various raw data source types and how to interpret them. (Unstructured -JSON/BSON, Flat files, XML, etc.) Experience with scripting languages such as R and Python Experience interacting with and extracting data from modern web APIs Experience with cloud platforms such as AWS and Snowflake. Job Summary: Designs, configures, develops, tests and deploys ELT/ETL processes and self-service reporting assets to provide data exploration capabilities to end users, while adhering to organizational, departmental, team and governmental standards and regulations. Proactively communicates with data consumers to improve performance of processes connecting to data. Designs and builds data models, contributing to an ongoing review of enterprise architectural standards and designs. Participates in data cleansing and profiling efforts and traces data quality issues to their source. Provides operational support for systems and applications troubleshooting and maintenance, and ad-hoc support for incidents related to data warehouses, data sets and data pipelines. Participates in organizational Data Governance for data assets and in enterprise-wide Agile ceremonies. Works with more experienced team members to capture, document and groom features and stories for Agile/scrum teams. Team members in this position excel at working self-sufficiently in most activities and know when to engage others. Provides coaching to less-experienced Data Engineers. Essential Functions: Designs, configures, develops, tests and deploys ELT/ETL processes across multiple sources, targets and tools, and self-service reporting assets to provide data exploration capabilities to end users. Demonstrates deep understanding of one or more data domains, the processes that generate the associated data, and the potential biases within those data sets. Leads communications with data consumers with a skill growth mindset, to improve performance of processes connecting to data, and shares best practices. Develops relationships and collaboration across multiple teams to deliver timely and reliable information that enables users to uncover insight from information. Provides operational support for systems and applications troubleshooting and maintenance support, and ad-hoc support for incidents related to data warehouses, datasets and data pipelines. Provides on-call support for data platforms, reporting assets, and ETL processes. Develops and implements data model designs, engaging less-experienced team members to grow their skills. Build data models, using a variety of sources and target systems, growing a better understanding on enterprise architectural standards and designs. Applies best practices and organizational standards of the Software Development Lifecycle, to ensure the production of high quality, reliable data assets. Applies the organizational standards and laws governing use of sensitive data relating to Protected Health Information (PHI) and Personally Identifiable Information (PII). Leads data cleansing and profiling to ensure accuracy and quality of information. Participates in organizational Data Governance for data asset management. Develops Continuous Integration and Continuous Development (CI/CD) pipelines under the guidance of more experienced team members. Participates in enterprise-wide Agile ceremonies determining and recommending technical programming approaches and solutions to complex applications. Works independently capture, document and groom features and stories for Agile/scrum teams to maximize visibility and readiness for upcoming planning and execution. Actively pursues opportunities to grow their business acumen and skills from more experienced team members. Provides coaching to less experienced team members. Bachelor's Degree or equivalent NO C2C USC/GC ONLY