Logix Guru

Senior Data Engineer (W2 Only)

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer (W2 Only) with a contract length of "unknown" and a pay rate of "$/hour." It requires 4+ years in data management, proficiency in AWS tools, and strong collaboration skills. Location: "remote."
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
January 13, 2026
πŸ•’ - Duration
Unknown
-
🏝️ - Location
Unknown
-
πŸ“„ - Contract
W2 Contractor
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Columbus, OH
-
🧠 - Skills detailed
#"ETL (Extract #Transform #Load)" #Data Science #Datasets #Data Management #AWS (Amazon Web Services) #Computer Science #Statistics #Athena #Agile #Data Pipeline #Data Integration #Business Analysis #Data Catalog #DevOps #Data Quality #Automation #Data Governance #Version Control #Data Analysis #Programming #Data Engineering
Role description
Job Description: This role plays a pivotal role in building and operationalizing the minimally inclusive data necessary for the enterprise data and analytics initiatives following industry standard practices and tools. Data Engineers will also test for data quality to ensure data conforms to business rules and is accurate, complete, consistent, and uniform. This role’s primarily focuses on building, managing and optimizing data pipelines and then moving these data pipelines effectively into production for key data and analytics consumers like business/data analysts, data scientists or any persona that needs curated data for data and analytics use cases across the enterprise. Duties and Responsibilities: β€’ Architecting, creating and maintaining data pipelines. β€’ Assist with renovating the data management infrastructure to drive automation in data integration and management. β€’ Work in partnership with data science teams and with business analysts in refining their data requirements for various data and analytics initiatives and their data consumption requirements. β€’ Train counterparts across the organization in data pipelining and preparation techniques, which make it easier for them to integrate and consume the data they need for their own use cases. β€’ Work with data governance teams and participate in vetting and promoting content created in the business and by data scientists to the curated data catalog for governed reuse. β€’ Demonstrated ability to communicate complex results to technical and non-technical audiences. β€’ Demonstrated ability to work effectively in teams as well as independently across multiple tasks while meeting aggressive timelines. β€’ Strategic, intellectually curious thinker with focus on outcomes. β€’ Professional image with the ability to form relationships across functions. β€’ Performs other duties as assigned. Basic Qualifications: β€’ Bachelor's Degree in computer science, statistics or related field, or equivalent related work experience. β€’ 4+ years of related experience in data management disciplines including data integration, modeling, optimization and data quality, and/or other areas directly relevant to data engineering responsibilities and tasks β€’ 4+ years of experience working in cross-functional teams and collaborating with business stakeholders in support of a departmental and/or multi-departmental data management and analytics initiative, experience with data preparation tools and database programming languages Preferred Qualifications: β€’ Master's degree in computer science, statistics or related field β€’ Hands on data testing experience β€’ Strong experience with data engineering tooling (e.g., Glue, Landa, Athena, AWS) β€’ Strong experience with open-source and commercial data science platforms β€’ Learn and/or Agile methodology β€’ Strong experience with various Data Management architectures and processes β€’ Strong ability to design, build and manage data pipelines for data β€’ Strong experience in working with large, heterogeneous datasets in building and optimizing data pipelines, pipeline architectures and integrated datasets β€’ Demonstrated success in working with large, heterogeneous datasets to extract business value β€’ Strong experience in working with DevOps capabilities like version control, automated builds, testing and release management capabilities β€’ Demonstrated ability to communicate complex results to technical and non-technical audiences β€’ Demonstrated ability to work effectively in teams as well as independently across multiple tasks while meeting aggressive timelines β€’ Strategic, intellectually curious thinker with focus on outcomes β€’ Professional image with the ability to form relationships across functions β€’ Willingness and ability to learn new technologies on the job