Sr. SAS/ETL Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. SAS/ETL Data Engineer with a contract length of "unknown", offering a pay rate of "$/hour". Key skills include SAS ETL, SQL, cloud platforms (Snowflake, AWS), and experience with data warehousing and integration tools.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
August 13, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Pipeline #BigQuery #Cloud #SAS #Physical Data Model #Data Lake #SAP #dbt (data build tool) #Snowflake #Data Integration #"ETL (Extract #Transform #Load)" #Data Design #Scripting #Data Warehouse #Python #SQL Server #Teradata #IICS (Informatica Intelligent Cloud Services) #GitLab #Public Cloud #Data Engineering #Oracle #AWS (Amazon Web Services) #Fivetran #Databases #EDW (Enterprise Data Warehouse) #Redshift #AWS Glue #SQL (Structured Query Language) #Business Objects #Informatica Cloud #Informatica #Data Integrity #Data Architecture #Data Mart #BO (Business Objects) #Version Control
Role description
Short Description: Responsible for designing, building, and maintaining data pipelines that supports data integrations for Enterprise Data Warehouse, Operational Data Store or Data Marts etc. with following Client defined guidelines. Complete Description: Data Engineering: β€’ SAS ETL skills for current SAS script maintenance and minor enhancements, but should re ready to also use other ETL tools listed below for future state β€’ Experience in designing and building Data Warehouse and Data lakes. Good knowledge of data warehouse principles, and concepts. β€’ Technical expertise working in large scale Data Warehousing applications and databases such as Oracle, Netezza, Teradata, and SQL Server. β€’ Experience with public cloud-based data platforms especially Snowflake and AWS. Data integration skills: β€’ Expertise in design and development of complex data pipelines β€’ Solutions using any industry leading ETL tools such as SAP Business Objects Data Services (BODS), Informatica Cloud Data Integration Services (IICS), IBM Data Stage. β€’ Experience of ELT tools such as DBT, Fivetran, and AWS Glue β€’ Expert in SQL - development experience in at least one scripting language (Python etc.), adept in tracing and resolving data integrity issues. β€’ Strong knowledge of data architecture, data design patterns, modeling, and cloud data solutions (Snowflake, AWS Redshift, Google BigQuery). β€’ Data Model: Expertise in Logical and Physical Data Model using Relational or Dimensional Modeling practices, high volume ETL/ELT processes. β€’ Performance tuning of data pipelines and DB Objects to deliver optimal performance. β€’ Experience in Gitlab β€’ version control and CI/CD