Data Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer with a contract length of "unknown" and a pay rate of "$X/hour." Key skills include ETL, SQL, AWS, and data warehousing. Requires 7+ years of experience and a Bachelor's in a related field.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
-
πŸ—“οΈ - Date discovered
September 11, 2025
πŸ•’ - Project duration
Unknown
-
🏝️ - Location type
Unknown
-
πŸ“„ - Contract type
Unknown
-
πŸ”’ - Security clearance
Unknown
-
πŸ“ - Location detailed
United States
-
🧠 - Skills detailed
#Data Modeling #Data Conversion #Python #Elasticsearch #S3 (Amazon Simple Storage Service) #Matillion #Cloud #SNS (Simple Notification Service) #Data Pipeline #Databases #Snowflake #RDS (Amazon Relational Database Service) #Mathematics #Computer Science #MongoDB #Data Engineering #BI (Business Intelligence) #Spark (Apache Spark) #"ETL (Extract #Transform #Load)" #Oracle #Strategy #Azure #SSIS (SQL Server Integration Services) #DynamoDB #MS SQL (Microsoft SQL Server) #NoSQL #SQL Server #Talend #SQL (Structured Query Language) #Data Analysis #Informatica #AWS (Amazon Web Services) #EC2
Role description
Position Summary: We are looking for a Senior Data Engineer to expedite our Data Platform Stack to support our developers, database architects, data analysts on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects in a dynamic environment. Responsibilities ● Analyze and interpret complex data sets. Ability to identify data anomalies and resolve data issues ● Understand specific business processes and domain concepts and relate them to data subject domains ● Collaborate with Data Leads, Data Analysts and QA analysts to validate requirements, participate in user requirement sessions ● Perform tests and validate data flows and prepare ETL processes according to business requirements ● Perform ETL tuning and SQL tuning ● Ability to document data flows representing business logic in ETL routines ● Design and implement data conversion strategy from legacy to new platforms ● Perform design validation, reconciliation, and error handling in data load processes ● Design and prepare technical specifications and guidelines including ER diagrams and related documents Qualifications ● Must be well versed with Data warehousing concepts including design patterns (Star schemas, Snowflake design schemas). Must be aware of data modeling concepts including the data modeling Normal forms ● Knowledge of AWS Infrastructure including S3, SNS, Ec2, CloudWatch and RDS ● 7+ years working in ETL/Data transformation projects with one or more related products such as Informatica or Talend or Microsoft SSIS ● 7+ years working in business intelligence and Python, data warehousing initiatives ● Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases like Oracle or Ms Sql Server or Vertica. ● Good to have at least 1+ years working with Matillion ELT tool and Snowflake Database ● Good to have experience in building AWS Data Pipelines using Python or Spark, SparkSQL in any of Cloud Environments (AWS / Azure / Google). ● Good to have - Experience with any of the NoSQL datastores such as ElasticSearch, MongoDB, DynamoDB, Cassandra Education/Experience: Bachelor’s in mathematics, Computer Science or related technical field. Post graduate degree preferred. Minimum 5 years of relevant experience