

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Data Engineer with a 6-month contract, offering a pay rate of "$X/hour". Candidates must have expertise in SQL, Informatica ETL, AWS Redshift, Python, and data visualization tools like Tableau. U.S. Citizenship or Green Card required.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 1, 2025
π - Project duration
Unknown
-
ποΈ - Location type
Unknown
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
San Francisco Bay Area
-
π§ - Skills detailed
#SQL (Structured Query Language) #Data Modeling #Oracle #Security #Data Science #DMS (Data Migration Service) #AWS (Amazon Web Services) #Data Engineering #Scripting #Data Warehouse #RDBMS (Relational Database Management System) #Aurora PostgreSQL #SQL Queries #Automation #BI (Business Intelligence) #SAP #IAM (Identity and Access Management) #CLI (Command-Line Interface) #Cloud #Lambda (AWS Lambda) #SQL Server #Amazon Redshift #Aurora #EDW (Enterprise Data Warehouse) #Redshift #Data Integration #Computer Science #Data Architecture #Informatica #Terraform #AWS CLI (Amazon Web Services Command Line Interface) #PostgreSQL #SAP BusinessObjects #Python #Visualization #Tableau #BO (Business Objects) #S3 (Amazon Simple Storage Service) #Deployment #Microsoft SQL Server #"ETL (Extract #Transform #Load)" #MS SQL (Microsoft SQL Server) #Debugging #GitLab #Microsoft SQL
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We are looking for a skilled Data Engineer with a strong experience in designing, developing, and maintaining complex data warehouse and business intelligence solutions, with a focus on AWS cloud technologies and data architecture.
Ideal candidates must have expertise in SQL, data modeling and ETL development using Informatica along with hands on experience in cloud-based data warehousing solutions such as AWS Redshift and Aurora. Proficiency in Python scripting and working knowledge of visualization tools like Tableau and SAP BusinessObjects are essential.
Responsibilities:
β’ Design, develop, test, and automate enterprise data warehouse and BI solutions.
β’ Build and maintain ETL pipelines, data models, and analytics dashboards.
β’ Translate complex technical details into clear, non-technical language for stakeholders.
β’ Analyze and troubleshoot data issues, ensure accuracy, consistency, and integrity.
β’ Develop implementation and transition plans and participate in on-call support as needed.
β’ Occasionally travel domestically for business with rare overnight stays possible.
β’ Participate in the full lifecycle of complex data warehouse and business intelligence solutions, including design, development, testing, implementation, and review.
β’ Develop components of advanced data warehouse applications based on defined requirements, builds reports and dashboards, creates prototypes, and evaluates the feasibility of proposed system designs.
β’ Review and implement updates, patches, and new versions of data warehouse software
β’ Responsible for code deployment and application integration.
Qualifications:
β’ Bachelorβs degree in Computer Science, Information Systems, or a related technical field.
β’ At least 5 to years of experience in data engineering, data science, and software engineering.
β’ Proven expertise in writing and optimizing complex SQL queries across multiple RDBMS platforms, including Microsoft SQL Server and Oracle.
β’ U.S. Citizenship or Green Card status is required for this position.
β’ Hands-on experience with ETL tools, particularly Informatica, for data integration and transformation.
β’ Proficient in data visualization using Tableau and SAP BusinessObjects (BO).
β’ Experienced in scripting and automation using Python.
β’ Extensive experience using core AWS services to design and support cloud-based data warehouse solutions in alignment with AWS architecture using S3, DMS, Glue, Lambda.
β’ Development and modeling experience with Amazon Redshift and Amazon Aurora PostgreSQL.
β’ Proficient in using AWS service APIs, AWS CLI, and SDKs to build, deploy, and debug applications.
β’ Experience developing cloud native and serverless applications using AWS, with emphasis on applying security leveraging IAM roles over embedded credentials.
β’ Experience with CI/CD pipelines and deployment tools, including GitLab, Terraform, and DBMaestro.
β’ Capable of maintaining and debugging modular AWS-based code.
β’ Understanding of application lifecycle management (ALM) and containerization technologies.