

Cloud Data Engineer : 197273
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Cloud Data Engineer with a 12-month contract, based onsite in Columbus, OH. Requires a Bachelor's Degree, 5+ years of experience, ETL and Data Modeling on AWS, SQL Server, and Informatica expertise.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
624
-
ποΈ - Date discovered
June 13, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
On-site
-
π - Contract type
Unknown
-
π - Security clearance
Yes
-
π - Location detailed
Columbus, OH
-
π§ - Skills detailed
#Agile #Computer Science #Data Lineage #Data Management #Data Engineering #AWS (Amazon Web Services) #Data Processing #DevOps #Scala #Migration #Big Data #Security #Databases #Data Quality #Data Warehouse #Scrum #SQL Server #Business Analysis #Monitoring #Snowflake #Data Pipeline #Informatica Cloud #SQL (Structured Query Language) #"ETL (Extract #Transform #Load)" #Datasets #Informatica #Data Lake #Data Modeling #Cloud
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
We cannot accept Corp to Corp or 1099 candidates for this role.
Contract: 12 Month Contract
The position would be working for HKA at the COLUMBUS, OH facility
1 PTO day per month on the 1st of each month
9 paid holidays offered
Outline:
ONSITE POSITION, MUST BE LOCAL TO COLUMBUS, OH AT THE TIME OF RESUME SUBMISSION PER THE CLIENT. Expect 4 days per week onsite, and one hybrid during onboarding/training (~2 months). 3 days onsite will be the ongoing expectation.
Ideal Candidate for this position:
Skills Needed:
Scope:
The Sr. Software Engineer is responsible for system analysis, design, development and testing for their assigned technical product(s) or application(s), within the context of an agile/DevOps delivery model. In addition, they will extend their DevOps responsibilities to take on Operations topics and may also fill the Agile delivery roles of Business Analyst and Scrum Master.
Key Responsibilities:
The Cloud Data Engineer will be responsible for developing, expanding, and optimizing our data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. During various aspects of this process, you should collaborate with coworkers to ensure that your approach meets the needs of each project.
To ensure success as a data engineer, you should demonstrate flexibility, creativity, and the capacity to receive and utilize constructive criticism. A formidable data engineer will demonstrate unsatiated curiosity and outstanding interpersonal skills.
Key accountabilities of the function Leading Operations for Assigned Systems:
Designing, implementing, and operating assigned cloud technology platforms as the technical expert.
Leading internal and external resources in the appropriate utilization of cloud technology platforms.
Executing ITSM/ITIL processes to ensure ongoing stable operations and alignment with SLAs.
Steering providers in the execution of tier 2 and 3 support tasks and SLAs.
Resolving escalated support issues.
Performing routine maintenance, administering access and security levels.
Driving System Management & Application Monitoring.
Ensuring monitoring and correct operation of the assigned system.
Ensuring changes to the system are made for ongoing run and support.
Ensuring consolidation of emergency activities into regular maintenance.
Analyzing system data (system logs, performance metrics, performance counters) to drive performance improvement.
Supporting Agility & Customer Centricity.
Supporting the end user with highly available systems.
Participating in the support rotation.
Performing other duties as assigned by management Additional skills: special skills / technical ability etc.
Demonstrated experience in vendor and partner management.
Technically competent with various business applications, especially Financial Management systems.
Experience at working both independently and in a team-oriented, collaborative environment is essential.
Must be able to build and maintain strong relationships in the business and Global IT organization.
Ability to elicit cooperation from a wide variety of sources, including central IT, clients, and other departments.
Strong written and oral communication skills.
Strong interpersonal skills.
Education
Bachelorβs Degree
BASIC QUALIFICATIONS:
This position requires a Bachelor's Degree in Computer Science or a related technical field, and 5+ years of relevant employment experience.
2+ years of work experience with ETL and Data Modeling on AWS Cloud Databases.
Expert-level skills in writing and optimizing SQL.
Experience operating very large data warehouses or data lakes.
3+ years SQL Server.
3+ years of Informatica or similar technology.
Knowledge of Financial Services industry.
MUST HAVE:
ETL Developer
Informatica Power Center and Informatica Cloud (IDMC) experience required
SQL Server experience required
Snowflake experience is a bonus
Strong organizational skills
Good communication skills
Experience migrating ETL from On Prem to Cloud platforms is a bonus
PREFERRED QUALIFICATIONS:
5+ years of work experience with ETL and Data Modeling on AWS Cloud Databases.
Experience migrating on-premise data processing to AWS Cloud.
Relevant AWS certification (AWS Certified Data Analytics, AWS Certified Database, etc.).
Expertise in ETL optimization, designing, coding, and tuning big data processes using Informatica Data Management Cloud or similar technologies.
Experience with building data pipelines and applications to stream and process datasets at low latencies.
Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data.
Sound knowledge of data management and knows how to optimize the distribution, partitioning, and MPP of high-level data structures.
Knowledge of Engineering and Operational Excellence using standard methodologies.
HKA Enterprises is a global workforce solutions firm. If you are seeking a new career opportunity or project experience, our recruiters will work to understand your qualifications, experience, and personal goals. At HKA, we recognize the importance of matching employee goals with those of the employer. We strive to seek credibility, satisfaction, and endorsement from all our applicants. We invite you to take time and search for your next career experience with us! HKA is an EEO Employer who participates in the US Citizenship and Immigration Services E-Verify Program.