Info Origin Inc.

Sr. Cloud Data Engineer

โญ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Sr. Cloud Data Engineer on a contract basis, offering $40.00 per hour for remote work. Requires a Bachelor's degree, 5+ years in data engineering, proficiency in ETL/ELT processes, and experience with Azure, Databricks, and Snowflake.
๐ŸŒŽ - Country
United States
๐Ÿ’ฑ - Currency
$ USD
-
๐Ÿ’ฐ - Day rate
320
-
๐Ÿ—“๏ธ - Date
October 26, 2025
๐Ÿ•’ - Duration
Unknown
-
๐Ÿ๏ธ - Location
Remote
-
๐Ÿ“„ - Contract
Unknown
-
๐Ÿ”’ - Security
Unknown
-
๐Ÿ“ - Location detailed
Lincoln, NE
-
๐Ÿง  - Skills detailed
#Azure Databricks #Data Warehouse #Agile #Cloud #Python #Snowflake #Computer Science #Scala #Data Engineering #Compliance #BI (Business Intelligence) #Security #Data Integrity #Documentation #SQL (Structured Query Language) #Datasets #Azure #Data Governance #Data Pipeline #Microsoft Power BI #SSIS (SQL Server Integration Services) #"ETL (Extract #Transform #Load)" #Scrum #Databricks
Role description
Job OverviewWe are seeking a skilled Cloud Data Engineer to join our Data Office Team and help drive the modernization of enterprise analytics. The ideal candidate will have strong experience building scalable, high-performance data pipelines and models using modern cloud technologies such as Azure, Databricks, Snowflake, SQL, Python, Scala, and Power BI. Youโ€™ll be responsible for designing, developing, and optimizing data systems that support enterprise reporting and analytics, working in a collaborative and agile environment. Key Responsibilities: Design and develop data pipelines and ELT processes to integrate large, diverse datasets from multiple sources. Build and maintain data models and structures to support enterprise reporting and analytics. Collaborate with cross-functional teams to deliver BI and analytics solutions aligned with business goals. Optimize data performance by troubleshooting and resolving issues related to large-scale data querying and transformation. Participate in the design and documentation of data processes, including model development, validation, and implementation. Ensure compliance with data governance and security policies to maintain data integrity and privacy. Core Competencies: Data Structures & Modeling: Ability to design and implement scalable architectures for structured and unstructured data. Data Pipelines & ELT: Skilled in developing robust extraction, transformation, and loading processes using modern tools. Performance Optimization: Strong capability to monitor and enhance data performance in both development and production environments. Required Qualifications: Bachelorโ€™s degree in Data Analytics, MIS, Computer Science, or a related field. 5+ years of experience in data engineering or data warehouse development, including dimensional modeling. 5+ years of experience designing and developing ETL/ELT processes using tools such as SSIS, Databricks, or Python. Proven ability to work independently in a remote Agile environment, with strong ownership, accountability, and proactive communication skills. Experience working in Scrum or distributed teams and managing deliverables effectively. Additional Details: Remote work available โ€“ Candidates can work from anywhere within the U.S. Local candidates (Lincoln, NE area) are also encouraged to apply. Job Types: Contract, Temporary Pay: From $40.00 per hour Expected hours: 40 per week Work Location: Remote