

Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Contract Data Engineer with a duration of 12 months (possible extension to 18 months), offering $49.00 - $64.00 per hour. Key skills include API integration, SQL, Snowflake, Databricks, and experience with BI tools like Tableau. Remote work preferred in PST.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
-
ποΈ - Date discovered
July 12, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
United States
-
π§ - Skills detailed
#Databases #Databricks #Data Storage #Data Quality #API (Application Programming Interface) #Database Management #Spark (Apache Spark) #Python #Data Governance #Visualization #Fivetran #Data Warehouse #Documentation #Scala #Programming #SQL (Structured Query Language) #Snowflake #PySpark #BI (Business Intelligence) #Database Design #Data Engineering #GCP (Google Cloud Platform) #Data Ingestion #Data Pipeline #Data Security #Airflow #Data Analysis #Cloud #Security #AWS (Amazon Web Services) #Azure #Tableau #Docker #Consulting #"ETL (Extract #Transform #Load)" #dbt (data build tool) #Compliance #Storage #Data Processing #REST (Representational State Transfer)
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Dice is the leading career destination for tech experts at every stage of their careers. Our client, OSI Engineering, Inc., is seeking the following. Apply via Dice today!
A globally leading technology company is looking for an experienced and highly skilled Contract Data Engineer. This role is focused on building robust data pipelines, primarily involving the integration of data from various APIs and managing critical data infrastructure. The successful candidate will work hands-on with Databricks, Snowflake, and/or relational databases, ensuring data is clean, reliable, and optimized for consumption by analytics tools like Tableau. This is a project-based contract position requiring someone who can quickly integrate and deliver results. Qualified candidates are encouraged to apply!
Key Responsibilities:
Design, develop, and maintain scalable data ingestion pipelines to extract data from various third-party and internal APIs (REST, SOAP, etc.).
Implement efficient data transformation and loading processes (ETL/ELT) within the data platform.
Manage and optimize data storage and schemas in Snowflake and Postgres databases.
Utilize Databricks for data processing, transformation, and orchestration tasks.
Ensure data quality, accuracy, and integrity throughout the data pipelines.
Collaborate with data analysts and BI developers (particularly Tableau users) to understand data requirements and optimize data models for performance.
Monitor data pipelines and systems for performance issues, errors, and data discrepancies, implementing necessary fixes and improvements.
Develop and maintain technical documentation for data pipelines, processes, and data models.
Troubleshoot data-related issues and provide timely resolutions.
Implement best practices for data security and governance within the data platform.
Required Skills and Qualifications:
Proven experience as a Data Engineer, with a strong focus on API integration and database management.
Experience building data pipelines for API data ingestion, including handling authentication, error handling, and data parsing.
Strong proficiency in SQL and experience with database design, query optimization, and performance tuning in relational databases (e.g., Postgres).
Hands-on experience with Snowflake as a cloud data warehouse, including data loading, querying, and performance optimization.
Experience using Databricks for data processing, ETL/ELT, and pipeline orchestration (e.g., PySpark, notebooks).
Proficiency in a programming language commonly used for data engineering (e.g., Python).
Experience working with cloud platforms (e.g., AWS, Azure, Google Cloud Platform) where Snowflake and Databricks are deployed.
Understanding of data warehousing concepts and best practices.
Experience supporting BI tools (like Tableau) by providing clean, structured, and performant data sources.
Excellent problem-solving skills and ability to work independently in a fast-paced environment.
Strong communication and collaboration skills.
Experience with specific ETL/orchestration tools (e.g., Airflow, Fivetran, dbt).
Familiarity with containerization technologies (e.g., Docker).
Experience in a contract or consulting role.
Knowledge of data governance and compliance standards.
Experience with BI tools like Tableau, including dashboard creation or assisting BI developers with visualization best practices related to data structure.
Type: Contract
Duration: 12 months (with a possibility to extend up to 18 months)
Work Location: Remote (PST time zone preferred)
Pay Range: $ 49.00 - $ 64.00 (DOE)