

Senior Data Engineer
β - Featured Role | Apply direct with Data Freelance Hub
This role is for a Senior Data Engineer on a part-time contract for over 6 months, with a pay rate of $110,000 - $137,694 annually. Key skills include Informatica, Azure Data Lake, Python, SQL, and big data technologies. Remote work location.
π - Country
United States
π± - Currency
$ USD
-
π° - Day rate
625.8818181818
-
ποΈ - Date discovered
July 1, 2025
π - Project duration
More than 6 months
-
ποΈ - Location type
Remote
-
π - Contract type
Unknown
-
π - Security clearance
Unknown
-
π - Location detailed
Remote
-
π§ - Skills detailed
#Programming #SQL (Structured Query Language) #Agile #Data Science #Data Engineering #Data Processing #Hadoop #Scripting #Shell Scripting #Data Pipeline #SQL Queries #Automation #Data Lake #Cloud #Spark (Apache Spark) #Big Data #Database Design #Informatica #Datasets #Database Querying #Python #Databases #Storage #"ETL (Extract #Transform #Load)" #Scala #Azure
Role description
Heading 1
Heading 2
Heading 3
Heading 4
Heading 5
Heading 6
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur.
Block quote
Ordered list
- Item 1
- Item 2
- Item 3
Unordered list
- Item A
- Item B
- Item C
Bold text
Emphasis
Superscript
Subscript
Job OverviewWe are seeking a skilled Data Engineer to join our dynamic team. In this role, you will be responsible for designing, implementing, and maintaining robust data pipelines and architectures that support our data analytics initiatives. You will work closely with data scientists and analysts to ensure that our data infrastructure is optimized for performance and scalability. The ideal candidate will have a strong background in database design and experience with big data technologies.
Duties
Design and develop scalable data models to support business requirements.
Implement ETL processes using tools such as Informatica to extract, transform, and load data from various sources.
Manage and optimize Azure Data Lake for efficient storage and retrieval of large datasets.
Write efficient SQL queries to manipulate and analyze data stored in relational databases.
Utilize Python for scripting and automation of data processing tasks.
Develop shell scripts to automate routine tasks and enhance system performance.
Collaborate with cross-functional teams in an Agile environment to gather requirements and deliver solutions that meet business needs.
Monitor data pipelines for performance issues and implement necessary optimizations.
Qualifications
Proven experience in database design and management.
Proficiency in Informatica or similar ETL tools.
Familiarity with Azure Data Lake or other cloud-based storage solutions.
Strong programming skills in Python, with the ability to write clean, efficient code.
Solid understanding of SQL for database querying and manipulation.
Experience with server management and maintenance.
Knowledge of shell scripting for task automation is a plus.
Familiarity with big data technologies such as Hadoop or Spark is advantageous.
Ability to work effectively in an Agile team setting, demonstrating strong communication skills.
Experience with Azure Fabric is a plus.
Join us as we leverage the power of data to drive insights and innovation within our organization. We look forward to your application!
Job Types: Part-time, Contract
Pay: $110,000.00 - $137,694.00 per year
Benefits:
Flexible schedule
Schedule:
8 hour shift
Work Location: Remote