Compunnel Inc.

Database Developer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Database Developer in Westlake, TX, requiring 8+ years of experience in database development, PL/SQL expertise, and knowledge of AWS. Contract length is unspecified; pay rate is also unspecified. Onsite work is needed for 2 weeks/month.
🌎 - Country
United States
πŸ’± - Currency
$ USD
-
πŸ’° - Day rate
Unknown
-
πŸ—“οΈ - Date
November 18, 2025
πŸ•’ - Duration
Unknown
-
🏝️ - Location
On-site
-
πŸ“„ - Contract
Unknown
-
πŸ”’ - Security
Unknown
-
πŸ“ - Location detailed
Dallas, TX
-
🧠 - Skills detailed
#Lambda (AWS Lambda) #Security #Informatica #Data Integrity #SQL (Structured Query Language) #RDBMS (Relational Database Management System) #IAM (Identity and Access Management) #Data Ingestion #GIT #AWS (Amazon Web Services) #Data Modeling #Python #"ETL (Extract #Transform #Load)" #Cloud #Kafka (Apache Kafka) #NoSQL #Oracle Exadata #Microsoft Power BI #Compliance #RDS (Amazon Relational Database Service) #Scala #Oracle #Data Security #Deployment #PostgreSQL #Data Governance #Computer Science #BI (Business Intelligence) #EC2 #Batch #Data Profiling #S3 (Amazon Simple Storage Service) #Jenkins #Tableau #Data Analysis #Databases #ThoughtSpot
Role description
Job Title: Database Developer Location: Westlake TX (2 weeks/month onsite) MUST HAVE: PL/SQL, Co-Pilot, AWS and python are nice to have. Job Description: The Skills and Expertise You Bring β€’ Bachelor’s degree in computer science engineering or equivalent β€’ 8+ years of relevant experience in database development β€’ 8+ years hands-on RDBMS expertise (Oracle, Exadata, PostgreSQL). β€’ Extensive hands-on experience with ETL/ELT tools (Informatica, SnapLogic). β€’ Expertise in Data Analysis, Data Profiling and Data Modeling skills. β€’ Knowledge of Data Warehousing methodologies and concepts. β€’ Build and maintain robust ETL pipelines to integrate data from multiple sources into OLAP data stores, ensuring data integrity and consistency. β€’ Bring an innovative spirit in search of efficiencies, process improvement opportunities, technical improvements, and other ways to add value to the organization with focus on operations improvement. β€’ Extensive experience with PL/SQL, database stored procedures, and performance tuning. β€’ Proficiency in Python β€’ Proficiency with python for data movement/transformation including development of classes and object-oriented code. β€’ Complex batch cycle orchestration (tools like Control-M, Autosys or Crontab). β€’ Implement best practices in data security, role-based access control, and data masking to maintain compliance and data governance standards. β€’ Knowledge of Cloud platforms and Services (AWS – IAM, EC2, S3, Lambda, RDS). β€’ Knowledge of data streaming tools like Kafka, Kinesis. β€’ Design experience of scalable data models, optimized for data ingestion and analytics requirements including SCD. β€’ Developing and automating deployment with GIT, Jenkins, and CICD Processes. Skills that are an advantage: β€’ Knowledge of non-RDBMS databases like Graph, No-SQL, Timeseries databases. β€’ Working knowledge of analytics and BI front-end tools like Power BI, Tableau, and ThoughtSpot. β€’ Financial Services experience.