iTOTEM Analytics

Database Engineer

⭐ - Featured Role | Apply direct with Data Freelance Hub
This role is for a Database Engineer & Certification Project Leader in Houston or Brownsville, Texas, with a contract length of "unknown" and a pay rate of "unknown." Key skills include SQL, Python, Azure tools, and experience with ISO 27001/SOC 2 compliance.
🌎 - Country
United States
💱 - Currency
$ USD
-
💰 - Day rate
Unknown
-
🗓️ - Date
March 9, 2026
🕒 - Duration
Unknown
-
🏝️ - Location
On-site
-
📄 - Contract
Unknown
-
🔒 - Security
Unknown
-
📍 - Location detailed
Brownsville, TX
-
🧠 - Skills detailed
#Scala #Data Security #Data Cleaning #ADF (Azure Data Factory) #Storage #Programming #Azure Data Factory #Spatial Data #SQL (Structured Query Language) #Data Modeling #Azure #Compliance #Documentation #CLI (Command-Line Interface) #"ETL (Extract #Transform #Load)" #AI (Artificial Intelligence) #Indexing #Complex Queries #Azure SQL #GitHub #Scripting #Cloud #Computer Science #SQL Server #Logging #Data Engineering #Data Pipeline #Debugging #Schema Design #Azure Blob Storage #Python #SQL Queries #Data Science #Automation #Data Exploration #Security #Datasets
Role description
Job Title: Database Engineer & Certification Project Leader Location:, Houston or Brownsville, Texas offices. About iTOTEM: iTOTEM turns complex capital project and operations data into hyper-local insights for regulated sectors. Our Local Logic AI-enabled engine integrates diverse sources (surveys, census, spatial, sentiment, models) to deliver verified, relatable impact intelligence by neighbourhood or electoral boundary. At the heart of iTOTEM is the Canonical — the system that keeps our local impact insights consistent, reliable, and trust worthy no matter the industry, place or time period. You'll build the data pipelines that supply the Canonical and drive Local Logic, taking messy real-world information and turning it into clean, properly tracked datasets ready for our models, reports, regulators, and decisions that affect real communities. iTOTEM solves a critical problem for regulated companies in sectors like energy, utilities, mining, forestry, and large infrastructure projects. These organizations often struggle to show how their investments create real local benefits, such as jobs, business spending nearby, community support, and innovation—especially at the neighborhood level. Without clear proof, regulators and stakeholders see higher risk, which can delay approvals, stall funding, or reduce company value. Equally important, regulated companies are challenged by the lack the hyper-local intelligence needed to make decisions that truly align with the unique values, priorities, and requirements of their host communities. Our Local Logic and Canonical system fixes both issues by turning complex data into trustworthy, hyper-local insights that not only make benefits visible and defensible to regulators and investors, but also empower companies to tailor their approaches for stronger community alignment, reduced opposition, and more sustainable, community-supported projects. What you’ll do: • Write, optimize, and maintain complex SQL queries across large, messy, multi-source datasets (census, vendors, surveys, admin, spatial files, CSV/Excel). • Design and align schemas to support Canonical consistency in entity types, relationships, and definitions. • Build and maintain scalable ELT/ETL pipelines (Azure Data Factory, Python scripting, Azure Blob Storage) for ingestion, cleaning, validation, transformation, and entity matching of raw/third-party data. • Handle incomplete, inconsistent, or poorly structured data — cleaning, validating, and transforming it while documenting assumptions, history, and quality checks. • Monitor and troubleshoot Azure SQL Server performance, indexing, query efficiency, and scaling as datasets grow. • Work closely every day with the Database Engineer, Data Science team, Full Stack Developer, and Canonical leader to ensure high-quality data feeds Local Logic and downstream tools. • Use AI-powered programming tools (Claude, GitHub Copilot, Cursor, or similar) every day to speed up scripting, debugging, data exploration, validation logic, and pipeline building. • Lead and project manage iTOTEM's certification process for ISO 27001 (Information Security Management System) and SOC 2 (or equivalent), including planning timelines, coordinating gap assessments, implementing required controls (e.g., access, encryption, logging in data pipelines), gathering evidence, working with auditors/external consultants, and ensuring ongoing compliance/maintenance. • Follow and help enforce data security, reliability, and best practices, with documentation of schemas, transformations, sources, decisions, and compliance artifacts. What we’re looking for • Solid SQL proficiency (complex queries, optimization, joins, performance tuning). • Strong Python skills for data cleaning, automation, validation, entity matching, and scripting. • Understanding of data modeling, pipelines (ELT/ETL), warehousing, and handling multiple data sources and formats. • Hands-on experience with Azure tools: Azure SQL Server, Azure Data Factory (ADF), Azure Blob Storage. • Comfort working with messy or incomplete data (CSV, Excel, spatial/flat files) and basic performance concepts (indexing, schema design, query efficiency). • Good documentation and communication skills; eagerness to learn and grow in a database-focused role with AI support. • Basic understanding of cloud data infrastructure, security, and reliability practices. • Experience or strong interest in project managing compliance/certification efforts (ISO 27001, SOC 2, or similar standards) — including scoping, timelines, cross-team coordination, and audit preparation. • Experience using AI coding assistants (Claude CLI, Copilot, etc.) to speed up data work. • Familiarity with LLM use in data tasks (e.g., entity matching, validation, or augmentation). • Exposure to spatial data or GIS concepts. Experience: • University/college training in a relevant field (Computer Science, Data Engineering, etc.) or equivalent professional experience. • 5 years hands-on in database engineering, pipelines, or related (intermediate level; candidates with certification/compliance project experience encouraged). Why iTOTEM: Work on live, real-world North American projects. Shape the direction of Local Logic. We've gone narrow and deep in AI, focusing intensely on hyper-local intelligence rather than broad general tools. We're building a scalable intelligence layer that can expand across industries. This is a rare ground-floor opportunity to join early, shape the core systems, and grow with a focused, high-potential AI company that's solving meaningful problems in regulated sectors. Join a mission-driven team. If you enjoy defining product direction, orchestrating delivery, and want stable, high-impact work over hype-cycle intensity, this is a great fit. Words we live by: • See the people behind the numbers. • We’re data science. No lab coats required. • Focus hyper-locally to support progress globally. • Be relatable. Be interesting. Be share-worthy. • Build win-wins and quick wins.