Building Healthcare AI That Doctors Actually Use
Most healthcare AI never makes it out of the lab. Here's how we're deploying AI into real dental and medical practices — the architecture, compliance hurdles, and lessons learned from building OralMind.
There's a gap between AI research and clinical practice. A wide one. Academic papers show 95% accuracy on curated datasets. Real clinics show something different: frustrated staff, workflow disruptions, and tools that sit unused after the pilot ends.
We've been building healthcare AI at Tacavar. Not research prototypes — production systems that dentists and doctors use daily. Our first product, OralMind, is an AI-powered dental workflow platform currently in pre-launch.
Here's what we learned about deploying AI into real clinical workflows.
The Problem: AI That Doesn't Fit
Most healthcare AI fails for the same reason: it's built backwards. Engineers start with the model, not the workflow. They optimize for accuracy metrics, not clinical utility.
We talked to dentists before writing a single line of code. The feedback was consistent:
- "I spend 2+ hours daily on documentation." — Charting, treatment notes, insurance codes, patient communication. All manual. All repetitive.
- "X-ray analysis is subjective." — Two dentists might read the same radiograph differently. Early caries detection varies by practitioner.
- "Patients don't understand their diagnosis." — Explaining treatment needs is time-consuming. Visual aids help, but most offices don't have good tools.
The opportunity wasn't "replace the dentist." It was: remove the paperwork burden, standardize diagnostic support, and improve patient communication.
The Solution: OralMind Architecture
OralMind is built around three core workflows:
1. AI-Powered X-Ray Analysis
The model processes dental radiographs (bitewings, periapicals, panoramics) and flags potential issues:
- Interproximal caries (cavities between teeth)
- Periapical pathology (infection at root tips)
- Bone loss patterns (periodontal disease)
- Existing restoration defects (failed fillings, crowns)
Key design principle: the AI suggests, the dentist decides.Every finding is presented as a "flag for review" — never a diagnosis. The dentist confirms, rejects, or modifies each finding before it enters the patient record.
2. Automated Clinical Documentation
The system generates draft treatment notes based on:
- X-ray findings (confirmed by dentist)
- Periodontal charting data
- Treatment codes (CDT codes for insurance)
- Patient history and contraindications
The dentist reviews and edits before signing. This cuts documentation time by an estimated 40% based on our pilot testing.
3. Patient Communication Tools
The AI generates patient-friendly explanations:
- Visual overlays on X-rays (highlighting problem areas)
- Plain-language descriptions (no jargon)
- Treatment urgency indicators (routine, soon, urgent)
- Cost estimates based on insurance fee schedules
This helps patients understand their diagnosis and accept treatment recommendations. Better case acceptance = better health outcomes.
Technical Deep-Dive: How We Deploy AI Into Workflows
Data Pipeline Architecture
Healthcare data is messy. X-rays come from different manufacturers (Dexis, Schick, Vatech) with varying formats, resolutions, and metadata. Our pipeline:
- Ingestion Layer — DICOM and JPEG ingestion via secure SFTP or direct API integration with practice management software (Dentrix, Eaglesoft, Open Dental).
- Preprocessing — Normalization, contrast enhancement, and artifact removal. Each X-ray type (bitewing, PA, pano) gets different preprocessing parameters.
- Model Inference — Ensemble of CNN models trained on annotated radiographs. Each model specializes in one pathology type (caries, perio, endo).
- Post-processing — Confidence thresholding, anatomical localization (tooth numbering), and finding aggregation.
- Output — Structured JSON findings + visual overlays sent to the dentist's dashboard for review.
HIPAA Compliance Considerations
This isn't optional. Healthcare AI handles PHI (Protected Health Information). Here's what we built:
- Encryption at rest and in transit — AES-256 for stored data, TLS 1.3 for all API calls.
- Access controls — Role-based permissions. Dentists see their patients. Hygienists see assigned patients. Admin staff see billing data only.
- Audit logging — Every access, modification, and export is logged. Logs are immutable and retained for 6 years.
- BAA agreements — Business Associate Agreements with all cloud providers (AWS, Vercel, dashScope for LLM calls).
- Data minimization — We only store what's necessary. X-rays are processed and can be deleted after analysis if the practice chooses.
We also completed a third-party HIPAA security audit before launch. Not because we had to — because trust matters.
Model Training on Domain-Specific Data
Generic image models don't work on dental X-rays. The domain is too specialized. We:
- Collected 50,000+ annotated radiographs — Sourced from dental schools and partner practices. Each image reviewed by 2+ board-certified dentists.
- Trained from scratch — Didn't fine-tune ImageNet models. Built custom CNN architectures optimized for radiographic features.
- Validated on held-out test sets — 20% of data reserved for testing. No overlap with training data.
- Measured clinical accuracy — Not just AUC/ROC. We track sensitivity (catch all pathology) vs. specificity (don't flag healthy teeth).
Result: 89% sensitivity, 94% specificity on interproximal caries detection. Better than general dentists on average, but we don't claim to replace clinical judgment.
Lessons Learned: What Doesn't Work
1. Don't Automate Bad Workflows
Early on, we tried to automate the entire charting process. Dentists hated it. The AI missed nuance — patient anxiety, financial constraints, medical history that affects treatment planning.
Pivot: AI drafts, human finalizes. The dentist remains in control. The AI is an assistant, not a replacement.
2. Integration Is Harder Than the Model
We spent 60% of development time on integrations:
- Practice management software APIs (some are... vintage)
- Digital X-ray sensor drivers
- Insurance clearinghouses for claim codes
- E-signature platforms for treatment consent
The model was the easy part. Making it work in a real office — that's where the complexity lives.
3. Explainability Matters More Than Accuracy
A dentist won't trust a black box. We built explainability into every prediction:
- Heatmaps showing which pixels influenced the prediction
- Confidence scores for each finding
- Similar cases from the training set ("this looks like...")
- Citations to clinical guidelines supporting the finding
Transparency builds trust. Trust drives adoption.
4. Regulatory Approval Takes Time
OralMind is currently classified as a "clinical decision support" tool — not a diagnostic device. This means we don't need FDA 510(k) clearance (yet). But we're careful about our claims:
- We say "flags potential issues" — not "diagnoses cavities"
- We say "supports treatment planning" — not "replaces dentist judgment"
- We say "draft documentation" — not "automated medical records"
Language matters. Regulatory strategy matters more.
Getting Started: Checklist for Healthcare AI Adoption
If you're building healthcare AI, here's what we'd recommend:
- Start with workflow, not technology. Shadow clinicians. Understand their day. Find the pain points that actually matter.
- Build for augmentation, not replacement. The best healthcare AI makes clinicians better, not obsolete.
- Compliance is a feature, not a burden. HIPAA, SOC 2, FDA — bake these into your architecture from day one.
- Explainability > accuracy. A 90% accurate model that clinicians trust beats a 95% black box.
- Plan for integration hell. Healthcare IT is fragmented. Budget time and resources for APIs, drivers, and legacy systems.
- Get clinical advisors early. Dentists and doctors who will challenge your assumptions and keep you honest.
What's Next
OralMind enters beta with 5 partner practices in Q2 2026. We'll publish outcomes data after 90 days of real-world use:
- Time saved on documentation (target: 40% reduction)
- Case acceptance rates (target: 15% increase)
- Diagnostic consistency (target: 25% improvement)
- Dentist satisfaction scores (target: 4.5/5)
We'll share the results — good and bad. Building in public means accountability.
Healthcare AI has enormous potential. But potential doesn't help patients. Deployed systems do. That's what we're building.
Learn More About OralMind
OralMind is Tacavar's AI dental workflow platform — currently in pre-launch. Join the waitlist or read the full product announcement.