Skip to main content

Perplexity: When Health Search Becomes Health Infrastructure

March 23, 2026
Image: [image credit]
The future of clinical care: human insight amplified by AI in a balanced, symbiotic partnership.

 

Brandon Amaito, Contributing Editor

Perplexity is entering one of the most consequential corners of consumer health technology with Perplexity Health, a new product that promises to combine medical records, lab results, and wearable data into one conversational layer. On the surface, that sounds like the natural next step for an AI search company. In reality, it signals something bigger. Consumer health AI is moving beyond answer retrieval and into the far riskier business of interpreting a person’s fragmented health life in one place.

That shift matters because fragmentation has long been the opening that consumer platforms wanted to exploit. Patients have electronic records, but often across multiple portals. Lab data may sit in one service, prescription history in another, and fitness or sleep data in a third. A 2017 ASTP blog post summarizing ONC research on patient access found that patient portals often provide only a snapshot of the information people need and that both patients and health systems are burdened by a fragmented records process. Perplexity Health is betting that AI can finally make that fragmented landscape feel coherent.

The market opportunity is real

That is a plausible bet. The administrative and cognitive burden of assembling health information remains far too high for consumers, especially for people managing multiple conditions, multiple specialists, or multiple devices. ASTP’s health IT and exchange overview makes clear that federal policy has been pushing toward secure electronic sharing and access across care settings for years, including better patient access to records and stronger interoperability. A product that can gather that data and explain it in plain language is chasing a real need rather than an invented one.

That is also why the Perplexity launch should be taken seriously instead of dismissed as another AI feature drop. The company is not merely adding a symptom checker. It is trying to become the interpretive layer on top of clinical records, biomarker data, and consumer-generated health information. That is a materially different role. Once an AI product begins synthesizing personal health history, recent lab values, activity trends, and medication context, it starts to look less like search and more like infrastructure for decision support, even if the company continues to frame it as educational.

The commercial logic is obvious. Search results alone are becoming commoditized. Personalized interpretation is harder to replicate and more defensible. But the closer an AI platform gets to individualized health guidance, the less forgiving the market should be about transparency, boundaries, and evidence discipline.

The privacy line is where this gets harder

This is where many consumer health AI launches still speak in a language of convenience while the real issue is governance. The most important fact about health-data-connected apps is not that they can access records. It is what happens after the records leave a covered entity. In HHS guidance on the access right, health apps, and APIs, the department explains that once health information is received at an individual’s direction by an app that is neither a HIPAA covered entity nor a business associate, that information is no longer subject to HIPAA’s protections. That is not a technical footnote. It is the core structural issue in consumer health apps.

The same HHS guidance also says a covered entity generally cannot refuse to send a patient’s electronic health information to a third-party app designated by that individual merely because of concerns about the app’s downstream use or disclosure practices. That means interoperability can expand access while also shifting responsibility. The old healthcare assumption that sensitive health data is necessarily living inside the HIPAA framework becomes less reliable the moment a consumer-directed app enters the picture.

That does not make products like Perplexity Health inherently reckless. It does mean that the trust standard should be much higher than marketing language about encryption and user control. The Federal Trade Commission’s health privacy guidance and its Health Breach Notification Rule resources make clear that many health apps and connected services fall under a different federal regime, one that requires consumer notification after certain breaches and is particularly relevant when apps draw health information from multiple sources. A platform that connects records, wearables, and other health signals is not just offering convenience. It is assuming stewardship over one of the most sensitive consumer data assemblages in the economy.

The real challenge is clinical boundary management

The harder issue is not privacy alone. It is epistemic discipline. Health questions feel personal even when the right answer is uncertain, context-dependent, or not knowable from available data. That is precisely why generative AI products become more dangerous as they become more persuasive. A general search result can be wrong and still feel provisional. A synthesized answer based on records, trends, and clinical language can feel authoritative even when it is incomplete.

That is why the World Health Organization’s guidance on large multimodal models for health is so relevant here. WHO argues that health uses of generative AI require governance, transparency, human oversight, and careful attention to context-specific risks. The problem is not only hallucination. It is misplaced confidence, unclear provenance, and the temptation to let fluent systems stand in for more rigorous clinical reasoning than they actually possess.

The National Institute of Standards and Technology’s AI Risk Management Framework makes a complementary point, saying trustworthiness has to be built into the design, development, use, and evaluation of AI systems. In consumer health AI, that principle should translate into concrete questions. What sources are privileged when personal data conflicts with general medical literature. How are outdated records handled. How is missing information surfaced rather than silently filled in. When does the system stop answering and redirect to urgent care or clinician review. How visible is the chain from source material to synthesized response.

Those questions matter more than whether the interface feels smooth. Consumer health AI will not earn durable trust by sounding intelligent. It will earn trust by making uncertainty legible and by keeping the line between education and quasi-diagnosis unmistakably clear.

A product category is being born in public

Perplexity is not alone in chasing this category, and that may be the strongest signal of all. Health AI is moving toward products that do not just search the web or summarize a chart. They attempt to unify records, apps, devices, and premium literature into a single personal health layer. That category will be attractive because it solves a real frustration. It will also be dangerous because it can overstate coherence where the underlying data remains partial, noisy, or out of date.

The next winners in this space are unlikely to be the products with the most dramatic claims about autonomy. They will be the ones that behave more like disciplined health infrastructure than like consumer AI theater. That means auditable sourcing, clinically credible escalation guardrails, strong deletion and disconnection controls, conservative use of data, and clear explanations of what legal protections apply once information moves into the app environment.

Perplexity Health deserves attention for exactly that reason. The launch is not merely another example of AI entering healthcare. It is a sign that the consumer front door to health information is being rebuilt around personalized synthesis. The opportunity is obvious. So is the risk. Once an AI product starts connecting the dots across a person’s records, labs, and wearables, the market should stop evaluating it as a better search engine and start evaluating it as a new layer of health infrastructure.