In a significant shift, Google has officially discontinued its “What People Suggest” feature, which aimed to provide users with health advice sourced from individuals sharing similar experiences. This decision arrives amidst increasing scrutiny regarding the safety and reliability of health-related information generated by artificial intelligence. Sources familiar with the matter confirmed the move, which Google described as part of a broader effort to simplify its search interface.
The Rise and Fall of “What People Suggest”
Launched with the intention of enhancing user access to personal medical insights, “What People Suggest” allowed individuals to tap into a reservoir of experiences from others navigating similar health challenges. Announced at a New York event in March 2025, Google’s then Chief Health Officer, Karen DeSalvo, extolled the feature’s potential to combine expert medical information with real-life experiences, presenting a dual approach to health queries.
DeSalvo noted, “While people come to search to find reliable medical information from experts, they also value hearing from others who have similar experiences.” The feature aimed to curate diverse perspectives from online discussions, enabling users to glean insights tailored to their specific health conditions. Initially available on mobile in the US, the feature was positioned as a transformative tool for health information seekers.
Concerns Over Safety and Reliability
Despite Google’s optimistic messaging around “What People Suggest,” the feature has faced backlash, particularly concerning the accuracy of the health advice it disseminated. An investigation by the Guardian earlier this year revealed alarming instances of dangerous misinformation linked to Google’s AI-generated health summaries. These summaries, which reach a staggering 2 billion users monthly, were shown to contain misleading content, raising red flags among health experts and the public alike.
In response to the findings, Google attempted to reassure users by highlighting the links provided in its AI summaries to reputable sources. Shortly thereafter, the company began removing AI Overviews for certain medical queries, acknowledging the potential risks associated with the feature.
Clarifying Google’s Position
Following the discontinuation of “What People Suggest,” a Google representative stated that the decision was not influenced by safety concerns but was part of a broader simplification of its search results. They maintained that the initiative had been abandoned months prior and that the company continues to strive for reliable health information through various channels, including forums that offer personal insights.
Yet, when pressed for details on where this rationale was communicated publicly, the spokesperson pointed to a blog post that notably did not mention the feature. This lack of clarity has only intensified scepticism towards Google’s commitment to user safety in health-related matters.
Looking Ahead: Future Health Initiatives
As Google prepares for its next “Check Up” event, where executives, including the newly appointed Chief Health Officer Michael Howell, will discuss upcoming AI research and technological innovations, questions linger about the company’s strategy in the health sector. The focus will likely shift towards more robust partnerships and reliable solutions, addressing health challenges without relying on crowdsourced input that may compromise user safety.
Why it Matters
The discontinuation of “What People Suggest” highlights a critical juncture in the intersection of technology and health. As digital platforms increasingly influence health decisions, ensuring the reliability of information shared is paramount. Google’s efforts to simplify its approach underscore the necessity for tech companies to balance innovation with user safety, particularly in an arena as sensitive as health. The implications of this decision resonate beyond Google, signalling a cautionary tale for other tech giants navigating the complex landscape of health information dissemination.