Can You Trust AI to Search for Information?

One of the most common ways people use AI chatbots like ChatGPT is for information search. I hear this over and over again in my workshops and consulting sessions:

“I mostly use ChatGPT instead of Google when I need a quick answer.”

And it makes sense. Unlike traditional search engines, ChatGPT:

  • Doesn’t flood you with ads or sponsored results

  • Offers concise, human-readable answers

  • Feels fast, convenient—and, perhaps surprisingly, trustworthy

People use ChatGPT for everything from comparing supplements (“Chat recommended collagen from this brand”) to finding courses and even AI consultants. I’ve personally had clients reach me based on ChatGPT’s recommendations.

Web Access Makes It Even More Powerful—In Theory

Since November 2023, OpenAI has added web browsing capabilities to the paid version of ChatGPT. With this feature, the model can search the internet and include sources with links. In theory, this should reduce hallucinations and improve accuracy.

In practice, however, limitations remain.

Why Does AI Still Hallucinate?

Despite improvements, ChatGPT still “makes things up” more often than users expect. There are a few reasons for this:

  • It doesn’t always use the search function. Even when a question clearly requires current or factual data, ChatGPT may answer from its internal model instead of querying the web.

  • It gets facts wrong. A small but telling example: ask ChatGPT to list the left tributaries of the Vistula River in Poland. It frequently (and confidently) names San, Raba, and Soła—all of which are right-bank tributaries.

  • It’s a moving target. A year ago, asking ChatGPT “Who is Julia Krysztofiak-Szopa?” produced surreal answers—actress, journalist, or manufacturing company owner. Today, it gives a correct answer, pulled from actual internet sources—at least in the paid version.

These examples highlight a key point: AI is a generative model, not a fact-checking engine. It is trained to generate plausible-sounding language, not necessarily to provide verified truth.

So, Should You Use ChatGPT for Search?

Yes—but critically.

ChatGPT is an excellent tool for fast idea generation, summarization, and orientation in a topic. But it should never be treated as an authoritative source. Users must apply critical thinking and cross-check important facts—especially in business, education, or healthcare contexts.


In my AI trainings, I teach professionals how to use tools like ChatGPT effectively and safely—maximizing their potential while avoiding the most common pitfalls.

Book an intro call to learn how I can help your team build reliable, informed AI practices.

Previous
Previous

What Does the Internet Know About You?

Next
Next

From Idea to Visual in Seconds: Using Napkin.ai to Create Graphics from Text