A major portion of historic human progress is about creating efficiency, which is essential for the continued growth and complexity of our civilization. If we stop making our products, processes, and systems more efficient, innovation halts, and civilization stagnates. One essential aspect of improving efficiency is reducing redundancy.

By the nature of the physical world, there will always be some redundancy. However, two factors are simultaneously acting to reduce it. First, services have been giving way to products. Once something is created, repeated efforts focus only on distribution rather than human effort in recreating it again and again. Second, distribution itself is becoming more efficient with globalization, decreasing overall redundancy at a universal level.

It's interesting to note that products and services are built through knowledge. Since the dawn of human history, someone, somewhere in the world, creates new knowledge, and it takes a while for that knowledge to reach everyone on the planet. We can term this the process of knowledge distribution. Many times, due to slow distribution, others manage to figure out the knowledge on their own before receiving it from the original source. For instance, 10,000 years ago, when knowledge distribution was extremely slow, major human settlements around the planet discovered agriculture independently.

However, knowledge distribution has been continually improving. The invention of written books, the printing press, radio, and television are all technological shifts that have accelerated it.

The internet and search engines have greatly boosted this ancient process. A chef publishing a recipe blog in Venezuela can be read by a housewife in Bangladesh. A Japanese individual facing issues writing a regex program can look up a solution penned by a Serbian coder. Search engines eliminate much redundancy by allowing people to instantly search and consume existing knowledge. As a side effect, they also increase focus on building upon that knowledge. The less time redundancy there is, the fewer duplicated efforts are spent on reinventing the wheel.

LLM models like chatGPT and GitHub Copilot take this one step further. Their understanding of the world’s online knowledge is deeper than Google's, so they serve answers more precisely to input queries. While Google provides high-level internet destinations (owned by humans) most likely to contain the answer, chatGPT directly delivers the answer.

LLMs are similar to Google in that they have learned answers from the same sources, but they provide answers with far less user effort. They also reduce redundancy from the supplier side; for instance, 100 websites don't need to exist to convey the same piece of information. Consider searching for "best places to visit in Spain." The human labor, marketing efforts, and server compute costs that went into creating and maintaining these duplicate information portals must be enormous.

Of course, LLMs won't completely replace search engines, as there will be areas where directly accessing web pages will be more relevant. However, for many other use cases, search engines might start to feel time-consuming.