- cross-posted to:
- hackernews@lemmy.smeargle.fans
- hackernews@derp.foo
- cross-posted to:
- hackernews@lemmy.smeargle.fans
- hackernews@derp.foo
I guess we all kinda knew that, but it’s always nice to have a study backing your opinions.
I guess we all kinda knew that, but it’s always nice to have a study backing your opinions.
Before we had Google, we had Altavista and before that we had indexes like Yahoo. Maybe we should consider going back. With the help of AI (I know…) it seems feasible to keep up with the ever growing content.
You can’t really go back. Those old engines worked on more naive algorithms against a significantly smaller pool of websites.
The more modern iteration of Altavista/AOL/Yahoo has been the aggregation sites like Reddit, where people still post and interact with the site to establish relevancy. Even that’s been enshittified, but its a far better source than some basic web crawler that just scans website text and metadata for the word “Horse” and returns a big listical of results based on a hash weighted by number of link-backs.
That system was gamed decades ago and is almost trivial to undermine in the modern moment. Nevermind how hard you’d need to work to recreate the original baseline hash tables that these old engines built up over their own decades of operation.