• madjo@feddit.nl
    link
    fedilink
    arrow-up
    4
    ·
    edit-2
    1 day ago

    Pretty much all of the ai tools available now have been shown to hallucinate, even if it started out with an internet search.

    I’ve had ai tools spit out real looking URLs that led to 404 pages, because it had hallucinated those links. It’s a place to start your research, to maybe refine your questions, but I wouldn’t trust it much with the actual research.

    An LLM, a large language model, that an ai tool like Mistral is, doesn’t really use knowledge, it predicts what the next logical text is going to be based on information it has been trained on. It doesn’t think, it doesn’t reason, it just predicts what the next words are likely going to be.

    It doesn’t even understand text, that’s why all of them claimed that there were just 2 Rs in strawberry. It doesn’t treat text as text.

    You can use it to rewrite a text for you, perhaps even summarize (though there’s still the possibility of hallucinations there), but I wouldn’t ask it to do research for you.