General Discussion
In reply to the discussion: So, I'm gonna stir the pot a bit... don't be too harsh... [View all]mike_c
(37,051 posts)Except for the occasional prodigy, that's the way humans learn, too. I learned my profession from the work of others, and I pay attention to new developments achieved by others so I can mimic their successes in my own work. I developed whatever sense of aesthetics I have by viewing and listening to the work of others. I judge AI performance by comparing it to what I know about the work of other humans. That's how we all learn-- AI's simply automate the process and accomplish it faster, often on the fly. If you detest AIs learning from human accomplishments then you have to detest most human education as well, because that's how it works. We're not all innate prodigies, and even those who are still depend upon education-- "scraping" the work of others-- for the rest of their knowledge base.
I have no quibble with training LLMs with human examples. That's exactly how we train ourselves, too. It's not surprising that we developed a tool that mimics our own learning process.
That said, remember that current AIs don't "learn" anything. If an LLM says that an apple is red, that might be true, but it doesn't mean the AI knows what an apple is or what red looks like. All it does is predict the statistical likelihood of "red" being the correct response when the textual context involves the "color" of "apple." Does a wrench know anything about bolts? Can a ladder steal understanding of "elevation" from other tools?