Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
In reply to the discussion: So, I'm gonna stir the pot a bit... don't be too harsh... [View all]genxlib
(6,135 posts)10. Except this time
The buggy whips are people.
We are not ready for having vast swaths of our populace rendered economically obsolete.
I could see hitting double digit unemployment in the next couple of years. Rough time when it goes over 10%. Societal breakdown when it goes to 20% or beyond.
Edit history
Please sign in to view edit histories.
Recommendations
13 members have recommended this reply (displayed in chronological order):
89 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
RecommendedHighlight replies with 5 or more recommendations
It is absolutely NOT better for AI to tell you about your symptoms than Google
Ms. Toad
23 hrs ago
#76
Those AI overviews are stealing traffic from the websites they stole the information from, and the
highplainsdem
Yesterday
#35
It still hallucinates. All genAI models do. It can hallucinate at any time, and for that reason its
highplainsdem
Yesterday
#47
And current LLMs do exactly the same thing as Google search or YouTube algorithms. You just don't realize it.
paleotn
23 hrs ago
#66
If you mean generative AI, the kind most hyped now, it's badly flawed tech based on stolen intellectual property,
highplainsdem
Yesterday
#20
It works - to the extent it works when it's mindless and will always hallucinate - only because of IP theft.
highplainsdem
Yesterday
#29
The AI companies who felt they had a right to take everyone else's IP have been quick to scream if
highplainsdem
Yesterday
#40
I'm in favor of creatives owning their intellectual property, and that right being protected. It's as
highplainsdem
22 hrs ago
#84
Legal judgments aren't always ethical, as everyone here is aware. Creatives and those who support
highplainsdem
22 hrs ago
#87
The problem is not a fork or a knife, the problem is who has it in their hand...An assassin with a knife is very
Escurumbele
Yesterday
#27
I agree. I've been saying this about computers for decades. However, I think most of us agree that IA should be
Martin68
Yesterday
#34
True, AI by itself is benign. The companies controlling it, however, are not.
tinrobot
Yesterday
#45
I sometimes stir a pot in the kitchen and then walk away until dinner is served
Soul_of_Wit
Yesterday
#60
I do agree with you there. One of my smartest friends, a tech professional, thinks like Joinformill.
Scrivener7
Yesterday
#55
AI can be rejected - and should be, by ethical, smart people who have any choice in the matter.
highplainsdem
Yesterday
#59
It's genAI being hyped and used most widely. Which is why people need to know about how harmful
highplainsdem
23 hrs ago
#63
Not like a fork: like a cruise missle with a spork instead of a warhead.
JustABozoOnThisBus
Yesterday
#54
And using AI harms human intelligence. See this thread on yet another article about that:
highplainsdem
23 hrs ago
#67
Sadly, few people are fully able to tell when AI provides facts or fallacies.
MineralMan
23 hrs ago
#69