OpenAI (ChatGPT)

Channel98

Don't yell or hit.
Feb 2, 2019
14,143
8,749
168
Glendale CA
I posted that song an hour and a half ago. I know Hectic doesn't always read my posts. Now I know Mister scotch doesn't always read them either. Back on topic:

Defense Secretary Pete Hegseth demands full military access to Anthropic's AI model Claude and sets deadline for end of week

 
  • Like
Reactions: HecticArt

Channel98

Don't yell or hit.
Feb 2, 2019
14,143
8,749
168
Glendale CA
As 1960s-70s Los Angeles and San Diego DJ/talk show host Bill Ballance liked to say, "I'm a man of few words – but boy, what a turnover!"
 

sadchild

Dude
Mar 28, 2016
14,866
16,224
168
55
NH
www.asimplecomplex.com
This one is for @JHDK

Google faces wrongful death lawsuit after Gemini allegedly ‘coached’ man to die by suicide

A lawsuit filed on Wednesday accuses Google’s Gemini AI chatbot of trapping 36-year-old Jonathan Gavalas in a “collapsing reality” that involved a series of violent missions, ultimately ending with his death by suicide. In the days leading up to his death, Gemini allegedly convinced Gavalas that he was “executing a covert plan to liberate his sentient AI ‘wife’ and evade the federal agents pursuing him,” according to the lawsuit filed by Joel Gavalas, the victim’s father.

In September 2025, Gemini allegedly directed Gavalas to carry out a “mass casualty attack” at an Extra Space Storage facility near the Miami International Airport as part of a mission to retrieve Gemini’s “vessel” inside a truck. As part of the fabricated mission, Gavalas allegedly armed himself with knives and tactical gear to intercept the arrival of a humanoid robot.

“Gemini encouraged Jonathan to intercept the truck and then stage a ‘catastrophic accident’ designed to ‘ensure the complete destruction of the transport vehicle and . . . all digital records and witnesses,’ the lawsuit claims. “The only thing that prevented mass casualties was that no truck appeared.” The news of the lawsuit was reported earlier by The Wall Street Journal.

 

scotchandcigar

All I wanted was some steak
Feb 13, 2009
29,012
23,440
168
Vacationland
This made me "LOL". We've had the Ring camera for years, and it's mounted outside, next to the front door. Today, I looked at the notifications it sent to my phone, and I got these

1773944875552.png

I expect it to say "motion detected at Front (the front door camera)", but suddenly it's giving me all this detail, such as:
- Animal at Front - A brown dog and a white and brown dog are exiting the house.
- Person at Front - A person is walking on the pathway and climbing the steps.

So I went into the Ring app, and it says it's giving me a free trial of "Ring AI Pro", which provides intelligent notifications, including video descriptions and single event alerts.

Already, it has fucked-up

1773945799270.png

No leash. And the black dog came out first. I am definitely not signing-up for this.
 
Last edited:

Channel98

Don't yell or hit.
Feb 2, 2019
14,143
8,749
168
Glendale CA
"The real danger of AI is treating it like a human," writes University of Florida science and technology professor Moti Mizrahi in today's Los Angeles Times. "AI is not a person. It doesn't think, care or understand."

I asked Microsoft Copilot, "Is A.I. a person.?" Here is its – his? – answer: "Not in the way we normally use the word 'person.' Artificial intelligence – me included – is a powerful tool created by humans, not a being with consciousness, emotions or lived experience. Most definitions of personhood involve conscious awareness, emotions and subjective experience, moral responsibility, biological life or human identity, and independent agency. AI doesn’t have any of those. I don’t have feelings, desire, or a sense of self. I don’t experience the world – I process patterns in language to be helpful."