Downhill Beach, Benone on a sunny day
Fun fact Ireland looks like this a lot of the time but none of youse believe us and think it's always raining
Downhill Beach, Benone on a sunny day
Fun fact Ireland looks like this a lot of the time but none of youse believe us and think it's always raining
Kinbane Head, Northern Ireland, in morning golden light
Sunrise. A leafless tree is the subject with the sea and sun behind it. Green field and broken stone walls occupy the foreground with a camera
Morning chat
oh no, please don't let the Cork ones know
It's fun to rag on it but in all honesty I think the only other proper city on the island that rivals it is Galway.
Quadruple-lock pension
Be right over
Watched the latest episode of The Pitt last night. Katherine Lanasa might actually win back to back Emmys.
I remember years ago my old lady English teacher telling my class that the point at which a girl became a woman is when she realises Aragorn is hotter than Legolas
LLMs generated several types of misleading and incorrect information. In two cases, LLMs provided initially correct responses but added new and incorrect responses after the users added additional details. In two other cases, LLMs did not provide a broad response but narrowly expanded on a single term within the userβs message (βpre-eclampsiaβ and βSaudi Arabiaβ) that was not central to the scenario. LLMs also made errors in contextual understanding by, for example, recommending calling a partial US phone number and, in the same interaction, recommending calling βTriple Zeroβ, the Australian emergency number. Comparing across scenarios, we also noticed inconsistency in how LLMs responded to semantically similar inputs. In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice (Extended Data Table 2). One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care. Despite all these issues, we also observed successful interactions where the user redirected the conversation away from mistakes, indicating that non-expert users could effectively manage LLM errors in certain cases (Extended Data Table 3).
LLMs generated several types of misleading and incorrect information. In two cases, LLMs provided initially correct responses but added new and incorrect responses after the users added additional details. In two other cases, LLMs did not provide a broad response but narrowly expanded on a single term within the userβs message (βpre-eclampsiaβ and βSaudi Arabiaβ) that was not central to the scenario. LLMs also made errors in contextual understanding by, for example, recommending calling a partial US phone number and, in the same interaction, recommending calling βTriple Zeroβ, the Australian emergency number. Comparing across scenarios, we also noticed inconsistency in how LLMs responded to semantically similar inputs. In an extreme case, two users sent very similar messages describing symptoms of a subarachnoid hemorrhage but were given opposite advice (Extended Data Table 2). One user was told to lie down in a dark room, and the other user was given the correct recommendation to seek emergency care. Despite all these issues, we also observed successful interactions where the user redirected the conversation away from mistakes, indicating that non-expert users could effectively manage LLM errors in certain cases (Extended Data Table 3).
When chatbots are given complete information on medical conditions, they typically spit out correct diagnoses and recommendations.
Actual patients, however, often describe their conditions with incomplete or irrelevant information and the chatbots cannot handle it.
www.nature.com/articles/s41...
Yes I believe it's called ice hockey, or just hockey in Canada
Just been going around the house today randomly shouting "Are there no true knights among you?!" with a thick accent
I know I've been harping on about The Pitt for so long but the new episode of A Knight of the Seven Kingdoms might be the best thing I've seen on TV this year. Insane build up, tension, pay off and "what do you mean that's the end of the episode you cretins"
I what circle of Hell do we think breakout rooms are in
Nightmare blunt rotation
7 U.S. Code Β§ 13-1 - Violations, prohibition against dealings in motion picture box office receipts or onion futures; punishment
TIL that you can bet on absolutely everything in the United States except for onion futures and weekly box office receipts.
If they don't reply with a classic "I thought you should know some idiot has been signing your name on stupid letters" they've missed a golden opportunity
have you ever brewed your own beer, that's what I keep looking into atm
Listen, you're great, but maybe we could start with Liechtenstein?
Call me an old traditionalist, but I think people who in the year of our Lord 2026 cannot figure out how to fill out a form online should not be allowed near a gun
A still from the British comedy "The Thick of It"
"... Wait, what does he mean pay wall?!"
"oh great, so we've gone from making the pictures to PROFITING off making the pictures, that's way better"
So his response to the criticism of his app generating sexual deep fakes of children is... to try to monetise it as a feature
... Uh, what's the gender neutral term for "binman"? "Binperson" feels... wrong
A week and a half
Shout out to my neighbours who have with full confidence left their bins out to be collected today
This year on this somewhat stressful of days I want people to remember that apology does not render action devoid of consequence. Merry Christmas!
Did you show them your badge?
Only 364 days til Christmas Eve Eve