According to OpenAI, their contract with the US DoW locks in current law, "even if those laws or policies change in the future".
Our legal analysis, with Virgil Law CEO Luke Versweyveld, shows that this is almost certainly incorrect.
www.answer.ai/posts/2026-0...
It'd be interesting to compare this to other companies that have grown as rapidly. While some of the blame is probably on AI coding I think we'd struggle to find a lot of companies that handled this sort of rapid growth without a bit of down time.
For real though, this is a good feature.
You just know we're hours away from LinkedIn posts about "We saved $100k on our CI/CD pipeline by installing Claude code on our production server and now fix bugs faster than ever via WhatsApp"
code.claude.com/docs/en/remo...
This annoys me for 2 petty reasons unrelated to toppings.
1. 'overtaken' only makes sense if you are comparing this to the same chart in a previous year
2. Inter generational implies that this is some change in that cohort of people and suggests these same 18-24 year olds will not flip as they age.
This, but for all software. 5 mins to check changes means at most 96 changes a day if you work at 100% efficiency. You either need to be very confident in your changes or willing to accept failure. This applies to tests, manual and automatic as well as reviewing user flows. Speed matters.
Large and in charge? Not a chance. Medium and full of tedium.
I think the first time they did I'd probably be kinda impressed. Only the first time though.
I continue to be baffled by how clunky the Pandas syntax feels. You can do a lot of stuff but it always feels like such a hack. Years and years of using it and I still feels like I am guessing half the time if what I am doing is going to work.
One thing I wonder about is whether LLMs will result in more or less tiny packages like the famous left-pad.
When it was removed from npm it was 17 lines of code, trivial to implement but previously "not worth the effort"
Is this going to make code supply chains stronger or weaker? Time will tell.
Arrogant Ignorance, the original AI
Knuckles falling in Sonic Blast
*begins intensive interpretative dace in front of a pile of lead*
works on my machine
Any plans to launch on other platforms (Linux in particular)? Looks slick in any case and will give it a go on my laptop.
The stumbling block that has led so many weaker men to start podcasts and talk for 4 hours instead of writing words.
I mean, the dude lives in a trash can. Are we sure this is an accurate statement.
As of right now what would you say is the 'best' model someone with a reasonably recent GPU could self-host and use with open code? Plus, what delta do you think exists between that model and something like opus 4.5 in terms of day to day code output?
If you can substitute "hungry ghost trapped in a jar" for "AI" in a sentence it's probably a valid use case for LLMs. Take "I have a bunch of hungry ghosts in jars, they mainly write SQL queries for me". Sure. Reasonable use case.
"My girlfriend is a hungry ghost I trapped in a jar"? No. Deranged.
I read the same HN comment and imagined cow-ork to be an off brand centaur.
Likely a combination of terrible marketing advice (all content = more SEO) and laziness. Iβd even be willing to bet that half of these were never even seen by their authors and just posted by the companies marketing team on their behalf.
there's vomit stderr already, claude's spaghetti
he's nervous, but assures it's production ready
who decided to call it Secret Santa when Nondisclosure Claus was right there
I wish my house had a huge walk
And once again Valve finds a way to take more of my money.
I canβt help but feel this whole humanoid robot thing is folks trying to inflate a new bubble before the AI one pops
Reply THINKBOI for my secret ChatGPT prompt to turn that 2000 word think piece into seven instantly postable insights guaranteed to get you engagement.