the weirdest part about DTF St. Louis to me as an STL native is the made-up neighborhood of Twyla. like… that place is as buroughed up as could be, y’all didn’t have to invent your own white-collared enclave lmao
the weirdest part about DTF St. Louis to me as an STL native is the made-up neighborhood of Twyla. like… that place is as buroughed up as could be, y’all didn’t have to invent your own white-collared enclave lmao
sickos:
this was clearly written by the same staffer who updated Ahmadinejad's Wikipedia entry
i need to work on getting my data pipeline framework to a releasable state, for no other reason than that amanda will kill me if i keep talking to her about "my pipelines"
“in the announcement of his retirement, the representative did want his constituents to know that he is not, in fact, a big baby”
was this headline written by somebody Owens cut off in traffic or stiffed on a tip? they did the man hilariously dirty here, i love it
tux is considering which window he should jump out of
i can't wait to marry her
huffin' a whole can of copium 2nite
*did i stutter?*
Followed back — should be fixed!
this is the exact analogy I use when trying to describe to people the way it's shifted lots of engineers' jobs. when you garden, you're not physically growing out a plant — it can do that on its own. your job is to maintain the environment and make sure things are growing in a healthy way
in case you're wondering — lawmakers weren't prescient, this just... used to happen
during the AIDS crisis, it was legal to buy the life insurance payout of patients (e.g. we'll give you $10,000 now for your $100,000 payout), and then use those earnings to lobby against HIV research
A final post for clarity — obviously the initial 5 paragraph request didn’t hit the 32K context window. This is the rest of the test query.
Poor sucker got about 120 paragraphs in before it crashed out.
No problem! That is, of course, a *wildly* rough benchmark that isn’t tracking actual GPU usage, so other queries may use more or less active model layers.
I don’t think that folks realize that a machine that could run a 2025 AAA game is probably enough to run a capable local model
for a cap-out reference, my most recent AlphaFold2 runs are very likely the heaviest load my home server has had in awhile, pushing both cards up to ~175W each, bringing the whole shindig up to ~450W.
This is a pretty loose experiment, but on 2x3060.12GB:
Running gpt-oss:20b up to a 32k context window draws about 65W. Running New Vegas at 2560x1440 seems to draw about 80W.
Stats are on the blue line, on a machine drawing 75W at base:
that’s also one of the wildest things about local LLMs. i’ve had people ask if i’m concerned about running my own AI workstation and it’s like… *believe me*, OSS-GPT for 30 minutes can’t hold a candle power-wise to my New Vegas binges
it is absolutely insane to me that i can run dual instances of AlphaFold2 on a machine that could barely crank out Borderlands 4 at 60fps without flickering
Hey! Feel free to send a DM 😄
one of the most important lessons that i've learned in my career is that dark mode is nice for day-to-day use, but light mode is *way* better for demos
also, it was a very nice opportunity to take some time and make the annotation viewer... not completely suck? it's still horrifying on mobile — but getting there. slowly.
this one is truly chicken noodle soup for the AInxiety-ridden soul — certainly one of the denser sets of annotations, but highly worth a read if you need something to feel reassured about being human.
a huge shout-out to @stevekrouse.com for pointing me to the original source!
tempted to get this framed
a weird case: Apple was found liable for an IP violation of a medical device company’s blood ox model, so for some watches, they legally can’t have the watch do the statistics — the raw data has to be sent to your phone, where they *are* allowed to run the model
this is why, sometimes, older watches can get updates that provide new data — they build a new statistical model that can use the old watch’s sensors to approximate something new
ooooo! not a medical person, but a stats person who has worked on health data projects.
we don’t really know, because it’s a proprietary model. the strategy is to stuff the watch full of sensors, like the accelerometer, blood ox, mic, etc. and then fit a model that guesses the rate
this is my favorite driving thought. like, damn — humanity pulled off a single uninterrupted road connecting me and like 90% of everybody else on the continent
The “$22 + tax” in the link preview makes this feel like the blue shark’s PPV OnlyFans content
it’s not resting hate face either — if i tried to pet her right now she’d 100% hiss at me