Had there been (years ago) a consistent way to have chat history and multi-client login (like mobile and desktop), I think I'd be using IRC rather than matrix/discord/slack/zulip.
Had there been (years ago) a consistent way to have chat history and multi-client login (like mobile and desktop), I think I'd be using IRC rather than matrix/discord/slack/zulip.
Is there a local first CI yet? It'd be nice if there was a cryptographic way to ensure CI checks were actually ran on someone else's computer. Someone surely is puttin' this "on da chain"
I sometimes just feel like I'm here training the AIs with stuff like this. Not sure how I feel about it.
I wrote some words on how one could write iOS app UI Tests in rust. simlay.net/posts/2026-0... Could have just as easily titled it "hacks on hacks". Also, I should make my site more mobile friendly.
I published a release of coreaudio-rs today. github.com/RustAudio/co.... This release adds tvOS and visionOS and fixes a leak from some of the objc-rust glue. This is a minor release so it'll take a bit to get into cpal and then eventually something like bevy. #rust
The general thing I've learned about the ecosystem is that it's a bit of the blind leading the blind in the github issues. You find yourself with some niche issue but all the other people commenting don't give any extra context as they're not usually engineers.
I've not tried opencode yet, but Zed's Agentic thing works okay. There's still some quirks. Like I told it to add a function to a file but it blew the whole thing away.
I've found reasonable performance on a 7 year old 1080Ti on linux (this required compiling llama.cpp) and my 2021 M1 Pro Max.
I've tried Ollama, LM Studio and now llama.cpp. Ollama's pretty easy and performant as an application but doesn't expose much of the details. LM Studio gives a lot more customization but is slower and llama.cpp is the most configurable and hardest to use but it is performant.
Itβs becoming clear to me that AI is mostly cool if I have it and no one else does.
I avoid iot stuff because the companies behind them frequently get acquired and the software becomes abandonware. With vibe coded projects, you get that just without VC's, customers and ~5 years of actual support.
Claude et al. really has turned so much software into fast fashion.
That dumb $20k walking robot could do this. Just walking around random fields with no friends in Santa outfit looking sad.
I've been trying out Ollama to self-host LLMs on my archlinux box. The thing I've noticed is how many non-engineers are creating/commenting on the GitHub issues. It kinda feels like the blind leading the blind.
Not that you should be taking feature requests but there are many of times I've wanted to see the dependency with a toggle of a feature flag on a crate. It'd be neat one could show which of the sub deps have a feature flag that changes the sub-sub deps as well.
I have definitely done `cargo tree | vim -` so that I could "fold"/delete various dependencies I'm not trying to look at. It's still quite cumbersome of course.
I use signal to text people far more than I use the native sms/iMessage client and Iβm yet to see anyone use the βStoriesβ feature.
Yeah. I definitely had a job at a place writing Python/Javascript and there was a project that was a perfect fit for Rust and was one of the first people to write rust there.
In an early iteration of the internet, we built web APIs allowing engineers to write software against it and automate stuff. These days, the best (least bad) way to do this is wire up playwright and an AI agent and pray or something.
Yeah, I tried to use ROS in 2019 wanting to write components in rust. It's so big and so difficult to administrate, I gave up.
I've not used Bazzite but I use steam on archlinux and then SteamLink on my apple TV (attached to a $30 goodwill TV) and some generic bluetooth controllers. This rube goldberg machine works quite well.
A very old tube of silicon calk that exploded out the side.
Visiting parents and their aging house is an exercise in yak shaving. Replacing a bulb in a yard light, I bumped the siding and it fell off, fixing the siding required some sealant, all the sealant was old and it exploded out the side. Going depth first would mean a dark work site with broken glue.
From the ~1 hour I played with the apple container stuff, I was pleased. It worked almost identically to the docker. I think that it's got better support for container-in-container stuff so in theory one might be able to get local macOS CI to work. I'm tempted to give this a crack I don't love go.
I upgraded when this trash came out and then downgraded. From what I can tell, the only real benefit of Tahoe is the apple container (github.com/apple/contai...) being supported first class.
rust-analyzer now fully uses the new trait solver! ππ rust-analyzer.github.io/thisweek/202...
I really need to keep a notepad of future tweet/skeet/toots. I don't need my recently learned lessons to trigger repercussions in my current situation.
"Watching" smaller dependencies on GitHub has taught me so much useful random crap. Basically any vim plugin or non-mainstream repo I use, I follow.
There's also the fact that tokio spawn takes a future that impl's Send docs.rs/tokio/latest..., but JsFuture is !Send - docs.rs/wasm-bindgen...
There's a bunch of rough edges with rust browser stuff. Like, when using (or having a dependency that uses) `tokio::time::Instant::now()` in the browser and find a nice panic as rustc stub's `std::time::Instant::now()` as `unimplemented!`.
When constructing tuple structs, you can use curly braces with integer keys. These are identical:
let x = MyType(1, βhelloβ, true)
let x = MyType{1: βhelloβ, 0:1, 2:true};
Days since I've superglued my fingers together: 0