ia800809.us.archive.org/29/items/Bau...
ia800809.us.archive.org/29/items/Bau...
Chess is the best use of duolingo
996 exists to capture the life force that you would put in to your own side projects
LMK if you need help writing the initial design doc. I think this would make waves in the neuro space, too: see alex.merose.com/neuro-dbs/, github.com/NeurodataWit...
You would make a lot of people's heads explode in the climate & weather + bio spaces if you remade Xarray.dev in Rust like this. We'd use it at OA, to say the least.
The things people offer as examples of LLM disruption in academia often strike me as insights about the way things already work. E.g. βNow we can mass-produce mediocre papersβ or βNow the value of an RA is not that they do the grunt work but that they bring a different perspective.β
How Iβve been feeling lately, given the news
youtu.be/frAEmhqdLFs?...
I just realized I've spent most of my career working remote.
Iβm migrating my town community grange away from wordpress. The version they use is so old that they donβt have a media export capability. While I build the static site, I had claude write a big js function I could paste in the chrome console to download all the media files.
I make a lot of the best changes to my side projects when doing the dishes.
Eisenhower
My job is to figure out how to allocate $20 worth of tokens per month.
The hardest part of using Perceiver transformers is remembering the i before e rules.
Do I say goodbye?
Everything is on autopilot for the night. Is there anything else you'd like to work on, or shall we wrap up this session?
First time I had a session with Claude that had a natural wrap up.
With AI coding agents turning us all into wizards, the biggest problem we have is not any problem per se, but which problem is worthy of our time.
This is one major reason why Iβm excited to work at openathena.ai. Working on fundamental scientific challenges are the ultimate benchmarks.
I believe itβs been the latter for a long time.
en.wikipedia.org/wiki/Bullshi...
banger
load-bearing @xkcd.com
This is awesome. Congrats, Ted!
My cat has never been happier.
One of my favorite math jokes:
What does the βBβ stand for in βBenoit B. Mandelbrotβ?
βBenoit B. Mandelbrotβ
Will take a deeper look later; thanks for sharing
Forgive me, I'm not sure how to fork your code. You may want to add this contextmanager to cast to bfloat16 to make things faster:
```
with torch.autocast("cuda", dtype=torch.bfloat16):
# torch code in, say, a fwd pass.
```
Reading the script quickly, I have a few ideas about how we could scale this up.
Iβd love to work on this with you. Have you heard of the TPU Research Cloud?
sites.research.google/trc/about/
I think this may address our compute power limitations
Wow, I didnβt realize! One day, Iβll make my fractal sculpture. I have a dream of making a series of sculptures named after classic neural networks, like βMNIST digitsβ or βhotdog/not-hotdogβ.
You can use Xarray-SQL in Python. Just `uv pip install xarray-sql`. Enjoy.
github.com/alxmrs/xarra...