Bro just accept the L and quit while you still can
Edit: I just saw the thread 💀
Developer, 11 year reddit refugee
Bro just accept the L and quit while you still can
Edit: I just saw the thread 💀
Yes I do! It’s a pretty great overview that isn’t extremely math heavy
The book is “Deep Learning for Coders with Fastai and PyTorch: AI Applications Without a PhD”
I have a book on learning Pytorch, this XKCD is in the first chapter and implementing this is the first code practice. It’s amazing how things progress.
I’m really enjoying Otterwiki. Everything is saved as markdown, attachments are next to the markdown files in a folder, and version control is integrated with a git repo. Everything lives in a directory and the application runs from a docker container.
It’s the perfect amount of simplicity and is really just a UI on top of fully portable standard tech.
Then you missed where they dropped an opportunity to show a new screwdriver variant coming to LLTStore.com 🤦
Thank you for this, I don’t normally use twitter but I read some people saying the Threadreader app wasn’t up to date with all the comments.
This situation sucks and was something I would have been willing to see through. But after reading the thread from Madison this morning I’ve decided to cancel my Floatplane subscription. While the accusations she makes are currently accusations, they’re pretty damning and worth taking seriously in case they are more than allegations. I await LMG’s response to her thread, as I feel that will be the deciding factor in whether or not I continue to consume and support anything LMG does going forward.
Her thread: https://twitter.com/suuuoppp/status/1691693740254228741
I was using Vanced for around a year, and immediately switched to Revanced when it became available. No issues so far
You could always try Asahi Linux if you’re on a newer MacBook
but if you need me to leave, I can. I get that a lot.
I don’t think OP is suggesting this. It’s simply a reminder to those who have the privilege of having extra income that contributing to the core devs improves the experience for everyone, regardless of their individual ability to contribute.
I’m personally happy to donate if it means everyone gets to continue enjoying the growth of the platform, as the real value of the threadiverse is user activity.
You’re not paying to remove ads from Lemmy. You can continue using Lemmy ad-free on mobile via the mobile site or any of the other PWA’s or native apps. What you’re paying to remove ads from is Sync. The developer has decided that they need to be compensated to sustain the amount of effort developing and maintaining the app requires. If you don’t want to pay that price with cash or your eyeballs then don’t use it.
Nobody is forcing you to use Sync, nobody is forcing you to see ads. The beauty of a platform like Lemmy is you have the choice to use whatever client you want. That doesn’t mean you’re entitled to any of them.
There’s an expression I think about a lot, “You can’t think when you’re hungry”
Unfortunately principles and ideals are calorie-free
We may not like it, but this is what progress looks like?
This is a known bug in 0.18.3, a fix will be in the next release:
One would hope
The short answer is friction. The friction of overcoming the forces of violence the larger class has at its disposal and utilizes at the smallest hint of uprising is greater than the friction of accepting the status quo.
And yet here I am using Revanced, which is even better than Vanced was
Setting aside the obvious answer of “because capitalism”, there are a lot of obstacles towards democratizing this technology. Training of these models is done on clusters of A100 GPU’s, which are priced at $10,000USD each. Then there’s also the fact that a lot of the progress being made is being done by highly specialized academics, often with the resources of large corporations like Microsoft.
Additionally the curation of datasets is another massive obstacle. We’ve mostly reached the point of diminishing returns of just throwing all the data at the training of models, it’s quickly becoming apparent that the quality of data is far more important than the quantity of the data (see TinyStories as an example). This means a lot of work and research needs to go into qualitative analysis when preparing a dataset. You need a large corpus of input, each of which are above a quality threshold, but then also as a whole they need to represent a wide enough variety of circumstances for you to reach emergence in the domain(s) you’re trying to train for.
There is a large and growing body of open source model development, but even that only exists because of Meta “leaking” the original Llama models, and now more recently releasing Llama 2 with a commercial license. Practically overnight an entire ecosystem was born creating higher quality fine-tunes and specialized datasets, but all of that was only possible because Meta invested the resources and made it available to the public.
Actually in hindsight it looks like the answer is still “because capitalism” despite everything I’ve just said.
And you haven’t already quit because you’re on an H1B/GC visa, and so your residence in the US is tied to your employment, effectively making you a corporate owned slave.
Give NixOS a shot. It’s got a learning curve that may be difficult if you’ve never read code, but it’s my preferred immutable setup.
It even has more packages than Arch.
Here’s the video that got me onto it:
https://youtu.be/CwfKlX3rA6E