• 0 Posts
  • 46 Comments
Joined 2 years ago
cake
Cake day: June 20th, 2023

help-circle
  • This can be correct, if they’re talking about training smaller models.

    Imagine this case. You are an automotive manufacturer that uses ML to detect pedestrians, vehicles, etc with cameras. Like what Tesla does, for example. This needs to be done with a small, relatively low power footprint model that can run in a car, not a datacentre. To improve its performance you need to finetune it with labelled data of traffic situations with pedestrians, vehicles, etc. That labeling would be done manually…

    … except when we get to a point where the latest Gemini/LLAMA/GPT/Whatever, which is so beefy that could never be run in that low power application… is also beefy enough to accurately classify and label the things that the smaller model needs to get trained.

    It’s like an older sibling teaching a small kid how to do sums, not an actual maths teacher but does the job and a lot cheaper or semi-free.




  • I bought a Ricoh GR III (not a GR III digital, an actual GR III) off Facebook marketplace for £350 a few years ago (at this point, the GR IIIx hadn’t been released). It was cheaper than most because it was missing the front decorative ring. I added that back for £10-15, though.

    In the year or two that I had it, I realised it wasn’t right for me. It was a lovely camera, but I hated the focal length, the autofocus system always seemed to pick up some random object I didn’t care about, and manual focus with the buttons was between impractical and downright unfeasible. I put it on eBay with an auction start price of £150 or so… And it fetched £450.

    Another year or two later I was thinking of getting a GR IIIx as I’d probably enjoy the focal length more. I looked at eBay and damn, both the IIIx and normal III are selling for £800+, with only some poor-condition outliers below £700.

    Apparently ricoh’s manufacturing hasn’t kept up with demand and their used prices are absolutely bonkers. I don’t think this has happened to Sony (although I wish, as I need to sell my A7ii soon). I think its a bit of a case-by-case basis.











  • Also because, as a person who has studied multiple languages, German is hard and English is Easy with capital E.

    No genders for nouns (German has three), no declinations, no conjugations other than “add an s for third person singular”, somewhat permissive grammar…

    It has its quirks, and pronunciation is the biggest one, but nowhere near German (or Russian!) declinations, Japanese kanjis, etc.

    Out of the wannabe-esperanto languages, English is in my opinion the easiest one, so I’m thankful it’s become the technical Lingua Franca.




  • I’m talking about running them in GPU, which favours the GPU even when the comparison is between an AMD Epyc and a mediocre GPU.

    If you want to run a large version of deepseek R1 locally, with many quantized models being over 50GB, I think the cheapest Nvidia GPU that fits the bill is an A100 which you might find used for 6K.

    For well under that price you can get a whole Mac Studio with those 192 GB the first poster in this thread mentioned.

    I’m not saying this is for everyone, it’s certainly not for me, but I don’t think we can dismiss that there is a real niche where Apple has a genuine value proposition.

    My old flatmate has a PhD in NLP and used to work in research, and he’d have gotten soooo much use out of >100 GB of RAM accessible to the GPU.