

And I wouldn’t consider 8% of all prospective sales to be a joke
Don’t put words in my mouth. I never said it’s a joke.
It kinda just irrelevant, if you target mass market. But not a joke.


And I wouldn’t consider 8% of all prospective sales to be a joke
Don’t put words in my mouth. I never said it’s a joke.
It kinda just irrelevant, if you target mass market. But not a joke.


all-time high
It’s 3.2%
I mean, sure, big headline is cool. But it’s 3.2%
How that weirdo in jacket once said?
The more you buy, the more you save
Do you wanna talk about it? Do you perhaps need a pawjob?
Lemmy not without problems too. Biased weird mods, small user base = not a lot of content and platform itself that lacking in some crucial functionality, like choosing comment layout by default (for example: always showing top comment). It sure are fixable problems, but will they be fixed? Who knows.


Software needs hardware
Can I introduce you to a concept of installing Linux on a dead badger?


You forgot to add text like “me and who?”
Upd. Damn, someone already made this joke in the comments


60+ images
Jokes on you, I wrote own shader that I use in simple script that shows it on background as wallpaper. The future is now, old man!
That makes me feel uncomfortable, it shall be undone


Wait, you mean using Large Language Model that created to parse walls of text, to parse walls of text, is a legit use?
Those kids at openai would’ve been very upset if they could read.


In russia there a specific law for that, allowing to ignore other military laws and restrains provided by contract and not “committing crime due to superior command”. That clause also basically means, that whole russian military, that attacked Ukraine - criminals. Funnily enough, it’s not the only one russian law that ignored in that war, but that beyond topic above.
I’m not strong in American law system, but isn’t there something similar for situations like that?


I once wrote small post on reddit about running FSR4 on rdna3 (via driver emulation hack that devs on linux added, before INT8 version). That poor post was reused by multiple sites with bizarre titles, like “guy on reddit hacked FSR4!” and other similar crap. I’m not sure if it’s even humans writing/doing that, probably some server with llm continuously scrapes google for new posts, rewrite them and post on own sites for engagement.
The future that awaits us sure looks fun
represents minority
tells that other minority shouldn’t be tolerated
Many such cases


Wait, you telling me volatile stock market is volatile?
Who could possibly predict such a thing?
In fact, I’m gonna steal the part about mommy and add it to my profile too. A focking masterpiece.
$20k
Real question is: do I bring my own lube or not?


That propaganda won’t work comrade Ivan, we all know there no gay farmers in Vladivostok. Our wise government made sure of it!
But yeah, jokes aside, it’s currently a crime to support LGBT in russia. Being gay = being rebel against regime, cause real rebels either already dead, in prison or fled. Next in line, if I would guess, would be jews I think.
is there a general term for the setting that offloads the model into RAM? I’d love to be able to load larger models.
Ollama does that by default, but prioritizes gpu above regular ram and cpu. In fact, it’s other feature that often doesn’t work, cause they can’t fix the damn bug that we reported a year ago - mmap. That feature allows you to load and use model directly from disk (alto, incredibly slow, but allows to run something like deepseek that weight ~700gb with at least 1-3 token\s).
num_gpu allows you to specify how much to load into GPU vram, the rest will be swapped to regular RAM.
You’d need ollama (local) and custom models from huggingface.
Half of the charm in using ollama - ability to install models in one command, instead of searching for correct file format and settings on huggingface.
Of course not, your opinion is very important to us!
Please, feel free to voice your concerns in feedback form below: [insert feedback link here]