

In better times, this would at least get a class action.
“then” is used to depict time, sequence or a causal relationship. “than” is used with comparative adjectives, to depict comparison.


In better times, this would at least get a class action.


I am considering using ???.
Will get to it someday.


I thought a ‘+’ sign was doubly symmetric.


Maybe their god wanted some peace and quite and thought that if they were to annihilate each other, that might lead to fulfilling the requirement.
Yeah.
Although I usually tend to send a link directly to the post (which is relevant to what is being discussed), the things around that might change their impression. And considering that their is more political stuff than plain tech stuff, almost everywhere on the internet rn, that kind of a result is expected.
Honestly, it is not fun being flattered in a way that makes me try to give them an answer that they will ignore.
Imagine a C++ compiler with feelings, reading your code ignoring the return value of a [[nodiscard]] function.
Ah well, I am not good at that.
The best I have gotten people to say is how I “know so much” about stuff at work and the best I can point them to is Wikipedia, StackOverflow and the like, which of course they aren’t really interested in doing and their lines are probably just a way to try and flatter me.
I advertised Lemmy to my friends a few times and they have now stopped replying to my messages :P


When you’re talking about the PCIe peripheral, you’re talking about a separate dedicated graphics card or something else?
Yes, similar to what a PCIe Graphics Card does.
A PCIe slot is the slot in a desktop motherboard that lets you fit various things like networking (ethernet, Wi-Fi and even RTC specialised stuff) cards, sound cards, graphics cards, SATA/SAS adapters, USB adapters and all other kinds of stuff.
I guess the main point of NPUs are that they are tiny and built in
GPUs are also available built-in. Some of them are even tiny.
Go 11-12 years back in time and you’ll see video processing units embedded into the Motherboard, instead of in the CPU package.
Eventually some people will want more powerful NPUs with better suited RAM for neural workloads (GPUs have their own type of RAM too), not care about the NPU in the CPU package and will feel like they are uselessly paying for it. Others will not require an NPU and will feel like they are uselessly paying for it.
So, much better to have NPUs be made separately in different tiers, similar to what is done with GPUs rn.
And even external (PCIe) Graphics Cards can be thin and light instead of being a fat package. It’s usually just the (i) extra I/O ports and (ii) the cooling fins+fans that make them fat.


I guess an NPU is better of being a PCIe peripheral then?
And it can then have their specialised RAM too.


Aren’t there already regulations for casinos and the like?
Might as well apply the same to these. Then all lootbox games will become adult only.


Makes sense, considering DDR4 has only gone up 2x.
Though now I need to but the motherboard sooner than later, lest there be no good stock by the time I get to it.


It’s not a privacy problem.
It is a stalking problem.
We’re using the wrong words.
If we end up getting privacy in public, the police will then use it to stop people from filming them in public. That is the long-term goal of setting this in motion.


Might as well use it to track ICE


I’m pretty sure that the “non entitled to privacy” part was not about getting organisationally stalked, but that if someone were to randomly take a picture outside and post it somewhere, then you don’t get to make them take down photos.
Also, if you are creating a scene in public, other get to film you as they get to see you.
This is not a problem about privacy in public. This is a problem of:


Add a compile flag!


Well, the server ECC variant is still pretty useful for desktop workloads. Just make sure AMD always supports it in the next generations. If it’s still a DIMM, then it can be sold right away.
GDDR7, again, if the chip has the required pins as in GPUs, then GPU manufacturers can simply buy them, test them for a few hours maybe, and pop them in their lineups with a bit of re-calculation of traces (in case the exact pinout differs). Of course you get some re-soldering damage, but there’s not much you can do about it. On the other hand, if the GDDR7 is in GPUs already, most the companies would require is to alter the firmwares a bit and sell refurbished units.
HBM2. Seems like it is possible to get slottable modules with HBM2. Pretty sure some industrious people in China will find a good use for them. Perhaps with RISC V processors?
And the AI specialised units shouldn’t be fully useless either. Remember the cancer studies case?
It is still useful computing ability that can be used well by those who know how.


Maybe they just realised in time about the upcoming RAM price hike and setup both to each other, so that either one becomes an excuse for the other.


Are they using the same ICs in the AI modules as they are in DIMMs?
If yes, then we can still hope for some level of a 2nd hand market, which may at least manage to be lower than the max at that point.
I see ‘******’ though.
Maybe it’s just a different interface.