Hacker Newsnew | past | comments | ask | show | jobs | submit | throwaway77385's commentslogin

> spinning disks have been replaced by NVMe solid state drives with near-RAM I/O bandwidth

Am I missing something here? Even Optane is an order of magnitude slower than RAM.

Yes, under ideal conditions, SSDs can have very fast linear reads, but IOPS / latency have barely improved in recent years. And that's what really makes a difference.

Of course, compared to spinning disks, they are much faster, but the comparison to RAM seems wrong.

In fact, for applications like AI, even using system RAM is often considered too slow, simply because of the distance to the GPU, so VRAM needs to be used. That's how latency-sensitive some applications have become.


>for applications like AI, even using system RAM is often considered too slow, simply because of the distance to the GPU

That's not why. It's because RAM has a narrower bus than VRAM. If it was a matter of distance it'd just have greater latency, but that would still give you tons of bandwidth to play with.


You could be charitable and say the bus is narrow because it has to travel a long distance and this makes it hard to have a lot of traces.

It's not. It's narrow even between the CPU and RAM. That's just the way x86 is designed. Nvidia and AMD by contrast have the luxury of being able to rearchitect their single-board computers each generation as long as they honor the PCIe interface.

It is also true that having a 384-bit memory bus shared with the video card would necessitate a redesigned PCIe slot as well as an outrageous number of traces on the motherboard, though.


Traditionally, the width of the GPU memory interfaces was many times greater than that of CPUs.

However the maximum width in consumer GPUs, of up to 1024-bit, has been reached many years ago.

Since then the width of the memory interfaces in consumer GPUs has been decreasing continuously, and this decrease has been only partially compensated by higher memory clock frequencies. This reduction has been driven by NVIDIA, in order to increase their profit margins by reducing the memory cost.

Nowadays, most GPU owners must be content with a memory interface no better than 192-bit, like in RTX 5070, which is only 50% wider than for a desktop CPU and much narrower than for a workstation or server CPU.

The reason why using the main memory in GPUs is slow has nothing to do with the width of the CPU memory interface, but it is caused by the fact that the GPU accesses the main memory through PCIe, so it is limited by the throughput of at most 16 PCIe lanes, which is much lower than that of either the GPU memory interface or the CPU memory interface.


ThreadRipper has 8 memory channels versus 2 for a desktop AMD CPU. It's not an x86 limitation.

"x86" as in the computer architecture, not the ISA. Why do you think they put extra channels instead of just having a single 512-bit bus?

The memory interface of CPUs is made wider by adding more channels because there are no memory modules with a 512-bit interface. Thus you must add multiples of the module width to the CPU memory interface.

This has nothing to do with x86, but it is determined by the JEDEC standards for DRAM packages and DRAM modules. The ARM server CPUs use the same number of memory channels, because they must use the same memory modules.

A standard DDR5 memory module has a width of the memory interface that is of 64-bit or 72-bit or 80-bit, depending on how many extra bits may be available for ECC. The interface of a module is partitioned in 2 channels, to allow concurrent accesses at different memory addresses. Despite the fact that the current memory channels have a width of 32-bit/36-bit/40-bit, few people are aware of this, so by "memory channel" most people mean 64 bits (or 72-bit for ECC), because that was the width of the memory channel in older memory generations.

Not counting ECC bits, most desktop and laptop CPUs have an 128-bit memory interface, some cheaper server and workstation CPUs have a 256-bit memory interface, many server CPUs and some workstation CPUs have a 512-bit memory interface, while the state-of-the-art server CPUs have a 768-bit memory interface.

For comparison, RTX 5070 has a 192-bit memory interface, RTX 5080 has a 256-bit memory interface and RTX 5090 has a 512-bit memory interface. However, the GDDR7 memory has a transfer rate that is 4 to 5 times higher than DDR5, which makes the GPU interfaces faster, despite their similar or even lower widths.


I can't edit my comment, but to the people responding here, thank you for adding all this information. It really helped elucidate why VRAM vs RAM is a distinction and also prevents my somewhat naive interpretation from being the only thing people see. Thanks!

How does this work for the myriad banks I've had to prove my identity to in the same way? I'll be attempting steps 1-4 and see what Persona comes back with.

To report back on this, I contacted the various email addresses given in OP's article.

For people with GDPR rights, this link helps make a DSAR (though interestingly the US and many other countries are also available from the country dropdown, maybe they follow these rules everywhere): https://withpersona.com/dsar

This of course brings up the problem of having to verify the ID of the person who's requesting their ID to be deleted from their DB. So I am probably going to stop here.

I also received a separate email stating that my data was already scheduled for deletion if I used Persona through LinkedIn.


I'm reasonably sure funding for the arts is globally as low as can be.

If we applied the rule of "it has to be good to be worth it" and money is the main indicator for "good", then what about the myriad products and services that are low quality / terrible, yet make tons of money because they can afford to shove marketing down everyone's throats and thus stay relevant?

Most popular music is downright awful to me. Do I want to take their money away because I don't think they deserve it? No. On the other side of that coin I'd like to see some kind of counter balance. How many artists were considered awful until they suddenly became the biggest deal ever? Often posthumously.

This tiny sliver of funding for some people you may not like won't take anything away from you.

The early internet was so great because it was full of weird things. We've lost ALL of it, due to commercialisation. We stand to lose even more if we don't do something to fund the people who dare to be weird.

This goes right back to the thing Bezos said about how we need to become interplanetary so we can inhabit the galaxy, because if we inhabit the galaxy we could have a thousand Mozarts. I think we could already have a thousand Mozarts if they weren't busy slaving away in Amazon's warehouses.

Once there's a trillion humans in the galaxy and they're still all slaving away in warehouses, we still won't have any Mozarts.

Not everything can or should be quantified by money and economics.


Yuck.

https://news.ycombinator.com/newsguidelines.html

> Be kind. Don't be snarky. Converse curiously; don't cross-examine. Edit out swipes.

> Comments should get more thoughtful and substantive, not less, as a topic gets more divisive.

> When disagreeing, please reply to the argument instead of calling names. "That is idiotic; 1 + 1 is 2, not 3" can be shortened to "1 + 1 is 2, not 3."

Honestly, if you just made your profile a day ago to yell overly confident and meaningless statements into the void, like a Mandrill in the jungle trying to shout over all the others, go back to LinkedIn, they like that kind of stuff there.

I even agree that AI has a place in our world and can greatly increase productivity. But we should talk about the how and why, instead of attacking others ad hominem and just stopping any discourse with absolutist nonsense.


Things that can only be used by an exclusive elite don't tend to survive, unless we're talking super-yachts.

AI is only going to work if enough people can actually meaningfully use it.

Therefore, the monetisation model will have to adapt in ways that make it sustainable. OpenAI is experimenting with ads. Other companies will just subsidise the living daylights out of their solutions...and a few people will indeed run this stuff locally.

Look at how slow the adoption of VR has been. And how badly Meta's gamble on the metaverse went. It's still too expensive for most people. Yes, a small elite can afford the necessary equipment, but that's not a petri dish on which one can grow a paradigm-shift.

If only a few thousand people could afford [insert any invention here], that invention wouldn't be common-place nowadays.

Now, the pyramid has sort of been turned on its head, in the sense that things nowadays don't start expensive and then become cheaper, but instead start cheap and then become...something else, be that more expensive or riddled with ads. But there are limits to this.

> People who are cut out to be software developers

You mean the people AI is going to replace? What's the definition of 'cut out to be' here?


> Now, the pyramid has sort of been turned on its head,

It has, and the financial system enables that, the self-restraint that was promised at the time of gutting Glass-Steagall never materialized.

> But there are limits to this.

Yes and no.

Yes, because there are limits as there's a limit to the load placed on a ship, if you overload it with some cargo, something or somebody must be thrown overboard in order to preserve the ship.

No, because those who're responsible for loading and overloading the ship are the ones commanding and steering it. When they overload the ship they get to throw you overboard and keep your stuff too... there's nothing to compel them to stay within limits and everything to tempt them to do the opposite.

We've been already thrown overboard with regard to hardware purchases, that will spread to other areas with the BS AI excuse.

So many comments, other than yours, around here engage in deep thinking about the profits or losses of Captain Ahab... they miss the point entirely.


Salient point, regarding the 'no' bit. I agree completely. But since I was responding to a likely troll, there wasn't much point elaborating further. Thanks for the added information :)


Valid point, but would be stronger if 'their' had been spelled correctly.


Fixed. This is the type of comment that makes HN good. Fact-checking and spell-checking, and critical thinking.


Ads, which are the sole reason for the attention-grabbing-at-all-costs society we find ourselves in, are, in my opinion, one of the greatest cancers to ever befall us.


Ads are information. They're made up of fact and opinion. The facts are valuable. I would like to know if there's a new pizza place that opened in my town. We all, by necessity, have to buy lots of things in life, and we should know what the options are. We're also adults who can separate the fact that a pizza place exists from their biased claim that it's the best pizza.

We don't need to go overboard with calling advertising cancer. As is usually the case, we can ignore the most extremist takes. Ads are annoying more often than useful, but you can say that about lots of things in life.


Ads are to information what propaganda is to objective reporting. Informative ads used to exist, e.g. the content of the venerable Computer Shopper magazine was mostly ads and quite informative. What changed? Well, those Computer Shopper ads mostly consisted of lists of bits and parts and widgets followed by their sales price, some contact information and that's it. Not so for the blithering idiocracy which is the 'modern' advertising industry where it is all about lifestyle and image and signalling and sex and anything else except for just saying 'buy our widget for €XX.yy a piece, 10% off when buying 3 or more'. Nope, instead of an informative list of widgets and gizmos we get a diverse couple - black man, white woman - smiling happy smiles because of ${reasons} which have nothing to do with whatever they're trying to peddle. Add some bullshit about sustainability and building better worlds together and such, drape it in a rainbow flag and done, here's your ad for those ramen noodles. Oh, you're selling cars instead of noodles? No problem, we'll ask the diverse couple to eat their noodles in a parking lot. What, no noodles? Fine, let them starve in the parking lot, smiling happy smiles because of $reasons. We'll throw in an angry fool of a white man who can be told off by the kind and wise black man, that'll sell those noodles - ehhh sorry, cars. Yes, cars, or was it bathroom slippers? Doesn't matter. Here's your ad, now pay us.


This is why ads should be something you actively look for, not something that is shoved into your eyeballs on every medium conceivable.


If they'd protected their knowledge from AI crawlers before it was too late, they might stand a chance, but in this climate, they're just adding nails to their coffin.


Broadly, I agree with your sentiment. As soon as some people rule over others, given enough time, things creep towards total enslavement and disenfranchisement of the others. This has been proven over and over.

The question then becomes, how do we organise society instead?


The question is why we have to work 8h a day to begin with. Or why we don't earn more.

If productivity goes up, something has to give. We either work less or we earn more.

If productivity goes up and we work the same amount of time for the same amount of money (and let's not kid ourselves, if anything we'll end up working more time for less money), the social contract has been broken.

I don't care how rich some outlier becomes, so long as it isn't at the sacrifice of our own self-actualisation. But that is exactly what is (and has been since the 70s) occurring. That trend is unlikely to reverse and it won't lead anywhere good.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: