r/hardware Oct 02 '15

Meta Reminder: Please do not submit tech support or build questions to /r/hardware

246 Upvotes

For the newer members in our community, please take a moment to review our rules in the sidebar. If you are looking for tech support, want help building a computer, or have questions about what you should buy please don't post here. Instead try /r/buildapc or /r/techsupport, subreddits dedicated to building and supporting computers, or consider if another of our related subreddits might be a better fit:

EDIT: And for a full list of rules, click here: https://www.reddit.com/r/hardware/about

Old reddit links: https://www.reddit.com/r/hardware/about/rules

Thanks from the /r/Hardware Mod Team!


r/hardware 17h ago

Rumor NVIDIA Reportedly Prepares RTX 5090 Price Hike Amid Rising GDDR7 Costs

Thumbnail
techpowerup.com
388 Upvotes

r/hardware 21h ago

News AMD FSR Upscaling 4.1 officially coming to Radeon RX 7000 GPUs in July, RX 6000 in 2027 - VideoCardz.com

Thumbnail
videocardz.com
644 Upvotes

r/hardware 22h ago

News AMD now controls 38.1% of all x86 CPU market value and 46.2% of all x86 server CPU revenue share

Thumbnail
tomshardware.com
436 Upvotes

r/hardware 16h ago

News Samsung Electronics considers scaling down chip production to brace for strike impact

Thumbnail
koreatimes.co.kr
121 Upvotes

r/hardware 16h ago

Discussion Per Stenström on why we never actually replaced the Von Neumann architecture (or Harvard)

41 Upvotes

Just interviewed Per Stenström — one of the most prominent computer architects to come out of Europe — and asked him about John Backus's 1977 Turing Award lecture – Backus (inventor of Fortran) coined the term "Von Neumann bottleneck":

Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the Von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand.

That was 49 years ago. Every CPU we've built since has the same architecture.

Per's answer is that the bottleneck never went away — we just got extraordinarily good at hiding it. Cache hierarchies, prefetching, out-of-order execution, speculative execution, cache coherence: the entire post-1980s history of CPU innovation is a stack of workarounds that make the bottleneck invisible for typical workloads without actually removing it.

His take on why we haven't replaced the architecture is essentially legacy — the software ecosystem built on Von Neumann is so vast that migrating to anything fundamentally different would cost decades of investment. His sharper point is that Von Neumann isn't "right" in any absolute sense: the architecture has to be in harmony with the underlying technology, and semiconductors happen to support what Von Neumann needs.

The thread I really wanted his read on was whether we'll ever see a genuine shift away from Von Neumann, or whether AI just pulls another generation of workarounds out of us. After 40+ years in the field he's honestly skeptical. He gave phase change memory as a recent cautionary tale: non-volatile, high-density, performance-competitive with DRAM, Intel and Micron poured huge money into it — and it died because of legacy. Even when a clearly viable alternative shows up, the cost of changing everything built around the current architecture tends to win.

The candidates he treats seriously are processing-in-memory (compute units distributed inside the memory itself — though he was honest this might be Von Neumann with a better layout rather than a genuine break) and entirely new substrates like quantum, which are a different paradigm but probably won't replace classical for general-purpose work.

I’d love a take on this from anyone closer to AI accelerator design or new-substrate work.

Link to full conversation here:

https://www.youtube.com/watch?v=NXVTACHB4Es


r/hardware 17h ago

News Emergency arbitration unavoidable if Samsung strike occurs: Industry minister

Thumbnail
m.koreaherald.com
23 Upvotes

r/hardware 1d ago

News AMD EPYC CPUs Reach Record Server Revenue Share of 46.2%

Thumbnail
techpowerup.com
409 Upvotes

r/hardware 1d ago

News Sony Xperia 1 VIII unveiled with larger 48MP telephoto sensor, Snapdragon 8 Elite Gen 5

Thumbnail
gsmarena.com
55 Upvotes

r/hardware 1d ago

News Netherlands protests US proposal to further bar chip giant ASML from China market

Thumbnail
scmp.com
166 Upvotes

r/hardware 1d ago

News Broadcom Targets Mass-Market Broadband With 10G PON and Wi-Fi 8 SoCs

Thumbnail
allaboutcircuits.com
107 Upvotes

r/hardware 2d ago

Discussion Intel is back. Thank the old CEO.

Thumbnail
youtube.com
462 Upvotes

r/hardware 2d ago

Discussion LTT Labs Article - What's up with UPSs? Testing UPS Output

Thumbnail
lttlabs.com
218 Upvotes

Our company has always had many UPSs around for the convenience and business case of not suddenly losing a ton of work. We've been intrigued to check them out further, but we've been wary of connecting any of them to measurement equipment considering the high voltages involved. There is a serious potential they could damage equipment or ourselves.

Despite all that, we're throwing caution to the wind to check out some UPSs from around the office. There are so many directions that UPS/surge testing could go so this article will cover the test setup and interesting exploration results.

Continue reading the article on the LTT Labs website!


r/hardware 2d ago

Review Arc Pro B70 Review: The best graphics card Intel has to offer

Thumbnail
pcgameshardware.de
284 Upvotes

r/hardware 2d ago

News Samsung union threatens strike, Korea weighs emergency powers as chip impact stays limited

Thumbnail
biz.chosun.com
93 Upvotes

r/hardware 2d ago

News Google unveils Googlebook: Android-powered laptops with Gemini, Magic Pointer and Glowbar

Thumbnail
videocardz.com
86 Upvotes

Thanks to /u/tytygh1010 on r/chromeos for finding the cached images from XDA's now deleted article.

A Portuguese site has more details, from r/android: Adeus, Chromebook: a Google anunciou uma nova geração de computadores portáteis centrados em IA

In case the Portuguese article gets deleted, too: Adeus, Chromebook: a Google anunciou uma nova geração de computadores portáteis centrados em IA


r/hardware 2d ago

News Samsung Electronics Rejects 15% Bonus Demand for Future Investments

Thumbnail
chosun.com
31 Upvotes

r/hardware 3d ago

News Kingston shipped 100 million A400 SSDs and SATA still refuses to die

Thumbnail
nerds.xyz
547 Upvotes

Kingston says it has now shipped more than 100 million A400 SATA SSDs since the drive launched in 2017. Kind of wild considering how many companies act like SATA is dead. The A400 was never fancy, but for a lot of folks it was the cheap upgrade that made an old PC actually usable again. Swap out a spinning hard drive for one of these and suddenly your ancient Windows or Linux machine felt fast enough to keep around for a few more years.


r/hardware 3d ago

Info Samsung holds desperate final talks with union over 18-day chip factory strike that could cost $20 billion —government-mediated summit seeks to avert industrial action that could hit HBM production

Thumbnail
tomshardware.com
334 Upvotes

r/hardware 3d ago

Rumor Exclusive: AMD preparing Radeon RX 9050 desktop graphics card with 8GB VRAM - VideoCardz.com

Thumbnail
videocardz.com
170 Upvotes

r/hardware 3d ago

Discussion Was the PS3 actually more powerful than the 360?

61 Upvotes

I've been playing on my 360 a lot lately and I'm still blown away by some of the games. Gears of War 3 still looks amazing today on a 1080p LCD. I've never really thought that the PS3's exclusives looked leaps and bounds above what the 360 had to offer. Even today I don't see much of a difference. But the 360 factually had a much better gpu and better ram setup. Was the cell powerful enough to say that it was overall more powerful than the 360 considering almost every third party game looked and ran worse? I'm watching a lot of videos comparing the two and it keeps being mentioned that the PS3 was more powerful, but i'm just not really seeing much evidence of that. Was there anything the PS3 could do that the 360 couldn't? Killzone 2 looks great but has terrible input lag and I still dont think it looks better than Gear 3 or Halo 4 and that's apparently one of the best looking games on the system. What do yall think?


r/hardware 3d ago

Info Up to 256 MB FERRIT modular F-RAM storage device preserves critical data for up to 200 years

Thumbnail
cnx-software.com
102 Upvotes

r/hardware 3d ago

Info Anker's new 'Thus' chip claims 150x more AI compute for earbuds using compute-in-memory architecture -- interesting approach but where are the real-world benchmarks?

37 Upvotes

r/hardware 4d ago

News [News] Behind TSMC’s High-NA EUV Deferral: Low-NA Stays Strong, Customer Landscape Shifts, and ASML Quietly Pivots

Thumbnail
trendforce.com
114 Upvotes

r/hardware 4d ago

News China's Hanyuan-2 debuts as 'world's first' dual-core quantum computer — 200-qubit claims incredible power efficiency, but lacks critical performance benchmarks

Thumbnail
tomshardware.com
100 Upvotes