PC is the last major open platform. While other platforms like Android and becoming less open, PC in general is becoming more open than it's been in a long time as heavy MacOS/Android/iOS competition is creating a focus on open standards and all-time high strong Linux support gives people a place to land and tinker/hack to their heart's content.
I think we will see an abandonment of consumer grade PC components and individuals are either pushed towards closed hardware like Playstation, MacBooks, and Android devices or they are pushed towards server grade components. I already have home sever rack, and would recommend it for other people.
You might be interested in the IBM PC compatible and Wintel wikipedia pages. This is a super high level timeline, but it is more interesting to get into the detail.
At a high level, the IBM PC platform were very well documented & sold well, to the effect of producing tons of software and peripherals add-ons ("PC Compatibles"). This led some other computer companies to reverse engineer the proprietary IBM BIOS, allowing them to run the same software and use the same peripherals. Because these were clean room reimplementations, IBM didn't have a legal case to prevent their sale.
Fast forward a bit, IBM's attempt at a new, closed platform, PS/2, flopped. People wanted their more open hardware. Windows became dominate enough that all the demand was for x86 based hardware that could run Windows. Microsoft was happy to work with many vendors.
The PC is very open today, but Apple survived. Atari ST and Amiga probably survived longer than you think as well.
In the whole history of computing PC is the only platform where buying a computer means crazy number of options and configuration mixes to choose from and expect it to work! And warranty would support it too! You can run any OS of your choice on it and that's also reasonable expectation.
Any other platform (SUN, Be, Amiga, NeXT, Apple) it was always buying it from one company only from its list of products. And even running with a different version of OS means warranty doesn't cover it.
I came back to this comment 12+ hrs later hoping to find someone make a great argument for some platform in the 70s that I didn't know enough about, or maybe a modern open hardware movement that is building niche support.
> I already have home sever rack, and would recommend it for other people.
I just want to warn people who haven't heard server-grade hardware in-person before: this is only for people who can put a server rack somewhere unpopulated like a garage or basement. Servers will make you think "wow, leafblowers sure are quiet". They are not suitable for apartment dwellers such as myself. When I was setting up my 1U before shipping it off to a colo, I wrote scripts and had detailed plans of the things I needed to run so I could minimize the time it was making my ears bleed.
This. At one company we ran out of space in the server room, so the excess machine temporarily landed next to my desk. Dear god. Noise cancelling headphones couldn't cope with the noise.
Yeah, it certainly wasn't the quietest choice for form-factor, but the fact remains: all server grade hardware are not optimizing for noise. They are meant to be running in datacenters, not livingrooms, so noise was never a concern for them. A nice thing about consumer-grade hardware is they are optimizing for both sound and power consumption because those devices are designed to be around humans. So I certainly hope consumer-grade hardware survives.
In my first job we worked in a room full of 4Us and it was always refreshing when we powered them all down for the weekend. So quiet. It’s almost like there was a reason why consumer-grade hardware existed.
You're right, I may have significantly over-estimated the percentage of people on hn that have dealt with server hardware. It's expensive, big, loud, power hungry and temperature sensitive.
I had to provision a 1U server in grad school. Turing that thing on in the office was a joke. Completely impossible to work with it on if you were anywhere in that part of the office.
You can buy server boards that don’t require loud fans. If you’re buying used server gear from a datacenter then it will be like what you said.
I have a 4U NAS with a supermicro board and an i3 chip with 6 WD Red NAS drives and it’s very quiet. The chassis came without fans so I installed the brand I like.
Tell my youve never owned a supermicro board without telling me. They support regular 80mm/90mm Noctua and function just fine. There's specific supermicro mounts for me.
Reminds me of when I as a kid got one of those Delta 7000rpm fan powered cpu coolers, my mom promptly asked what it would cost to make that noise (that was heard in the entire apartment) go away. Got a Zalman (back when they were great) and everything was good.
It was a learning experience, and I think everyone should experience that kind of industrial noise at least once to appreciate how quiet consumer hardware is.
Kind of a random aside, but I never realized how obnoxious LEDs were until I got a studio apartment and started sleeping in the same room as my homelab / workstation / networking hardware. Electrical tape saved me, but wow. You sure can produce a lot of light with a milliwatt of electricity :)
(And yes, my workstation has a clear case and LED RAM. Yes, I'm an idiot. Whenever Windows applies an update late at night, I wake up if it turns back on. I don't know what I was thinking when I built that thing, but never again.)
I like to put a little red wax over LEDs (at least, ones that I don’t touch). That way you can still see them, but they are dimmer, and the red tint makes the light less annoying at night.
Even worse are phone chargers, intended to be used next to your bed, that light up like a Christmas tree when running. Black electrical tape is great for the worst of it, but you still need a few things available to tell you the operational status, if only they'd dim them a bit.
I always thought it would be low-grade hilarious to record a fairly long video of the unboxing and assembly of a ridiculously elaborate in-case LED setup, only to reveal with a straight face and at the absolute last minute that the case in question is entirely opaque.
The noise problem is pretty easy to mitigate by choosing 2U servers instead of 1U. The latter are forced by the form factor to use smaller, higher speed fans.
A bigger issue for enterprise hardware is that it's optimized for performance per watt under load, not idle power consumption. Running a mostly-idle rack server 24/7 can result in a pretty sizable electric bill. This also depends heavily on the model. Some will idle at ~50 watts, others at ~300, but both of these are significantly higher than a Raspberry Pi or an old laptop which for personal use will generally do the job.
Business class desktops are also a good alternative here. Many models have pretty reasonable idle power consumption (check this for yourself, I've seen 6W but also 60W) and then you get a couple of drive bays and PCIe slots and expandable RAM which you don't get from a Raspberry Pi.
These days, pretty much the only thing that makes sense is a mini PC. AMD laptop chips generally trade blows with Apple stuff on power efficiency when you thrash them, and you get a surprisingly capable machine for not very much money.
It's really not worth it to run old hardware 24/7 unless it's making money. Buying a new machine of equivalent capability is (normally) pretty cheap, and it doesn't take very long for the power savings to pay for themselves.
They can be had with fairly respectable specs too. Certainly enough to play around with small local models.
"When you thrash them" is kind of the issue. There are ten year old business desktops with a <10W idle power consumption. If your use for it is to have something to rsync files to and host your personal website and the like, even old hardware is going to average 99% idle. There is no meaningful power savings from newer hardware unless you're consistently putting it under significant load.
Some of the newer hardware is actually worse because the idle power consumption of PCs since around 2010 is determined in significant part by the low-load efficiency of the power supply. Brand new machines with the wrong power supply can use several times as much power at idle as ten year old machines with the right power supply. Annoyingly, power supply efficiency at idle is rarely documented so the only thing to do is measure it.
I built PCs for a number of years and then I shifted to some combinations of RPis, MacBooks, and (maybe) Mac Minis. It was a (long) phase that involved quite a bit of money as well as frustration oftentimes but almost certainly not going to do it again.
You can make those rackmounted servers as loud or as quiet as you like. For home, optimize quiet (and low power consumption).
Even though my server rack is in the garage I try to keep it quiet. A couple of them are fanless Atom-based and others have fans but they are built to be quiet. If you need hardware that generates a lot of heat, go with 4U for large fans that spin slow, thus low noise.
The "wow, leafblowers sure are quiet" happens when you stuff a lot of heat generation into a 1U chassis that then requires lots of tiny fans running at full speed. Those you don't want at home! But it is easy to avoid. Data centers do this to maximize density, but that's unlikely to matter at home.
Supermicro sells Atom-based SKUs with enterprise features like a BMC+IPMI, 10Gb SFP+ ports, ECC memory, SFF-8087 ports, chassis intrusion detection, etc.
And do you need a full-on enterprise-grade server? Given the choice between a 1U server whose fans even at minimal utilisation can still be heard three doors away and something with a low-power/laptop-grade CPU that does the same job silently and with little power use, I'll take the latter.
I sit next to my 4U server with all enterprise components apart from fans - these are consumer grade.
I had to mod the chassis slightly (with just pliers, tape and random inserts) to fit these fans in there, and add fans in front to push the air in. The PSU that came with it was obnoxiously loud, but thankfully, Supermicro has a quiet version that I can't even hear. Even if SM didn't have this PSU, I could have easily modified the PSU and fit some noctuas in there without any issue or safety concerns - like I did with my enterprise grade Mikrotik switch that also had obnoxious fans by default.
I even have an enterprise grade UPS that is dead silent when it's not running on battery power (I swapped the fans there too).
I essentially try to buy enterprise gear whenever possible. Not only is it usually much better than the consumer alternative, but it also is frequently much cheaper too because of second hand market. Before AI sucked the soul out of the hardware market in general, you could have bought enterprise SSDs that had life expectancy - TBW - measured in petabytes, and MTFB - practically never - for half the price of the top consumer SSD that had TBW measured in tens of TB and MFTB of yesterday.
And the entire rack is just slightly more louder than the PC I was using.
The only consumer grade computer at my home is my MacBook and my phone.
Enterprise SSDs are all that. Just make sure you power it up. For data retention without power the requirements are 3mo for enterprise vs 1yr for consumer grade.
I had exactly this problem, 1U server that sounded like a 747 taking off downwind. I solved it by getting a mini-PC that had more processing power than the eBayed 1U server (I just looked up what was available in terms of CPUs and got the best bang per buck, an 8C16T AMD CPU) and that runs essentially silent except when it's under load - they're designed for low-power/silent operation. If you're running your server at 100% load 24/7 then this isn't for you, but for home "server" use it was ideal.
Assuming this trend continues, I think people are going to start re-using older hardware rather than turning to server-grade hardware (which is often not convenient for the average residential situation).
At least, that's what I hope happens. What will probably happen is people will continue to migrate away from the PC platform and towards closed platforms for the convenience, if history is any indication.
That's what I've been doing for years. I buy (or get for free) enterprise PCs coming off lifecycle at surplus sales. Nothing I do at home needs a cutting edge CPU. Unless you're a hard-core gamer or serious hobbiest/tinkerer a 5 year old or even older PC running linux is very adequate.
I think this is already happening, sort of. At least, people are hanging onto their older-but-not-yet-old components for much longer than they used to. I recently tried to build a NAS from eBay parts, and I was surprised to find that the newest stuff affordably available was 6th/7th generation Intel Core parts (retailed 2016/2017). I think people are trying to offload these CPUs in particular because they can't run an unmodified Windows 11 installation (no firmware TPM 2.0 implementation, and the corresponding consumer motherboards typically didn't have a discrete TPM module, either, if they had an LPC bus connector at all). Very little (reasonably-priced) availability of similar-aged Ryzen CPUs (which have firmware TPM support) or newer Intel CPUs.
Why would most people need a home server rack? That's a lot of noise, space, and electrical usage. For what most people would need a home "server" for a NUC PC or Mac Mini would do the job.
Ziply Fiber is offering 50 gbps home internet connections in some US locations. You cannot utilize that type of speed with a Mac Mini. Even the modest 8-10gbps connections offered by T-Mobile and Google probably require more.
VPNs. If you have a NAS and require high-speed access from/to your home files (dumping your Apple ProRes RAW rushes off your external SSD, so you can keep shooting your video, for instance), that kind of bandwidth cements your income.
> You cannot utilize that type of speed with a Mac Mini.
Mostly because the base Mini has Thunderbolt 4 which maxes out at 40Gbps. Anything with a PCIe 4.0 x16 slot will take a 100Gbps NIC. 100Gbps is around 10GBps (8 bits per byte plus encapsulation overhead). Desktop CPUs can do AES-GCM at 2.5GBps+ per core and have up to 16 cores and around 50GBps of memory bandwidth (dual channel DDR4-3200), so the NIC still seems like the bottleneck.
Wouldn’t recommend a home server rack in an apartment. For high wife approval factor, you can put Epyc hardware with Noctuas in a bigger case. I’ve got one at home. Runs my blog and a bunch of other things. Home is at 32 dB right now.
Realistically a Mac Mini will probably blow a lot of things out of the water on price / performance. Even an older one.
> ... or they are pushed towards server grade components. I already have home sever rack, and would recommend it for other people.
An actual rack with noisy 1U or 2U servers may be a bit overkill but on the plus side there's a guaranteed endless supply of such used servers.
Now there's a happy middle ground: used workstations with ECC memory, that you then use as servers.
People would be really wise to not underestimate what a 12 years old dual-Xeon, 14 cores each, 56 threads in total can do, for example. And such a complete workstation can basically be found for less than what it takes to fill my car's gas tank (granted it's got a big tank and it's fancy car whose manufacturer recommends to only use 98+ octane).
A single Xeon workstation with shitload of memory in a tower form factor is basically silent. Mine is. Dead quiet, next to the vaccuum cleaner and the cat's foot in a tiny room. I use it as a headless server.
And that's with the default PSU and fans. There are, of course, people modding these with adapters for regular consumer PSUs and then putting ultra-quiet PSUs in those. Same with Noctua fans etc.
And as for the usual complain: "but a server that is on 24/7 consumes too much electricity"... I only turn on my servers at home when I begin to work: I don't need these to be on 24/7.
So yeah: "Server CPU + ECC" doesn't imply noise. And "Server CPU + ECC" doesn't imply it has to be on 24/7 neither.
Contrary take: I believe we will see an expanded market for capable PCs that can be sanely put in a living space. By extension of the gaming PC niche to local AI. Both NVidia and AMD are developing product lines in that direction (DGX Spark, Ryzen AI Max). And Linux will be more prominent than ever, due to several independent reasons: MS dropping the ball hard on Windows, SteamOS making Linux attractive for gamers, 'digital sovereignty' as a trend, and Linux being the de facto standard for hosting AI (or anything really).
Well, the two chips I mentioned (DGX Spark uses the GB-10) are both a SoC, so no motherboard needed there. I don't know if that's the full explanation, but it could be a factor.
The SoC design with unified memory is generally well suited for residential use because it's quite energy-efficient, quiet and small (compared to traditional GPU-powered gaming rigs). Great performance-per-annoyance, so to say.
The problem with all those devices you listed is that they have lost the "general purpose" ability. I guess you could define "general" to mean "carefully curated"...
"Despite this drop in sales, these companies aren’t exactly struggling. Asus, Gigabyte, and ASRock have pivoted some of their production towards AI servers, allowing them to capture some of the investments that hyperscalers are generously pouring into their data centers. But if you’re planning to build a completely new PC from scratch, you might be able to find good deals on motherboard combos, especially as retailers are keen on getting their inventories moving. "
----------------
1. Within a few months, these manufacturers will likely raise desktop mobo and CPU prices with the justification that "volumes are too low".
2. If you're upgrading from an older machine, it likely has a format of RAM that's not compatible with newer boards. Upgrading the cheap parts now and waiting for the expensive bits to come down is simply not an option. It's all or nothing.
Game and application developers should be paying close attention to this. You're used to the average user's system spec going up every year. That's stopped for now. The average memory in new systems may actually retreat!
Anecdata to add to the pile...I pulled three 1u epyc gen2 servers from my production rack 1.5 years ago and replaced them with lower power alternatives for a production storage cluster. I didn't need the extra CPUs for app server stuff so they sat in my house for a while. Fast forward 1.5 years and it was making sense to upgrade some app servers to new gen stuff, get a bump in frequency and core count...when i went to spec some new servers, my normal $15k - $20k build was $55k.
Instead, I hittup ebay, got six used gen3 processors, found a "good deal" on a couple tb of new ram (still insanely expensive), and came out with the same overall horsepower for a total of $20k instead of $110k.
I know this is about consumer desktop, but seeing the comments about upgrading old hardware caused me to chime in. This is happening in the production/enterprise level in some segments.
Motherboards used to be $100, $200 for the high end. Now they want $300+, ram is crazy, storage, video cards, etc. I'm not surprised sales for these components is hitting a wall.
I don't really agree with this. Motherboard prices haven't been moved at all by AI.
I would also say that most consumers, who are almost exclusively buying gaming-oriented boards, do not need anything high end. They can pretty much buy the cheapest board available.
I am shopping around for a mini ITX board and the difference between something at $180 and something at $400 is basically one to two faster USB ports, which are pretty much irrelevant on desktop computers, and a few minor conveniences that I imagine most people can do without.
The higher-end chipsets add no discernible advantage and there are no CPUs that are unsupported by the lower end chipsets (on the AMD side, at least).
The high end stuff is just available for people with a lot of money.
I am massively sick of gaming focused boards. I don’t want my board to be “tough” or “mil-spec” or be extra shiny or have fancy-proprietary-auto-overclock. I want a reliable board that complies with all the specs it claims to support. Low idle power consumption would be nice, too.
This is obnoxiously difficult to shop for in the desktop/workstation space.
The PCIe lanes are the worst. You have x16 slots that run x1, you need to check slots with m.2 to make sure an x8 doesn't become x4 if you insert storage. Wait if I plug something into the thunderbolt port my 10g network card runs at half speed? Obviously these are actual physical limitation from PCIe lane counts, but it makes it impossible to search. Just painfull.
My advice to anyone doing motherboard shopping is to read the manual off the manufacture's site before deciding. The pcie lane tradeoffs tend to be in the block diagram next to the contents page.
This is exactly why my comment goes over the head of people who cry just get the basic boards. No, this is why the basic boards for $100 don't cut it. You now need to dive into the technical data and realize that the $100 board seems like a deal for a reason, and suddenly the $300+ category is your only option if you want to get a PC that doesn't run on fake specs.
They exist to partition capability so that enterprises can’t connect all of their peripherals and some ECC memory to get the same functionality for 1/10 the price. It’s not a physical limitation.
Obviously market tiering is part of it and you can play tricks with north and south bridge and pcie switches (which adds cost), but a ryzen board that advertises a pcie 5.0 x16 gpu slot and 5.0 x4 m2 slot only has 4 lanes left to work with from the cpu (i.e the cpus only have 24 usable lanes). Which while you can play with generations to get more lanes it's effectively still 16gb/s. That needs to cover network, extra m2 slots, usbs, as well as the extra PCIe slots.
I don't mind having to work within those physical limits but I do want to be able to search for boards that support N components. i.e 1x 4.0x8, 2x 3.0x8, 4x 5.0x4 . But the best you can search for is physical sizes of pcie slots and then dive into a spec sheet for each one, only to find that the 6 x16 slots only have 1.0x1 of bandwidth each.
My parents bought a mid-tier PC for $3,000 (in 1995 dollars) and there was still a thriving PC industry at those prices! While things are getting more expensive now (and that sucks) we have had it really good for a long time.
you mean you don’t prioritise helping your landlord buy their newest mcmansion? i’m just happy to have a roof over my head and continue to pay ever increasing rent!
Buying whole 2020 era PCs here for around $200 mark. As long as you don't need crazy CPU or GPU grunt, which is most people, they are almost indistinguishable from a new one.
Windows 10 LTSC + Firefox + uBlock Origin on an i5-9400 feels faster than my M4 Pro MBP. Probably same or better on Linux.
Upgrade my cpu the other day, got a ryzen 5 5600 for ~$100 new, can't complain. Still on my rtx 2060, can't complain either. As long as you don't fall for the 120hz and 4k memes you can easily get by with 2020 hardware indeed.
> Windows 10 LTSC + Firefox + uBlock Origin on an i5-9400 feels faster than my M4 Pro MBP
I don't remember Win10 being particularly lean (although I'm sure 11 is worse). And the M4 is definitely a much more powerful CPU. Can you not run Firefox and uBO on that? Or have they really weighed things down that much with the OS somehow?
> Probably same or better on Linux.
Even with the Cinnamon desktop environment I can vouch it uses considerably less RAM for just the desktop (ordinary applications are probably about the same) and offers much faster filesystem access by default. I'm sure this is at least partly due to not being weighed down by built-in anti-malware (that would do basically nothing for people who are comfortable using Linux in the first place).
> Motherboards used to be $100, $200 for the high end. Now they want $300+
Entry level motherboards are still $100.
$300+ is a very high end motherboard.
The existence of very high end products is confusing because it can give the impression that you have to buy a $300 motherboard because it exists. If you compare features side by side you're rarely missing anything important for the entry level motherboards.
Some people really want the best of the best and feel the need to buy motherboards with Thunderbolt 4 and other future-proofing measures just in case they might need them, but it's premium and luxury territory.
Future proofing is an expensive way to pay for features you don't need and will probably never use.
It's smarter to buy a cheap motherboard that meets your needs now. If in the future you find the need for USB4 or some other feature, upgrade the motherboard.
More often than not, builders will try to future proof for eventualities that don't arrive before it's time to upgrade to the next CPU socket anyway. There are a lot of people with expensive, outdated "futureproofed" builds who would have been better off saving the money on the original purchase so they could upgrade sooner instead.
This. In 2017 I bought the cheapest AM4 motherboard with a USB-C port (a Gigabyte X370 Aorus Word Salad). I'm still using it because BIOS updates gave it Zen 3 support.
Wanna guess how many times I've used that USB-C port? Maybe once or twice in the 9 years I've owned it. Never needed it. I also couldn't tell you what X370 is getting me that B350 wouldn't have gotten me.
When you try to future proof, you are basically hedging. It’s a kind of insurance; sometimes it pays off, sometimes it does not. Having more disposable income now than I did 10 years ago I tend to pay more attention to this sort of things, but everyone can choose where they put the cursor. Someone who overestimated their RAM needs when buying a computer last year are probably pretty happy about it, but it could have swung the other way.
I future proofed by stepping back to high end components from last generation (except for GPU). My memory speed is slightly lower, but I have 32 cores and 128 GB ECC RAM on 4 channels. I doubt I will need to upgrade this thing any time soon for my typical use cases.
Note that this was before the RAM shortage, but I bet you could still do this now and save a little versus mid-tier current gen gear.
Buy a $300 motherboard now in case you need future features, or buy a $100 motherboard now that does everything you currently need and then buy a second or even third $100 motherboard if you ever actually need those future improvements.
Then you get a new board designed for the new features instead of something several years old and you come out $100 on top.
Futureproofing is nonsense. PCs just don't work that way, and haven't for decades.
> Buy a $300 motherboard now in case you need future features, or buy a $100 motherboard now that does everything you currently need and then buy a second or even third $100 motherboard if you ever actually need those future improvements.
Right, but the problem is that by now your $100 new motherboard requires a new CPU and new RAM. Which is very much not $100.
In the past we got away with PCI cards to add features without changing the motherboard, but we still ended up changing everything every 2 years anyway…
I would only agree if you already plan on doing major hardware upgrades within like 3 years at the latest. Past that and you will inevitably be missing new features that will be shipping even on budget hardware and won't be saving on anything.
Entry level motherboards used to be just fine to use. The last time I was shopping, they all had a random deal breaker in terms of a missing feature. Maybe I’m just pickier now, but I doubt it.
I mean, if you think about all the motherboard does, and how many layers the PCB has to support all the features such that for a vast majority of users, the only things you need besides the motherboard is a CPU, some RAM, storage (either in M.2 or SATA) and maybe a dGPU, it's wild that it is often the cheapest item in the PC.
Not just motherboards. Cases, PC accessories (fans, etc), consumer SSDs, and more. Cases are especially hard hit, apparently, as they're already quite a low margin business.
Personally, I see little reason to upgrade from my AM4 platform. It's never been easier to hold on to aging hardware with the advent of DLSS stretching older cards further, diminishing returns on the newer gen GPUs, and the 'realism' of video games plateauing.
Last year I said I should have upgraded my 1060 last year.
I bought it second hand 7 years ago and it is still the same price.
I don't do much gaming, and it runs Immich / etc light inference just fine. One thing I don't regret is getting 32GB of DDR4 when I built the system around the time of the GPU upgrade.
Sometimes you just have to accept the current pricing and buy what you need to buy (assuming you need to buy anything at all).
7 years ago it was the same price, but then again, the last 7 years have involved accelerated inflation. So, the same price is actually a lower price.
If you're looking for a card in the sane $300 area, the Intel ARC B580 (12GB) or the RX 9060XT (8GB) are a reasonable value. If you want 12GB+ from Nvidia or AMD the used market in previous generations is a good place to look: maybe something like a RTX 3060Ti (12GB) or RX6800XT (16GB).
I personally don't think the GPU market is incredibly miserable. Maybe I am just used to the pain or something? Nvidia has a bit of a tax where but something like the RX 9070XT is basically the 3rd fastest gaming GPU money can buy and it's around $700. (I'm not sure why the 5070ti costs $200 more even given Nvidia's software advantages. It performs almost identically it just doesn't make purchase sense)
Agreed. I build a system every ten years and I've got 6 years to go. AM4 works great, and I've managed to hoard enough ram and drives to hopefully cover any concerns for the next 4 years. Things work, they are stable, and I feel super lucky for that.
I invested quite a bit in enterprises level homelab equipment 2020 to 2025 (about 10k). Happy I made it before the big bang. Eg. my SAS he8 drives will last at least till 2035. But what then? I want my children to be free, too.
When investors stop to ponder if they are ever going to see any return on their superhot AI investments, you'll have all the cheap hardware you could ever want.
I still think it pretty much was the last major generational upgrade in graphics. An early PS3 game looked night and day better than a late PS2 game. Meanwhile, an early PS4 game looked only marginally better than a late PS3 game, and most PS5 games don't look noticeably better than a PS4 game.
I don't mind that graphics have plateaued, because they aren't the important bit. If anything, I would rather that devs stop trying to chase graphics and make more games with shorter dev cycles.
> Meanwhile, an early PS4 game looked only marginally better than a late PS3 game, and most PS5 games don't look noticeably better than a PS4 game.
Partially this is because there was usually an overlap in sales for early PS4 and late PS3, etc. if you have to support both console generations, it won’t truly be able to take advantage of the newer gen stuff.
Texture resolution and shadow resolution do a lot to make a game look better. The big difference between the PlayStation 2 and 3 was the massive jump in texture resolution, shadow resolution and model polygon count. Play Gran Turismo 5 and go look at one of the cars imported from Gran Turismo 4 for a good example. However the PlayStation 2 was capable of some very high polygon count models, as evidenced by Lulu's cutscene model from Final Fantasy X that rivals most PlayStation 3 player models in detail. Those resolution upgrades, the number of objects and not just polygons displayable on screen, and the increase in distance required for low-poly LOD models all made that giant leap possible and very visible. Since then it's mostly been adding camera effects such as depth of field and ambient occlusion that are much less noticeable. Though for those with keen eyes, only in the current generation are there textures without noticeable anti-aliasing effects which came as a result of being able to split the UVs thanks to a higher resolution making small UV faces possible.
Since we're 10 years on at this point, I feel pretty confident saying the plateau to my eyes landed somewhere between the PS4 in 2013 and Pascal (GeForce 10-series) in 2016.
I've kept playing games and upgrading my GPU every other generation, and they're still fully utilized, but I can't really see where the additional compute and money is going. My biggest visual upgrade during that time was actually going from LED to HDR OLED which is something that requires virtually no additional processing power.
What's wild is with all this craziness going on, it is sounding like AMD is bringing back the 5800X3D for another kick at the can. AM4 has got to be one of the greatest platforms to ever exist.
I'm one of the collapsed sales. My desktop had died, and I had been thinking about rebuilding it.
But RAM prices went to the moon, so I instead opted to repair the desktop. (It's only ~15 years old.) It's alive, again, and performs well enough.
The HDD in it is pretty old (not as old as the rest of it, it's on its second drive; 15 years would be quite impressive!), and still works for now, but there too, prices are silly and well above inflation. (I looked it up again: the same HDD is 50% more expensive today than when I bought it, in real, accounting-for-inflation dollars.)
I've been replacing older systems with last gen hardware off ebay. I'm typing this on a Thinkpad T14 i5-1250p 512GB 32GB WWAN I picked up last week for $370 all in.
Since this mess started, I've bought dozens of unused and like-new systems for clients. All with modern hardware - in the $250-$600 range.
Craigslist still exists, too. Found a really nice ~10 year old HP workstation for $100 and crammed as much DDR3 ECC RAM (still cheap) and the bets Ivy Bridge CPUs (cheap) I could find into it, and it shreds.
Shortages or not, there's little demand for cool new motherboards and CPUs from the enthusiast corner of the market because hardware platforms themselves are stagnating performance-wise.
13-14gen Intel Cores are still more than enough for your average home gamer, Zen 5 shows only marginal improvement over Zen 4 except for a very narrow range of workloads, getting wider than 128bit memory bus is prohibitively expensive while relatively cheap consumer boxes like Mac Mini run circles around dual-channel DDR5 setups, so on, so forth.
Sure, presenting this as a consequence of AI boom is convenient for a news outlet, but even before the craze both Intel and AMD were dragging their feet.
I'm not buying it. Both the premise and the new motherboard, that is.
I wanted to build a Threadripper 9965WX and the math worked out until DDR5 prices come in to play. Instead I got a used Lenovo P620 5975WX and still had to buy DDR4 from Shenzhen to get anything remotely affordable. The IPC of the Zen5 is a meaningful uplift especially for single thread but it is out of reach.
10-12 Months ago I had commented here that people are not realising that AI is going to price us normal people out of computer hardware and we need China to actually reach on parity with node size. And sadly it looks like I was correct in my prediction.
But in a way I do agree with you, I doubt it is as organized as you imply. Yes, companies and governments do not way anyone on a General Computing Device at all. They want to see exactly what content you are viewing and responding to.
Microsoft and Apple have been slowly adding various forms of spyware and locking down what applications you can use. And Cell Phones ? Those are the Holy Grail of what Microsoft and Apple want to move your Laptop/PC to.
Right now Linux and BSD are the only games in town for non-spyware systems. But the new Age Verification Laws seems to be a first attempt to lock-down even Linux :( Since the Linux Foundation is owned by large corporations, I feel that will succeed. For the BSDs ? Right now seems they are flying under the radar.
Why do you doubt this when the rich also have Signal? They meet and talk out of view? The insider trading coming out of Washington?
Why when emails from discovery in labor disputes between google and apple in the 2010s revealed they engage in exactly the sort of manipulation you disbelieve?
Because they own nothing but make believe stocks and life works great for them.
The mega-rich are 100% decoupled from physical reality. May as well treat them more like tribal shaman, priests, preachers, and rabbis.
Just parroting memes the likewise idiot politicians believe are the magic chants that keep gravity itself pulling together the Earth.
"Omg he said the thing! Cut his taxes! Give him welfare!"
Our generation of leaders were raised in a pre-science and information as world. They rely entirely in cult of personality as their meat suit never sees itself engage in the labor it relies on to live. It's well aware intuitively how fucked it is. Must continue to stand in the pulpit!
I think treating them as the fae, vampire or demons is sort of insulting. Those creatures are at least bound by supernatural laws and can be negotiated with in some way.
Bold claim given all the hate out there for covering up Christian leaders diddling kids, slaughter of Palestinian kids for not being Israeli Jews, and the beheadings and assassinations coming out of Muslim-landia over trite offenses.
I think you conflate informed consent with "brainwashed as children into fealty via allegory of the end times, and threats of violence if they don't comply."
Nah. The first two thirds of the 20th century was the science and information world. Man gained mastery of the skies, the depths of the sea, the void of space, the atom. We were taming diseases and found a way to end hunger. We started building thinking machines. We were playing with the fire of the gods. Science was working miracles on the daily.
It still is, but nobody gives a shit anymore, we are in the financialization and rent-seeker world now.
> we are in the financialization and rent-seeker world now
sometimes i wonder if this is what happens when organic growth stops/slows; when (for lack of a better word) desperate people just start looking for any alternative to keep the growth train running...
It might just be frustrated young people. They're getting squeezed real hard by a system that was set up to put them on an impossible trajectory before they were even born.
You can see the divide everywhere. People with lots of money think supply and demand, congestion pricing, etc. are great tools because it doesn't impact them at all compared to people on the bottom. Those are only good solutions if you're not the one falling off the bottom rung of the ladder.
Is it really shocking that people are upset to see the supply of resources being cornered and hoarded by the ultra rich with the most likely outcome being the only way to get access to those goods will be to pay forever?
The possibility of AI becoming a must-have knowledge repository or memory assistant is scary if you couple it with the idea of never being able to own it. How much is your memory worth? What if you can't compete in terms of productivity without having access to AI? What about the people that can't afford the "first month of rent"?
People come in and make angry posts like the GP because they know they're getting disenfranchised and don't have the power to do anything to change it.
I think it’s probably mostly what your sibling comment said, it’s very cheap to sow division and discord now.
I get what you’re saying, and there definitely are people who are angry about the US slipping, and standards of living reverting to the mean a bit, and looking to blame someone. The True Believer came out in the aftermath of WW2 and tried to analyze why it happened, and laid out that the most dangerous group of people aren’t the ones who’ve been poor for a long time, but those who were recently poor, who remembered a more prosperous time. Those people get tremendously angry about it, and represent fertile ground for politicians and motivated groups to plant the seeds of hate.
People need to have some perspective. You’re not permanently locked out of useful AI models, it’s within reach of most who can save a bit to go get a pair of used 3090s on eBay and run some pretty useful models.
> The True Believer came out in the aftermath of WW2 and tried to analyze why it happened, and laid out that the most dangerous group of people aren’t the ones who’ve been poor for a long time, but those who were recently poor, who remembered a more prosperous time.
Is it just people trying to sow division when you're potentially describing an entire upcoming generation?
> People need to have some perspective. You’re not permanently locked out of useful AI models, it’s within reach of most who can save a bit to go get a pair of used 3090s on eBay and run some pretty useful models.
I don’t agree. The current generation of young people can’t afford housing and education without taking on decades of debt. Buying a pair of 3090s for local AI isn’t even on the radar. Even if they could, it’s unlikely they’d be able to make productive use of them. The big AI companies haven’t even scraped the surface when it comes to memory, specialized knowledge, etc..
I see people downvoted my comment and I’m not sure why. I’m not trying to pile on to create drama. I’m trying to explain there’s a growing cohort of people that have a right to be angry because they’re watching global productivity increase as their standard of living is decreasing. Who wouldn’t be upset?
The dangerous part is that people angry about it are easy to sway with propaganda. It’s not the billionaire families colluding to fix food prices, which happened with bread in Canada, it’s the “insert another marginalized group here” that’s causing the problem.
I think the commenters with new accounts and comment on only political topics and not technical ones on Hacker News are a bit suspect. Not saying that there aren't a lot of disaffected youths out there, there totally are, and I'm agreeing with you with that bit about The True Believer, I just have a suspicion that a lot of these new politics-focused accounts aren't real. But maybe there are real young people who come to HN just to discuss politics, I guess tech has become more political over the last few decades.
I didn't mean that most people are going to go out and drop $1,000 and run their own models locally, I meant that it's pretty good evidence that they're not permanently locked out of owning access to AI, if that's a priority to them.
I agree with most of the rest, I'm a strong proponent of all sorts of safety nets, and higher top tax rates/cap gains tax rates. But it's also important to maintain perspective. A lot of what's happened is that citizens of very rich countries are maybe seeing their standards of living decrease somewhat while many more people globally are seeing their standards of living skyrocket. Visiting family in China every 5 years, the difference is astounding every time.
Upvoted that comment, fwiw, you answered in good faith, not sure why it's downvoted.
You wave off systemic issues as no big deal and discuss the potential of a 3090 graphics card. Tell us you're a privileged first worlder without telling us...
That you refuse to discuss solutions to political problems impacting a lot of people who, in our society are off the hook for you too, you're deciding to take the risk your own life doesn't vanish.
You're not relevant to others. Americans lack of political action to ensure a safety net exists for everyone just leaves everyone indifferent should you too end up giving blow jobs behind a Burger King for a portion of kids meal someone threw out a car window should it come to that for you.
So go ahead and pretend reality doesn't exist outside your own experience, little Dark Triad. But if you end penniless in the gutter, you'll only have yourself to blame
I totally agree that we need to be taking better care of each other, our system's a mess, but I wasn't planning on getting into a big discussion about that tonight from my phone.
The point about 3090s was that reasonably good local AI costs on the order of $1,000, so Americans aren't structurally locked out of owning the means to run their own models like the person I was responding to seemed to be claiming. If you can afford a desktop, local AI is in reach if owning it is a priority for you. I don't recommend that route, but it's possible.
From your other comments, sounds like you're also a "privileged first worlder" who got to go to college and attend Burning Man, so let's not fling stones. I'm extremely lucky, I'm extremely aware of it, a visit to some of the actual poorest parts of the world, where people wash themselves and their clothes in rivers that stink so badly of sewage that it's hard to breathe without gagging made me very aware of how lucky even the poorest Americans are, despite how bad it can feel to be in close proximity to some of the richest people in the world when you're not.
And if you're not an account who's part of an "AI-fueled agitprop campaign", I'm sorry for whatever's happened to you that's given you so much rage that you're feeling the need to come here and dump on nearly everyone you've interacted with. I hope things go better for you in the future, I really do.
Hobbyist equipment is still relatively cheap. You can get previous-gen hardware for formerly current-gen prices, you can run lots of “hobbyist” software on low RAM and no GPU.
Or live with lower specs. You can still get a Chromebook for $200 and install Xubuntu on it. I did that, and it's perfect for video conferencing and web surfing. You certainly can't play CoD on it, but even if you're looking to play games there are some older games that run on it just fine.
Who cares if Qualcomm owns Arduino. It has never been cheaper to get into embedded computing. You can buy Arduino-compatible STM32 Nucleo boards straight from STMicroelectronics for $15-20, and that's first party. If you're willing to buy third party clones there are boards on AliExpress for $10 or less.
At current prices, Chinese companies could even produce everything possible (~anything but current gen CPUs and GPUs) on slightly older nodes and make a stonking profit while lowering market prices.
If only the rest of the world could buy it, it would probably work almost as well (edit: to lower prices in the US). Besides, I'm in the rest of the world ;)
It'll be the same for Canada. We're already seeing satanic panic style action against things like TP-Link networking equipment and Hikvision cameras. Funny how those are a couple of the brands that can run 100% locally without a connection to the internet.
I'm not sure that's going to change unless the LLM stuff slumps. Chip makers get burned every few decades by building boom-time capacity that comes online just after the crash. They're going to be reluctant to make big capex spends unless they think the demand is durable.
After the recent run-up, where are prices on a per-performance basis? Back to 2019?
Computers were incredibly more expensive when I was growing up. People bought them anyway.
Is a computer that lasts 5-8 really productive years (and is still serviceable for another 5-7) and costs $1500 really a deal-breaker just because it was $1000 and on sale for $850 a year ago? Even if it doubled again, it still doesn’t price normal people out, IMO.
I think your idea of what "normal people" can afford is a bit off. Normal people aren't buying $1500 computers. And they definitely aren't buying $3000 computers.
An 4K Apple ][ cost the equivalent of around $7K when released. A C64 cost the equivalent of around $2K when released. Both were fairly popular and vastly less useful than a computer today.
If the cheapest useful computer ends up costing $3K, it will still be purchased and will still be worth it at around $1/day of useful life.
The C64 sold "between 12.5 and 17 million units" in its lifetime [0], vs. worldwide PC shipments of "71.5 million units in the fourth quarter of 2025." (emphasis mine) [1] It's truly an apples (hehe) to oranges comparison, and in my opinion it only reinforces the point that "normal people" will no longer be able to purchase computers, just like the C64 was not a mainstream product.
That seems like a false equivalence to me, even if we ignore the fact that only 21% of that revenue came from non-US countries. There are enormous chunks of the world where the local equivalent of $1500 is a life-changing amount of money.
How do you not understand the difference between spending $5 once or twice a week and having to cough up $3k all at once.. or paying 20-30% interest if they can't afford to pay that.
It's extremely obvious from your flippant attitude that you are doing quite well financially and are completely out of touch with the financial realities of the vast majority of people. Congratulations on your financial success, but maybe lay off on thinking that everyone else can afford the luxuries that you can.
It was nice, in the 90s and 00s when computing hardware's cost was just falling so rapidly. I think it was like what, 1.5x "stuff" each year? Like RAM going 1.5x bigger every 12 months, CPU frequencies increased by that much. Per-unit prices were falling.
Now, per unit costs is rising faster than inflation. The WD HDD I bought in 2017 for $65 real ($49 nominal) is now $95 real, 50% more expensive after inflation.
Trust me when I say my income has not increased by 50% post-inflation since then! (Also … I really should not have checked that number. Needless to say, it's not positive.)
"Normal people" were not purchasing Apple II or C64 computers in the 70s and 80s.
What you're showing me is that you are completely out of touch with the financial realities the vast majority of people face.
There is a reason that the Macbook Neo has been a smashing success.
If the cheapest useful computer ends up costing $3k, then most people will simply no longer own a computer whenever their current computer dies unless their livelihood depends upon it, which for most people it does not.
> Is a computer that lasts 5-8 really productive years (and is still serviceable for another 5-7) and costs $1500 really a deal-breaker just because it was $1000 and on sale for $850 a year ago? Even if it doubled again, it still doesn’t price normal people out, IMO.
Maybe it's different in the US. In Canada, the median income for 25-54 years old was just under $60k / year in 2024. When you're talking about a $3k USD computer, it's pushing 10% or more of the median after tax income. My gut reaction to that is that most people don't even end up with that much disposable income in total, let alone for a single purchase.
HN is skewed with people way at the top end of income earners, especially on a global scale. Imagine getting $30k / year to spend on everything you need and then consider how much $3k on a computer is.
My dad had to take a loan to buy our first computer. Who wants that? It's dumbfounding to see the number of people cheering on backwards progress where we end up where we were 3+ decades ago.
> When you're talking about a $3k USD computer, it's pushing 10% or more of the median after tax income.
If it lasts for 10 years, it's more like 1% of the after tax income of a median individual earner over that period.
I think a computer is clearly valuable enough that people will entirely rationally spend 1% of their income on it if that's what it costs. (I'm not "cheering it on"; I'm just observing and predicting that lots of normal people will still buy computers.)
Computers really aren't that valuable to the average person who already has a smart phone. For everyone else, many probably have a work issued computer, and don't need one at home.
The market for high end home hardware is really only gamers and tech workers, and gamers will fall back to closed hardware fast if price/perf pushed them to do it. A big reason PC gaming thrived 2010-2020 was PCs were better on a price/perf basis.
This is really a two-for-one for the AI companies: they lock up the hardware market for their growth while also making sure no-one can buy hardware to host models locally.
High end resins and epoxies are in a critical supply shortage right now. I suspect that there are going to be some serious resource driven PCB shortages in the very near future.
When photosynthesis first appeared, the oxygen it produced poisoned the existing life. Sulfur-breathers basically disappeared. In the geologic record the oxygen shows up as massive layers of iron oxide which we mine and turn into steel now. New things can radically shake up the existing environment, the degree of shakeup is the measure of how radical it is.
I'm thinking about how I jumped on getting a new PC a little over a year ago anticipating tariffs would balloon prices. Turns out I made the right choice but for the wrong reasons (not like the tariffs are helping either, but just wasn't as big of a factor).
I'm sure the AI shortages are hurting, but also I'm still using my same motherboard from 2020 and I see no reason why I should have to upgrade in the next 2-3 years (whenever I buy my RTX 7070Ti, it might be time, but maybe not even then).
hope you're handy with a soldering iron (reflow station?) because eventually the passive components are going to start failing and I don't imagine you'll be able to plug off the shelf components into a Dell
AI is simultaneously the reason you can't buy a motherboard and the reason you don't need to build a PC anymore. The industry is eating itself from both ends.
AI is exactly the reason why I would like to build a high end PC. I'm interested in this technology but I don't want to have anything to do with AI subscriptions and big tech in general.
Progress, in any meaningful sense, has to mean we are more capable of sustaining ourselves than we were before. Burning down the commons to train and serve a mythomaniac chatbot is not that. The consumer markets that still worked will shrink, and some will die.
I was looking into self-hosting deekseek v4 pro since frankly cache reads are an absolute scam and they're 90% of the cost, but then I looked at the ROI and it will never pay off fast enough because the hardware will become obsolete faster even if you were running 10 token generation streams 24/7.
The napkin math resulted that renting is around 27 times cheaper than owning (not including power). I think we're really screwed when it comes to having owned access to AI unless intel comes out swinging with a c series card that has 128gb vram so we can run them in a 4x128gb configuration, but seems unlikely since nvidia has a large share in them.
This was calculated expecting around 30tok/s, of course you can get 2-5tok/s much much cheaper, but it's unusable for my workflow.
Ironically the few people not scamming you for cache reads are Deepseek.
Everyone else charges a ridiculous amount but Deepseeks API is $0.003625 / M tok.
I'm surprised no one talks about this because of how significant it is. GPT 5.5 for example costs a ridiculous $0.50 / M tok cached. It's literally almost 140 times cheaper which matters a lot for tool calls.
Not OP, but basically take GiB/s and divide by 30.
You need at least 128GiB to hold the model, too.
It's expensive to get 200 GiB/s, very expensive to get 400 GiB/s and above that you are looking at DC-grade GPUs. Multiple, in fact.
I know it's going to be extremely painful, but the sooner this ridiculous unsustainable AI bubble pops the better off we'll be. The more it inflates the more collateral damage it will cause, and we're probably already looking at 2008 levels of financial chaos.
I think it comes down to scale (there's like 2 trillion invested so far by very large institutions) and also AI hollowing out foolish companies that decided to go "AI native" and downsize and lose institutional knowledge. When the rug pull inevitably comes and the AI subsidies are gone, the entire idea of "efficiency gains" in a lot of places is going to look pretty bad as soon as they look at their bill.
There are probably multiple goals of AI investment. It's entirely possible that they are deliberately killing the affordability of how personal electronics like home computers are made and will instead replace them with terminals that stream everything to the cloud. You can make a lot more money off consumers if you can turn their entire computing experience into a utility.
> replace them with terminals that stream everything to the cloud
they've been trying for a looong time on that one. i still remember those junky "net appliance"s from the early 2000s [0] and oracle and sun making big statements about them...
They should try releasing one with a futuristic space-age design and flashing rainbow lights. Maybe give it a name like HARDTEK and put some random techy shapes all over it.
The brief window between the covid gaming bubble pop/PoS ETH switch and the AI hardware blackhole will be fondly remembered as the last golden age of consumer PC hardware accessibility.
If China keeps releasing decent copies of SOTA models that only take 20% of the resources, then we may get some relief when those models become "good-enough"
I've been using deepseek and it's good enough for my personal use. It takes way more time/tokens/course-correcting to get things done, but I spend in a month what I spend in a day with opus 4.6
>copies of SOTA models that only take 20% of the resources
They might be 20% of the price (because they don't have to invest that much in training), but are probably not 20% of the resources (ie. inference), considering they take more tokens to do the same task, and have slower inference speeds.
This makes me sad. My son was just about to be old enough to build his first PC, and was showing interest. I guess I'm going to have match his savings 1:1 to make it possible now.
ASrock have created a "HUDIMM", which is basically a half bandwidth DDR5 DIMM. Basically half the number of chips per DIMM. So kinda a modern day 386SX with its 16 bit bus. Presumably hoping you'll be able get fewer, higher capacity DRAM dice for a competitive price versus a normal DIMM.
On modern systems (all 64 bit AMD, and Intel Core "i" onwards, so quite old now) the memory controller is integrated into the CPU, so what the CPU supports is what you get, and the latest CPUs are DDR5 only. Intel did have a transitional phase of CPUs that can do both DDR4 or 5 depending on motherboard, but AMD it's AM4 = DDR4, AM5 = DDR5.
even if volume and hype decreases from the general pop there doesn't seem to be much of a cap on model requirements -- so at least one sector will be pushed into purchases one way or another.
Smaller manufacturers will fold, and larger ones will leave the consumer market (like Micron/Crucial did), before the market has a chance to bounce back. If and when it does recover, it will be a market of much fewer choices.
A somewhat comparable historical example is the destruction of the Swiss watch industry in the 70s with the advent of quartz and digital watches.
A Rolex Daytona today is known as a very fancy and even hard-to-get watch. In the 70s they were practically giving them away with other watch purchases because electronic watches were taking their lunch.
The bigger takeaway, I think, is the destruction and folding eventually lead to the Swatch group. People forget Rolex, Omega, et. al. were tool watches that were expensive but fairly attainable. Even into the 90s you could walk into a Rolex store and walk out with the watch you wanted. Nowadays you basically have to buy a watch to prove you're good enough to get the one you want.
I forsee a similar thing happening with computing hardware. There will be a small high-end side industry for non-datacenter customers.
The digital watch user will be renting time for a thin client via a datacenter provider. The wealthy or high status user will be able to purchase the expensive boutique home computing hardware they want.
Besides watches becoming expensive trinkets, a Rolex Daytona in the 70s was basically the same watch as what you could get from other manufacturers with the same movement inside. Today you have to spend at least 30k to get something comparable to it which is part of the reason that it's in a permanent demand crunch.
Computers were like that twice already. That always ends.
The only reason you have those watch brands to mention is because they are non-functional status symbols. People that want a watch buy something else.
The same way, people that want a computer will buy from whoever is actually selling them. Manufacturers that want to sell only to datacenters won't last for long.
I think the shocking part is it's only projected to go down with 25%. That's quite mild given the increase in memory, storage and GPU prices, in my view.
Shortage of ram and ssds, and soon, cpus. Motherboards aren’t selling because theres no point buying a motherboard if you can’t by the ram or ssd it needs.
It’s brutal. I’ve just built a workstation with DDR4 and two-gen old cpu. I paid more for the ddr4 than it originally cost, four years ago. The same amount of ram for the latest motherboard would have been 10x ($10,000). So used DDR4 has gone through the roof, which impacts hobbyists who used to rely on “hand-me-downs”.
15 months ago I saw writing on the wall on several fronts. I suggested my community commit to their buys/builds ASAP and be forward-looking, before things changed.
My high-end HEDT would now be +$2300 to build mostly due to memory and SSD pricing. 96GB of memory going from $430 -> $1800 is wild. One community member literally wouldn’t be able to buy their Mac Mini configuration anymore, plus the self-upgrade SSD would be price hiked.
Where I blanche most is my storage server running TrueNAS. Built it 3.5 years ago, future-proofing in mind. Strong SSD cache layer, plus two spare HDDs as spares. It wasn’t cheap then, but I think between disks, storage, ECC memory, etc. it’s +$7000 now to rebuild it again, +$9000-$10000 on last generation hardware.
I assume manufacturers were making enough motherboards in 2025 to fulfill demand, so what happens when the demand is the same but the production is 25% less? Crazy.
Maybe with AI we can finally kill user-owned computing, and make almost everyone renters.
It's really wrong that the common people have access to things like PCs. It leaves a lot of money on the table the corporations can extract, and makes control much harder. PCs should cost at least as much as a car, so only the right people can afford them.
AI bros and crypto bros. One and the same thing. Same optimism. Same arguments. Same blind faith. Same zero knowledge of how economy, society or even the technology they are evangelizing works under the hood and what are its shortcomings that are impossible to overcome because physics won't allow.
> what are its shortcomings that are impossible to overcome because physics won't allow.
Have you intended to say "because reasons"? There should be a long chain of reasoning connecting "LLMs will never be able to strictly follow instructions written in natural language (as agreed by 90% consensus of experts or some such, because you can't formally verify adherence to informal natural language instructions)" and "physics doesn't allow that." And I can't find it anywhere. Neither in your comment history, nor in literature.
Because it is next token predictions and there is no logical part into that. It is like saying I cannot find any research paper on the fact that car collusion might cause fire because gasoline is in the tank.
But the fact is that there's plenty of literature out there on hallucination and unreliability of LLMs already. If you know otherwise, let us inform Dario before next funding round.
Next token prediction is a pretraining objective that doesn't tell anything about behavior and activation structure of the resultant network. The literature that explores a hallucination problem has little to do with your claim about physical impossibility.
You sound like a useless-eater manager. Just the kind of roles we'll be happy to have in our future Utopia. The people will be happy to be led by such visionaries such as yourself.
We were both being satirical. Seems like people couldn't seem to attribute me the same generosity and benefit of the doubt as the OP. Funny and yet the starkness of the hostility onto my post and not his is real.
Those who earn their living from their labor, and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
Also the root of most of society in the first place. We would probably not be able to sustain our current standard of living without this horrible system.
If any of these people don't work or don't work enough, they undeserving immoral moochers and should be miserable and in pain.
> and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
It totally fine if these people never lift a finger in their lives. In fact, they deserve it. NEVER question that. N-E-V-E-R! It's great! Capitalism is great! Capitalism is fair!
if you are on land, you are (or someone is) still paying rent to the government. rent can be raised and you will be evicted if you don't pay.
if you are living mobile, you probably need gas or batteries for warmth or cooling. if your climate is currently comfortable, temperatures can be raised.
or maybe you are a nomad hunting and gathering your own food? the wilderness can be pillaged and sold and "secured" until there's nothing left to eat.
99% of people who follow that are still completely dependent on the world order and will be just as screwed as everybody else if everything goes tits-up.
There's a lot of bullshit that can happen between the status quo and everything going tits up. Having FU money means you can say -all things being equal - "Fuck You" when it's appropriate, instead of worrying about becoming homeless or not having enough to eat
It will be the same two classes there are now and always have been. Those who need to sell their labor and those they sell it to. Class struggle is the only way out. Find some solidarity, you aren't exempt.
PCs are also made by corporations, together with PC parts. The reason computing became so cheap during the last 50 years was competition between said corporations. Competition that is also pushing the AI token price down and also encouraging - corporations - to come up with models that can run on user hardware.
So what are you ranting against?!
> Own nothing and be happy.
Ah, here it is. Only governments can confiscate our property and force us into that. Governments and politicians that keep telling us how evil corporations are…
The “own nothing and be happy” quote is from a blog post made by the World Economic Forum. I find meta-governmental organisations even more troublesome, and you can’t vote them out.
It isn’t only conspiracy theorists who should be disturbed by whatever politico-corporate freemasonry that goes on in Davos.
In the digital age corporations can and do confiscate things we thought we owned. Amazon removing paid-for kindle books, devices getting bricked, paying a recurring fee to use some features in your car.
Every year, the government comes around, reassesses my house value (always up, never down) and ask me to pay a percentage (always increasing) or they will take away my house which shelters my kids and family.
So, no, I am not too worried about Amazon removing my $9.99 book.
I kinda felt like this was coming so mid last year I built a local rig with the top of the line parts I could afford at the time lol (rtx5090/ryzen9 etc) now I just need to build out my inference setup (sadly m3 ultras r insanely expensive now) - I have a feeling they will try to lock down usage of open source LLMs too. I don’t get how token moat can exist if local inference rigs can be built out and serve open source models locally for nothing (besides power cost).
I'm sure this makes the billionaire class happy but there are some legit economics involved. We all want frontier class models running on our home PC but that takes 100x or 1000x the computer we've been running. The market isn't going to instantly adapt and make that kind of machine for the same price we've been paying for 1% of it, so that we can leave it sitting idle 99% of the day.
I can't wait for this to happen. I live far enough from the US to not be super affected by this crash and hopefully I will be able to stock up on disks and parts once the prices drop.
> required it isn't really a consumer choice, is it?
No one really resists or pushes back. When I resist I hear "that's what consumers want", "it's for security", or that I'm the problem. There is no one to complain to even, except to low paid kiddos in customer service.
It's not just new hardware, even used hard drives manufactured a decade ago have at least doubled in price. Scam Altman has effectively killed personal computing for all but the most affluent
PCs and higher-end IoT devices (like RPi) are becoming less and less affordable. The web is saturated with slop and bot traffic. I wonder how many people think about what this means. We are rapidly losing several of our major technological ecosystems that have driven economic growth in the past few decades. It's not at all clear what economic benefits we will be getting in return.
Will demand for computing ever go down from where it is now? Even if the AI bubble temporarily pops, in the long run I think the demand for computers will be practically infinite.
Market forces will probably bring the price of hardware down in the next decade. Whether it is in a form that is useful for regular people/hobbyists is another question. If not, then hopefully the "cloud" starts to look a lot different.
I think it's possible (10-15%) that the AI bubble pops and we all live without 50M token/day OpenClaw installs and running Opus to do things that should have been done by a shell script to the point that it causes a dip in total compute demand. I think it's likely (75% likely if the AI bubble pop causes a dip in compute demand) that this dip extends longer than the median lifespan of the hardware currently being installed in datacenter.
Of course in 20 years we'll be using more compute than today (99% likely).
EDIT: Of course cryptocurrencies provide a floor compute pricing.
It looks like we are going to start to see consumer components stop being created in the first place, as demand dries up to the point where it's no longer worth it to serve the consumer market at all. Almost purely because the supply is already so low that almost no one is willing to pay the price.
I would say that since more and more of the world wants/needs sovereignty in this space, more and more options would come up in the next 5 years. They will probably not be the cutting edge we have now but personal computing will not die. It will just get a (healthy) reset.
"Fueled by greed". It would be trivial to say no to AI companies because dollars are dollars, it doesn't matter who pays them, and prioritizing literally all of humanity instead of "five companies" is a choice that every single supplier could make, but decided not to. This problem was 100% manufactured by suppliers.
Reminds me of how ever since egg prices went to the moon we've all had to give up dessert and subsist on thin gruel for breakfast.
What's that? Egg prices are back down after suppliers cranked up their output? Surely nothing like that is possible with hardware... Personal computing is dead forever...
This will happen eventually but there is a much longer lag for hardware supply than for egg supply so I wouldn’t expect a ton of improvement until late 2027 or even 2028.
That’s the issue: ram suppliers have not started building new fabs, which means they expect this demand to be temporary and they’re just going to make a killing on it while they can. It takes years to get a fab up, and they think demand will be gone by then. So that means ridiculous prices now, and if demand doesn’t drop, ridiculous prices until someone thinks the demand will continue for four more years after that moment. Whatever the moment, building for four years out is a risk. So this could last forever.
> SK Hynix has reportedly broken ground on a new advanced memory packaging facility in West Lafayette, Indiana, that should boost the supply of US-made high-bandwidth memory (HBM)
PC is the last major open platform. While other platforms like Android and becoming less open, PC in general is becoming more open than it's been in a long time as heavy MacOS/Android/iOS competition is creating a focus on open standards and all-time high strong Linux support gives people a place to land and tinker/hack to their heart's content.
I think we will see an abandonment of consumer grade PC components and individuals are either pushed towards closed hardware like Playstation, MacBooks, and Android devices or they are pushed towards server grade components. I already have home sever rack, and would recommend it for other people.
> While other platforms like Android and becoming less open
ok....
> PC in general is becoming more open than it's been in a long time as heavy MacOS/Android/iOS competition is creating a focus on open standards ...
I'm so confused by what you're trying to say here.
This will surely bring new energy into opening these platforms, as it did in days before
why?
I'm interested to know, WHY is PC so open? what led to that?
Many vendors, because that means you need specs and that in turn allows for interoperability
Because Microsoft commodified their complement in the 1980s to break the back of IBM.
Agreement IBM had to make with the DoJ/etc in the 80s to open the PC platform to avoid antitrust prosecution. That was the key event.
I would argue that the key event was Columbia Data Products’s clean room implementation of the BIOS.
https://en.wikipedia.org/wiki/Columbia_Data_Products
That, and I’m pretty sure the DOJ had ended the antitrust suit (which was about bundling) by the time the PC was released.
You might be interested in the IBM PC compatible and Wintel wikipedia pages. This is a super high level timeline, but it is more interesting to get into the detail.
At a high level, the IBM PC platform were very well documented & sold well, to the effect of producing tons of software and peripherals add-ons ("PC Compatibles"). This led some other computer companies to reverse engineer the proprietary IBM BIOS, allowing them to run the same software and use the same peripherals. Because these were clean room reimplementations, IBM didn't have a legal case to prevent their sale.
Fast forward a bit, IBM's attempt at a new, closed platform, PS/2, flopped. People wanted their more open hardware. Windows became dominate enough that all the demand was for x86 based hardware that could run Windows. Microsoft was happy to work with many vendors.
The PC is very open today, but Apple survived. Atari ST and Amiga probably survived longer than you think as well.
> PC is the last major open platform.
In the whole history of computing PC is the only platform where buying a computer means crazy number of options and configuration mixes to choose from and expect it to work! And warranty would support it too! You can run any OS of your choice on it and that's also reasonable expectation.
Any other platform (SUN, Be, Amiga, NeXT, Apple) it was always buying it from one company only from its list of products. And even running with a different version of OS means warranty doesn't cover it.
I came back to this comment 12+ hrs later hoping to find someone make a great argument for some platform in the 70s that I didn't know enough about, or maybe a modern open hardware movement that is building niche support.
I guess it really is just the PC.
> I already have home sever rack, and would recommend it for other people.
I just want to warn people who haven't heard server-grade hardware in-person before: this is only for people who can put a server rack somewhere unpopulated like a garage or basement. Servers will make you think "wow, leafblowers sure are quiet". They are not suitable for apartment dwellers such as myself. When I was setting up my 1U before shipping it off to a colo, I wrote scripts and had detailed plans of the things I needed to run so I could minimize the time it was making my ears bleed.
This. At one company we ran out of space in the server room, so the excess machine temporarily landed next to my desk. Dear god. Noise cancelling headphones couldn't cope with the noise.
> my 1U
1Us have the most compromised ventilation and compensate with loud fans running at high speeds.
Sure. But are there actual limits on how much noise they're allowed to make?
This is all built to be put in a place where noise is not an issue
Yeah, it certainly wasn't the quietest choice for form-factor, but the fact remains: all server grade hardware are not optimizing for noise. They are meant to be running in datacenters, not livingrooms, so noise was never a concern for them. A nice thing about consumer-grade hardware is they are optimizing for both sound and power consumption because those devices are designed to be around humans. So I certainly hope consumer-grade hardware survives.
In my first job we worked in a room full of 4Us and it was always refreshing when we powered them all down for the weekend. So quiet. It’s almost like there was a reason why consumer-grade hardware existed.
I have a 4U with noctua fans and the loudest part of my rack is the harddisks
Not only the loudness, the small fans have subjectively more annoying sound even if they were the same volume. Much more shrill than a large fan.
You're right, I may have significantly over-estimated the percentage of people on hn that have dealt with server hardware. It's expensive, big, loud, power hungry and temperature sensitive.
I had to provision a 1U server in grad school. Turing that thing on in the office was a joke. Completely impossible to work with it on if you were anywhere in that part of the office.
You can buy server boards that don’t require loud fans. If you’re buying used server gear from a datacenter then it will be like what you said.
I have a 4U NAS with a supermicro board and an i3 chip with 6 WD Red NAS drives and it’s very quiet. The chassis came without fans so I installed the brand I like.
no, you definitely cannot. you probably have a consumer board in a 4u case
Tell my youve never owned a supermicro board without telling me. They support regular 80mm/90mm Noctua and function just fine. There's specific supermicro mounts for me.
Completely wrong.
Reminds me of when I as a kid got one of those Delta 7000rpm fan powered cpu coolers, my mom promptly asked what it would cost to make that noise (that was heard in the entire apartment) go away. Got a Zalman (back when they were great) and everything was good.
It was a learning experience, and I think everyone should experience that kind of industrial noise at least once to appreciate how quiet consumer hardware is.
I remember a review from back then which contained the phrase "strap on the 7000rpm from hell". Those Delta's definitely weren't quiet.
Kind of a random aside, but I never realized how obnoxious LEDs were until I got a studio apartment and started sleeping in the same room as my homelab / workstation / networking hardware. Electrical tape saved me, but wow. You sure can produce a lot of light with a milliwatt of electricity :)
(And yes, my workstation has a clear case and LED RAM. Yes, I'm an idiot. Whenever Windows applies an update late at night, I wake up if it turns back on. I don't know what I was thinking when I built that thing, but never again.)
Is it even possible to buy computers these days that don't look like they're intended to be the lighting system at a rave?
Yes it is.
I like to put a little red wax over LEDs (at least, ones that I don’t touch). That way you can still see them, but they are dimmer, and the red tint makes the light less annoying at night.
Even worse are phone chargers, intended to be used next to your bed, that light up like a Christmas tree when running. Black electrical tape is great for the worst of it, but you still need a few things available to tell you the operational status, if only they'd dim them a bit.
I always thought it would be low-grade hilarious to record a fairly long video of the unboxing and assembly of a ridiculously elaborate in-case LED setup, only to reveal with a straight face and at the absolute last minute that the case in question is entirely opaque.
The noise problem is pretty easy to mitigate by choosing 2U servers instead of 1U. The latter are forced by the form factor to use smaller, higher speed fans.
A bigger issue for enterprise hardware is that it's optimized for performance per watt under load, not idle power consumption. Running a mostly-idle rack server 24/7 can result in a pretty sizable electric bill. This also depends heavily on the model. Some will idle at ~50 watts, others at ~300, but both of these are significantly higher than a Raspberry Pi or an old laptop which for personal use will generally do the job.
Business class desktops are also a good alternative here. Many models have pretty reasonable idle power consumption (check this for yourself, I've seen 6W but also 60W) and then you get a couple of drive bays and PCIe slots and expandable RAM which you don't get from a Raspberry Pi.
These days, pretty much the only thing that makes sense is a mini PC. AMD laptop chips generally trade blows with Apple stuff on power efficiency when you thrash them, and you get a surprisingly capable machine for not very much money.
It's really not worth it to run old hardware 24/7 unless it's making money. Buying a new machine of equivalent capability is (normally) pretty cheap, and it doesn't take very long for the power savings to pay for themselves.
They can be had with fairly respectable specs too. Certainly enough to play around with small local models.
"When you thrash them" is kind of the issue. There are ten year old business desktops with a <10W idle power consumption. If your use for it is to have something to rsync files to and host your personal website and the like, even old hardware is going to average 99% idle. There is no meaningful power savings from newer hardware unless you're consistently putting it under significant load.
Some of the newer hardware is actually worse because the idle power consumption of PCs since around 2010 is determined in significant part by the low-load efficiency of the power supply. Brand new machines with the wrong power supply can use several times as much power at idle as ten year old machines with the right power supply. Annoyingly, power supply efficiency at idle is rarely documented so the only thing to do is measure it.
I built PCs for a number of years and then I shifted to some combinations of RPis, MacBooks, and (maybe) Mac Minis. It was a (long) phase that involved quite a bit of money as well as frustration oftentimes but almost certainly not going to do it again.
You can make those rackmounted servers as loud or as quiet as you like. For home, optimize quiet (and low power consumption).
Even though my server rack is in the garage I try to keep it quiet. A couple of them are fanless Atom-based and others have fans but they are built to be quiet. If you need hardware that generates a lot of heat, go with 4U for large fans that spin slow, thus low noise.
The "wow, leafblowers sure are quiet" happens when you stuff a lot of heat generation into a 1U chassis that then requires lots of tiny fans running at full speed. Those you don't want at home! But it is easy to avoid. Data centers do this to maximize density, but that's unlikely to matter at home.
>Atom-based
Not exacty enterprise grade servers then?
Supermicro sells Atom-based SKUs with enterprise features like a BMC+IPMI, 10Gb SFP+ ports, ECC memory, SFF-8087 ports, chassis intrusion detection, etc.
And do you need a full-on enterprise-grade server? Given the choice between a 1U server whose fans even at minimal utilisation can still be heard three doors away and something with a low-power/laptop-grade CPU that does the same job silently and with little power use, I'll take the latter.
I sit next to my 4U server with all enterprise components apart from fans - these are consumer grade.
I had to mod the chassis slightly (with just pliers, tape and random inserts) to fit these fans in there, and add fans in front to push the air in. The PSU that came with it was obnoxiously loud, but thankfully, Supermicro has a quiet version that I can't even hear. Even if SM didn't have this PSU, I could have easily modified the PSU and fit some noctuas in there without any issue or safety concerns - like I did with my enterprise grade Mikrotik switch that also had obnoxious fans by default.
I even have an enterprise grade UPS that is dead silent when it's not running on battery power (I swapped the fans there too).
I essentially try to buy enterprise gear whenever possible. Not only is it usually much better than the consumer alternative, but it also is frequently much cheaper too because of second hand market. Before AI sucked the soul out of the hardware market in general, you could have bought enterprise SSDs that had life expectancy - TBW - measured in petabytes, and MTFB - practically never - for half the price of the top consumer SSD that had TBW measured in tens of TB and MFTB of yesterday.
And the entire rack is just slightly more louder than the PC I was using.
The only consumer grade computer at my home is my MacBook and my phone.
Enterprise SSDs are all that. Just make sure you power it up. For data retention without power the requirements are 3mo for enterprise vs 1yr for consumer grade.
If you’re living in an apartment I definitely could it seeing being non viable, but if you’re in a house I don’t think it’s a big deal.
Every house I’ve lived in has had machinery for water pumping and heating and we just put our server along with them.
A 4U case is basically just a midsized tower with rack ears (or rails).
A 1U case runs the gamut in noise from vacuum to jet-engine.
and i dont know whats the reliability gain from consumer to "server" hardware. 1 x 9? hot plug power supplies? definitely more ram slots i guess...
meanwhile..i see axiomtek industrial computers that dont even have a power button sold with 7yr warranties..
I had exactly this problem, 1U server that sounded like a 747 taking off downwind. I solved it by getting a mini-PC that had more processing power than the eBayed 1U server (I just looked up what was available in terms of CPUs and got the best bang per buck, an 8C16T AMD CPU) and that runs essentially silent except when it's under load - they're designed for low-power/silent operation. If you're running your server at 100% load 24/7 then this isn't for you, but for home "server" use it was ideal.
If you build your own servers you can make them silent.
I had a 2U Xeon beast I kept water-cooled. Before I installed the water cooling, a bit noisy and 60C. Afterwards, total silence and 30C.
Assuming this trend continues, I think people are going to start re-using older hardware rather than turning to server-grade hardware (which is often not convenient for the average residential situation).
At least, that's what I hope happens. What will probably happen is people will continue to migrate away from the PC platform and towards closed platforms for the convenience, if history is any indication.
That's what I've been doing for years. I buy (or get for free) enterprise PCs coming off lifecycle at surplus sales. Nothing I do at home needs a cutting edge CPU. Unless you're a hard-core gamer or serious hobbiest/tinkerer a 5 year old or even older PC running linux is very adequate.
I think this is already happening, sort of. At least, people are hanging onto their older-but-not-yet-old components for much longer than they used to. I recently tried to build a NAS from eBay parts, and I was surprised to find that the newest stuff affordably available was 6th/7th generation Intel Core parts (retailed 2016/2017). I think people are trying to offload these CPUs in particular because they can't run an unmodified Windows 11 installation (no firmware TPM 2.0 implementation, and the corresponding consumer motherboards typically didn't have a discrete TPM module, either, if they had an LPC bus connector at all). Very little (reasonably-priced) availability of similar-aged Ryzen CPUs (which have firmware TPM support) or newer Intel CPUs.
Why would most people need a home server rack? That's a lot of noise, space, and electrical usage. For what most people would need a home "server" for a NUC PC or Mac Mini would do the job.
Ziply Fiber is offering 50 gbps home internet connections in some US locations. You cannot utilize that type of speed with a Mac Mini. Even the modest 8-10gbps connections offered by T-Mobile and Google probably require more.
Doesn't really answer the question though, why would someone be trying to utilize that much bandwidth out of their house?
Is this for people trying to start the next netflix out of their garage before they have any money to put the servers in a colo?
VPNs. If you have a NAS and require high-speed access from/to your home files (dumping your Apple ProRes RAW rushes off your external SSD, so you can keep shooting your video, for instance), that kind of bandwidth cements your income.
You and 49 other people all simultaneously working from locations with gigabit uplinks
> You cannot utilize that type of speed with a Mac Mini.
Mostly because the base Mini has Thunderbolt 4 which maxes out at 40Gbps. Anything with a PCIe 4.0 x16 slot will take a 100Gbps NIC. 100Gbps is around 10GBps (8 bits per byte plus encapsulation overhead). Desktop CPUs can do AES-GCM at 2.5GBps+ per core and have up to 16 cores and around 50GBps of memory bandwidth (dual channel DDR4-3200), so the NIC still seems like the bottleneck.
Why would most people need a NUC PC or Mac Mini when a pencil would do the job?
Degrowthing is dumb, people will find use if they have more.
Wouldn’t recommend a home server rack in an apartment. For high wife approval factor, you can put Epyc hardware with Noctuas in a bigger case. I’ve got one at home. Runs my blog and a bunch of other things. Home is at 32 dB right now.
Realistically a Mac Mini will probably blow a lot of things out of the water on price / performance. Even an older one.
Buy steam deck, and steam box.
> ... or they are pushed towards server grade components. I already have home sever rack, and would recommend it for other people.
An actual rack with noisy 1U or 2U servers may be a bit overkill but on the plus side there's a guaranteed endless supply of such used servers.
Now there's a happy middle ground: used workstations with ECC memory, that you then use as servers.
People would be really wise to not underestimate what a 12 years old dual-Xeon, 14 cores each, 56 threads in total can do, for example. And such a complete workstation can basically be found for less than what it takes to fill my car's gas tank (granted it's got a big tank and it's fancy car whose manufacturer recommends to only use 98+ octane).
A single Xeon workstation with shitload of memory in a tower form factor is basically silent. Mine is. Dead quiet, next to the vaccuum cleaner and the cat's foot in a tiny room. I use it as a headless server.
And that's with the default PSU and fans. There are, of course, people modding these with adapters for regular consumer PSUs and then putting ultra-quiet PSUs in those. Same with Noctua fans etc.
And as for the usual complain: "but a server that is on 24/7 consumes too much electricity"... I only turn on my servers at home when I begin to work: I don't need these to be on 24/7.
So yeah: "Server CPU + ECC" doesn't imply noise. And "Server CPU + ECC" doesn't imply it has to be on 24/7 neither.
I recommend this too!
I like my Dell Precision T7910 (dual-socket Xeon FTW) a lot.
What are you using?
For most people, I’d recommend a NuC or a NAS with an unlocked bootloader (so you can put Linux on) for a home server.
Most home users need a small amount of compute, and are sensitive to noise and power use.
Contrary take: I believe we will see an expanded market for capable PCs that can be sanely put in a living space. By extension of the gaming PC niche to local AI. Both NVidia and AMD are developing product lines in that direction (DGX Spark, Ryzen AI Max). And Linux will be more prominent than ever, due to several independent reasons: MS dropping the ball hard on Windows, SteamOS making Linux attractive for gamers, 'digital sovereignty' as a trend, and Linux being the de facto standard for hosting AI (or anything really).
Great take, but if the market is expanding for capable PCs why are motherboard sales decreasing?
Well, the two chips I mentioned (DGX Spark uses the GB-10) are both a SoC, so no motherboard needed there. I don't know if that's the full explanation, but it could be a factor.
The SoC design with unified memory is generally well suited for residential use because it's quite energy-efficient, quiet and small (compared to traditional GPU-powered gaming rigs). Great performance-per-annoyance, so to say.
Mini PCs (NUC-ish form factor) are selling a lot now too, small, quiet, most people don't need expansion over what you can get from eg USB4.
According to the article, because components are really expensive right now, particularly RAM and storage.
And flying pugs gonna fall from teh sky too.
The problem with all those devices you listed is that they have lost the "general purpose" ability. I guess you could define "general" to mean "carefully curated"...
"Despite this drop in sales, these companies aren’t exactly struggling. Asus, Gigabyte, and ASRock have pivoted some of their production towards AI servers, allowing them to capture some of the investments that hyperscalers are generously pouring into their data centers. But if you’re planning to build a completely new PC from scratch, you might be able to find good deals on motherboard combos, especially as retailers are keen on getting their inventories moving. "
----------------
1. Within a few months, these manufacturers will likely raise desktop mobo and CPU prices with the justification that "volumes are too low".
2. If you're upgrading from an older machine, it likely has a format of RAM that's not compatible with newer boards. Upgrading the cheap parts now and waiting for the expensive bits to come down is simply not an option. It's all or nothing.
Game and application developers should be paying close attention to this. You're used to the average user's system spec going up every year. That's stopped for now. The average memory in new systems may actually retreat!
Anecdata to add to the pile...I pulled three 1u epyc gen2 servers from my production rack 1.5 years ago and replaced them with lower power alternatives for a production storage cluster. I didn't need the extra CPUs for app server stuff so they sat in my house for a while. Fast forward 1.5 years and it was making sense to upgrade some app servers to new gen stuff, get a bump in frequency and core count...when i went to spec some new servers, my normal $15k - $20k build was $55k.
Instead, I hittup ebay, got six used gen3 processors, found a "good deal" on a couple tb of new ram (still insanely expensive), and came out with the same overall horsepower for a total of $20k instead of $110k.
I know this is about consumer desktop, but seeing the comments about upgrading old hardware caused me to chime in. This is happening in the production/enterprise level in some segments.
Motherboards used to be $100, $200 for the high end. Now they want $300+, ram is crazy, storage, video cards, etc. I'm not surprised sales for these components is hitting a wall.
I don't really agree with this. Motherboard prices haven't been moved at all by AI.
I would also say that most consumers, who are almost exclusively buying gaming-oriented boards, do not need anything high end. They can pretty much buy the cheapest board available.
I am shopping around for a mini ITX board and the difference between something at $180 and something at $400 is basically one to two faster USB ports, which are pretty much irrelevant on desktop computers, and a few minor conveniences that I imagine most people can do without.
The higher-end chipsets add no discernible advantage and there are no CPUs that are unsupported by the lower end chipsets (on the AMD side, at least).
The high end stuff is just available for people with a lot of money.
I am massively sick of gaming focused boards. I don’t want my board to be “tough” or “mil-spec” or be extra shiny or have fancy-proprietary-auto-overclock. I want a reliable board that complies with all the specs it claims to support. Low idle power consumption would be nice, too.
This is obnoxiously difficult to shop for in the desktop/workstation space.
The PCIe lanes are the worst. You have x16 slots that run x1, you need to check slots with m.2 to make sure an x8 doesn't become x4 if you insert storage. Wait if I plug something into the thunderbolt port my 10g network card runs at half speed? Obviously these are actual physical limitation from PCIe lane counts, but it makes it impossible to search. Just painfull.
My advice to anyone doing motherboard shopping is to read the manual off the manufacture's site before deciding. The pcie lane tradeoffs tend to be in the block diagram next to the contents page.
This is exactly why my comment goes over the head of people who cry just get the basic boards. No, this is why the basic boards for $100 don't cut it. You now need to dive into the technical data and realize that the $100 board seems like a deal for a reason, and suddenly the $300+ category is your only option if you want to get a PC that doesn't run on fake specs.
They exist to partition capability so that enterprises can’t connect all of their peripherals and some ECC memory to get the same functionality for 1/10 the price. It’s not a physical limitation.
Obviously market tiering is part of it and you can play tricks with north and south bridge and pcie switches (which adds cost), but a ryzen board that advertises a pcie 5.0 x16 gpu slot and 5.0 x4 m2 slot only has 4 lanes left to work with from the cpu (i.e the cpus only have 24 usable lanes). Which while you can play with generations to get more lanes it's effectively still 16gb/s. That needs to cover network, extra m2 slots, usbs, as well as the extra PCIe slots.
I don't mind having to work within those physical limits but I do want to be able to search for boards that support N components. i.e 1x 4.0x8, 2x 3.0x8, 4x 5.0x4 . But the best you can search for is physical sizes of pcie slots and then dive into a spec sheet for each one, only to find that the 6 x16 slots only have 1.0x1 of bandwidth each.
$180 for a basic motherboard is damn expensive though.
My parents bought a mid-tier PC for $3,000 (in 1995 dollars) and there was still a thriving PC industry at those prices! While things are getting more expensive now (and that sucks) we have had it really good for a long time.
How much money was left to spend on hobbies considering the cost of education and real estate at the time?
Similar boat here, and more than enough for hobbies. The difference between income and housing cost was significantly less.
you mean you don’t prioritise helping your landlord buy their newest mcmansion? i’m just happy to have a roof over my head and continue to pay ever increasing rent!
Buying whole 2020 era PCs here for around $200 mark. As long as you don't need crazy CPU or GPU grunt, which is most people, they are almost indistinguishable from a new one.
Windows 10 LTSC + Firefox + uBlock Origin on an i5-9400 feels faster than my M4 Pro MBP. Probably same or better on Linux.
Windows 10 LTSC isn't a realistic option for most people.
You can install windows 11 on an i5 9500 though and it'll run fine if you clean it up a bit.
Why not ?
There's even Win 11 LTSC now
Upgrade my cpu the other day, got a ryzen 5 5600 for ~$100 new, can't complain. Still on my rtx 2060, can't complain either. As long as you don't fall for the 120hz and 4k memes you can easily get by with 2020 hardware indeed.
I plugged a 60hz monitor in at work and discovered I can't ever go back. I need 32" 4k 120hz at least now.
I am "easily getting by" with 2014 hardware.
> Windows 10 LTSC + Firefox + uBlock Origin on an i5-9400 feels faster than my M4 Pro MBP
I don't remember Win10 being particularly lean (although I'm sure 11 is worse). And the M4 is definitely a much more powerful CPU. Can you not run Firefox and uBO on that? Or have they really weighed things down that much with the OS somehow?
> Probably same or better on Linux.
Even with the Cinnamon desktop environment I can vouch it uses considerably less RAM for just the desktop (ordinary applications are probably about the same) and offers much faster filesystem access by default. I'm sure this is at least partly due to not being weighed down by built-in anti-malware (that would do basically nothing for people who are comfortable using Linux in the first place).
Yeah I can and I do run Firefox on the M4 Pro. It's almost indistinguishable on most tasks when there's an ad blocker running surprisingly.
LTSC is a whole different animal than any other windows. Super lean.
Let's just take a moment to appreciate how important uBlock is for performance. I pity the fools without.
> Motherboards used to be $100, $200 for the high end. Now they want $300+
Entry level motherboards are still $100.
$300+ is a very high end motherboard.
The existence of very high end products is confusing because it can give the impression that you have to buy a $300 motherboard because it exists. If you compare features side by side you're rarely missing anything important for the entry level motherboards.
Some people really want the best of the best and feel the need to buy motherboards with Thunderbolt 4 and other future-proofing measures just in case they might need them, but it's premium and luxury territory.
If you’re dropping that kind of cash, you definitely want to future proof it and not go with the budget lastgen motherboards.
Future proofing is an expensive way to pay for features you don't need and will probably never use.
It's smarter to buy a cheap motherboard that meets your needs now. If in the future you find the need for USB4 or some other feature, upgrade the motherboard.
More often than not, builders will try to future proof for eventualities that don't arrive before it's time to upgrade to the next CPU socket anyway. There are a lot of people with expensive, outdated "futureproofed" builds who would have been better off saving the money on the original purchase so they could upgrade sooner instead.
This. In 2017 I bought the cheapest AM4 motherboard with a USB-C port (a Gigabyte X370 Aorus Word Salad). I'm still using it because BIOS updates gave it Zen 3 support.
Wanna guess how many times I've used that USB-C port? Maybe once or twice in the 9 years I've owned it. Never needed it. I also couldn't tell you what X370 is getting me that B350 wouldn't have gotten me.
It's a gamble. I take the opposite mindset now; scarcity mindset.
"$1600 is too much for a video card" - me a few years ago on not buying an RTX4090 from nvidia's website.
"I only need 32Gb of RAM. If I want more later, I'll just updgrade" - Me a year ago.
Both mistakes, with hindsight. I will always future proof from here on out.
Counterpoint:
"$100 is a reasonable amount for a video card, I know this is on the budget side but at least I have a card this way" — me 12 years ago.
"I guess it's worth it to spring for 8GB of RAM..." — me 12 years ago.
Still using the same machine, with no regrets (just the occasional bit of envy).
Different people have different expectations and requirements.
When you try to future proof, you are basically hedging. It’s a kind of insurance; sometimes it pays off, sometimes it does not. Having more disposable income now than I did 10 years ago I tend to pay more attention to this sort of things, but everyone can choose where they put the cursor. Someone who overestimated their RAM needs when buying a computer last year are probably pretty happy about it, but it could have swung the other way.
I future proofed by stepping back to high end components from last generation (except for GPU). My memory speed is slightly lower, but I have 32 cores and 128 GB ECC RAM on 4 channels. I doubt I will need to upgrade this thing any time soon for my typical use cases.
Note that this was before the RAM shortage, but I bet you could still do this now and save a little versus mid-tier current gen gear.
Buy a $300 motherboard now in case you need future features, or buy a $100 motherboard now that does everything you currently need and then buy a second or even third $100 motherboard if you ever actually need those future improvements.
Then you get a new board designed for the new features instead of something several years old and you come out $100 on top.
Futureproofing is nonsense. PCs just don't work that way, and haven't for decades.
> Buy a $300 motherboard now in case you need future features, or buy a $100 motherboard now that does everything you currently need and then buy a second or even third $100 motherboard if you ever actually need those future improvements.
Right, but the problem is that by now your $100 new motherboard requires a new CPU and new RAM. Which is very much not $100.
In the past we got away with PCI cards to add features without changing the motherboard, but we still ended up changing everything every 2 years anyway…
The only 2 parts that even make sence to "future proof" are power supply and case.
I would only agree if you already plan on doing major hardware upgrades within like 3 years at the latest. Past that and you will inevitably be missing new features that will be shipping even on budget hardware and won't be saving on anything.
Entry level motherboards used to be just fine to use. The last time I was shopping, they all had a random deal breaker in terms of a missing feature. Maybe I’m just pickier now, but I doubt it.
> Entry level motherboards are still $100
Entry level motherboards were $50 (meaning 40 on sale)
I mean, if you think about all the motherboard does, and how many layers the PCB has to support all the features such that for a vast majority of users, the only things you need besides the motherboard is a CPU, some RAM, storage (either in M.2 or SATA) and maybe a dGPU, it's wild that it is often the cheapest item in the PC.
Well to be fair processors, ram and modern storage are lithography required to make parts. Not sure how much lithography the chipset requires.
I just priced out a new motherboard for $130.
Not just motherboards. Cases, PC accessories (fans, etc), consumer SSDs, and more. Cases are especially hard hit, apparently, as they're already quite a low margin business.
Personally, I see little reason to upgrade from my AM4 platform. It's never been easier to hold on to aging hardware with the advent of DLSS stretching older cards further, diminishing returns on the newer gen GPUs, and the 'realism' of video games plateauing.
I should have upgraded my GTX 1060 6GB last year.
Last year I said I should have upgraded my 1060 last year.
I bought it second hand 7 years ago and it is still the same price.
I don't do much gaming, and it runs Immich / etc light inference just fine. One thing I don't regret is getting 32GB of DDR4 when I built the system around the time of the GPU upgrade.
Sometimes you just have to accept the current pricing and buy what you need to buy (assuming you need to buy anything at all).
7 years ago it was the same price, but then again, the last 7 years have involved accelerated inflation. So, the same price is actually a lower price.
If you're looking for a card in the sane $300 area, the Intel ARC B580 (12GB) or the RX 9060XT (8GB) are a reasonable value. If you want 12GB+ from Nvidia or AMD the used market in previous generations is a good place to look: maybe something like a RTX 3060Ti (12GB) or RX6800XT (16GB).
I personally don't think the GPU market is incredibly miserable. Maybe I am just used to the pain or something? Nvidia has a bit of a tax where but something like the RX 9070XT is basically the 3rd fastest gaming GPU money can buy and it's around $700. (I'm not sure why the 5070ti costs $200 more even given Nvidia's software advantages. It performs almost identically it just doesn't make purchase sense)
3060ti only has 8GB, 3080ti has 12GB. That’ll make a difference for prices/comparison.
I’m in almost exactly the same boat.
2017 GTX 1070 and 32GB ram. I don’t run games 4K and still haven’t had any problems running reasonably pretty recent stuff.
Agreed. I build a system every ten years and I've got 6 years to go. AM4 works great, and I've managed to hoard enough ram and drives to hopefully cover any concerns for the next 4 years. Things work, they are stable, and I feel super lucky for that.
I invested quite a bit in enterprises level homelab equipment 2020 to 2025 (about 10k). Happy I made it before the big bang. Eg. my SAS he8 drives will last at least till 2035. But what then? I want my children to be free, too.
You cannot be fully free if you're attached to physical goods.
It may sound like pseudo-Buddhist claptrap, but it's also true. Or, I suppose, Fight Club claptrap. It's still true.
The choice is "do you want to participate in society, its benefits and drawbacks". You can't have only one side of that.
The first rule of Buddhist Fight Club is that attachment causes suffering.
You're not supposed to talk about it!
it also makes the economy go round (unfortunately)
When investors stop to ponder if they are ever going to see any return on their superhot AI investments, you'll have all the cheap hardware you could ever want.
> and the 'realism' of video games plateauing.
I used to think the plateau was here when the Xbox 360 and PS3 came out.
I still think it pretty much was the last major generational upgrade in graphics. An early PS3 game looked night and day better than a late PS2 game. Meanwhile, an early PS4 game looked only marginally better than a late PS3 game, and most PS5 games don't look noticeably better than a PS4 game.
I don't mind that graphics have plateaued, because they aren't the important bit. If anything, I would rather that devs stop trying to chase graphics and make more games with shorter dev cycles.
> Meanwhile, an early PS4 game looked only marginally better than a late PS3 game, and most PS5 games don't look noticeably better than a PS4 game.
Partially this is because there was usually an overlap in sales for early PS4 and late PS3, etc. if you have to support both console generations, it won’t truly be able to take advantage of the newer gen stuff.
Texture resolution and shadow resolution do a lot to make a game look better. The big difference between the PlayStation 2 and 3 was the massive jump in texture resolution, shadow resolution and model polygon count. Play Gran Turismo 5 and go look at one of the cars imported from Gran Turismo 4 for a good example. However the PlayStation 2 was capable of some very high polygon count models, as evidenced by Lulu's cutscene model from Final Fantasy X that rivals most PlayStation 3 player models in detail. Those resolution upgrades, the number of objects and not just polygons displayable on screen, and the increase in distance required for low-poly LOD models all made that giant leap possible and very visible. Since then it's mostly been adding camera effects such as depth of field and ambient occlusion that are much less noticeable. Though for those with keen eyes, only in the current generation are there textures without noticeable anti-aliasing effects which came as a result of being able to split the UVs thanks to a higher resolution making small UV faces possible.
Since we're 10 years on at this point, I feel pretty confident saying the plateau to my eyes landed somewhere between the PS4 in 2013 and Pascal (GeForce 10-series) in 2016.
I've kept playing games and upgrading my GPU every other generation, and they're still fully utilized, but I can't really see where the additional compute and money is going. My biggest visual upgrade during that time was actually going from LED to HDR OLED which is something that requires virtually no additional processing power.
What's wild is with all this craziness going on, it is sounding like AMD is bringing back the 5800X3D for another kick at the can. AM4 has got to be one of the greatest platforms to ever exist.
I'm one of the collapsed sales. My desktop had died, and I had been thinking about rebuilding it.
But RAM prices went to the moon, so I instead opted to repair the desktop. (It's only ~15 years old.) It's alive, again, and performs well enough.
The HDD in it is pretty old (not as old as the rest of it, it's on its second drive; 15 years would be quite impressive!), and still works for now, but there too, prices are silly and well above inflation. (I looked it up again: the same HDD is 50% more expensive today than when I bought it, in real, accounting-for-inflation dollars.)
I've been replacing older systems with last gen hardware off ebay. I'm typing this on a Thinkpad T14 i5-1250p 512GB 32GB WWAN I picked up last week for $370 all in.
Since this mess started, I've bought dozens of unused and like-new systems for clients. All with modern hardware - in the $250-$600 range.
Craigslist still exists, too. Found a really nice ~10 year old HP workstation for $100 and crammed as much DDR3 ECC RAM (still cheap) and the bets Ivy Bridge CPUs (cheap) I could find into it, and it shreds.
Shortages or not, there's little demand for cool new motherboards and CPUs from the enthusiast corner of the market because hardware platforms themselves are stagnating performance-wise.
13-14gen Intel Cores are still more than enough for your average home gamer, Zen 5 shows only marginal improvement over Zen 4 except for a very narrow range of workloads, getting wider than 128bit memory bus is prohibitively expensive while relatively cheap consumer boxes like Mac Mini run circles around dual-channel DDR5 setups, so on, so forth.
Sure, presenting this as a consequence of AI boom is convenient for a news outlet, but even before the craze both Intel and AMD were dragging their feet.
I'm not buying it. Both the premise and the new motherboard, that is.
I wanted to build a Threadripper 9965WX and the math worked out until DDR5 prices come in to play. Instead I got a used Lenovo P620 5975WX and still had to buy DDR4 from Shenzhen to get anything remotely affordable. The IPC of the Zen5 is a meaningful uplift especially for single thread but it is out of reach.
Where/how did you buy your DDR4 from SZ? Interested in doing the same, but want reputable source/supplier.
10-12 Months ago I had commented here that people are not realising that AI is going to price us normal people out of computer hardware and we need China to actually reach on parity with node size. And sadly it looks like I was correct in my prediction.
It's an active attack on the Hobbyist space. Qualcomm buying Arduino solidified this idea in my head. They literally want us to own nothing.
Tin foil hat :)
But in a way I do agree with you, I doubt it is as organized as you imply. Yes, companies and governments do not way anyone on a General Computing Device at all. They want to see exactly what content you are viewing and responding to.
Microsoft and Apple have been slowly adding various forms of spyware and locking down what applications you can use. And Cell Phones ? Those are the Holy Grail of what Microsoft and Apple want to move your Laptop/PC to.
Right now Linux and BSD are the only games in town for non-spyware systems. But the new Age Verification Laws seems to be a first attempt to lock-down even Linux :( Since the Linux Foundation is owned by large corporations, I feel that will succeed. For the BSDs ? Right now seems they are flying under the radar.
Why do you doubt this when the rich also have Signal? They meet and talk out of view? The insider trading coming out of Washington?
Why when emails from discovery in labor disputes between google and apple in the 2010s revealed they engage in exactly the sort of manipulation you disbelieve?
Because they own nothing but make believe stocks and life works great for them.
The mega-rich are 100% decoupled from physical reality. May as well treat them more like tribal shaman, priests, preachers, and rabbis.
Just parroting memes the likewise idiot politicians believe are the magic chants that keep gravity itself pulling together the Earth.
"Omg he said the thing! Cut his taxes! Give him welfare!"
Our generation of leaders were raised in a pre-science and information as world. They rely entirely in cult of personality as their meat suit never sees itself engage in the labor it relies on to live. It's well aware intuitively how fucked it is. Must continue to stand in the pulpit!
>May as well treat them more like tribal shaman, priests, preachers, and rabbis.
Why associate them with roles that have a degree of positive association and human connection?
Treating them as faeries, vampires, or demons seems more accurate.
I think treating them as the fae, vampire or demons is sort of insulting. Those creatures are at least bound by supernatural laws and can be negotiated with in some way.
Bold claim given all the hate out there for covering up Christian leaders diddling kids, slaughter of Palestinian kids for not being Israeli Jews, and the beheadings and assassinations coming out of Muslim-landia over trite offenses.
I think you conflate informed consent with "brainwashed as children into fealty via allegory of the end times, and threats of violence if they don't comply."
Nah. The first two thirds of the 20th century was the science and information world. Man gained mastery of the skies, the depths of the sea, the void of space, the atom. We were taming diseases and found a way to end hunger. We started building thinking machines. We were playing with the fire of the gods. Science was working miracles on the daily.
It still is, but nobody gives a shit anymore, we are in the financialization and rent-seeker world now.
Now we are just playing with fire.
sometimes i wonder if this is what happens when organic growth stops/slows; when (for lack of a better word) desperate people just start looking for any alternative to keep the growth train running...
Why is it that new accounts these days always seem to come out swinging about politics, class warfare, etc?
It might just be frustrated young people. They're getting squeezed real hard by a system that was set up to put them on an impossible trajectory before they were even born.
You can see the divide everywhere. People with lots of money think supply and demand, congestion pricing, etc. are great tools because it doesn't impact them at all compared to people on the bottom. Those are only good solutions if you're not the one falling off the bottom rung of the ladder.
Is it really shocking that people are upset to see the supply of resources being cornered and hoarded by the ultra rich with the most likely outcome being the only way to get access to those goods will be to pay forever?
The possibility of AI becoming a must-have knowledge repository or memory assistant is scary if you couple it with the idea of never being able to own it. How much is your memory worth? What if you can't compete in terms of productivity without having access to AI? What about the people that can't afford the "first month of rent"?
People come in and make angry posts like the GP because they know they're getting disenfranchised and don't have the power to do anything to change it.
I think it’s probably mostly what your sibling comment said, it’s very cheap to sow division and discord now.
I get what you’re saying, and there definitely are people who are angry about the US slipping, and standards of living reverting to the mean a bit, and looking to blame someone. The True Believer came out in the aftermath of WW2 and tried to analyze why it happened, and laid out that the most dangerous group of people aren’t the ones who’ve been poor for a long time, but those who were recently poor, who remembered a more prosperous time. Those people get tremendously angry about it, and represent fertile ground for politicians and motivated groups to plant the seeds of hate.
People need to have some perspective. You’re not permanently locked out of useful AI models, it’s within reach of most who can save a bit to go get a pair of used 3090s on eBay and run some pretty useful models.
> The True Believer came out in the aftermath of WW2 and tried to analyze why it happened, and laid out that the most dangerous group of people aren’t the ones who’ve been poor for a long time, but those who were recently poor, who remembered a more prosperous time.
Is it just people trying to sow division when you're potentially describing an entire upcoming generation?
> People need to have some perspective. You’re not permanently locked out of useful AI models, it’s within reach of most who can save a bit to go get a pair of used 3090s on eBay and run some pretty useful models.
I don’t agree. The current generation of young people can’t afford housing and education without taking on decades of debt. Buying a pair of 3090s for local AI isn’t even on the radar. Even if they could, it’s unlikely they’d be able to make productive use of them. The big AI companies haven’t even scraped the surface when it comes to memory, specialized knowledge, etc..
I see people downvoted my comment and I’m not sure why. I’m not trying to pile on to create drama. I’m trying to explain there’s a growing cohort of people that have a right to be angry because they’re watching global productivity increase as their standard of living is decreasing. Who wouldn’t be upset?
The dangerous part is that people angry about it are easy to sway with propaganda. It’s not the billionaire families colluding to fix food prices, which happened with bread in Canada, it’s the “insert another marginalized group here” that’s causing the problem.
I think the commenters with new accounts and comment on only political topics and not technical ones on Hacker News are a bit suspect. Not saying that there aren't a lot of disaffected youths out there, there totally are, and I'm agreeing with you with that bit about The True Believer, I just have a suspicion that a lot of these new politics-focused accounts aren't real. But maybe there are real young people who come to HN just to discuss politics, I guess tech has become more political over the last few decades.
I didn't mean that most people are going to go out and drop $1,000 and run their own models locally, I meant that it's pretty good evidence that they're not permanently locked out of owning access to AI, if that's a priority to them.
I agree with most of the rest, I'm a strong proponent of all sorts of safety nets, and higher top tax rates/cap gains tax rates. But it's also important to maintain perspective. A lot of what's happened is that citizens of very rich countries are maybe seeing their standards of living decrease somewhat while many more people globally are seeing their standards of living skyrocket. Visiting family in China every 5 years, the difference is astounding every time.
Upvoted that comment, fwiw, you answered in good faith, not sure why it's downvoted.
You are people and I agree you need perspective.
You wave off systemic issues as no big deal and discuss the potential of a 3090 graphics card. Tell us you're a privileged first worlder without telling us...
That you refuse to discuss solutions to political problems impacting a lot of people who, in our society are off the hook for you too, you're deciding to take the risk your own life doesn't vanish.
You're not relevant to others. Americans lack of political action to ensure a safety net exists for everyone just leaves everyone indifferent should you too end up giving blow jobs behind a Burger King for a portion of kids meal someone threw out a car window should it come to that for you.
So go ahead and pretend reality doesn't exist outside your own experience, little Dark Triad. But if you end penniless in the gutter, you'll only have yourself to blame
I totally agree that we need to be taking better care of each other, our system's a mess, but I wasn't planning on getting into a big discussion about that tonight from my phone.
The point about 3090s was that reasonably good local AI costs on the order of $1,000, so Americans aren't structurally locked out of owning the means to run their own models like the person I was responding to seemed to be claiming. If you can afford a desktop, local AI is in reach if owning it is a priority for you. I don't recommend that route, but it's possible.
From your other comments, sounds like you're also a "privileged first worlder" who got to go to college and attend Burning Man, so let's not fling stones. I'm extremely lucky, I'm extremely aware of it, a visit to some of the actual poorest parts of the world, where people wash themselves and their clothes in rivers that stink so badly of sewage that it's hard to breathe without gagging made me very aware of how lucky even the poorest Americans are, despite how bad it can feel to be in close proximity to some of the richest people in the world when you're not.
And if you're not an account who's part of an "AI-fueled agitprop campaign", I'm sorry for whatever's happened to you that's given you so much rage that you're feeling the need to come here and dump on nearly everyone you've interacted with. I hope things go better for you in the future, I really do.
AI-fueled agitprop campaigns.
Hobbyist equipment is still relatively cheap. You can get previous-gen hardware for formerly current-gen prices, you can run lots of “hobbyist” software on low RAM and no GPU.
It’s bad, but it’s not “literally own nothing”.
The second hand market is going to have much much more lag. But it's very unclear that this is going to sustain indefinitely.
Yeah, I'm not sure that fewer people will own computers, I do think people will shift to much longer upgrade cycles.
it just depends on how you define computer.
people will own an increasing number of dumb terminals connected to rented services.
does that reduce the number of computers? well, no..
so, imo : the trick isn't to reduce physical ownership of devices, the trick is to make it so that you need Big Iron in order to do anything.
One way that might be achieved is by forming social and cultural dependence on models so large that no one individual could possibly run them...
Or live with lower specs. You can still get a Chromebook for $200 and install Xubuntu on it. I did that, and it's perfect for video conferencing and web surfing. You certainly can't play CoD on it, but even if you're looking to play games there are some older games that run on it just fine.
Who cares if Qualcomm owns Arduino. It has never been cheaper to get into embedded computing. You can buy Arduino-compatible STM32 Nucleo boards straight from STMicroelectronics for $15-20, and that's first party. If you're willing to buy third party clones there are boards on AliExpress for $10 or less.
Most things were cheaper last year. But you can still get RP2049 Zero for less than a buck each and run FUZIX. Neat.
At current prices, Chinese companies could even produce everything possible (~anything but current gen CPUs and GPUs) on slightly older nodes and make a stonking profit while lowering market prices.
It would unfortunately be considered contraband in the US or tariffed 500%
If only the rest of the world could buy it, it would probably work almost as well (edit: to lower prices in the US). Besides, I'm in the rest of the world ;)
It'll be the same for Canada. We're already seeing satanic panic style action against things like TP-Link networking equipment and Hikvision cameras. Funny how those are a couple of the brands that can run 100% locally without a connection to the internet.
China is also short on supply... Capex for these are planned years ahead and just not flexible enough to deal with the supply squeeze right now.
China is famously fast at building things, but maybe not semiconductor factories... especially with long lead times from Western suppliers.
I'm not sure that's going to change unless the LLM stuff slumps. Chip makers get burned every few decades by building boom-time capacity that comes online just after the crash. They're going to be reluctant to make big capex spends unless they think the demand is durable.
After the recent run-up, where are prices on a per-performance basis? Back to 2019?
Computers were incredibly more expensive when I was growing up. People bought them anyway.
Is a computer that lasts 5-8 really productive years (and is still serviceable for another 5-7) and costs $1500 really a deal-breaker just because it was $1000 and on sale for $850 a year ago? Even if it doubled again, it still doesn’t price normal people out, IMO.
I just looked at some low-end NAS-oriented HDDs. The cost $/TB is 2x similar ones I bought 5 years ago.
That has never before happened in the history of computing, and it violates long-held, fundamental assumptions.
It happened in 2011 when massive flooding in Thailand stopped a huge chunk of global production. Hard drive prices pretty much doubled overnight.
No, storage prices were still much lower 5 years later.
I think your idea of what "normal people" can afford is a bit off. Normal people aren't buying $1500 computers. And they definitely aren't buying $3000 computers.
An 4K Apple ][ cost the equivalent of around $7K when released. A C64 cost the equivalent of around $2K when released. Both were fairly popular and vastly less useful than a computer today.
If the cheapest useful computer ends up costing $3K, it will still be purchased and will still be worth it at around $1/day of useful life.
The C64 sold "between 12.5 and 17 million units" in its lifetime [0], vs. worldwide PC shipments of "71.5 million units in the fourth quarter of 2025." (emphasis mine) [1] It's truly an apples (hehe) to oranges comparison, and in my opinion it only reinforces the point that "normal people" will no longer be able to purchase computers, just like the C64 was not a mainstream product.
[0] https://web.archive.org/web/20160306232450/http://www.pageta... [1] https://www.gartner.com/en/newsroom/press-releases/2026-1-20...
> "normal people" will no longer be able to purchase computers,
Starbucks' revenue was almost $10B in the last quarter. Most people can clearly afford $1/day for something as useful as a computer.
That seems like a false equivalence to me, even if we ignore the fact that only 21% of that revenue came from non-US countries. There are enormous chunks of the world where the local equivalent of $1500 is a life-changing amount of money.
How do you not understand the difference between spending $5 once or twice a week and having to cough up $3k all at once.. or paying 20-30% interest if they can't afford to pay that.
It's extremely obvious from your flippant attitude that you are doing quite well financially and are completely out of touch with the financial realities of the vast majority of people. Congratulations on your financial success, but maybe lay off on thinking that everyone else can afford the luxuries that you can.
It was nice, in the 90s and 00s when computing hardware's cost was just falling so rapidly. I think it was like what, 1.5x "stuff" each year? Like RAM going 1.5x bigger every 12 months, CPU frequencies increased by that much. Per-unit prices were falling.
Now, per unit costs is rising faster than inflation. The WD HDD I bought in 2017 for $65 real ($49 nominal) is now $95 real, 50% more expensive after inflation.
Trust me when I say my income has not increased by 50% post-inflation since then! (Also … I really should not have checked that number. Needless to say, it's not positive.)
"Normal people" were not purchasing Apple II or C64 computers in the 70s and 80s.
What you're showing me is that you are completely out of touch with the financial realities the vast majority of people face.
There is a reason that the Macbook Neo has been a smashing success.
If the cheapest useful computer ends up costing $3k, then most people will simply no longer own a computer whenever their current computer dies unless their livelihood depends upon it, which for most people it does not.
People spend $1500 on phones. I'd be surprised if they weren't willing to spend that on computers if they see a need.
But from what I can see a lot of people aren't really interested in PCs. Most of the non-techie, non-gamer people I know do everything on mobile.
> Is a computer that lasts 5-8 really productive years (and is still serviceable for another 5-7) and costs $1500 really a deal-breaker just because it was $1000 and on sale for $850 a year ago? Even if it doubled again, it still doesn’t price normal people out, IMO.
Maybe it's different in the US. In Canada, the median income for 25-54 years old was just under $60k / year in 2024. When you're talking about a $3k USD computer, it's pushing 10% or more of the median after tax income. My gut reaction to that is that most people don't even end up with that much disposable income in total, let alone for a single purchase.
HN is skewed with people way at the top end of income earners, especially on a global scale. Imagine getting $30k / year to spend on everything you need and then consider how much $3k on a computer is.
My dad had to take a loan to buy our first computer. Who wants that? It's dumbfounding to see the number of people cheering on backwards progress where we end up where we were 3+ decades ago.
> When you're talking about a $3k USD computer, it's pushing 10% or more of the median after tax income.
If it lasts for 10 years, it's more like 1% of the after tax income of a median individual earner over that period.
I think a computer is clearly valuable enough that people will entirely rationally spend 1% of their income on it if that's what it costs. (I'm not "cheering it on"; I'm just observing and predicting that lots of normal people will still buy computers.)
Computers really aren't that valuable to the average person who already has a smart phone. For everyone else, many probably have a work issued computer, and don't need one at home. The market for high end home hardware is really only gamers and tech workers, and gamers will fall back to closed hardware fast if price/perf pushed them to do it. A big reason PC gaming thrived 2010-2020 was PCs were better on a price/perf basis.
And currently the US government is actively trying to ban chinese hardware from the consumer market [1]. So gonna be real fun.
Maybe we'll get a chinese hardware black market.
[1] https://www.reuters.com/sustainability/boards-policy-regulat...
This is really a two-for-one for the AI companies: they lock up the hardware market for their growth while also making sure no-one can buy hardware to host models locally.
High end resins and epoxies are in a critical supply shortage right now. I suspect that there are going to be some serious resource driven PCB shortages in the very near future.
...no. If anything the GPU situation would cause it to ease up as less low-middle end gets even build
Is anything not in a shortage?
> Is anything not in a shortage?
Technofascism
The supply-chain disruptions didn't get there yet, but you just wait...
No more tech as all tech components are in short supply, so it'll just become old fashioned fascism.
Authoritarianism
bad news
Slop
Labor
Is there an industry newsletter or source where one can read up on this? All I can find is a single article from Reuters.
When photosynthesis first appeared, the oxygen it produced poisoned the existing life. Sulfur-breathers basically disappeared. In the geologic record the oxygen shows up as massive layers of iron oxide which we mine and turn into steel now. New things can radically shake up the existing environment, the degree of shakeup is the measure of how radical it is.
Let's shake up AI by taxing it's profits to ensure nobody in our country goes homeless.
Great analogy! Unfortunately we are the sulfur breathers.
It’s the ultimate NIMBYism. The idea is fine but nobody actually wants it when they’re the sulphur breather.
If you really want to see a radical shakeup that would have some very exciting effects, could I interest you in a little Total Atomic Anihilation?
I'm thinking about how I jumped on getting a new PC a little over a year ago anticipating tariffs would balloon prices. Turns out I made the right choice but for the wrong reasons (not like the tariffs are helping either, but just wasn't as big of a factor).
I'm sure the AI shortages are hurting, but also I'm still using my same motherboard from 2020 and I see no reason why I should have to upgrade in the next 2-3 years (whenever I buy my RTX 7070Ti, it might be time, but maybe not even then).
I guess my hasty purchase meant as only a temporary, times were tight, placeholder Dell Inspiron in 2015 has to do me for another ten years.
hope you're handy with a soldering iron (reflow station?) because eventually the passive components are going to start failing and I don't imagine you'll be able to plug off the shelf components into a Dell
No point in buying motherboards if you can't afford to put any RAM in it.
AI is simultaneously the reason you can't buy a motherboard and the reason you don't need to build a PC anymore. The industry is eating itself from both ends.
Why don't I need to build a PC? I wanna run some stuff in it.
AI is exactly the reason why I would like to build a high end PC. I'm interested in this technology but I don't want to have anything to do with AI subscriptions and big tech in general.
Progress, in any meaningful sense, has to mean we are more capable of sustaining ourselves than we were before. Burning down the commons to train and serve a mythomaniac chatbot is not that. The consumer markets that still worked will shrink, and some will die.
I was looking into self-hosting deekseek v4 pro since frankly cache reads are an absolute scam and they're 90% of the cost, but then I looked at the ROI and it will never pay off fast enough because the hardware will become obsolete faster even if you were running 10 token generation streams 24/7.
The napkin math resulted that renting is around 27 times cheaper than owning (not including power). I think we're really screwed when it comes to having owned access to AI unless intel comes out swinging with a c series card that has 128gb vram so we can run them in a 4x128gb configuration, but seems unlikely since nvidia has a large share in them.
This was calculated expecting around 30tok/s, of course you can get 2-5tok/s much much cheaper, but it's unusable for my workflow.
Ironically the few people not scamming you for cache reads are Deepseek.
Everyone else charges a ridiculous amount but Deepseeks API is $0.003625 / M tok.
I'm surprised no one talks about this because of how significant it is. GPT 5.5 for example costs a ridiculous $0.50 / M tok cached. It's literally almost 140 times cheaper which matters a lot for tool calls.
it's a temporary promo, deepseek will return to only 10x cheaper after.
Yes Deepseek V4 pro is currently on discount.
> The deepseek-v4-pro model is currently offered at a 75% discount, extended until 2026/05/31 15:59 UTC.
However even when the discount ends its still very cheap. It will go back to $0.0145 / M cache hit. That's still 34x cheaper than GPT 5.5.
Would you mind sharing the napkin maths?
Not OP, but basically take GiB/s and divide by 30. You need at least 128GiB to hold the model, too. It's expensive to get 200 GiB/s, very expensive to get 400 GiB/s and above that you are looking at DC-grade GPUs. Multiple, in fact.
The only way to profitable serve AI is to have large batch sizes - run 500 requests at the same time.
If you serve a single user you'll never get your electricity price back, nevermind hardware costs.
I know it's going to be extremely painful, but the sooner this ridiculous unsustainable AI bubble pops the better off we'll be. The more it inflates the more collateral damage it will cause, and we're probably already looking at 2008 levels of financial chaos.
What’s the feedback loop that leads to a total financial collapse? This looks much more like dotcom bubble. Everyone knows where the exposure is.
I think it comes down to scale (there's like 2 trillion invested so far by very large institutions) and also AI hollowing out foolish companies that decided to go "AI native" and downsize and lose institutional knowledge. When the rug pull inevitably comes and the AI subsidies are gone, the entire idea of "efficiency gains" in a lot of places is going to look pretty bad as soon as they look at their bill.
if that happens, its going to be one hell of a mess of dominoes to clean up...
There are probably multiple goals of AI investment. It's entirely possible that they are deliberately killing the affordability of how personal electronics like home computers are made and will instead replace them with terminals that stream everything to the cloud. You can make a lot more money off consumers if you can turn their entire computing experience into a utility.
they've been trying for a looong time on that one. i still remember those junky "net appliance"s from the early 2000s [0] and oracle and sun making big statements about them...
[0] https://www.ecommercetimes.com/story/sonys-evilla-joins-audr...
And flying pugs gonna fall from teh sky too.
They should try releasing one with a futuristic space-age design and flashing rainbow lights. Maybe give it a name like HARDTEK and put some random techy shapes all over it.
Weren't we supposed to be living in the post-scarcity era?
We are, it's just very unevenly distributed.
The brief window between the covid gaming bubble pop/PoS ETH switch and the AI hardware blackhole will be fondly remembered as the last golden age of consumer PC hardware accessibility.
If China keeps releasing decent copies of SOTA models that only take 20% of the resources, then we may get some relief when those models become "good-enough"
I've been using deepseek and it's good enough for my personal use. It takes way more time/tokens/course-correcting to get things done, but I spend in a month what I spend in a day with opus 4.6
>copies of SOTA models that only take 20% of the resources
They might be 20% of the price (because they don't have to invest that much in training), but are probably not 20% of the resources (ie. inference), considering they take more tokens to do the same task, and have slower inference speeds.
https://x.com/scaling01/status/2050616057191072161
Even at 2x the tokens (max from that tweet), that makes them 40% of resources. Which is still only 40% of the resources.
This makes me sad. My son was just about to be old enough to build his first PC, and was showing interest. I guess I'm going to have match his savings 1:1 to make it possible now.
Who will be the first motherboard maker to put out a board with 12 slots for legacy RAM?
ASrock have created a "HUDIMM", which is basically a half bandwidth DDR5 DIMM. Basically half the number of chips per DIMM. So kinda a modern day 386SX with its 16 bit bus. Presumably hoping you'll be able get fewer, higher capacity DRAM dice for a competitive price versus a normal DIMM.
On modern systems (all 64 bit AMD, and Intel Core "i" onwards, so quite old now) the memory controller is integrated into the CPU, so what the CPU supports is what you get, and the latest CPUs are DDR5 only. Intel did have a transitional phase of CPUs that can do both DDR4 or 5 depending on motherboard, but AMD it's AM4 = DDR4, AM5 = DDR5.
We are in AI mania right now. I dont think this will continue forever.
hard to tell.
even if volume and hype decreases from the general pop there doesn't seem to be much of a cap on model requirements -- so at least one sector will be pushed into purchases one way or another.
Smaller manufacturers will fold, and larger ones will leave the consumer market (like Micron/Crucial did), before the market has a chance to bounce back. If and when it does recover, it will be a market of much fewer choices.
A somewhat comparable historical example is the destruction of the Swiss watch industry in the 70s with the advent of quartz and digital watches.
A Rolex Daytona today is known as a very fancy and even hard-to-get watch. In the 70s they were practically giving them away with other watch purchases because electronic watches were taking their lunch.
The bigger takeaway, I think, is the destruction and folding eventually lead to the Swatch group. People forget Rolex, Omega, et. al. were tool watches that were expensive but fairly attainable. Even into the 90s you could walk into a Rolex store and walk out with the watch you wanted. Nowadays you basically have to buy a watch to prove you're good enough to get the one you want.
I forsee a similar thing happening with computing hardware. There will be a small high-end side industry for non-datacenter customers.
The digital watch user will be renting time for a thin client via a datacenter provider. The wealthy or high status user will be able to purchase the expensive boutique home computing hardware they want.
Besides watches becoming expensive trinkets, a Rolex Daytona in the 70s was basically the same watch as what you could get from other manufacturers with the same movement inside. Today you have to spend at least 30k to get something comparable to it which is part of the reason that it's in a permanent demand crunch.
Computers were like that twice already. That always ends.
The only reason you have those watch brands to mention is because they are non-functional status symbols. People that want a watch buy something else.
The same way, people that want a computer will buy from whoever is actually selling them. Manufacturers that want to sell only to datacenters won't last for long.
It's sad when our best hope is that the pumped economy dumps and tanks all the other industries so we can buy computers again.
I think the shocking part is it's only projected to go down with 25%. That's quite mild given the increase in memory, storage and GPU prices, in my view.
Shortage of ram and ssds, and soon, cpus. Motherboards aren’t selling because theres no point buying a motherboard if you can’t by the ram or ssd it needs.
It’s brutal. I’ve just built a workstation with DDR4 and two-gen old cpu. I paid more for the ddr4 than it originally cost, four years ago. The same amount of ram for the latest motherboard would have been 10x ($10,000). So used DDR4 has gone through the roof, which impacts hobbyists who used to rely on “hand-me-downs”.
15 months ago I saw writing on the wall on several fronts. I suggested my community commit to their buys/builds ASAP and be forward-looking, before things changed.
My high-end HEDT would now be +$2300 to build mostly due to memory and SSD pricing. 96GB of memory going from $430 -> $1800 is wild. One community member literally wouldn’t be able to buy their Mac Mini configuration anymore, plus the self-upgrade SSD would be price hiked.
Where I blanche most is my storage server running TrueNAS. Built it 3.5 years ago, future-proofing in mind. Strong SSD cache layer, plus two spare HDDs as spares. It wasn’t cheap then, but I think between disks, storage, ECC memory, etc. it’s +$7000 now to rebuild it again, +$9000-$10000 on last generation hardware.
I assume manufacturers were making enough motherboards in 2025 to fulfill demand, so what happens when the demand is the same but the production is 25% less? Crazy.
See Permacomputing https://news.ycombinator.com/item?id=48044638
When RAM and an SSD cost more than an entire system used to it's not surprising to see this.
Maybe with AI we can finally kill user-owned computing, and make almost everyone renters.
It's really wrong that the common people have access to things like PCs. It leaves a lot of money on the table the corporations can extract, and makes control much harder. PCs should cost at least as much as a car, so only the right people can afford them.
Own nothing and be happy.
On HN it's not always clear when sarcasm is in use. Especially given that I have seen AI bros basically cheering this on.
AI bros and crypto bros. One and the same thing. Same optimism. Same arguments. Same blind faith. Same zero knowledge of how economy, society or even the technology they are evangelizing works under the hood and what are its shortcomings that are impossible to overcome because physics won't allow.
> what are its shortcomings that are impossible to overcome because physics won't allow.
Have you intended to say "because reasons"? There should be a long chain of reasoning connecting "LLMs will never be able to strictly follow instructions written in natural language (as agreed by 90% consensus of experts or some such, because you can't formally verify adherence to informal natural language instructions)" and "physics doesn't allow that." And I can't find it anywhere. Neither in your comment history, nor in literature.
Because it is next token predictions and there is no logical part into that. It is like saying I cannot find any research paper on the fact that car collusion might cause fire because gasoline is in the tank.
But the fact is that there's plenty of literature out there on hallucination and unreliability of LLMs already. If you know otherwise, let us inform Dario before next funding round.
Next token prediction is a pretraining objective that doesn't tell anything about behavior and activation structure of the resultant network. The literature that explores a hallucination problem has little to do with your claim about physical impossibility.
If you want it to be sarcasm, treat it like sarcasm. What's the worst thing that could happen?
Well, the last time I assumed everyone was being sarcastic and ironic, until they actually elected the guy president.
Haha, ah yes, good old GWB.
Not to mention the comment agents. The true AI bro/sis won't waste her/his time commenting when an agent can do it for him/her.
Have to efficiently trash the internet of course.
You sound like a useless-eater manager. Just the kind of roles we'll be happy to have in our future Utopia. The people will be happy to be led by such visionaries such as yourself.
Hey, I'm pretty sure the person you're replying to is being satirical, and you're both in alignment on this one.
We were both being satirical. Seems like people couldn't seem to attribute me the same generosity and benefit of the doubt as the OP. Funny and yet the starkness of the hostility onto my post and not his is real.
Sarcasm aside, yep. There will be three classes: the owned, the owners and the unownable. I aim to be unownable.
There will be two classes: those who are part of the perpetual underclass and those who are not. And 98 percent of the population will be part of it.
There are _already_ two classes:
Those who earn their living from their labor, and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
The root of most of society's problems right there...
Also the root of most of society in the first place. We would probably not be able to sustain our current standard of living without this horrible system.
> Those who earn their living from their labor
If any of these people don't work or don't work enough, they undeserving immoral moochers and should be miserable and in pain.
> and those whose income is derived simply by owning things they (often) didn't create themselves and charge for access.
It totally fine if these people never lift a finger in their lives. In fact, they deserve it. NEVER question that. N-E-V-E-R! It's great! Capitalism is great! Capitalism is fair!
You talk like this is new, it's the way it's always been.
> There will be two classes
That confident "will" in that prognosis may ultimately stimulate a consensus "why?" response in the population to explore alternative outcomes ..
How will you do that?
To be owned, someone needs leverage over you or the ability to coerce you.
I spent the last half a century making sure they have no leverage and I am not interested in being coerced.
It's called security.
Hey guys, Richard Stallman is on HN!
Hey I clean my toes before they become snackable.
If you don't care to go into detail, that's fine, but your answer is vague. E.g. do you have a hidden, off-grid, underground lair?
No. I don't need to work. I outright own my accomodation. I have no hard technology dependencies.
if you are on land, you are (or someone is) still paying rent to the government. rent can be raised and you will be evicted if you don't pay.
if you are living mobile, you probably need gas or batteries for warmth or cooling. if your climate is currently comfortable, temperatures can be raised.
or maybe you are a nomad hunting and gathering your own food? the wilderness can be pillaged and sold and "secured" until there's nothing left to eat.
there is no perfect security.
Correct. I just aim to get eaten last.
And I aim to be the one that eats you
Good luck :)
> If you don't care to go into detail, that's fine, but your answer is vague
No lairs necessary. You can read up on people who do FIRE.
99% of people who follow that are still completely dependent on the world order and will be just as screwed as everybody else if everything goes tits-up.
The aim is to be screwed last. That's a reasonable survival strategy.
There's a lot of bullshit that can happen between the status quo and everything going tits up. Having FU money means you can say -all things being equal - "Fuck You" when it's appropriate, instead of worrying about becoming homeless or not having enough to eat
It will be the same two classes there are now and always have been. Those who need to sell their labor and those they sell it to. Class struggle is the only way out. Find some solidarity, you aren't exempt.
Sounds like my kid's friends talking about betas, alphas and sigmas. I think they aim to be sigmas.
i've said something similar, but i shorten it to labour and capital. similar conclusion.
PCs are also made by corporations, together with PC parts. The reason computing became so cheap during the last 50 years was competition between said corporations. Competition that is also pushing the AI token price down and also encouraging - corporations - to come up with models that can run on user hardware.
So what are you ranting against?!
> Own nothing and be happy.
Ah, here it is. Only governments can confiscate our property and force us into that. Governments and politicians that keep telling us how evil corporations are…
> Only governments can confiscate our property and force us into that.
... Do you want corporations to have that power too or something? What are you saying here?
The “own nothing and be happy” quote is from a blog post made by the World Economic Forum. I find meta-governmental organisations even more troublesome, and you can’t vote them out.
It isn’t only conspiracy theorists who should be disturbed by whatever politico-corporate freemasonry that goes on in Davos.
In the digital age corporations can and do confiscate things we thought we owned. Amazon removing paid-for kindle books, devices getting bricked, paying a recurring fee to use some features in your car.
Every year, the government comes around, reassesses my house value (always up, never down) and ask me to pay a percentage (always increasing) or they will take away my house which shelters my kids and family.
So, no, I am not too worried about Amazon removing my $9.99 book.
It's unfortunate but personal general-purpose computing has being under attack for a long time, this is only another nail in the coffin.
I kinda felt like this was coming so mid last year I built a local rig with the top of the line parts I could afford at the time lol (rtx5090/ryzen9 etc) now I just need to build out my inference setup (sadly m3 ultras r insanely expensive now) - I have a feeling they will try to lock down usage of open source LLMs too. I don’t get how token moat can exist if local inference rigs can be built out and serve open source models locally for nothing (besides power cost).
I'm sure this makes the billionaire class happy but there are some legit economics involved. We all want frontier class models running on our home PC but that takes 100x or 1000x the computer we've been running. The market isn't going to instantly adapt and make that kind of machine for the same price we've been paying for 1% of it, so that we can leave it sitting idle 99% of the day.
considering how unsustainable this whole AI Business is and how much it ruins everything, I can't wait for the crash.
Why did we listen to the Worldcoin guy again?
Because he killed the Open part of openai and the PEs went nuts
I can't wait for this to happen. I live far enough from the US to not be super affected by this crash and hopefully I will be able to stock up on disks and parts once the prices drop.
Waiting for the future where the only computing devices you can buy as a consumer are locked-down phones and PCs are simply not available anymore...
This looks like a consumer choice and producers are following up with the demand.
Given that the AI companies are still operating at a loss and running on investor money, I'm not sure it's actually driven by consumer choice.
Consumers are not helping though. They do everything on a smartphone and don't even blink when mobile app is required for most trivial things.
To echo the post you are responding to: when the app is required it isn't really a consumer choice, is it?
> required it isn't really a consumer choice, is it?
No one really resists or pushes back. When I resist I hear "that's what consumers want", "it's for security", or that I'm the problem. There is no one to complain to even, except to low paid kiddos in customer service.
That does seem to be where this is headed. Especially after googles last announcement possibly bricking 99% of the phones from accessing the internet
It's not just new hardware, even used hard drives manufactured a decade ago have at least doubled in price. Scam Altman has effectively killed personal computing for all but the most affluent
PCs and higher-end IoT devices (like RPi) are becoming less and less affordable. The web is saturated with slop and bot traffic. I wonder how many people think about what this means. We are rapidly losing several of our major technological ecosystems that have driven economic growth in the past few decades. It's not at all clear what economic benefits we will be getting in return.
I think people are slowly realizing this and that's why projects like Meshtastic are so popular. Too bad none of that crap actually works at scale.
Will demand for computing ever go down from where it is now? Even if the AI bubble temporarily pops, in the long run I think the demand for computers will be practically infinite.
Market forces will probably bring the price of hardware down in the next decade. Whether it is in a form that is useful for regular people/hobbyists is another question. If not, then hopefully the "cloud" starts to look a lot different.
I think it's possible (10-15%) that the AI bubble pops and we all live without 50M token/day OpenClaw installs and running Opus to do things that should have been done by a shell script to the point that it causes a dip in total compute demand. I think it's likely (75% likely if the AI bubble pop causes a dip in compute demand) that this dip extends longer than the median lifespan of the hardware currently being installed in datacenter.
Of course in 20 years we'll be using more compute than today (99% likely).
EDIT: Of course cryptocurrencies provide a floor compute pricing.
It looks like we are going to start to see consumer components stop being created in the first place, as demand dries up to the point where it's no longer worth it to serve the consumer market at all. Almost purely because the supply is already so low that almost no one is willing to pay the price.
I would say that since more and more of the world wants/needs sovereignty in this space, more and more options would come up in the next 5 years. They will probably not be the cutting edge we have now but personal computing will not die. It will just get a (healthy) reset.
As an example?
"fueled by AI"
...just as the kleptocrats wanted, to stop the spread of self-hosting and desktop computing now that society is tentatively starting to go digital.
Is now a good time to get a upgrade for cpu and motherboard?
"Fueled by greed". It would be trivial to say no to AI companies because dollars are dollars, it doesn't matter who pays them, and prioritizing literally all of humanity instead of "five companies" is a choice that every single supplier could make, but decided not to. This problem was 100% manufactured by suppliers.
It would be equally "trivial" to say no to personal compute as well. So maybe the problem is manufactured by all of us.
poor Joe Rogan
Reminds me of how ever since egg prices went to the moon we've all had to give up dessert and subsist on thin gruel for breakfast.
What's that? Egg prices are back down after suppliers cranked up their output? Surely nothing like that is possible with hardware... Personal computing is dead forever...
This will happen eventually but there is a much longer lag for hardware supply than for egg supply so I wouldn’t expect a ton of improvement until late 2027 or even 2028.
It takes 5 months for a newly-hatched chicken to start laying eggs. It takes 5 years after breaking ground on a new fab to start producing chips.
That’s the issue: ram suppliers have not started building new fabs, which means they expect this demand to be temporary and they’re just going to make a killing on it while they can. It takes years to get a fab up, and they think demand will be gone by then. So that means ridiculous prices now, and if demand doesn’t drop, ridiculous prices until someone thinks the demand will continue for four more years after that moment. Whatever the moment, building for four years out is a risk. So this could last forever.
> SK Hynix to invest about $13 bln in a new South Korea plant to meet AI memory demand
https://www.reuters.com/world/asia-pacific/sk-hynix-invest-a...
> SK Hynix has reportedly broken ground on a new advanced memory packaging facility in West Lafayette, Indiana, that should boost the supply of US-made high-bandwidth memory (HBM)
https://www.theregister.com/on-prem/2026/04/22/sk-hynix-brea...
> Samsung to advance mega-fab expansion by 6 months to get ahead in capacity race; SK Hynix follows suit
https://www.kedglobal.com/korean-chipmakers/newsView/ked2026...