Build a PC while you still can - PCs are changing whether we like it or not.

Lil air fryer
Lil air fryer

This all sounds like absolute hell for consumer’s rights, repairability, upgradability, and overall variety in the PC space.

Vor Monat
Tempo Fugo
Tempo Fugo

Currently, you can't repair a gpu, a psu, a cpu or a motherboard. It's hard. All you can do is replace it. So not that different from those new kinds of PC.

Vor 8 Tage
staraffinity
staraffinity

@Stefan My main computer is a Mac Pro from 2010 and it disagree with you. It was several years ago upgraded with 32 GB of RAM (used from eBay) and has RX 5700 XT graphics from AMD, a USB 3.2 Gen 2 PCIe card from Sonnet Technologies with two USB-C ports and the MacOS is stored on a Samsung 970 EVO NVMe drive which Apple added support for booting from in the Boot ROM just a few years ago. Runs the latest MacOS and Windows 10 very well still. So much for planned obsolescence in that case (well, the latest MacOS isn’t officially supported, it requires some hacking to work, but still; a long lasting machine). Sure they have computers that aren’t that upgradable, but to say that their stuff doesn’t last isn’t fair I think.

Vor 9 Tage
Drakesfortune
Drakesfortune

@Braden Richardson If the PC is just as powerful, and costs 1/4 the price though, that's a trade off I'd take.

Vor 26 Tage
spunkinater
spunkinater

You will own nothing and be happy

Vor 27 Tage
Sloppy Puppy
Sloppy Puppy

And the answer to all that is, RISC-V

Vor 27 Tage
Rob Lugo
Rob Lugo

Anthony has honestly been my favorite addition to the LTT crew, a pleasure to watch. I'd take a computer class if he taught it.

Vor 29 Tage
David Chen
David Chen

I already like ths guy the first time i watched him talk

Vor 10 Tage
have a cigar
have a cigar

self discipline would be a class he could use i am sure. he should down on the cupcakes.

Vor 14 Tage
Metrion Void
Metrion Void

@Timberwolf CY Literally makes no sense.

Vor 15 Tage
Timberwolf CY
Timberwolf CY

@Metrion Void Tell us you're jealous without telling us you're jealous, lol

Vor 15 Tage
Thomas Wiley
Thomas Wiley

Actually this is an old story; LTT is just observing one of the cycles. They aren't old enough to remember 'big iron,' the massive mainframes that everyone saw in movies from the 50s to the 90s. These are the grandparents of the SoCs we see now. IBM's VTAM came along in the 70s as a comm/app layer within big iron as a way of connecting different SoCs within a mainframe. It was also a way for IBM to make more money because you had to pay more to "unlock" features, ie open communication to the SoCs within the mainframe. Base level got you one to two "features" while paying premiums got you many or the rest of the system. The add-ons were already there, they just had to be unlocked by software. And how many terminals you bought as physical add-ons. The first IBM PCs were mostly SoCs with perhaps a SCSI controller or add-on RAM board as after market purchases. The only "build" items were the physical items that wore out like keyboard or mouse. (Except for the IBM 101Model M keyboard - you could kill a person in the desert with one of those, and still have a functioning keyboard. Just sayin') Also, we've had these SoCs for years now. We've been calling them 'appliances' because, well, the term is apt. People know them as 'thin clients.' Basically, these are the replacements to terminals, the things that connected to mainframes. With an appliance you get localized processing power and lockable systems, which are needed in quality/sensitive areas of businesses. It's only because of the advent of smart phones do we have businesses considering marketing appliances to the general population. Phones are technically SoCs, right? And while people bitch about a phone becoming obsolete in four years, they go along with it. So now we've come back in the circle where appliances are now set-top boxes or dongles [Roku, XBox, Fire, etc.], AKA 'little iron,' with unlockable features (comm/app layer for games, movies, Internet access, etc.) for a monthly fee. And yes, the only add-ons are the physical parts - controllers, keyboards. Eventually, people will tire of disposable computing and some company will offer an appliance with the ability to snap in upgrades (an all new PCI bus) and the circle will continue.

Vor Monat
Cntrl
Cntrl

up

Vor Tag
ccSleepy
ccSleepy

Yeah I already can’t build a computer. So this wouldn’t change anything for me. But there’s no way I’ll believe my ability to build a computer will be compromised because of what LTT describes. My big concern is price creep on parts which has outpaced inflation.

Vor 2 Tage
Daniel H
Daniel H

@Coaxill I respect your cordiality and I believe you have good intentions... as do I. We just have very different ideas on getting there. So let's just say that everything you say is absolutely 100% correct. Let's say it is easier to pursue imperialism rather than pure socialism. What makes you think (if this is what you think) it will ever turn out more pure than every time it's ever been tried in the past? In the U.S., for example, both left-wing and right-wing hate each other's guts (present company excluded). Why do you think either side will suddenly start wishing good will toward the other (again, of that is what you think)? If that's not what you think, then feel free to disregard these questions.

Vor 4 Tage
Coaxill
Coaxill

@Daniel H I've studied it in some detail. Meaning no insult, I think perhaps you could benefit from further study. Particularly with an eye for the actors involved, and their intentions. I think the problem is that using Socialist ideals to promote imperialism is easier than using Socialist ideals to promote Socialism. People kinda like imperialism. They liked Lenin's Vanguard rhetoric until it disadvantaged them, not realizing that from the beginning Lenin was clearly taking exceptions with the most crucial parts of socialism. In other words, having a dictatorship of the proletariat, represented by a single person, is just a dictator. This is why every single one of the countries you're thinking about (USSR, China, Cuba, Vietnam) call themselves Marxist - Leninist, rather than Marxist or simply Socialist. Leninism, Vanguardism, call it what you want, it's essentially the idea that socialism is best implemented by a small and all powerful intellectual elite who can shape society according to their whims. This is quite evidently an atrocious idea, antithetical to the very concept of socialism, but people liked Lenin. Modern Socialist theory considers this notion monstrous, and an example of how easy it can be for people to put their own desires in Socialist terms. Hope that was worthwhile, I respect your cordiality.

Vor 4 Tage
Enchanter Eddie
Enchanter Eddie

I cannot even recall since which year came out the saying: "PC is dying". And obviously, it still hasn't finished this process. What I believe is that there will always be the need of highest possible performance out of desktop form factor, either normal enthusiasts or some businesses. The lower end PCs have been made into very small boxes for years and are used as family rigs as well as business PCs. But they are not killing the bigger desktop boxes. Computing technology can be built into many sizes from super computers to mobile phones, and desktop is one sweet spot of them. Even ATX and x86 may be replaced by something better (possibly ARM), people will always want/need their computers to be customizable and allow them to grow bigger for the sake of better performance.

Vor Monat
Adam Lawson
Adam Lawson

You're not considering that "high performance" and "small size" can be the same thing. Ergo, Apple Silicon.

Vor 23 Stunden
Joey_BK_86
Joey_BK_86

@Keyser soze Samsung also went the same route as iPhone with the no headphone jacks.

Vor 3 Tage
Jack Dillinger
Jack Dillinger

I always think like this; better cooling equals better performance. Liquid hydrogen any reasonable cpu or some shit, room full of ice packs from the freezer and somebody who replenishes them every 2 mins, whatever better performance, so bigger case is better cooling, smaller lame laptop is worse cooling, worse performance.

Vor 5 Tage
Redslayer86
Redslayer86

PC started dying the year the first PC hit a customers desk, its been dying ever since. Now that said, there was a stretch from about 2002-2012 where PC gaming actually was being pretty solidly out done by console gaming (not on power obviously). But since about 2016, the shift has become pretty clear back towards PC again, and that's why you're seeing so many "console exclusives" on PC, even sony is starting to send their exclusives to PC. The age of PC gaming has arrived again, and its because of AAA company greed making trash for consoles, and indie devs making gold on PC.

Vor 5 Tage
Russell Smith
Russell Smith

I agree with Anthony, that the salient question is "Is it worth it?" My answer for Apple Silicon was "No." (but that had more to do with being allergic to Apple's infrastructure). If Intel or AMD went ARM (assuming a relatively equal capacity to run older application in emulation -- a big assumption considering that Apple really did an amazing job on Rosetta II), I'd be more sanguine on making the move. The remaining issue is losing incremental upgrading, a hallmark of the current PC. Albeit the upgrade path on a CPU is limited by eventually needing a new motherboard to support the newer CPU and there are advancements in Thunderbolt and such that also push new CPUs and MBs but the basic concept of being able to replace your graphics subsystem with an updated, more powerful model is an important consideration when graphics cards are steadily increasing in capability.

Vor Monat
Dragonrabbit
Dragonrabbit

thumbs up for using salient and sanguine within the first two sentences. five internet point acquired!

Vor 2 Tage
CalikoTube
CalikoTube

@BlackAxe101 Or the entire tech industry can stop copying Apple. But you know that never happens.

Vor 9 Tage
BlackAxe101
BlackAxe101

@CalikoTube Not always. Built a system in 2019. I managed to grab a used 2070 for like $300 and a New 7700K+mobo for like another $200. Maybe $800 total for a bomb 1080/ decent 1440 game machine. Just replaced the mobo + processor from 7th to 12th gen. Was able to reuse everything including RAM since I considered dragging it along with me to future ryzen so the speeds are good. If I would've waited instead of trying to "Future proof" I wouldve saved money on the RAM too lol. Both of these situations would be erased with ARM. How would I source used/cheaper old parts in paragraph one if all parts are basically sautered together. How would I upgrade anything in paragraph 2 even if I can sauter? Presumably the pc breaks detecting new hardware that isn't OEM, and just straight up doesn't work. So until Apple starts to consider a little maintenance/upgradability, or other major players that already do so start playing with ARM(on a consumer level). I'll be right with X86 where I have a plethora of options, not just "Buy a new one" where this looks like its going.

Vor 9 Tage
Russell Smith
Russell Smith

@CalikoTube "eventually, you have to replace every single part" Agreed. The thing is, I can replace them as new hardware becomes available at a lower cost per upgrade than buying a whole new unit. There are, of course, advantages in integration that sometimes offset this. For instance, new processors with big and little cores, require a new motherboard to support them, which might also be helped by newer memory. Sometimes these things snowball.

Vor 19 Tage
CalikoTube
CalikoTube

@Russell Smith I do see your point, problem is, components get outdated. Even the motherboard. So eventually you have to replace every single part. The only reason I would want easy to upgrade RAM is so I can buy an M2 MacBook with only 8GB RAM and upgrade it later to 32GB when I can afford it. Also second hand Macs would be more enticing. Also I wasn’t aware that Apple developed their own hard drives. I always thought they were 3rd party drives.

Vor 19 Tage
Sebastian Loskarn
Sebastian Loskarn

Hearing this, I imagine a dark future where PCs are handled like phones today. "Sorry your 2 year old PC is now irrelevant because we're not giving it anymore updates."

Vor Monat
horizon
horizon

@A Wonce you can game on a laptop using low power, then i think all non desktop-enthusiasts that like gaming will switch, and maybe if it can game it can do most everything else that most people use. i can see people picking mini pc's instead if they just use an external monitor(s) but maybe the laptops can be detachable from the screen and sold separately to be more economic and versatile for that reason.

Vor 8 Tage
DJ Soru
DJ Soru

Bruh, you clearly don't know you can install other OSes on your android phone.

Vor 8 Tage
slimdunkin117
slimdunkin117

@tippyc2 my iPhone is jailbroken and I do what I want..I can spoof the version

Vor 9 Tage
guitarplayerforu
guitarplayerforu

@ArtisChronicles If companies do away with Outright ownership of products, I think they'll soon go Bust/bankrupt, because leasing/Subscriptions are just not feasible or doable for the vast majority of people, it doesn't matter how much they try and force it, it's simply not going to work full stop. By scrapping the option of outright ownership, I see the majority of companies going bankrupt very quickly. It's one thing having subscription or leasing a product available as an option alongside outright ownership so people can make their own choice, but to do away with the latter would finish a lot of companies off, it certainly wouldn't make them more money that's for sure 👍🏻 Then you have the sustainability argument, how can trading in(throwing away) a perfectly good working product after just 2 or 3 years be better for the planet than owning a product outright and keeping it until it reaches end of it's useful life and then recycling it, Am I missing something? Not owning a product and swapping it every 2-3 years is far more wasteful than owning a product and keeping it for as long as it lasts, then recycling it. Making products last longer should be the primary focus, not scrapping the outright ownership option, if they want less waste and to be more sustainable/environmentally friendly or whatever you want to call it.

Vor 16 Tage
guitarplayerforu
guitarplayerforu

@Cringe with Minish I'd never ever lease products, because 1. It's a constant ongoing cost which simply isn't sustainable for me, and 2. The products then aren't mine, so I have no say in how I use them, Like a gaming PC for example, the lessor would Decide what games you could play, not you etc... Which is totally unacceptable to me. It's different paying monthly for stuff like streaming services, TV, broadband, cloud storage etc, because you Decide what service you want and then pay a monthly fee for it. I think if companies did away with outright ownership of products, they'd soon go bust, as subscription/leasing just isn't feasible or doable long term for the Many, doesn't matter how much they try and force it on the public, it's simply just not going to work. I'd happily keep the same smartphone or gaming PC for 5+ years, or as long as it lasts before recycling them, so me owning my phone, PC, product outright and keeping it for a long time or as long as it lasts, is far more sustainable and environmentally friendly and less wasteful, than trading in perfectly a good PC/phone, vehicle, product, etc... just because it's 2-3 years old. I mean heck, even Google is realising that, by supporting their pixel 6 series and beyond for at least 5 years.

Vor 16 Tage
Demonmutantninjazombie
Demonmutantninjazombie

People seem to miss the point of the video regarding efficiency and power draw. CISC is incredibly complicated to implement and iterate compared to RISC type architectures like ARM. Die space and power draw can’t keep increasing the way they are now, especially for areas like high performance computing. CISC requires more transistors to implement than RISC due to the way instruction sets are handled. This is one reason why the cores are bigger. The other issue is that larger cores mean longer distances to communicate with other places on the device. Even a um is such a large amount of distance that results in a higher power draw and heat due to the increased resistance in addition to significantly increased latency. Then the issue of supporting the ever increasing amounts of instruction sets that do wacky things in one instruction. I mean sometimes multiple RISC instructions in a single instruction. However, this is a nightmare to complete on hardware via control logic. Multiply this by hundreds of complex instruction sets made over the years and so much die space would be taken up just for the control logic. It’s just inefficient and isn’t sustainable if we want greater computing performance. Like sure transistor sizes are still decreasing but come on, it can’t keep decreasing and we are obviously incredible close to a fundamental limit. CISC can owe it’s success to ever decreasing process nodes so it’s inefficiencies can be ignored due to the increased efficiencies provided by smaller transistor sizes. Now that we are reaching a fundamental limit, to increase computational power we must explore other architectures and ideas rather than just sticking to the idea that smaller transistors will solve problems. It’s funny how people see this as the end of customizable PCs though and get so angry at LTT pushing a “narrative”. Quite simply, SOCs are just a more efficient architecture. I can guarantee that people here really won’t want ever increasing power consumption by a PC once they see the power bill. Like seriously, we are approaching PCs that legit consume the same amount or more power as a fucking microwave, except I’m going to guess gaming lasts longer than a minute to warm up a meal. Even if someone is ready to fork that bill, the people who run HPC clusters are not. Servers suffer from the power bill and they will buy anything that can offer the same performance for lower wattage. Regardless it’s not like it would be the end of customizable PCs, that’s just kinda dumb. I can easily see architectures where SOCs are used on the main processor and integration with other components on the main motherboard. One of the things my professor talked about was integrating DRAM on top of the actual processor die and having wires vertically connect the two. This would save so much time and power regarding latency and only possible if we abandon the way we see PCs now.

Vor 22 Tage
Sophia Antonopoulou
Sophia Antonopoulou

I love your whole team and I really mean that (no true favourites as you’re all awesome) - this channel is really fun and informative - but Anthony is such a great guy and I could listen to him talk about computers all day. He reminds me of a lot of my friends :)

Vor 4 Tage
The Dream Traveler
The Dream Traveler

If this was happening any time soon Intel wouldn't have started developing their own fully fledged gpus

Vor Monat
LemonSqueez
LemonSqueez

It's easier to develop something big and then reduce the size later to fit into a small form factor. They need time to catch up hardware and software(drivers) wise with the other two

Vor 4 Tage
stephen allen
stephen allen

It always strikes me every time I watch these videos, but a genuinely large factor in me enjoying these videos is the stretched aspect ratio which looks beautiful on mobile. Now if only they could do it all the way at 21:9 that would be killer

Vor 28 Tage
Nabusco
Nabusco

While I can see this change as an inevitability, I just hope the serviceability and upgradeability will not be impacted as hard as I think they will be

Vor Monat
jaka alatas
jaka alatas

@Lil air fryer not...yet

Vor 6 Tage
Armin Rigatoni
Armin Rigatoni

@Zoltán Nyikos "Subscription based hardware" Well said, but I've got news for you bud. They want subscription based EVERYTHING

Vor Monat
Commander ZiN
Commander ZiN

I disagree that it is inevitable. People have said this before it never happened. While there is demand for PCs then PCs will exist. I know many that will be willing to pay more for more power and upgradability, repairability and an open platform.

Vor Monat
mfrey0118
mfrey0118

@Ben van Broekhuijsen I guess house repair is a bit more important than PC upgrades 🤣. And yes, when you choose a motherboard go to the manufacturer's website and check the RAM QVL list before buying your memory. Upgrading, at least for me, has not been hell, but quite fun and if you don't choose the bare minimum, but go a step above on your components, you won't even need to overclock and run into all the headaches associated with it.

Vor Monat
mfrey0118
mfrey0118

@Ben van Broekhuijsen I had a budget build AMD A10-7850K Kaveri APU up until 2 years ago when I couldn't take the lag of the old tech anymore. You should not still be on that old architecture if PCs are important to you. Those chips came out years ago and you could've been putting 20 bucks aside per paycheck or something and been able to get a high end build today. AM4 socket has been with us for years over multiple generations of CPUs, but you missed it all and complain about compatibility with ancient components that you couldn't even give away at a yard sale today because you "don't have the money". That's an excuse, if something's important enough to you, you'll find a way. I mean, really, even a budget build today with like a 3400G on a B450 board would be a MASSIVE increase in performance at all levels. That was my first APU since Kaveri that went into my budget build in 2020 and the difference in speed and responsiveness was literally night and day. And somehow, while still being "poor", I was able to, peice by piece, upgrade my system in the last two years, while still enjoying compatibility on the AM4 platform, to where I now have a beast high end gaming system. Lots of work put into it, patience, saving money, and overtime...was it worth it? Hells yeah and I appreciate it every time I hit that power button.

Vor Monat
Exinferris
Exinferris

The bleeding edge will always be power inefficient. If I wanted to play Star Citizen on a thin laptop, then I would give it 10-15 years. If I wanted to remove miners from the economy, I would release 600w GPU's. *Anthony, I hope Linus put you up to this.

Vor Monat
yothere1209
yothere1209

I enjoy hearing you unpack and elaborate on computer architecture stuffs

Vor Monat
harshbarj
harshbarj

People have been saying for DECADES that the pc was doomed. I remember as a kid in the 90's people saying the same. Arm systems are great, but they all still lack real power. Even the new apple arm chip. It's a massive step up, but it's still well behind higher end X64 chips. Plus just try to upgrade an arm system in any way. My R-pie-4 is great and I love my Samsung tab S8 ultra. But at the end of the day when I need some real power I use my Ryzen system. Arm will no doubt eat away at the low end and may even eat a little into the mid range. But for the foreseeable future Arm is no threat to the high end.

Vor 25 Tage
paradox
paradox

​@CalikoTube you make a lot of points in favour of ARM yet you fail to mention the massive lack of software that actually supports ARM... not to mention the fact that compatibility layers such as Wine have a bunch of issues with running software (especially games)... good luck waiting a few years or even decades for software and game developers to actually support ARM - a lot of older software likely won't even see this kind of support at all also, ARM has existed for decades (nearly as long as x86), so there's no "new" technology here - you're still defending old technology apple just implemented a CPU that's based on this instruction set

Vor 3 Tage
CalikoTube
CalikoTube

@Richard Clay Ummm NO. Apple never said “that’s not fair. Doesn’t run on Apple Silicon.” To think that Intel is panicking for no reason is ignorant. ARM runs a lot cooler than X86 so that temperature argument is out.

Vor 9 Tage
Richard Clay
Richard Clay

​@CalikoTube If you are talking about the Intel chips on Macbook. Yes. But not any other x86 computers. You could even run PC in 5w, that Apple Silicon could get 10000% faster than x86 because it cannot even run on this watt. It's stupid to say that Apple's chips are 2 times faster than Intel. The fact is, if you are trying to run chips in 500w, x86 could provide boost performance, while Apple Silicon rebooted due to the high temperature. Apple Silicon still cannot run much software. Once somebody shows the super strong performance of x86, "Apple" always says that "It's unfair, this is not natively supported by Apple Silicon". I don't know whether Apple knows the future, but it's clear that Apple understands how to make money by selling an extra 8G for hundred dollars.

Vor 9 Tage
Dhruv Pandya
Dhruv Pandya

@CalikoTube Headphone jack thing still is a feature which people will look for. Second they are removing unnecessary tech, not making new tech. Arm is not scalable and versatile. Versatility is traditional of hardware is an advantage in itself. Number 2, apple has never been massively succesful on non-mobile computer i.e. desktop. Apart from creators, any other office looks for repairablity and forgoing some power is not a difficult decision. I am not saying arm will never over take x86 but, it seems unlikely. If there is no environment for repairing a machine, it is not reliable. I have seen companies stick gasoline cars as they are repairable. They also tend to pick cars which are easily and cheaply repairable by the local mechanics.

Vor 17 Tage
CalikoTube
CalikoTube

No. Apple made a huge leap over Intel. If they hadn’t, they would have stayed with Intel. Intel is sweating right now and mocking Apple in ads. Apple Silicon saves a ton of energy compared to high end X86 chips. This is why your knockoff iPad from Samsung doesn’t use X86. I wouldn’t be surprised if within a decade Apple has a chip as fast or faster than the highest end X86 chip but using half the power. Apple knew the future and is making it real time. This comment is gonna age as well as “Lol Apple removed the floppy disk!” “Lol Apple removed the SCAI drive!” “Lol Apple removed the mobile keyboard!“ “Lol Apple removed the CD ROM!” “Lol Apple removed the headphone jack” I know you’re not mocking Apple, but defending old tech never ends well.

Vor 21 Tag
Gerard S.
Gerard S.

So long to the numerous combinations of CPU, GPU, RAM, and memory that we enjoy and the ability to build the PC according to our budget and needs. If this is the future, the PC makers will only sell us what they think will make the most money and as some of the other comments correctly predict, force us to upgrade when they stop support for our old PCs.

Vor Monat
HunterDrone
HunterDrone

my sole complaint about SOC systems is how frequently the makers of them seem to not give a shit about long term maintenance, expecting you to just buy a new model rather than maintain or reconfigure your existing machine.

Vor Monat
neeneko
neeneko

@The_DankSmith There has always been a tradeoff with repairability in PCs. There used to be a debate about closing things off in ICs since you could no longer replace blown transistors. The thing is, the curve between repairaiblity and ewaste is a complicated one. the more 'repairable' a system is, the more ewaste it produces. this might be offset by repairs, or it might not... and to put it bluntly, even when systems are repairable, most people don't. go down to any ewaste facility and it is filled with machines that are 'repairable', but no one is bothering, so they simply end up creating more waste than more intigrated systems do.

Vor 28 Tage
adbrooks95
adbrooks95

The dgaf. Simple. They'd just tell you to buy a new one

Vor Monat
Mail Pigeon
Mail Pigeon

Apple in a nutshell

Vor Monat
Squintis
Squintis

@Peter Breis most pc parts have a warranty for up to 5 years. Meaning if your gpu fries, you can send it back to amd or nividia and get a new one to replace. Apple? Sure you got the 10 month mark. Literally everyone other manufacturer has a year warranty, that’s standard and nothing special. If you waited legit 3 more months, they would have told you it’s not repairable (even though it probably is a loose cable or something stupid) and ask you for 1.6k to fix it, or buy the new one for 2k. Maybe it’s cause I am a “fiddle monkey” who has no problem fixing their own stuff, but calling nivida for a new gpu if mine breaks within 5 years and a quick swap is leagues ahead of spending 2k every 5 years.

Vor Monat
Trauma
Trauma

I could see pcs becoming smaller and smaller but customization will still be an option

Vor 28 Tage
Harry
Harry

When amd manages to improve performace per watt with 50% each two years, these gpu's are not dead but in their prime.

Vor Monat
Forbsi Sauras
Forbsi Sauras

I'm a gaming enthusiast, built my own pc (evga 3090). I felt very uninformed in most of what was said in this video, however as I understand that pc's may change to a more compact less modular design. Sure go for it as long as competition flourishes and doesn't become a monopoly.

Vor 21 Stunde
William keenan
William keenan

Imagine having your own custom game cube tho that doubles as a pc. If this goes far enough it’d be incredible

Vor 17 Tage
Fatty
Fatty

And the great thing about a small, integrated system is that when it breaks you get to buy a whole new system! Wait...

Vor Monat
Brad
Brad

@Joe Marais You are missing the point. There won't be a need for upgrading for a while now. The cost will be extremely reduced if everything was on a single die. We are at the brink of capping out silicon. 5nm is gonna be a pause.

Vor Monat
Shannon O.
Shannon O.

@Obywatel CG The screen is a problem when folks throw their Nintendo wand at it, what really goes bad is the back light inverter (cheap electronics) or the screen driver board (cheap electronics) and yes, most TVs are repairable but out of warranty it’s not economical

Vor Monat
truejim
truejim

@Dave102693 I think we’re agreeing with a lot of old dudes who remember when being a car enthusiast meant you could do all your own work on your engine. They probably feel the same way about a once great era now gone.

Vor Monat
truejim
truejim

@Jordan Chang There used to be a shit ton of TV repair shops in the Toronto area. Now it’s a niche business. What Anthony is saying is that in a few years PC repair will be a niche business too, because all new PCs will be SoC.

Vor Monat
Dave102693
Dave102693

@IrisCorven Apple already has

Vor Monat
GrahamAtDesk
GrahamAtDesk

In my experience, whether companies will make products for enthusiasts is down to the size of the market. The number of environmentally conscious consumers has been increasing, and I see no reason why that should slow down soon. I think modular machines are here to stay, though they'll evolve (as they always have) to suit the available tech. I can't see power hungry architectures surviving that much longer though. When chips similar to Apple's become avaialble to the PC market, things are going to change fast, and there'll be a large number of enthusiasts wanting to put them in their existing machines. And I expect companies that sell to those enthusiasts now to want to get a slice of that pie. What am I missing?

Vor Monat
Guy
Guy

Hopefully we will have upgradable sockets for the system on a chip or a cluster. Maybe the diy space will get even more creative 🤷‍♂️

Vor 13 Tage
Arctic Sun
Arctic Sun

I appreciate the in depth philosophical exploration of this video. I have more questions than answers after watching and reflecting on it. Thanks for creating and sharing this content.

Vor 24 Tage
greyグレェ
greyグレェ

The lessons Anthony mentions with tighter integration as seen in the M1 Ultra and such, were already evident decades ago in the Commodore Amiga, as they iterated from the 1000 to the 4000. Unfortunately, we all lost out when Commodore went bankrupt. At least maybe this time we will learn from history instead of repeating its mistakes?

Vor Monat
BorlandC452
BorlandC452

Sometimes I feel a little old-fashioned for still using a tower for my main computing when I'm not much of a gamer. But I still really like having one because of how I can customize it. I love being able to swap out individual components, or even adding new ones. You really can't mix and match with a laptop, and especially not a phone.

Vor Monat
Sky Cloud
Sky Cloud

I'm crap at working in small spaces, so my pc housing is stupidly big for what is needed.

Vor 20 Tage
Dawserdoos Grahamcracker
Dawserdoos Grahamcracker

@Riccardo It also stands with a menacingly large price, standing far far out of reach for many to overcome. My nondescript rectangles may have their issues, who doesn't? Let me also tell you that they can show their true colors if allowed the chance. Mayhaps none of my nondescript rectangles will ever be as mighty as the wonderous tower... But they sure do try their best, considering 1/4 the price.

Vor 22 Tage
Victor Almanzar
Victor Almanzar

Framework laptop, while not as modular as a desktop is miles more repairable/upgradable than anything else on the market

Vor 24 Tage
Yuna Ch
Yuna Ch

True. The best thing about a tower is you can build the pc according you your very own specifications. You can build it to be as unique as yourself.

Vor 24 Tage
CavalloDiSpade
CavalloDiSpade

Yeah, I'm still using the tower I got in 2013. This thing is turning into the Ship of Theseus with all the parts I've added and replaced. It's now starting to come to the end of its life (the motherboard is now getting really dated and some things have failed) but it lasted almost 10 years now, so I've really gotten my value out of it. And when the time really comes I'll do what I did again, and build a high-end pc that hopefully will again last me 10 years.

Vor 26 Tage
JustGiveNoFox
JustGiveNoFox

I’m glad I built my PC when I did. It’s not top of the line now, it wasn’t back then, but the feeling of pulling a brand new 2070S XC Ultra out of the box along with a brand new ASRock X570 Taichi and 3950X. She’s my pride and joy. No point in getting a prebuilt. That’s all these are, prebuilts

Vor Monat
JustGiveNoFox
JustGiveNoFox

@Ryan Tornai yeah but today way too many people have more dollars than sense. So people will say “I built that” when they really mean “I paid someone to build it for me”

Vor 18 Tage
Ryan Tornai
Ryan Tornai

Back then? The car scene is still huge

Vor 18 Tage
JustGiveNoFox
JustGiveNoFox

@Sartenazo working on cars is something I do for a living so there’s definitely a correlation there 😆 Some things do not change; I may not have a fast car but I’m proud of the computer that I built. Every day I get to look at it and admire it because I built it

Vor 26 Tage
Sartenazo
Sartenazo

I find funny that back then people were proud of their cars/motorcycles they themselves built, and now we do the same thing but with PCs, there's things that don't change.

Vor 27 Tage
R R
R R

With the advent of the mighty APUs we're seeing this year from Ryzen 6000 series and Intel's 12th gen, this is slowly creeping in. I love my PC to death, but I'd love to game on the go just as much. That's why I still lug around my Switch and a 6000 series handheld soon. To me, the definition of worth is different for each individual. I hope and I pray that Blue and Red don't go this route of monopolistic obsolescence. It's worth having a 600W GPU for gaming at home. On the road or at work? Not so much. As the saying goes "It depends". I don't need 600W 240Hz ultra on the go. At least not in 2022. For that usecase, the Switch or a Windows handheld (or Steam) would still be the go-to.

Vor 28 Tage
Jaff Fox
Jaff Fox

This 'news' is older than most people watching. I remember my IT teacher banging on about this back in the early 90's telling us the Risc Acorn Archimedes computers we were all forced to use would soon replace the 'inferior' x86 DOS PC's we had at home. But the fact is people will always want freedom to choose what they want and how they use it. That's how true innovation and progress occurs.

Vor 20 Tage
popehentai
popehentai

Look who its coming from. the same guy that swears more home users will start using linux because windows 11.

Vor 15 Tage
Sam Speed
Sam Speed

7:54 10Ghz from a Pentium by 2005? I want to know what the engineers had been drinking that day when they came up with that figure! Interesting video. Personally, I don't mind. If it can deliver the performance and it's reliable enough that it won't go wrong (because it would be practically impossible to fix if it went wrong without sending it off somewhere), then great.

Vor 15 Tage
Aefweard
Aefweard

My issue with the idea of the everything chip is say you’re 2 generations down the line and want to upgrade your graphics, but the cpu side is still chugging fine, having to replace the whole thing is not only wasteful, but more expensive. Same goes with a component dying.

Vor Monat
Vpn Vpn
Vpn Vpn

Yep, there’s always a trade off.

Vor 24 Tage
古明地恋
古明地恋

@Ruan Ellis I will take 3x power bills rather than 3x upgrading costs thanks

Vor Monat
arno nabuurs
arno nabuurs

@SkyDivingMoose everything is already soldered down, why would they change this?

Vor Monat
Proliloli
Proliloli

@Ruan Ellis Especially now that electricity and gas prices are up, your GPU would probably cost its value in electricity prices within 2 years haha

Vor Monat
The Smuggest One
The Smuggest One

They want that. It's not about you having a better product, it's about siphoning more of your money into their offshore bank accounts, yachts and second, third, fourth luxury homes.

Vor Monat
M Cabrera
M Cabrera

Exellent topic and a total pleasure to have this kind of hypothetical/future tech conversation without biases for companies. Great video

Vor 7 Tage
NotKray
NotKray

The amazing part about pcs is that we can swap,downgrade, and upgrade the components. I doubt that it will happen to all components in this form factor. Laptops alone only offer limited upgradability let alone this type of form.

Vor Monat
Daryl Cheshire
Daryl Cheshire

The reason I build my PCs is that I was frustrated with my 2000 HP Pavillion computer with it’s bespoke motherboard and I was impressed how much better it ran with it’s own graphics card rather than the built in one. At the end of the day the PC has about 20 components which only fit one way. I was intrigued by this guy who had his GF build her PC and there was this ASUS youtube of this bikini clad girl who excitedly built her own PC explaining how everything goes together and claps her hands when finished.

Vor Monat
David de Goede van Hasselt
David de Goede van Hasselt

I agree that the higher power consumptions of many PC components, and so of PC's in general, is a subject by itself that needs to be addressed. But then to say why it has to change to a small form factor with an M1 or similar idea in design? That means that, again, consumers will be paying the price to have a computer that will stay powerful enough to keep up with the ever changing ways to project moving images for our games and fantasies in whatever form of media. Ending up in probably buying a new machine every few years... In turn causing more and more waste as you cannot simply swap out a main processor with all its intergrated stuff since the outer package most probably cannot utilize it. You are egging people on with this to buy a new phone every year, and buy a new tablet and oh heck go get that new small superpowered (but ever so limited) new computer too! If that isn't wastefull I guess I am totally clueless what waste is... So... Linus, what the heck???

Vor 22 Tage
Dick Waggles, P.I.
Dick Waggles, P.I.

I won't ever give up the ability to modify my computer.

Vor Monat
Truth Be Told
Truth Be Told

@Rokmononov I bought a computer that's not easily upgraded. It will be the last, after the power supply wasn't designed to be enough to actually charge the batteries while it's being used to a fraction of max. Oh, batteries you cant replace without melting glue. And a graphics card that overheats and disconnects itself randomly. Adding an SSD? Same issues. Anything I cannot replace parts in, I'm not buying in a PC. Sure they did the same junk with phones, but phones aren't as meant to be utilitarian as my PC.

Vor 12 Tage
Dick Waggles, P.I.
Dick Waggles, P.I.

@P X yeah, those two things are totally comparable. and phones totally aren't getting worse every single year. Not much of a point you made there

Vor 23 Tage
Vanquished
Vanquished

@Kratos reag haha. What a bad take.

Vor 23 Tage
P X
P X

Own a smartphone? You already did.

Vor 23 Tage
ArcangelZero7
ArcangelZero7

@Kratos reag My Thinkpad X230 still go brrrrr.

Vor 23 Tage
JoeKool
JoeKool

the intel 6500k cpu fm2 is still to this day over powered in terms of when it came out and what it can do and run. It legit runs pubg and can do some of the quickest rendering at base speeds. no cap look into it

Vor 12 Tage
kasa
kasa

I do think Intel, AMD and Nvidia should definitely heavily invest in ARM too. It would enrich computing market. Also, there is always going to be demand and need to customize your computing needs. So even if ARM takes over, you will just have some standardized way to connect separate CPU, GPU and other components chips in smaller form. Maybe it wouldn't be as efficient as one integrated chip, but it wouldn't need to be in desktop application.

Vor Monat
B. L.
B. L.

Anthony is the man. So knowledgeable and excellent at conveying the information.

Vor 7 Tage
DrummClem
DrummClem

The power consumption is, IMO what the manufacturer should focus on. We saw, okay guys, you can make a lot of things with that nuclear powerplant of a PSU, but we need to do something about that. Apple had a point. (Although I'll never buy some Mac studio, way expensive)

Vor 22 Tage
smakfu
smakfu

At 47, I’m used to hearing that the traditional desktop form-factor is dead. I don’t think so. I’d also be careful with assuming closely coupled system modules (aka MCM’s posing as SOC’s) are the sole optimization route, as that’s true for Apple and the ARM universe as load-store RISC style ISA’s are highly sensitive to memory subsystem latency issues. CPU core-wise, they achieve great efficiency, but flexibility is highly limited, and scaling exotic architectures gets expensive and difficult. But Apple silicon, and the other mobile SOC-style producers are stuck in a “when all you have is a hammer” situation. Apple’s main business is mobile, the Mac business represents ~12% of their revenue, versus mobile devices at 60%, services at 20% and accessories at 8%. The desktop portion of that Mac business is minuscule. Through simple necessity, their desktops are going to follow the patterns established by the rest of their company’s hardware designs. That’s their business model driving design decisions, but don’t assume those same decisions work for Intel, AMD, etc., because they probably don’t. Also, the Mac as a platform has always been defined as a sealed box, no tinkering allowed, especially when Steve Jobs or his acolytes have been in charge of the platform. The expandable “big box” Mac’s have been the exception, not the rule. The Mac and the PC (defined by its open, slotted box roots) are two very different platforms philosophically. I don’t think you’ll see closely coupled system modules replacing big honking discrete GPU’s, sockets dedicated to big discreet CPU’s, and slotted RAM and PCIe slots for the desktop and workstation workloads that demand “bigger everything” (gaming, workstation, etc.). IMHO, you’ll see more chiplets (more closely coupled) in each of the compute buckets (CPU & GPU) and you’ll see a lot more cache, but the principle box with slots isn’t going anywhere. Where you will see more SOC-style system module solutions is in the laptop space. However, that’s just an extension of a pattern that’s existed for a long time, it’s just that Intel’s iGPU’s and interest in closely coupling memory has been limited by their being generally lazy. Keep in mind, the vast majority of all x86 “PC’s”, both in laptop and desktop form, already (poorly) implement closely coupled GPU (mostly on-die), memory controller, cache hierarchy, etc.. TL;DR : I doubt the desktop form factor, sockets, slots and all, isn’t going away. This all seemed a bit click-baity.

Vor Monat
RJARRRPCGP
RJARRRPCGP

@Brian Vickery Same kind of thing, at least since sometime after the original Windows 8 came out. :(

Vor 12 Tage
RJARRRPCGP
RJARRRPCGP

@RiezeXeero News about PCs being dead? IIRC, I've been seeing news of that kind, since sometime in the 2010s, especially since 2014 and around there, but I didn't let that stop me from building PCs! 👍 I think those terrible news stories, were because of Windows 8.

Vor 12 Tage
smakfu
smakfu

@Dave Nordquist ROFL

Vor Monat
Dave Nordquist
Dave Nordquist

That said, you should start a spintronic sockets company that hobbyists can't soil. Or ramp up embedded users of Intel's Optical I/O. Or get that quantum backplane rolling with mobo makers. Get it on all the types of storage and memory (write-once flash, I'm looking at you,) FPGAs, and then hobbyists have a chance in hell of not being killed by DDR5 sensitivity to air composition (and/or losing WiFi because Windows 11 can't believe they'd fart in the middle of PvP gaming and deauthenticates all the things.)

Vor Monat
smakfu
smakfu

@KLAP TV I said their market for desktops (e.g. not laptops) is minuscule. Of the 35 billion in revenue generated by the Mac product line, I’d be generous in allocating 25% of that to desktop products (mini, Studio, MacPro, iMac), which means it’s likely somewhere around an 8.75 billion dollar business. For a company that recorded 365 billion in revenue last year, that’s a footnote. Thus my point that their desktop strategy is built around reuse of mobile-first architecture, and they’re (understandably) not going to be particularly interested in building flexible open platforms. And that’s perfectly okay, as the traditional PC market exists to give you that flexibility (and that’s why it isn’t going anywhere).

Vor Monat
Peter W
Peter W

I disagree with this. Surely different architectures have different strengths and weaknesses, but the high power consumption is unlikely the fault of the architecture, or at least most of it is not. Performance per power consumption has very steep diminishing returns on the high end, always has been. You would need almost exponentially more power to push a little bit of clock speed at the upper end of your hardware limits, as most overclockers can tell you. It's just attention of the market, and therefore the money is quite heavily focused on MOAR POWAH right now, so it is more beneficial for the manufacturers to squeeze every bit of performance into their chips with little regard to power consumption. And we get what is expected, moderately more performance than efficiency-focused products like the M1, with a lot more power consumption. But I bet you that if Apple wants to attempt R5950 RTX3090 levels of performance, they would inevitably have to take the exponential power penalty just like everyone else. That is not to say that it is impossible to make efficient products with current architecture. They just have to try. We see from the rapid growth of thin-and-light laptops and handheld gaming consoles that if the manufacturers have performance in mind, they will deliver products that can make the most of their limited power consumption, regardless of architecture.

Vor Monat
Erik Lindgren
Erik Lindgren

@tiefensucht but the xbox is still a weak device, most games have to be seriously dumbed down to run on consoles, even impacting those who use pc because its easier to make it consistently decent than to make it use the full power available.

Vor 2 Tage
TheTyphoon365
TheTyphoon365

@Synoxa I would recommend growing up

Vor 11 Tage
tecanec
tecanec

@Demonmutantninjazombie A rule of mine is to never expect everyone to be experts on something. And I think that breaking this rule is one of the major shortcommings of the PC market. Way too many people think that, say, having more RAM will let them play video games at higher framerates, even though RAM generally doesn't work that way. But who could blame them? I only know because I'm a programmer who likes to work with that sort of thing directly.

Vor 22 Tage
Demonmutantninjazombie
Demonmutantninjazombie

@tecanec Ye, I think many people here don’t necessarily know about what exactly computer architecture is or the design principles behind it. There isn’t anything wrong with that but it results in people missing the point of the video. To explain all the technical details it requires some prior knowledge that not everyone has.

Vor 22 Tage
Demonmutantninjazombie
Demonmutantninjazombie

High power consumption is the result of the x86 architecture. Implementing a CISC type architecture requires more silicon especially when dealing with the control logic that has to deal with the more complicated CISC instructions. Not to mention the wasted space on the die implementing the control logic for the hundreds of extension CISC instructions. This results in larger cores and longer traces required to connect different components, which increase resistance and latency. The increased resistance increases power draw and heat. The inefficiencies of CISC were able to be hidden due to the long decrease in transistor sizes. So while the control logic and die space got worse over time, the efficiency improvement by decreasing the process node was greater than the inefficiencies above. Until of course we reach a limit to how small transistors can get. We cant rely on process nodes to overcome the inefficiencies by x86. Simply doing the same thing won’t help. ARM is a more efficient architecture regarding power draw and SOCs are even more efficient by integrating DRAM and other components into a small die space decreasing resistance and latency. This is only possible since ARM can afford smaller cores due to a simplified instruction set. Regardless these power increases are not sustainable. Those who run high performance computing clusters will ditch whatever they are running if something else offers the same performance for less power

Vor 22 Tage
Canon McLarnon
Canon McLarnon

I like the idea of diversification in the pc industry due to ARM becoming more popular but my worry is that Apple's head start on Samsung, HP, Huawei, etc would lead to a monopoly that the iPhone Hive Mind fully supports

Vor 10 Stunden
MrSlowestD16
MrSlowestD16

I feel like a more likely change will be the goal of making "computers" as low power and as baseless as possible, and the actual computing would be in the cloud, and they'd charge you a subscription service for it. And to that end, servers need swappable components. It's far too much money to just swap out large sections of the data center every few years. The form factor may change a bit, but swappable components will always be necessary, there's too large a market for it. Also, OEM's will always want swappable components. You take a company like HP, they have like 20 different variants of the desktop with different CPU's and GFX. If that was all integrated that'd be 20 different mobo's but it's not, it's likely "made to order" in the sense they just slap the RAM in it and ship it out the door. So I hear the argument, and I realize they don't care about gamers nor enthusiasts, but I think the data center market + the OEM demand will keep swappable components always a thing in some fashion or another. And as for the benchmarks, 66% of an RTX is pretty awesome for sure (is that also an embedded variant?), if we assume "Gaming" is reflective of graphically intensive applications, but it's also 2x the cost - at least. In terms of peak performance it's efficient, but it can't really show-case next generation graphics, the "ooo's" and "ahhhhs" that's needed for keynotes and similar. I think discrete GPU's will always be necessary in some form or another.

Vor 20 Tage
Jarls Tenibal
Jarls Tenibal

Just saying intel has been putting back door ‘security’ chips in their chip sets since like 2009. I dont think this as much as a problem as people think. Proprietary hardware drivers are way more of a risk for security and consumer rights. Who here is rocking custom drivers.

Vor Monat
Vpn Vpn
Vpn Vpn

I rock custom drivers

Vor 24 Tage
Benjamin McLean
Benjamin McLean

> "Just saying intel has been putting back door ‘security’ chips in their chip sets since like 2009" That's not OK either. Some days when I look at how blatantly evil the Big Tech companies are, it makes me want to go join the Amish.

Vor 28 Tage
Etienne Maheu
Etienne Maheu

Hey Antony, I don't know how deep your CPU design knowledge goes, but if the nitty gritty detail is something that interests you, there's a small company formed from ex ARM and VLIW employees that decided to start from scratch 15 years ago. They're currently doing simulation validation in FPGAs and are pushing loads of improvements in the LLVM compiler to support their architecture with the goal of getting a small micro-kernel running on the thing, and eventually Linux; so it's quite a serious project. They are called Mill Computing and they have 13 videos published on YouTube documenting their design and why it works. I thought it might be right down your alley :) EDIT: I didn't expected this to get that popular so, just a small FYI. If you're looking into this on YouTube, make sure to watch the playlist as it starts with part 1 hosted on the Stanford Uni's channel. Else, you might miss some important context. Cheer!

Vor Monat
Deoxal
Deoxal

I remember reading about it and people said the design made no sense. The belt forced a lot of unnecessary cycles on the back and forth of data.

Vor Monat
Doublin
Doublin

Processor technicals are way above may head but still very cool sounding, so I'll look up that channel

Vor Monat
Pirojf Mifhghek
Pirojf Mifhghek

@Scooty Efficiency is all well and good, but the architecture is all locked down. Can't overclock it. Can't use third party cooling solutions. You get what you pay for and not a single mhz more. If you're not shopping for a mobile device, that's not an appealing platform. At least not for the price that Apple's charging. Four grand for a computer that doesn't actually represent "top of the line" is a fail in the desktop computing space. I really want to see what other companies can do with this. If using RISC architecture gives us all this thermal and power headroom, let us use it.

Vor Monat
Scooty
Scooty

@Etienne Maheu I appreciate the thoughtful and well-articulated response, thank you!

Vor Monat
Etienne Maheu
Etienne Maheu

@Scooty if you have the patience, you can always watch the videos ;) But, as a TLDW, it's all about order of magnitudes perf and efficiency improvements, as well as some interesting security improvements and new software paradigms. It's nothing like x86 vs arm. This thing doesn't even have registers in the traditional sense.

Vor Monat
TheTyphoon365
TheTyphoon365

I confidently disagree, for the same reason that with engines "there's no replacement for displacement". With larger components, you get better heat dissipation, and cheaper manufacturing. The future won't be close shell micro builds, at least not completely. There will be more room for performance in large builds

Vor 11 Tage
13orrax
13orrax

Turbos exist

Vor 7 Tage
TheTyphoon365
TheTyphoon365

@Bifta right, true

Vor 8 Tage
Bifta
Bifta

His point is you can have theoretically like 4 small pcs in the size of a bigger one so the big pcs in the future will be really powerful

Vor 8 Tage
slimdunkin117
slimdunkin117

Rhe smaller builds will outperform the larger builds..the “larger” builds will just be obsolete

Vor 9 Tage
Mario S
Mario S

sure, SOC sounds fun until you realize you have to upgrade CPU, GPU, RAM & motherboard (& sometimes hdds) at the same time if one is malfunctioning or becomes obsolete. During the last 2 years I kept my PC going by selectively adding components and reselling old ones, something that would be impossible using an SOC setup.

Vor Monat
Yuna Ch
Yuna Ch

This are all still speculations right now. The things that is important about pc aside from performance is modularity, upgradability and repairability. Unlike a compact box of hardware like the Apple M1 which probably has horrible repairability and almost nonexistant hardware upgradability (I mean ever heard of an Apple product with hardware that can be upgraded?) traditional desktops are designed to be modular. Sure it's wasteful as it is but if you look at how it is in the long run, it becomes a more economical solution. If built correctly and with top grade components, you can use a desktop indefinitely. You can easily swap components that either broke down or becomes outdated which is something a mini PC style system unit can't boast about. But the biggest edge desktops has versus anything this companies are currently developing is.... RGB. Yes. RGB. Say all you want about your technology but gamers won't bite the bait unless you incorporate an RGB. Now imagine an apple product with RGB lighting. I dare you to imagine.

Vor 24 Tage
IN SHORTS
IN SHORTS

Well, the Buster understood like 10% of that - tops. What I take from this is: Regular CPU-Chips are dying out, new technology is taking over. Now what I care about is: HOW LONG until this new technology is cost-effective? And WHY would I NOT want this new technology to take over? Does it stop the system as a whole to be upgradeable or replaceable? Please keep in mind real world consequences for those of us, that aren't balls-deep into all of this and just want some sound advice for investments - as professional users. Tell me the future!

Vor 7 Tage
8lec Roe
8lec Roe

Yesss. Let's put the entire system on a single pcb so we need throw everything away when one part goes bad

Vor Monat
Snooze you Lose
Snooze you Lose

But guys you can rent it. The cost will be reducing portions of your bug caloric intake for a month. No big deal. Just tighten that belt.

Vor Monat
Eugenio Finizzi
Eugenio Finizzi

So when you crack a gpu you replace It without even trying to fix It?

Vor Monat
8lec Roe
8lec Roe

@Yaroslav Semeniuk i meant the time. Not saying the different parts of a cpu die a different times. Like a pin can get bent or a cap can get dislodged somehow before the CPU stops working, but the CPU itself doesn't really die It just keeps on going for years often more than 10 I can keep swapping out different parts while keeping my CPU. And the other parts are functional and can be reused by others or for different projects

Vor Monat
Yaroslav Semeniuk
Yaroslav Semeniuk

@8lec Roe like all transistors synchronously dying all at once? You have reverse causation here. Because CPU needs all 100% of it's parts to work, when some one small part dies - it won't work anymore. And if you talking about wear and tear, meaning that if one core died today, other cores would die in few months after either way, then your original comment doesn't lineup, because same would apply to SoC with CPU and GPU combined.

Vor Monat
David
David

Intel and AMD have moved toward SoC already so this is just the logical progression. If you build a PC today, the mobo will look like a clean sheet of silicon compared to one made in the early 00s. For example you wont find a northbridge or southbridge on the modern one cuz those controllers have been moved onto the chip, and of course integrated graphics have been viable for certain systems since the intel Sandy Bridge days. It's too bad PC building might decline just as it reaches a more mainstream audience though, and we still need the modularity and consumer choice that having split components brings. I've long been a fan of RISC architectures, but the Mac Studio really is a black (well, silver) box. Gonna be hard to swap out parts or repair that one at all, but that's Apple for you. I'd like to see some slightly less evil companies catch up in the desktop ARM race.

Vor 4 Tage
TAMPABLACK
TAMPABLACK

Cell Processor was amazing for PS3 but the learning curve was big for gaming. As it seems only SONY 1st party devs mastered it by the end of the generation. It was way ahead of its time.

Vor 14 Tage
Location3
Location3

I like how you say the issue is less the instruction set and more the modularity of the overall system architecture, so maybe a high performance, energy efficient mini desktop with an X86 SoC running Windows is possible. And if PCs have to switch to another instruction set, maybe they'd prefer to use RISC-V which is open source and has fewer competing chipmakers than the ARM ecosystem has.

Vor 2 Tage
Yuyah
Yuyah

Considering I'm already a laptop gamer for convenience (and yeah I know performance hurts) but the way it evolved to make smaller compact systems to run at decent performance is good on its own ways. Like consoles, they keep consistency over the rest. So if we can turn high performance more affordable it will benefit everyone. Atm PC's are suffering from shortages inflation and abuse from manufacturers, it's NOT ok.

Vor Monat
Rac3r4Life
Rac3r4Life

Even with a switch to ARM, I believe the socketed chip on motherboard paradigm will stick around. People will still want upgradability and expandability in their desktop computers.

Vor Monat
ItsGraywulf
ItsGraywulf

@me5383 true, I do concede.

Vor 22 Tage
me5383
me5383

@ItsGraywulf htc is garbo

Vor 22 Tage
ItsGraywulf
ItsGraywulf

@me5383 HTC has had a bootloader unlocker for quite a while

Vor 22 Tage
babybirdhome
babybirdhome

@Mario W I’m not meaning to knock on you, but I am going to be pedantic with your comment so that people have the opportunity to learn and know. And it has nothing to do with technology. You don’t actually have a return from your investment because you don’t have an investment. An investment is something that you put X money into in order to get X+Y money out of. What you did wasn’t “investing”, you just sold off your old video card before it was fully depreciated in value so you could roll that money back into your new video card. It can be a smart thing to do, and it can be a better way of spending your money, but it isn’t actually an investment, it’s an expense. The same is true of cars - they lose value as they age, so buying one is rarely an investment. The exceptions are when you’re buying something that’s especially rare and likely to appreciate in value so you can sell it for more than you bought it for. If you could reliably predict and expect that to happen, then that could be an investment, but normally cars are expenses because they lose value instead of gaining value. Houses, on the other hand, are considered investments because they tend to appreciate in value over time so you can buy one for like $250,000 and sell it 10 years later for $450,000. It’s the same thing for stocks - you buy shares of a stock because you think who/what you’re giving that money to is going to make a lot of money once it’s in the marketplace, and at that point, the shares of stock you bought are worth more than you paid if you sell them, and if you don’t sell them, the dividends you earn from owning those shares will pay you more back than you paid buying them. So what you’re doing is smart, but you’re just using the wrong terminology to describe it. 😊

Vor Monat
Egg-Roll
Egg-Roll

@Russell Doty That was once true, but now it's different since bands are SoC based. I know Xiaomi wants to enter the NA market (Officially) but due to dominance of the big players here they've not really bothered (yet they have the FCC approval on most phones lol). Ah yes the brand loyalty that brings random battery draining issues and the explosively good times of the Note 9, used to be loyal with LG till they shit their pants with the Stylo series and made them crap, and won't touch Sammy ever again thanks to the S3/S4 issues my family had. You follow the expensive brand loyalty and peer pressure and I'll follow fairly regular updates decent build quality and cheap prices.

Vor Monat
Ravvij
Ravvij

I'd prefer the major players would look into RiscV. Last I heard, it's versatility and purpose-built design is better than ARM. Not sure if that's still the case.

Vor 13 Tage
bob hope
bob hope

That was a great presentation. Exactly at my level

Vor 6 Tage
Josiah Whitfield
Josiah Whitfield

as a lover of small form factor pcs and power efficiency, i'm all on board for this future but i can already see this becoming a major problem for the wallets of every consumer

Vor Monat
MrSpaceMan
MrSpaceMan

I don't know, I still think we'll be skipping the SOC desktops and move straight to VM's. Just need the national network infrastructure to support it.

Vor 3 Tage
CarthagoMike
CarthagoMike

I have no doubt ARM will play a larger role in the PC market in the future, but for now I have yet to see it dominate over X86 anytime in the near future.

Vor Monat
Jesus Barrera
Jesus Barrera

@Goobfilm cast jajajaja yeah right.. try 10 years

Vor Monat
Goobfilm cast
Goobfilm cast

I give x86 laptops 3 years max before ALL makers have to resort to ARM-style SoCs....increasing performance will mean bigger thermal issues and larger batteries. Just not a solution for portables. Modular PCs will stick around a little longer but prices will go up and marketshare will go down

Vor Monat
Jesus Barrera
Jesus Barrera

@Ryan Thompson 90% of internet users do so on their phones.... They have absolutely no interest in a PC for such stuff now

Vor Monat
sylvia m
sylvia m

@Jovan Malic - At this rate, we’ll end up running our old games on Linux with some emulation layer.

Vor Monat
sylvia m
sylvia m

@Jovan Malic - Same here. My favorite game is from 1998.

Vor Monat
PlainOldCheese
PlainOldCheese

I really hope some company (maybe AMD or Intel) develops a motherboard and socket standard for ARM based chips that allow for PCI and upgradable RAM. The benefit of "custom" PCs is their longevity. If your rig gets slow because programs are using more RAM, add more RAM. If your CPU is becoming too weak slap a new one in the same motherboard. If your storage kicked the bucket, you can plug a new drive in. A super integrated system would just end up like phones where people toss them after 2-3 years for the new one. Massive piles of e waste and lack of repair ability are big problems that companies need to think of when they develop new products. (Even though they really dont seem to care) I hope that this is just a new branch for PCs and that it will open up more opportunities for variety and even more specialized and power efficient systems. Otherwise I'm just gonna hope raspberry pis can catch up and become truly viable as desktop replacements.

Vor 12 Tage
Elumio Merk
Elumio Merk

speaking of raspberry pies, I just watched this video yesterday: https://www.youtube.com/watch?v=fLYEleyXOtg what a coincidence.

Vor 12 Tage
jpogi gtxcr1
jpogi gtxcr1

This is me. I always upgrade my rig as a whole. I tend to balance out all the components.

Vor 22 Tage
KessilRun
KessilRun

Microsoft was once going to manufacture their own chips, starting with the Xbox. They even designed some aspects of the Xbox 360/ Xbox One. I believe they were going to design and implement a processor in Kinect, before deciding to tie into the Xbox One’s existing CPU core. But I believe they did design the audio chip in the original Xbox One.

Vor 13 Tage
The Garden of Eatin
The Garden of Eatin

I think the main thing keeping x86 relevant is Windows. Windows, and more to the point Windows' ecosystem, really can't abandon x86, because there's so much software that REQUIRES it. Apple has such a tight grip on their platform that they can dictate unilateral and very sudden platform shifts. Linux is source-available so as soon as the new architecture is supported by GCC someone somewhere can start pressing the compile button. Windows? There's gonna have to be an end of a decades-long era.

Vor 25 Tage
Sai Sibi
Sai Sibi

As far these changes go I really really hope the modular nature of PC's does not go away. I would very much like to have to install the GPU of my choice, the RAM of my choice and build it all on a motherboard of my choosing, all this in an enclosure that I like. So, I hope it doesn't all become a bunch of boxes doing things monotonously.

Vor Monat
Greg Daweson
Greg Daweson

@filleswe91 "looking forward to a much environmentally friendlier world of computers" ~live in the pod, eat the bugs. It's for mother gaia , just don't mention the rich, who continue to live in mansions and eat caviar.

Vor Monat
JU5TNTIM3
JU5TNTIM3

@Mike Loeven they won’t.

Vor Monat
Emerson Vella
Emerson Vella

Consider this: how many times do you put together, and take apart only to put together again a PC in its lifecycle? A couple of times. Would you rather have the opportunity to enjoy this 1hr playtime and lose all the performance benefits? I'd rather have a better performing system. The M1 has cut my render times by more than 50%.

Vor Monat
Hex
Hex

@filleswe91 I was about to say that computers do not represent a significant portion of power used, but then I remembered server farms. Since those requirement shape the consumer space, and server farms need to perform for the least cost, well, it is obvious that we will all move to soc"s. Great for laptops really. Unless I can stack chips or have adjacent sockets, then computers we know them will become very obsolete.

Vor Monat
filleswe91
filleswe91

The problem with the PC modular design is we lose performance by having each component so far away from each other, aka we have LOTS of latency between each component, you count each signal latency in nanoseconds (ns), not milliseconds (ms). That's why Apple Put almost every component on one silicon chip, everything I super close to each other. Personally I've been looking forward to a much environmentally friendlier world of computers (RISC/ARM) everywhere to not put as much strain on the electrical grids and the planet with all the oil, coal and other fuels humanity burns to charge our electronics. Way to go, everyone working to bring ARM to replace x86! ♥

Vor Monat
Daniel Hebard
Daniel Hebard

Even though the vast majority of consumer PCs may eventually go the Apple route, there will always be user-customizable PC components available, as long as there's demand. What will likely happen is the price of being a PC DIYer will go up a lot.

Vor Monat
iamthemoss
iamthemoss

As a person who has worked in tech for decades, the best technology does not always win. I am pulling for ARM, x86/x64 is a security and tech nightmare.

Vor 6 Tage
Walter B.
Walter B.

The die area comparison of M1 (5nm) and Ryzen5700G (7nm) are relly on level. As you try to make the point that the architecture allows the M1 to be smaller and more dense... Other than that nice video and topic.

Vor 13 Tage
Lars
Lars

Yeah x86 should be replaced at some point. The insanely good compatability keeps it going. But that doesnt mean the end for Hardware as we know it. As long as theres a market for highest Performance graphics cards they wont go away. An soc can never reach the pure Performance of dedicated gpus. Best example are Laptops which still run a wide variety of dedicated graphic Chips even though basically all of the have integrated graphics.

Vor 24 Tage
James Lake
James Lake

Hm. I'd say there's a mountain of "IF"s that need to be cleared before modular PCs and X86 goes away. Apple Silicon is power-efficient, but those designs have been under criticism for poor repairability and upgradability. Apple has sort of gotten away with it by having low expectations of repairability, but every PC becoming a black-box that requires an authorized technician and specialized tools won't sit well with many people. The DIY market isn't large, but does represent billions of dollars per year. There's going to be pushback from both sides to any attempt to eliminate that. Qualcomm has failed to compete, and I feel like I've heard them claim their next model is gonna be the one several times. Also, jumping from a duopoly to a near-monopoly known for poor long-term driver support doesn't seem like a move companies will be lining up for, and Apple Silicon/Qualcomm powered devices so far haven't been cheap (although at least the AS ones have been fast). Windows has a lot of legacy software that either needs to be emulated or left behind. That's either a difficult technical challenge, or cost every business that depends on some specialty piece of software significant time and money while angering every gamer who's favorite older title becomes unplayable. So IF people decide power consumption matters more than e-waste, and IF the major players decide and succeed in squeezing out a multi-billion dollar industry, and IF the ARM processor market gets more competitive in terms of both performance and available vendors, and IF Microsoft can get emulation right... then maybe every Windows PC will be an SOC and soldered-down everything. Not to say that things aren't going to look different in the future, but following Apple into an all ARM future is easier said than done.

Vor Monat
Jim Bernard
Jim Bernard

@James Savent Good point. For upgradability i feel like most of us are taken for a ride when it comes to our PC's. I've been building PC's since forever but every time I'm ready to upgrade i can't because the better CPU requires a different socket and the new ram won't work to it's fullest potential if it's not matched with the right processor and that's IF my motherboard supports it etc etc.

Vor Monat
Paul Hughes
Paul Hughes

@Ansuria Yihada or we could build computers that loop into your homes water tank to heat water. Could do distributed computing like folding@home when you take a shower or whatever

Vor Monat
Ansuria Yihada
Ansuria Yihada

Look beyond apple and focus on the message: we really need a shift to reduce power draw. It doesn't need to be apple, cause fuck em. The higher and higher power needs of our stuff need to stop, as we are nearing various (electrical, thermal, etc.) limits. Again, its not about supporting apple.. its supporting lower powered electronics.

Vor Monat
Taxed Burner
Taxed Burner

@Paul Hughes Whether it's subsidized or privatized, cost to the end user doesn't really change. If it's not corporate suits skimming off the top, then it's government bureaucratic nonsense delaying production and ballooning costs. You bring up gas fields but please find me a wind, solar or hydro farm that didn't also cost millions to produce. Let's also not forget that the lifetime environmental impact to produce and run these is only marginally better compared to coal or gas plants. Nuclear is the only truly clean energy generation method but it's also a limited resource like oil and you still have to do something with the waste materials. The biggest problem with nuclear is, at the end of the day, how comfortable do you think the average person is with having a nuclear plant in their back yard?

Vor Monat
MegaMark0000
MegaMark0000

It's going to be 2005-2013 all over again for pc gaming. Was it just me, or did it seem like price to performance got much better around 2014-2015 for pc components?

Vor 24 Tage
nik Lazz
nik Lazz

very informative video... Don't agree with all but definitely if things keep going this way with AMD, INTEL and NVIDIA we will need a dedicated 40amp breaker in our house panel just for our PC not to mention the ridiculous large electric bills if you use your the rig a lot. With current energy prices, and only climbing, having a PC will be like paying a $60+ subscription a month to your electric bill!

Vor 8 Tage
Jack Powell
Jack Powell

I think moving to ARM was almost never going to happen, until Apple did it. And im not sure anyone, unless the entire market collaborated on it together, could have pulled it off successfully. Apple, who i hate, though i fucking LOVE my m1 pro MBP (work paid for it) were able to push it through due to their controlled closed box nature and brand power. Now, we're seeing a lot of software, available on mac in ARM native. If windows were to now try again, it might be a LOT more successful that previous attempts. If the other hardware vendors match. Fact is, now web traffic is weighted to mobile devices, phones, tablets, etc and those are ARM based. The desktop is increasingly rare, and even laptops to a lower extent. Many homes just have mobile devices, and a few maybe also have laptops where the M1 style efficiency is crucial, as is driving down costs. Sure for us high end gamers its not where we want to go, but we're not the market majority, and im sure for some time our products will remain until it all goes.. cloud? Either way, for 99% of consumers, i'd suspect a low power draw, reasonable performance, low cost, ARM device would suffice. For some reason our phones can often have more performance than more budget laptops despite the huge miniturisation in those devices driving up cost with insane screens. I can't fathom why the large 500-700 dollar laptop market sucks ass so bad when a 500-700 dollar phone can be quite amazing.

Vor 26 Tage
depth386
depth386

Since the late 90s I have swapped GPU and kept the PSU, CPU and motherboard at least once per computer. Sometimes RAM and Storage changed, sometimes on individual timetables. I briefly even used SD RAM with a Pentium 4 because I was on a budget and the motherboard had one slot. The SoC approach has some merits but unless something changes majorly with the evolution of game graphics over time then you’re gonna be buying all new a lot. Offsetting this may be economy of scale at the modest spec level.

Vor Monat
itsky
itsky

I've been building my PCs since the early 90s. My lab consists of custom-built PCs/workstations or customized server platform systems. The only non-customizable desktop that I have is an M1 Mac Mini which I turn on once a month to run updates and crunch an occasional video. Going totally SOC would limit my customization due to limitations and expense. SOC systems are fine for those who want to surf the web, check email, and do other simpleton tasks.

Vor 2 Tage
Wayne Anderson
Wayne Anderson

... and Qualcomm's pricing model is borderline predatory. The last 2 generation of chips have seen qualcomm escalate pricing so much that nearly every manufacturer with access/deals with multiple suppliers have attempted to use samsung sourcing or co-development to manage BOM cost. Qualcomm is "best of the not-apple" but has forced their own customers - phone and device makers - to find alternatives with such aggressive pricing.

Vor 22 Tage
Breck the Yeen
Breck the Yeen

Apple needs to get software! Love my M1 air and M1 macmini but I'm in a gaming drought since switching over. The stuff that has been made is fantastic though. I might go back to a PC for home desktop just because of that, but my laptop it's a no brainer, it lasts days!

Vor Monat
Yeezet
Yeezet

Linux has better support than Mac for gaming. Most m1 stuff is mostly supported

Vor Monat
Nicholas
Nicholas

I'd argue that this has already happened - mainstream PC use has become laptop use which are exactly these devices, built on an Intel SOC and relatively unrepairable compact devices. Gaming PCs exist only for an enthusiast crowd, and I believe they still stay this way. No SOC will compete with a dedicated GPU in performance, not for a very long time. Gaming PCs may move to ARM, perhaps this is what nvidia was aiming for by trying to buy ARM. Yet, I don't think they will move to a SOC, there's too many compromises in doing that. Consider, if you have a massive single die for a GPU, that is still going to be a better than a massive single die split between CPU and GPU.

Vor 14 Tage
Adam Flannigan
Adam Flannigan

I’m confused. Did I miss the part where ARM is required to be a soldered cpu? I figured that ARM or RISC-V would take over the compute space eventually but that we would see ARM/RISC-V components just like we see x86 components.

Vor Monat
Brad
Brad

@matt anderson Consumers breathe oxygen. There won't be a breathable air left if we continue with this waste of energy.

Vor Monat
Goobfilm cast
Goobfilm cast

@Chris H Just like your current 4k TV from Sony, LG, etc.....

Vor Monat
mutedmutiny
mutedmutiny

@matt anderson keep projecting. Keep coping.

Vor Monat
matt anderson
matt anderson

@mutedmutiny Because they aren't. Because you made up some hypothetical situation doesn't change that. You're confused. I won't bother with quoting your posts tbh idc about the entire conversation because, and I know you want to matter badly, you don't.

Vor Monat
dc ocz
dc ocz

Hybrid will come first, probably emulations of x86 on ARM with hardware support. Remember they aren't CUDA cores

Vor 24 Tage
DFX2KX
DFX2KX

I suspect that PC makers might take a cue from the Raspberry Compute Modules, Nvidia and a few others also made them. This is essentially an SOC on a small daughterboard that can be removed like RAM or one of those EMMC storage sticks. The RAM is sometimes on-die with ARM chips, but it doesn't have to be, and I suspect memory will also be something that's expandable in at least some cases.. GPUs are going to stick around for awhile, I think. Since there are always going to be people who simply want graphical power at any price. my 5700XT heats up the room and I honestly don't much care. Either that, or you'll end up with ARM chips that still pull 400w as said GPU gets integrated.

Vor 25 Tage
Lance
Lance

Maybe they will still allow upgradable components? The dimension we are reaching into already knows whats best for us i would imagine.

Vor 21 Tag
Ro Bones
Ro Bones

Always watched Linus never seen this guy and after not watching for a while I come back and honestly like this guy better good to see Linus has a great team on his side.

Vor 23 Tage
Undercover237
Undercover237

I’ve seen him many times before on the short circuit channel if I’m right! He’s great tbh

Vor 10 Tage
Mercenary7
Mercenary7

Cool. Super looking forward to the future where everything is one chip and if you want to upgrade or even replace a broken component you need to throw the whole thing out and buy another system from a vendor. Super exciting.

Vor Monat
Cas Cas
Cas Cas

@LukasL34 Yeah, but they don't cost 1k

Vor Monat
AbuBakr Akram
AbuBakr Akram

@Demon Llama It's not really comparable. People are too stubborn to admit it, but Apple really did _invent_ the modern smartphone market and captured a huge segment of it. So, everyone looked to see what they did and got away with so that they could do the same. The PC market is too big and established for that. Apple already made soldered, glued laptops and desktops for years with x86, and there was pretty limited copying of them by the PC world.

Vor Monat
Demon Llama
Demon Llama

@AbuBakr Akram Probably because of what Apple did to the phone market. Apple went anti-consumer and all the other phone manufacturers followed suit. Based on that, and the greed of big companies in general people are assuming if Apple's M1 takes off and the industry shifts away from x86 they're going to follow Apple's anti-consumer practices.

Vor Monat
Skye
Skye

I reckon people will just start making open source hardware. I imagine there will be a LOT of hardware designers and experts who are not happy with this.

Vor 21 Tag
John Madsen
John Madsen

I just got a beelink. Running virtualization on it. It’s tiny 8c/16t 4.2ghz 64gb 3200 ddr4 sodimm ram. Replacing two full tower dual octos. Still need the tower for 12x8tb sas raid 6. So far, it’s much much cooler. And quieter. I like building but I’d do an overkill if I did a build and end up spending triple.

Vor Monat
Dee
Dee

Anyone else here prefer the Windows XP Interface? I LOVE how the XP interface looked and operated and have used every version since of Windows 3.1 (1992) ..except Millennium (a disaster).

Vor Monat
ArcangelZero7
ArcangelZero7

Love XP's aesthetic. I run Manjaro KDE with an XP theme on my laptop. :)

Vor 23 Tage
Zachary
Zachary

It will be worth it for a while we wont see these smaller footprint PC's for another couple years being predominantly used at least.

Vor 7 Tage
swiftpotato
swiftpotato

Been through this a few times, during my 52 years of life, CPU's come and go and backwards compatibility is the key issue here. However if the architecture can also give that compatibility through some form of emulation or virtualisation, I can see PC;s of the future being smaller and less power hungry. In the meantime I am waiting for the next gen of GPU's and CPU's to be released.

Vor Monat
Mud Kip
Mud Kip

@Garrus Vakarian I think you are underestimating people's internet speeds. At least in America there's a lot of rural areas that lack high speeds. They luckily just installed Fiber in my area but I was getting .7mbps to 1mbps on a good day (and paying 90$ a month because monopolies). 240p-360p was how I had to watch Youtube. Cloud Gaming is a good idea, I just think It's really only a person who has a solid connection.

Vor Monat
Dave Gunner
Dave Gunner

@Lawyer Lawyer yeah definitely. Don’t get why they propose a PC as a lot of e waste in this video Tbf, most Apple tech isn’t even designed to last longer than 5 years, it’s almost common place at this point

Vor Monat
Lawyer Lawyer
Lawyer Lawyer

@Dave Gunner yeah kitchens usually have this types of wattage in mind , but just as a piece of usefull info , even the most standard electric installation (on first world countries) can take at least 2,500watts A regular sized , wall vacuum cleaner , wich is something you can see in any house , used around 1,400 watts There are smaller 700 watts models , but there are also some 3000 watts models , so 1400 is very standard. And there has to be some headroom for other electronic devices so that you don’t end up in a house where if you turn the tv and the vacuum cleaner on at the same time , the lights go off 😅 So yeah , it’s very likely that an average upper floor couldn’t handle a 5000 watts oven. But there should be enough headroom for a pc even if it was a 1,500 watts pc monster. But yes this can’t continue , if it keeps on , there will come s moment when a year using a high end pc , costs the same money , the oc itself costed. I hope there is a way to keep improving DIY pcs with out jumping into APU type pcs The pc building community is huge. It would suck to have pcs become like phones. Where high end means the exact same computer and brand for everyone and it lasts 2 years.

Vor Monat
Benicio Navaira
Benicio Navaira

@Garrus Vakarian True both solutions have pros and cons. But have you seen how certain software ,and old games cost a small fortune to purchase. Really if there is a compatability issue, you could be sitting on a gold mine. Anyway hold on to things that you think might be worth money with in time. Any ways best to you, and keep space criminals in check!

Vor Monat
Dave Gunner
Dave Gunner

@Lawyer Lawyer Yeah no problem! I think he his settings on high for that part of his test, but the moment he moves it up to 4K ultra the performance drops to 50ish I think. And I didn’t know that actually, but most kitchens and such are wired to compensate for high power draws, but depending on the house some upper floors In various homes probably don’t. (I’m guessing mine doesn’t) But yeah it’s the cost that’s the issue. I reckon in the future it will come down to machines like the one in this video but I can’t see it wiping out DIY PCs, I just think it’ll potentially push out the entry level market. Being able to build, disassemble and clean my system and service it myself is very important to me

Vor Monat
Brachra2055
Brachra2055

Honestly, i love my custom PC with all the lights and customization. But If this tech can meet actual needs for my uses (Gaming mostly) and well not just "Playable" and also reduce my power bill by 80% and not turn my room into a space heater? Im not gonna complain. Edit: Unless I have to buy apple products. No. just no.

Vor Monat
Kuza
Kuza

@transformer stuff Why do you think i'm here sharing the gospel? Do you think I want to be doing this or because a supernatural factor is encouraging me to? Look at the Book of Daniel and the Book of Revelation and then look at how history has unfolded, specifically the Roman Empire. Its a major redpill

Vor 29 Tage
transformer stuff
transformer stuff

@Kuza gods are for those who have little knowledge.

Vor 29 Tage
Kuza
Kuza

This world is rapidly passing away and I hope that you repent and take time to change before all out disaster occurs! Belief in messiah alone is not enough to grant you salvation - Matthew 7:21-23, John 3:3, John 3:36 (ESV is the best translation for John 3:36) if you believed in Messiah you would be following His commands as best as you could. If you are not a follower of Messiah I would highly recommend becoming one. Call on the name of Jesus and pray for Him to intervene in your life - Revelation 3:20. Contemplate how the Roman Empire fulfilled the role of the beast from the sea in Revelation 13. Revelation 17 confirms that it is in fact Rome. From this we can conclude that A) Jesus is the Son of God and can predict the future or make it happen, B) The world leaders/nations/governments etc have been conspiring together for the last 3000+ years going back to Babylon and before, C) History as we know it is fake. You don't really need to speculate once you start a relationship with God. Can't get a response from God? Fasting can help increase your perception and prayer can help initiate events. God will ignore you if your prayer does not align with His purpose (James 4:3) or if you are approaching Him when "unclean" (Isaiah 1:15, Isaiah 59:2, Micah 3:4). Stop eating food sacrificed to idols (McDonald's, Wendy's etc) stop glorifying yourself on social media or making other images of yourself (Second Commandment), stop gossiping about other people, stop watching obscene content etc. Have a blessed day!

Vor Monat
transformer stuff
transformer stuff

apple will never understand gaming.

Vor Monat
Casson Dennison
Casson Dennison

Honestly, I don’t believe it will kill traditional PCs. The main reason traditional PCs are alive right now is it’s ability for personal customization and utilization. Apple is notorious for not being able to fix things cheaply, usually buying a new one makes more sense. And all of the PC component companies will most likely lobby and work something out to keep them alive and thriving. Idk

Vor 17 Tage
MrLeva115
MrLeva115

I finally upgraded my PC after having it for almost 8 years lol. I think I’m set for another 8

Vor 27 Tage
DJRonnieG
DJRonnieG

If we can fit the performance and utility of a high-performance PC inside a small package, imagine if we used that same tech but put "more" of it in to a larger package. That's the way way I see it and attempt to implement my builds. On the other hand, I have a friend who does all of his gaming on a home server (a used rack-mounted enterprise grade machine) through is laptop. This works for him since he has two kids playing Minecraft on laptops. Instead of maintaining separate installations, he just runs it on a Windows Server instance with some other front-ends and thigs like Guacamole. I still much prefer to have a powerful machine near my desk, but I do split the workloads between it and my server. For example, I used to keep my desktop on 24/7. Nowadays I put it to sleep mode and only wake it up remotely only when needed.

Vor Monat
Toby Roberts
Toby Roberts

To be honest, I'm surprised that enthusiasts have managed to keep the DIY build PC game going for so long. So many other hobbies and just things in general are taking away input and choice for consumers. I hope it hangs on a bit longer - I love my stupid PC because it was me who put it together.

Vor Monat
Spartan1312
Spartan1312

@Gamer Guy Hence why I will never buy one of those cars. I will never allow a machine I own to work against me or out of my control and if I can not "own it" I will never buy it.

Vor Monat
Ryan Thompson
Ryan Thompson

@Gamer Guy No... it doesn't "unlock" horsepower if it's using specific code made by Tesla that was hidden. I hate when people that dont understand programming pretend they do. Seat heaters don't magically go "brrrrr", they required lines of code to operate. Even if you "unlock" the heater, you HAVE TO PROGRAM THE ENTIRE THING TO WORK. Unlocking a seat heater is just unlocking glorified chunks of metal... it doesn't do anything until you tell it to do something. If you're not creating your OWN program, you're stealing theirs PERIOD. By the way, the don't actually tell you if they're programmers that created their own code, they just claim it unlocks features... features that have code built BY tesla. Beyond that, not only does that company operate in Canada, it operates in QUEBEC Canada that provides some very big issues when it comes to enforcing UNITED STATES LAW. Even Canada barely has any say over Quebec, so how would the US?

Vor Monat
TriggeredPandora
TriggeredPandora

@Ryan Thompson Yes I know that CPUs and GPUs are limited in their clocks to not fry themselves but that doesn't mean you can't overclock them a bit if you desire slightly more performance with less stability, for FREE. There shouldn't be anything stopping you from doing the same with your car. If you want to increase performance at the cost of lifespan it should be your decision to make but not the default setting. The fact such a feature is behind a paywall in the first place could also be misleading customers into thinking it must be a no compromises upgrade. Now you have unknowing customers who are paying money to kill their batteries.

Vor Monat
Dusty Floor
Dusty Floor

Over the years I've watched Linus tech completely destroyed valuable technology.. as a joke.

Vor 2 Tage
Gigawatt
Gigawatt

Anthony has come so far! I’m proud and happy to see his personality shine through!

Vor 22 Tage
Wotzinator
Wotzinator

I would love it if we could have both options. I love arm and want it to replace x86 and think socs to be pretty cool, however do we really need to kill graphics cards for that? Or modularity in general?

Vor 29 Tage
Cid Sapient
Cid Sapient

@Wotzinator pc hardware used to be pushed to the edge even just 10 years ago when i was a kid we pushed hardware to the point we needed graphics accelerators and then discreet GPUs then cryptography/mining we started pushing GPUs further

Vor 25 Tage
Wotzinator
Wotzinator

@Cid Sapient pc hardware isnt pushed. Console devs regularly push hardware

Vor 25 Tage
Cid Sapient
Cid Sapient

really i dont think it should the problem is the software industry (why i moved away from windows entirely) ive been gaming since the 90s and games are not pushing limits of hardware anymore they are pushing the limits of graphical engines... so many games im playing right now need to just give up and switch to the new unreal MWO is one game that could have a major resurgence right now but its so outdated

Vor 25 Tage
josephjdesouza
josephjdesouza

The initiative to get away from the x86 CISC architecture is a long standing one. Last attempted in the PowerPC and Itanium era it never pans out because forced upgrading only seems to work when baked into the ecosystem. Layering features into apps based on architectural design and generation is intrinsic to the realm of the mobile platforms, but desktops are another story. What this really means is the desktop platform must go away and all performance benchmarks will be nominalized to the platform the manufacturer sets. The public is not used to being beholden to the manufacturer for support as the Windows App Store has proven.

Vor Monat
dylf14
dylf14

Every time I hear about how RISC is much simpler instruction set and how CISC is simplifying with extensions, also remember that RISC is increasing its own complexity as well. Both sets are moving towards a common center. A RISC based system requires API makers/driver teams/OS teams to reorient themselves to use more instructions to accomplish the same task. Software designers don't want to spend lots of time optimizing, and would rather move on to another project as it works better at a revenue perspective. Think about how much we ask game developers to optimize games for certain archs, and they just abandon the game and move on to the next title...

Vor Monat
Anony Mousse
Anony Mousse

@Schule04 They are ABI's and as such have very little to do with the way the processor is implemented, only with the exposure of the programmer's interface. People need to quit telling that lie because they aren't CISC or RISC internally, that's external.

Vor Monat
Schule04
Schule04

Not to mention, all modern CISC CPUs are internally RISC anyway

Vor Monat
FrenchFriedLegion
FrenchFriedLegion

@Democrab yeah, that was also my thought. I could not see how it is irrelevant. I also recently worked on custom extensions for a RISC-V based CPU for accelerating certain tasks in software, so my guess was that either he was an actual engineer on a team for constructing CPUs somewhere or he was not necessarily aware of what he was talking about. Yes, haha, that was also something I had to look into with my former project as data paths should not become too long. I am glad that I did not just confuse everything up, thanks!

Vor Monat
Silver Knight PCs
Silver Knight PCs

@Kaz Jim Keller has been barely involved with CPU design the last decade. Hes quite literally an enigma. Some say he is a genius, others that he claims credit for his subordinates work. History will tell

Vor Monat
sznikers
sznikers

@Roger Coker yeah but were hearing that for long time now and still no CPUs with that tech on the market. So its either to costly to manufacture end product or they know current tech has still enough headroom so they don't care about using light yet. BTW didn't first intel dual core use light guides as interconnect between two dies? They were pushed to the wall back then thou.

Vor Monat

Nächster

Common PC Building Mistakes that Beginners Make!

22:30

Was 3D TV actually poo?

21:22

Was 3D TV actually poo?

Linus Tech Tips

Aufrufe 1 900 000

THE BOYZ(더보이즈) ‘WHISPER’ MV

3:31

THE BOYZ(더보이즈) ‘WHISPER’ MV

THE BOYZ

Aufrufe 21 731 333

The 4 things it takes to be an expert

17:59

The 4 things it takes to be an expert

Veritasium

Aufrufe 5 200 000

Is Buying More RAM a WASTE for Gamers? (2022)

9:28

Is Buying More RAM a WASTE for Gamers? (2022)

Linus Tech Tips

Aufrufe 1 500 000

Evolving AIs - Predator vs Prey, who will win?

12:15

Why CPU GHz Doesn’t Matter!

10:25

Why CPU GHz Doesn’t Matter!

Linus Tech Tips

Aufrufe 2 600 000

The Absolute Fastest Gaming Computer - Splave PC

17:42

The Absolute Fastest Gaming Computer - Splave PC

Linus Tech Tips

Aufrufe 3 000 000

How different animals go down a slide. 😂 Part 1

0:24

How different animals go down a slide. 😂 Part 1

0:24

Dry what first?!

0:44

Dry what first?!

Mikaylah

Aufrufe 5 659 135

ER wandert wirklich aus?! 😮 | #ungeklickt

18:47

Lohnt sich Amazon Retouren im Wert von 3000€ zu kaufen?

20:51

Das Date zwischen den beiden..

18:19

Das Date zwischen den beiden..

YaviReaction

Aufrufe 49 966