RTX 4090 & A770 Margins Leak: Are Nvidia and Intel making big profits this Fall?

classified information
classified information

i LOVE how the whole community are all in on the 4080 12gb *cough* 4070 joke

Vor 2 Monate
Eric Witte
Eric Witte

The card itself is a joke.

Vor 2 Monate
Ben C
Ben C

@David Ngo right, that was an odd gen

Vor 2 Monate
David Ngo
David Ngo

@Ben C you forgot 1660.

Vor 2 Monate
Michael Toney
Michael Toney

I don't think of it as just a joke, but more of a warning for buyers. If you buy this card you are not really getting a 80 card but a 70 version. I will be going with AMD this time unless NVIDIA makes some huge changes to pricing. Nvidia seams to think crypto prices were ok. But doesn't want to see the fact that cryptocurrencies is dead when it comes to needing highend video for the hash rates. If I had to bet I would say over 70% of their cards sold for crypto mining and new miners trying it out. A few for end users that had the money due to government checks, and then well off highend enthusiasts. But at the prices they are asking now I cant help but think they priced out a huge amount of their market. With current state of the US and economy I really think they shot themselves in the foot.

Vor 2 Monate
unmaker
unmaker

If AMD has balls, they will name their entire line "7900xt" with some other modifier to differentiate between them.

Vor 2 Monate
James Smith
James Smith

The problem I have with Mr. Jensen on the price of video cards exploding is: a 4c/8t 2600k cpu launched in 2011 at $320. (gtx 580 was $479). Today, a 6c/12t Intel cpu is $250. (rtx 4080 = $1200) Ram is faster today than 10yrs ago and is cheaper. Storage is faster today than 10yrs ago and is cheaper. Every component is cheaper today.. EXCEPT video cards. What has grown the most since 2016 is Nvidia's marketcap$, as much as 40x's. I know it's not apples to apples but I am still VERY suspicious.. 🤔

Vor 2 Monate
AwesomeBlackDude
AwesomeBlackDude

@BeetleBuns hmmm...

Vor 2 Monate
Rudy Sal 14
Rudy Sal 14

@AwesomeBlackDude pricey? Pricey is a stick of ddr pc100 256MB for about 100 bucks in the early 2000s.

Vor 2 Monate
Camilo Buritica
Camilo Buritica

And then Nvidia aiming for a Mac/Apple like ecosystem... Not scummy at all!!! /s

Vor 2 Monate
BeetleBuns
BeetleBuns

@AwesomeBlackDude why tf would you cherry pick expensive versions of shit to claim that it's all expensive? That's like someone saying that all meat smells bad because they saw a dead deer on the side of the road...

Vor 2 Monate
AwesomeBlackDude
AwesomeBlackDude

@MrBashem hmmm...

Vor 2 Monate
necrotic256
necrotic256

Damn, AMD really hit Intel hard with the latest price drop. A770 8gb even for 320$ is DOA with the current 6650xt prices

Vor 2 Monate
Duke49th
Duke49th

It's not amd price drop. Amd is still on 5he "old" price. It's Newegg having sale. And too many youtuber spreading misleading information.

Vor 2 Monate
John Scaramis
John Scaramis

@Matthew Bossert Does the AV1 encoder actually work? From what I know/heard, people who bought an ARC card specifically for that have huge problems to get the AV1 encoder running, if they get it running at all.

Vor 2 Monate
Andy Ruse
Andy Ruse

I’d say it depends. I’m betting Intel tries to compete for OEM contracts. Buy an intel CPU and A770 and pay less than the combined price. It doesn’t have to be the cheapest GPU on the market to sell. It just has to appeal to corporations like HP, Dell, maybe Lenovo or Acer.

Vor 2 Monate
aquapendulum
aquapendulum

Considering the state of Intel Xe driver, I'd still pass it over at a $250 price tag.

Vor 2 Monate
Preda Alex
Preda Alex

@TummyTums It performs better until it turns your computer into a brick.

Vor 2 Monate
Noah Mayer
Noah Mayer

AMD was smart. From the leaks, the high end cards (7900, 7800?), being on 5nm chiplet, but only the heavy lifting parts. Keeps the expensive part minimal, the rest on established 6nm. The 7600/7700 on 6nm monolithic will be interesting to see against the 4060 and 4070/4080 12gb.

Vor 2 Monate
Lemon Juice
Lemon Juice

@Twan Duvigneau i see much people saying the same thing, but in fact the navi 33 will be 7700xt,because is half the CUs than n31, and will have performance like an 4080 12gb, so no point in making a 7600xt with this level of performance due to competition.

Vor 2 Monate
Twan Duvigneau
Twan Duvigneau

The 7700 is not monolithic I believe, Navi 31 and 32 are chiplet, Navi 33 is monolithic

Vor 2 Monate
Lemon Juice
Lemon Juice

@Nathan Gamble the navi33 has a little more perf per clock than rx 6800 but will have much higher clock speeds and will be way cheaper to manufacture, i think it will be around the rx 6900xt performance in rasterization. And looking at the 4080 12gb performing around the same speed than 3090ti the navi33 will be like an 3090/3080ti level. All this with 203mm die size at 6nm, holly shit this will be the legendary graphics card of the following years, everyone will buy.

Vor 2 Monate
Антон Краценюк
Антон Краценюк

@Lemon Juice I think that means they LL rebrand 50 series like 6650 6750

Vor 2 Monate
Антон Краценюк
Антон Краценюк

@Lemon Juice true don't know why they can't do rdna3 on all lineap

Vor 2 Monate
CasualPhilosopher
CasualPhilosopher

NVIDIA BOM cost factors in a 40%+ margin. How sad and unprofitable would it be if they didn't get their massive margins.

Vor 2 Monate
HenryViii Fake
HenryViii Fake

@i077 "Clearance stock"? Give this guy a gig - he's a top-tier commedian! 🤣🤣🤣

Vor 2 Monate
Camilo Buritica
Camilo Buritica

This channel gives me a big "don't mess with the multi billion dollar company" *fat guy wielding a sword* meme vibe every time he defends the pricing of hardware.

Vor 2 Monate
Si Smith
Si Smith

@i077 Think you'll find the wafer manufacturers like TSMC are the ones doing the research into 4nm processes, they build the fans, not nvidia... they just buy the wafers. I find it funny how every other IT sector doesn't make nvidias margin but still innovate just fine. Maybe perhaps its that Nvidia has the monopoly at the high end, the monopoly on gfx cards fir 3d rendering, the monopoly on cards designed for AI... that is why the have massive margins... they can as long as they don't have any real competition.

Vor 2 Monate
BeetleBuns
BeetleBuns

@Todd Peterson he's not wrong, they have a right to profit off what they make, theoretically. However, we also have a right to refuse to buy their overpriced garbage.

Vor 2 Monate
Sneg Mann
Sneg Mann

@i077 oh shut up about "R&D", if they even spent 50% of what they paid their top management on R&D we'd be flying around in emissionless cars by now. R&D is a lame excuse for overcharging.

Vor 2 Monate
Crazy Beatrice
Crazy Beatrice

Let intel do some stuff in the low-end for a few generations. Maybe they can compete after they get software fixed up and hardware issues solved.

Vor 2 Monate
GraveUypo
GraveUypo

if they do it well, more power to them. because amd and nvidia seem to be neglecting it.

Vor 2 Monate
SuperFlamethrower
SuperFlamethrower

A380 is priced the lowest at $140, $10 cheaper than RX 6400 whose performance it matches. If it weren't for driver issues, it would beat the RX 6400 in value. That makes me optimistic about pricing of the rest of their lineup. The wildcard is of course those driver issues. If I was a betting man, my money would be on them being priced a little better than the competition in terms of raw performance. Like A770 for $280, and losing money on every card sold.

Vor 2 Monate
Jon Martin
Jon Martin

I think pride is a problem. It must be very difficult for Intel to say that it is going to take them at least five years to get a handle on discrete GPUs and that they are going to focus exclusively on the low end during that time. That's sort of what the messaging was in the beginning but then they started aiming high. I don't know how much of that can be blamed on Koduri and how much was Intel's whole culture not being able to restrain itself. I do hope they stay with it (of the three, Intel is consistently the most open source friendly) and I do think that going OEM only for at least the next few years is probably the best move. Intel's top brass needs to give AXG a strict power cap, maybe 50 W. Figure out how to make a 50 W card that consumers start demanding a retail version of, only then would Intel be ready to slowly move up-market.

Vor 2 Monate
Adam Kallin
Adam Kallin

But if it turns out they can’t compete they will have thrown a lot of money in the trash.

Vor 2 Monate
Tri Nguyen
Tri Nguyen

@Dralor D Agreed. Optane only lasted 5 years before being killed off.

Vor 2 Monate
OneDolla Bill
OneDolla Bill

Nvidia has basically pushed their arch to the edge. They need to pump a ton of power and add more compute units to get performance gains. Basically the actual architecture gains minimal uplift. Meanwhile Amd had been pretty good to get good gains with same power and going for smaller node. So the Rdna seems to be scaling a lot better than what nvidia has to offer.

Vor 2 Monate
Twan Duvigneau
Twan Duvigneau

Scalability was a huuuuuge part of the rDNA design

Vor 2 Monate
lasthopelost
lasthopelost

I think just like intel there pushing it with there hardware where as amd has a better layout that just gets better and better but I guess we will see with rdna 3

Vor 2 Monate
BeetleBuns
BeetleBuns

yup. And yet I still get called poor by nvidia fanboys when I say the new cards are shit.

Vor 2 Monate
aisolutions india
aisolutions india

@Anmol Agrawal yeah mark-ups lower than margins so a 40% gross margin is higher than 40% mark-up thats simple math.. restaurants actually aim for 40% gross margins [on material] and not mark-ups

Vor 2 Monate
Anmol Agrawal
Anmol Agrawal

there's a difference between *mark-up* and "profit margin* you know

Vor 2 Monate
Bob Dole
Bob Dole

Wow the margins of Arc just show the genius of AMD and how making a profitable GPU division is incredibly difficult even for billion dollar companies 🎉 ❤

Vor 2 Monate
WingofTech
WingofTech

Yeah, AMD’s been really hard at work on their GPU’s since Nvidia is an employee magnet!! 🧲

Vor 2 Monate
ARM-POWER
ARM-POWER

I'm gonna buy Intel's ARC as rare collection item. One day that card will be expensive as Apple 1 today 🙂

Vor 2 Monate
RobBCactive
RobBCactive

@lll I don't believe it's purely a driver issue with Arc, there's holes in what the AXG marketing said. Dr Ian Cutress TechTechPotato has an interview with Lisa Pearce that shows they knew years ago

Vor 2 Monate
lll
lll

@RobBCactive yeah, whoever thought reusing (crappy) iGPU drivers on a discrete card is a good idea is a fool. maybe they thought it should "just work" like on a console, forgetting that there are a million games that will need driver fixes.

Vor 2 Monate
SuperFlamethrower
SuperFlamethrower

@Tri Nguyen They're going to have to take a loss on them if they're going to move them. I think this is mentioned in the video.

Vor 2 Monate
A Martin
A Martin

Just picked up a 6650xt for $280. Huge improvement over my 1060 3gb. Was fully ready to drop $700 if it was worth it, but it isn't right now. Nvidia lost my money.

Vor 2 Monate
phonetic
phonetic

@The Echelon Magine being a gaymer boy

Vor 2 Monate
The Echelon
The Echelon

@phonetic Imagine being an Nvidia fanboy in this day and age, just sad.

Vor 2 Monate
Wawa Weewa
Wawa Weewa

@RAUL 🤣

Vor 2 Monate
TK
TK

I have ATI graphics cards that still work. A pair of HD 7770's work great and I keep them on hand for a bridge/test cards.

Vor 2 Monate
Dominic M
Dominic M

@phonetic My buddy has been running his reference 6900xt he got from the AMD website lottery(he wanted 6800xt, but was only offered 6900xt) since it launched. He hasn’t had a single issue with it.

Vor 2 Monate
traingp7
traingp7

The A 770 16 GB card is going to be a collectors item LINUS talks about in 10 years when he's in his mid 40's.

Vor 2 Monate
Earth Taurus
Earth Taurus

@White_Bunny I do hope Intel do shape up as we seriously need the competition in the GPU space as AMD sooner or later will also hit the wall. These things happen in cycles, one which Nvidia is trying to re-invent by becoming sole supplier and sole manufacturer of Nvidia GPUs.

Vor 2 Monate
White_Bunny
White_Bunny

We still need Intel to be there in the GPU business cuz we have a duopoly and it's not going to be good if both of them can get away unchallenged. This is good for consumers in any case.

Vor 2 Monate
harry bryan
harry bryan

I am hoping to get one.

Vor 2 Monate
Obnoxious Potato
Obnoxious Potato

I doubt I'll be even considering a 40 series card. Prices are far too high. The new power cable is badly designed and I wouldn't be surprised it doesn't cause a fire or two.

Vor 2 Monate
beep2bleep
beep2bleep

Really thinking about it there's nothing wrong with Intel using Arc to build better low end GPUs (possibly integrated again) and keeping the in house knowledge alive waiting for another time to strike. Obviously they need to work on getting good drivers that cover the whole spectrum of games. They can use the next two year to keep releasing some gpus and refining drivers. Seems like the next 2 year are going to be rough with the used/extra Gpus needed to filter through the market. It really seems like Intel's best move is to wait and prepare instead of jumping into the buzz saw of the current market.

Vor 2 Monate
50 Shades of Beige
50 Shades of Beige

I'd buy an RX 580 or even a 5500XT before the 6500XT. Seems silly to me to buy a discreet card that lacks hardware encoding.

Vor 2 Monate
Canalp Daşer
Canalp Daşer

The ball is in AMD's court now. All they have to do is not screw it up, provide what they say when they say it, and this could be a very interesting GPU generation.

Vor 2 Monate
PandemicGameplay
PandemicGameplay

Your videos are extremely useful, I hope to see you continue on the platform and your hardware news, it's the best on the internet right now and I don't say that lightly.

Vor 2 Monate
Geofrey Marcel
Geofrey Marcel

Quite easy to see that they are milking on the 4080 12gb when the 4090 is twice the card for quite less than double the price. This was highly unusual for high end to be cost efficient. It happened also with Zen 3, were the 16c was roughly double the price of the 8c. But 5-15 years ago, the top end was more like 10% more performance for double the price.

Vor 2 Monate
LordBattleSmurf
LordBattleSmurf

If AMD was smart they'd use this opportunity to keep prices significantly lower in order to gain mindshare. Unfortunately they'll probably raise prices for higher margins and be just slightly cheaper than Nvidia

Vor 2 Monate
CheruPrime
CheruPrime

@gtawarlord jea, the "community" always hopes that AMD creates something good for a good price, so that nvidia has to lower the price and they can buy a geforce.

Vor 2 Monate
gtawarlord
gtawarlord

The last time AMD was beating Nvidia and kept there prices significantly lower everyone still bought Nvidia. They might as well sell for better profits so they can keep innovating and beat them in the long run.

Vor 2 Monate
Mickaël's Flow
Mickaël's Flow

We know the price will be higher. It has to be (sadly). Yet... How will we know why? Of a 20% increase, what is "AMD wants higher profit" and what is "Die cost is more expensive, so here's 1:1"? How do you expect to know, not see, the difference?

Vor 2 Monate
Benjamin Lynch
Benjamin Lynch

Need to remember that even after the gong show of the last earnings release from nVidia, their gross profit margin as a company is still north of 60%. For every 1,000 in revenue, they only have $400 of cost, and the remaining $600 is profit. They could cut prices by 20% (from $1,000 to $800), and even with the same $400 cost they’re still making 50% margins. For the AIB partners, it’s obviously a bit different as nVidia (and AMD) set the AIB cost and then simultaneously put a ceiling on their profit margins by also establishing MSRP. There’s still 60-65% margin in these cards, but nVidia is taking the first 60% and leaving the scraps for AIB partners. I for one am not going to shed any tears for nVidia if they have to lower pricing to compete. Even with higher wafer costs and memory pricing, they can absolutely still make an acceptable profit margin (20-30%) on every consumer GPU in their lineup with pricing under $1,000 and midrange cards at or under $500.

Vor 2 Monate
Feel The Bern
Feel The Bern

@Aero I'm not saying that RTX should operate at a loss. I'm saying that they don't. Their margins are more than fine, and nothing justifies this big of a jump in pricing.

Vor 2 Monate
Aero
Aero

@Feel The Bern You misunderstood what my response meant in relation to the original comment. RTX cards cannot operate at a loss just because data center products carry the profits of the company. RTX cards do not sell with 60% margin just because the Nvidia reports a total gross margin of 60%. Those were my main points.

Vor 2 Monate
Feel The Bern
Feel The Bern

@Aero You're comparing apples to orange. Data Center clients are typically corporations (usually rich and profitable), whereas GPU clients are just regular average Joes, with regular disposable incomes. The reason why we (average people) are not served 128-core CPUs, is because those CPUs are already reserved for high-end clients who can afford to pay 10s or 100s of thousand of dollars for the best processors and servers out there. Does the fact that Intel serve the Data Center market, now means they have to increase prices for regular-consumer CPUs by 70%? No. No it doesn't. Nvidia has no reason to justify forcing regular people to pay the same extortionate prices that Data Center clients do, because regular people don't have that kind of money. It's that simple. You're suggesting that just because Nvidia makes Data Center products, they should ditch the GPU market entirely, or else squeeze all the blood from that stone. That's absurd. It's a completely false dichotomy. These two markets can co-exist at different price-points and profit margins, because they serve two completely different target audiences.

Vor 2 Monate
PAcifisti
PAcifisti

@J B NVIDIA's profit margins have been steadily going up. Yes, the larger cards cost more to manufacture but they also take a higher cut for each. Don't whiteknight for the megacorporations, they're not your friends nor do they care about you. Their real customers are the shareholders - you're just a paypig. Unless your kink is capitalist findom, don't enjoy it.

Vor 2 Monate
wuerger 32
wuerger 32

@J B Yeah all the prices have rising significantly the last couple of years, and will only get worse. I didn't calculate that because he didn't, his calculation is based off of the finished card. Component shipping should be calculated within component costs. I remember in his vid about shipping costs he kept going on about how ocean freight costs have went up 6X is why prices have gone up so much. If terrestrial rates went up that much too, I would buy it, but they haven't, I checked with friends who still drive truck to make sure. Freight costs are relatively cheap coming out of LA comparatively to going into California. The 1500 miles from LA to Kansas City for example: say $2 per mile before the raise in shipping rates, at a 6X increase it would be $18K. About the same as the ocean freight at the time. That would add approx another $1.80 or so to a card, now you are sitting at say a little over $3 total shipping to a distribution center. Leaving $27 or so for it to be sent to the final destination. Only specialized loads get anywhere near $12 per mile freight costs. There are a lot of loads out there at the $4 mark, there are even some still at $1 per mile, without the fuel surcharge. I'm sure when they calculate shipping costs it's to the furthest point on the chain though, say LA to New York City. Even then you are looking at around $5 per card at the 6X rate. Even factoring in 5 lbs per retail box on a 3060, bit high, you can still fit 10K in a dry van semi, they weigh around 30K pounds empty. My 3060s had the worst packaging I've ever seen in them, plastic bag wrapped in a foam sheet about 1/4 inch thick, in a box just big enough to fit it in. I bought 2 of them to play around with rendering 3d things. I know it's a simplified version of it, it's just to give an example. My conclusion is based off of the principle of Gelman Amnesia. If I see something that is completely off or wrong, I can't believe the rest of the data. I agree with you about inflation, some of the food I buy has gone up 100% where I live, cheese, frozen pizzas, and a lot of meat especially. With inflation factored in the prices have actually come down even if you only figure 15%, and I'm sure it's a lot higher than that, minimum I'd say 30%. It's hard to tell with a lot because of rampant shrinkflation, noticed the other day the chips I buy lost another 0.5 ounces in the bag. You are upset about the naming of the 80, I'm the opposite, I'm upset at the 90. I consider it to just be an 80 card in disguise, especially this gen. To me, it appeared last gen in order to raise the price of the top skew $500, MSRP of the 2080 Ti was $1000, a price it never hit. If people would pay that, they might pay a little more, so why not, and it worked. add more VRAM, and poof!, it's a new skew. At least with the 3090 it had the same specs as the 80 Ti basically, so you could save a little money if you didn't need the VRAM. Now the 80 Ti and 90 won't even be close in specs, there is a 6K difference just in cuda cores between the 80 and 90. And everyone is only focused on the 12 GB version. Nvidia won again, last gen all that was talked about is the $700 80, "look at that price drop! Value has returned!", now all they are talking about is the 12 GB, "that should be a 70 card!", completely ignoring the glaring problem of the gulf between the 80 16 GB and 90. The 80 Ti used to be the enthusiasts class, now the 90 Ti is. Imagine it was still called the 80 Ti and then take a second look at the prices, and ask yourself if prices have remained close to the same. It hasn't launched of course, based on current prices I'd guess a min of $1,800. And the last thing, I don't buy for a minute that Nvidia designed the Lovelace die to be expensive based on sales of the last gen cards. If I remember correctly AMD stated one time that the team of designers that worked on their CPUs, say the 2000 series, would finish that design, and then start working on the next one in development, not the 3000 series, but what would become the 5000 series. Why? Because there was already a different team working on the 3K series long before they finished the 2K. The share information to design the next gen better along the way, but people from one team don't also design on the other team. I'd imagine Nvidia is the same, Ada was in the design process long before Ampere launched. The added cost for Ada could come from the fact Nvidia had to crawl back to TSMC though, I don't doubt that one bit, TSMC has said they are a pain to work with, and TSMC raised costs of manufacture too. But designed to be expensive, no, don't buy that for one minute. Forced to raise costs by pushing the design to the max?, that I would buy because AMD is catching up. That and Nvidia completely over-ordered wafers for the 4K cards, there are articles out there where TSMC told the to suck it up, they ordered them, they are paying for them. I'd guess a lot of them may not ever make it to market due to lack of demand, driving up prices of the cards we would buy more. Edit: One last note, even though AMD's new GPUs aren't on the "specialised" node of Nvidia, we'll get a comparison of how TSMC's cost are based on their prices when they launch.

Vor 2 Monate
Big D
Big D

I do not think Nvidia will sell as many RTX 4000 cards as they want to. And that they are overestimating how willing people are to pay for their cards.

Vor 2 Monate
Big D
Big D

​@Saverio Gizzi Considering how bad the economy is and will become in the next few months, very few people will have money left after buying necessities. A currency crisis is also brewing in Asia, UK and other places around the world.

Vor 2 Monate
Saverio Gizzi
Saverio Gizzi

dont underestimate how many idiots walk among us 😂

Vor 2 Monate
Big D
Big D

And of course there is practically zero demand from miners now. Ethereum had well over 90% of the market cap of the GPU mineable coins.

Vor 2 Monate
Big D
Big D

@ogaimon The economy is much worse now than what it was just 1-2 years ago. Here in europe the economy is completely fucked now. And it is most likely only going to get worse. I just don't see the demand for luxury products such as graphics cards staying at the level that it has been for the last 3 years or so, when the economy is so bad now.

Vor 2 Monate
ogaimon
ogaimon

no they will sell just as much and i have no doubt about that like every gen consumers don't disappoint they will pay anything at any price ,you are underestimating how dumb and how much money first world people make and how much they can afford to throw away

Vor 2 Monate
B Fish
B Fish

Im sure Nvidia is going to get taken apart for the price/performance of the "4080"' 12 & 16gb. The "should have been a 4070" looks like what a 70 class usually is: previous gen flagship performance. However at almost double the price, I think we'll see them bloating shelves. By trying to price Lovelace so Ampere would sell for bloated margins, I think theyve ensured neither will sell. Microcenter has 3070ti's for $700, 6900xt roflstomps it for the same $$. I'd hope that's what AMD does this gen, prices their top card around $1200 and just murders Nvidia at every price tier.

Vor 2 Monate
CoreyAtoZ
CoreyAtoZ

The biggest issue that I have seen seems to be the fact that nvidia calls it a 4080 12gb, when it’s clearly a completely different tier of card than the 16gb variant. They’re not even the same die!!!

Vor 2 Monate
GMax's Food & Fitness
GMax's Food & Fitness

@JJ LW After Fermi (GTX 500 series) was when Nvidia started having non TI 80 series on mid-range dies (usually xx104) starting with the GTX 680 (which should've been a 660) Fermi was so bad that the next gen midrange card was so much faster than the previous high end they could get away with it. That's where it all started because it was not the norm to sell middle of the road GPUs for high prices. Adoredtv pointed this out in a couple of his OG Nvidia history and anti-consumer videos. The GPU number and bus width can help you indicate where in the stack a card should go. GTX 480 was on the top GF100 die (as was the GTX 490) and used a 384-bit memory bus like the high end 3090 or 4090 today. The GTX 580 was on the GF110 die (the die the GTX 590 was on, GTX 560 was on GF114) and used a 384-bit bus. The follow up GTX 680 was on GK104 die (the GTX 660 rev. 2 was on the same die) and was on a mid-range 256-bit bus. The follow up GTX 780 went back to high end configuration on the top GK110 die (GTX Titan used this die) and a 384-bit bus: https://www.techpowerup.com/gpu-specs/geforce-gtx-480.c268 https://www.techpowerup.com/gpu-specs/geforce-gtx-490.c3332 https://www.techpowerup.com/gpu-specs/geforce-gtx-580.c270 https://www.techpowerup.com/gpu-specs/geforce-gtx-680.c342 https://www.techpowerup.com/gpu-specs/geforce-gtx-780.c1701 That's 3 out of 4 generations with the 80 SKU on the top die and high end memory bus. I'm tired so I'm not doing anymore but you can look it on techpowerup for more info on later and earlier gens.

Vor 2 Monate
Power5
Power5

@ThunderingRoar I was referring to OP talking about how they called the 4070 a 4080 12gb. The 12 has less ram, less cuda on a different piece of silicon, and it has less memory bit width. It is a 4070, not a 4080.

Vor 2 Monate
ThunderingRoar
ThunderingRoar

@Power5 lovelace has a lot more L2$, its kinda like rdna2's infinity cache. So no, it has more memory bandwidth, otherwise the cards would be slower at 4k lol

Vor 2 Monate
Power5
Power5

Has less memory bit width as well. Total BS. Especially for double the MSRP as last gen. Just the silicon chip doubling in cost does not mean the entire card is now double the cost. There is a whole lot of other stuff in a graphics card besides the GPU chip.

Vor 2 Monate
JJ LW
JJ LW

@ThunderingRoar Oh I agree, pricing on especially that card is bonkers! I just think there will be more cards released and prices adjusted once 30 series supply dwindles.

Vor 2 Monate
Geier S
Geier S

Amd should really have a huge production cost advantage with their chiplet designs in both gpus and cpus. So they should be much cheaper than Lovelace and still have good margins.

Vor 2 Monate
Benjamin Oechsli
Benjamin Oechsli

BOM costs are something that fascinates me. How much profit does the company make on a card, a generation? And how much of that money goes back into R&D? All leading to a question I daydream about a little: How small a profit margin could a company make and still make competitive products every generation? It might not have a proper answer, but it's fun to mull over. Thanks for the video, MLiD crew! Good to put a finger on the pulse of the market in this way.

Vor 2 Monate
Karkarov
Karkarov

@GMax's Food & Fitness he said bom and that shipping was included in the bom. Which implies it includes shipping cost from providers to Nvidia or third party makers like MSI but not necessarily from them to distributors. He made no claims at all about knowing the full markup and was simply making an "educated guess" in that regard.

Vor 2 Monate
Michael Gray
Michael Gray

What were are seeing now from the Radeon group is only possible because of Zen (Ryzen and epic). We're those two segments in cpu not successful and wouldn't have the money to do what they are doing with rDNA. It's the proof in the pudding that reinvestment is key to survival as a company.

Vor 2 Monate
GMax's Food & Fitness
GMax's Food & Fitness

@Karkarov Tom did include shipping cost in those calculations as shown in the video. Of course everything else you mention is valid as well but that would be included in the company markup which is also factored into everything. Did you watch the whole video? It's all there.

Vor 2 Monate
Karkarov
Karkarov

Bill of materials isn't even half the story. Shipping costs, building rental, cost of electricity, government fees, R&D, marketing, maintenance and repair, etc etc etc. The cost of the physical parts is not even the biggest cost factor a lot of the time.

Vor 2 Monate
SuperFlamethrower
SuperFlamethrower

They have to make back the R&D cost. I'm sure it's easy to quantify for Nvidia and Nvidia but for us not so much. After the R&D cost has been made back they can accept a margin of 0 and stay in business indefinitely, it's really about maximizing profit. It's guaranteed that RTX 30 and RX 6000 have made back their R&D costs by this point. R&D costs are substantial. John Peddie estimates AXG spent 3.5B between Q1 2021 and now so $0.5B per quarter.

Vor 2 Monate
Deneguil - ジョハン
Deneguil - ジョハン

I wonder if Intel GPUs would be good for creators since I remember on benchmarks they have really good encoders and stuff like that when it works, like the A380 beating the 3060 in encoding, which doesn't seem that impressive but seeing how the 380 is around $150 is a really good value proposition for this task specifically

Vor 2 Monate
sflxn
sflxn

These gamers paying over $1000 for a gfx card are insane.

Vor 2 Monate
Wasmachineman
Wasmachineman

​@sflxn Good thing I don't have a life and absolutely need to see my two autismal station wagons in 3840x2400 8xAA :^)

Vor 2 Monate
sflxn
sflxn

@Wasmachineman I'll wait till you get some brains and see I wrote I don't play those types of games that require $1500 gfx cards. You are among the few thousand people out there willing to give this kind of money to these companies. I love gaming but I also try to have a life.

Vor 2 Monate
Tissaye
Tissaye

@Wasmachineman ok

Vor 2 Monate
Wasmachineman
Wasmachineman

​@sflxn Now go try GTA V in 4k with all settings maxed out. I'll wait. Even my 6900 XT struggles at times unless I lower AA to 4x.

Vor 2 Monate
sflxn
sflxn

@Wasmachineman I have a gaming laptop with 3070 ti. It runs all my games at max settings above 100-300fps. I'm good for now. I stopped playing the CoDs and Battlefields long ago. Good luck to you guys buying 1.5k-3k graphics card.

Vor 2 Monate
Lutijen K.
Lutijen K.

Another interesting fact about Arc cards. Intel waited for so long to tell us that only A750 and A770 Limited Edition cards have PCON converter chip on PCB for real HDMI 2.1 signal. This chip converts DP 1.4 signal into HDMI 2.1 with FRL protocol, allowing for higher bandwidth than 18 Gbps from HDMI 2.0. It is unclear if those two cards output 24Gbps, 32 Gbps, 40 Gbps or 48 Gbps, until tested by reviewers. All other founders cards have old HDMI 2.0 port with 18 Gbps speed and PCON chip is optional from AIBs. Article is in PC Gamer.

Vor 2 Monate
Xp4
Xp4

Nice to know, thanks

Vor 2 Monate
AmpEdition
AmpEdition

I wanna see performance leaks and more talk about RDNA 3 😁 I understand why it's more for Lovelace wafer for 4nm is 4x the cost as last year.

Vor 2 Monate
Dralor D
Dralor D

PPL got it wrong. Nvidia is not using 4nm they are using 4N which is a 5nm node optimized for Nvidia.

Vor 2 Monate
Ben C
Ben C

The 4nm wafer hasn’t increased in price 4 times ever. I’d you mean over Ampere, the price from Ampere’s Samsung 8LPP wafer to Ada’s TSMC N4 wafer is at least 2.2 times the cost as 8LPP

Vor 2 Monate
Mr Bob
Mr Bob

Nvidias not making shit…profits are down dramatically with miners gone and 4000 series failure will not help them 😀

Vor 2 Monate
ThexWITCHxMaster
ThexWITCHxMaster

8:52. Don’t be surprised if AMD comes in and says they are going to offer 7900xt’s their flagship GPU for way less money than Nvidia. I fully expect Lisa Su to be standing on a stage somewhere announcing how their GPU’s are “cheaper than the competition and still gives at or better performance”. Nvidia’s greed may come back on them if AMD is willing to under cut them.

Vor 2 Monate
Foxx
Foxx

AMD is willing to burn Capital to burn Nvidia.

Vor 2 Monate
Blue Max
Blue Max

@ThunderingRoar as stated that was two 290's on a single PCB in crossfire and was still cheaper than Nvidias equivalent at the time further proving my thesis

Vor 2 Monate
Outrider42
Outrider42

@ThunderingRoar Less VRAM, no DLSS, and is only good for playing video games. The 3090 is an excellent production card. The 6900xt is not. This man right here, Tom, kept a 3070 instead of a 6800xt because it worked better for his video production. Tom later grabbed a 3090, and he did so for his video production. Tom does not own a 6900xt, he could have bought that instead, but he didn't. Most youtubers and content creators are using Nvidia because it really is that much better. This market is not just gamers, folks. There's your price difference.

Vor 2 Monate
WayStedYou
WayStedYou

@ThunderingRoar Thats two gpus, so 750 each

Vor 2 Monate
ThunderingRoar
ThunderingRoar

@Blue Max that is also not true, Radeon R9 295x2 launched with 1500$ MSRP back in 2014

Vor 2 Monate
olternaut
olternaut

I get why the decisions on Intel ARC was made but I'm getting the feeling now that Intel can't make products that people want anymore.

Vor 2 Monate
Michael Carson
Michael Carson

@Rafa Allegretti The cpus themselves don't have the pins, they have contact points. You clearly didn't read what I wrote.

Vor 2 Monate
Rafa Allegretti
Rafa Allegretti

@Aleksander Trubin yeah, that's what I meant. PGA is better for the end user, harder to bend and easier to fix by yourself.

Vor 2 Monate
Aleksander Trubin
Aleksander Trubin

@Rafa Allegretti depends on what your definition of better is. LGA has much higher pin density, while PGA is somewhat more resilient and can be fixed by hand

Vor 2 Monate
Rafa Allegretti
Rafa Allegretti

@Michael Carson it's not pinless lol. The pins are on the boards, and btw PGA is way better than LGA.

Vor 2 Monate
Paul John
Paul John

Intel are dinosaurs and should/will just fade away, even faster if US politicians stop giving them US taxpayer dollars but hey Intel keeps making promises so that won't end anytime soon i think.

Vor 2 Monate
GHOST D0G
GHOST D0G

I would like to see more benchmarks include 1080/ti, a phenomenal GPU. The games i play continue to work exceedingly well even now. AAA titles are not worth paying the extreme price of a NEXT GEN PC. VR comes to mind and Productivity over gaming when it comes to spending large amounts of cash!

Vor 2 Monate
RobBCactive
RobBCactive

Arc becoming an OEM and mobile product is a very Intel outcome. The plan to go from DG1, an iGPU on a card, to a full low to 70 class range in one generation always seemed far fetched. As for Nvidia's AIBs making a profit on a $1599 no chit Sherlock! Effectively Nvidia has one real card on the market without improving price/performance and they've destroyed the '80 class's meaning. AMD's 7900 & 7800 launch is now easier than if Nvidia had held AD103 & AD104 back from the market.

Vor 2 Monate
eugkra33
eugkra33

Prices are inflated to sell Ampere. They'll drop $200 of the 4080 12gb cards in 4 months.

Vor 2 Monate
XxOblongxX
XxOblongxX

Yesterday i browse Amazon for a 6900xt i was just curious, there was Gigabyte version for sale for 610 euros absolutely crazy price i hesitated and decided to wait for RDNA3 few hours later they were all gone my 5700xt is still a very good card anyway my focus is on the new 7700x to replace my 2700

Vor 2 Monate
XxOblongxX
XxOblongxX

@Saverio Gizzi yeah by the time i looked again later at night they were all gone i just told myself better luck next time right now i'm waiting for the price to stabilize on motherboard to start my new 7700x build

Vor 2 Monate
Saverio Gizzi
Saverio Gizzi

610... i think it should have been an instant buy xD PS did the same saw one at 670 and decided to wait for 7800, but if i see one at 599 i will instabuy

Vor 2 Monate
WayStedYou
WayStedYou

Bro the replacement for the 6900xt is onyl going to be like 100 euro less than that with half the vram.

Vor 2 Monate
Pitivier Bag
Pitivier Bag

Damn. Waiting too as i have Vega 56. But for 610 euros i would have buy without hésitation

Vor 2 Monate
o lo
o lo

good job holding off!

Vor 2 Monate
psionx1
psionx1

the way I see it intel can't afford NOT to keep improving ARC because AMDs APUs will only be more of a threat going forward.

Vor 2 Monate
Mendoras
Mendoras

As expected. We knew that the 4080 12 GB is a borderline scam. I wait for AMD - and maybe we will see "real" 4070 next year.

Vor 2 Monate
Marc M.
Marc M.

It's upsetting to see Intel effectively exiting the market. Hopefully they can keep the arc line up on life support until a better version of it is designed and the market is a little more favorable. Going from low end to mid level (or in their case more accurately said mid-end) strategy is actually not necessarily a bad one. If they can maintain the research and development and all the costs associated with it with aiming for that part of the market especially in mobile, until such a time as they can actually ramp up to better margin areas, that might be the only really effective way of entering this duopoly market. It is not though a high margin path...

Vor 2 Monate
Alice Osako
Alice Osako

One thing to remember - and to their credit, MLID did mention it at the end - is that Intel's strategy for Xe is based primarily on its use in datacenters and HPC. The discrete cards were never the main focus, they are a spin-off product. This affects how Intel positions them, and how willing they are to push them.

Vor 2 Monate
Fabs
Fabs

AMD's systematic price cuts without going crazy are a good indication that: 1) supply chain is healthy on the high end; 2) it will open old price slots for the new gen, though I would expect a $50-$100 inflation.

Vor 2 Monate
The Echelon
The Echelon

@haukionkannel That's not how percentages work..

Vor 2 Monate
haukionkannel
haukionkannel

@Patrick Bateman 8% per month... So in two months it can be 16% and so on... Another thing is that I don´t know what inflation is in the USA. Maybe it is less in there? Third thing is that inflation is not evenly distributed among all consumer products. Some prices goes up faster than another. For example food prices goes up now faster than general inflation number.

Vor 2 Monate
GMax's Food & Fitness
GMax's Food & Fitness

@Patrick Bateman There's the problem right there, "according to the Federal Reserve." Anyway it's about the inflation since 2020 when RDNA 2 launched since inflation has sky rocketed in the last 2 years. I discovered that inflation from 2006-2017 is equivalent to the inflation from 2017-2022 and most of that increase is from 2020 to now.

Vor 2 Monate
Patrick Bateman
Patrick Bateman

@haukionkannel Where is it 20%? US inflation is 8% according to the Federal Reserve.

Vor 2 Monate
haukionkannel
haukionkannel

Inflation is over 20% so prices has to go up about that much...

Vor 2 Monate
radoxx
radoxx

I honestly see no possibility for Intel to sell these cards for the prices shown. Especially not, since they will be in direct competition to RDNA3 and ADA Lovelace. They would have to be substantially cheaper for me to even consider being a beta tester. But in any case, let's wait for reviews first.

Vor 2 Monate
Winter
Winter

I wonder what the outlook is for Intel for ARC over a longer time period, they would be able to control both the CPU and GPU side for laptops, oems etc. Stopping all production because you stumble out the gate with your first product after billions down the drain in search of that CPU/GPU combo they so wanted would be an interesting choice

Vor 2 Monate
chaon93
chaon93

It's important to note that intel can afford to eat a loss on each gpu die sold to AIB's if they want. They aren't a GPU company and can play the long game if they want. Intel can control the BOM costs to sway AIBs as needed

Vor 2 Monate
Silver Joystix
Silver Joystix

I could definitely see myself buying a laptop with an Arc-based gpu. I think that makes a lot more sense for Intel than desktop gpus.

Vor 2 Monate
Kane Bunce
Kane Bunce

While it is true that it is a bad business decision for Intel to compete with AMD and Nvidia at the high end right now, it is also a bad idea from a product and reputation perspective to have high end cards until they get their driver problems solved.

Vor 2 Monate
Stoned Clueless
Stoned Clueless

AMD is brutally precise in cost effectiveness. multi chiplet cards will bring so much sweet $ for AMD

Vor 2 Monate
Jieren Zheng
Jieren Zheng

I was wondering Battlemage being a mobile GPU could be a way for Intel to continue to somewhat fund the AXG division and develop drivers/software features with a live product, yet like you said in a previous video, it is a segment at NVIDIA are worried about. So it might actually be worth it for them to lay low and work on that segment (gain market share) until the day it becomes worth it to launch a desktop card again.

Vor 2 Monate
Bat_Daddy
Bat_Daddy

Intel starting with laptop GPU isn't necessary a bad thing because it would be better for them to nail the drivers and software of the or GPUs first before scaling up performances for desktop

Vor 2 Monate
Siyzerix
Siyzerix

@ModernOddity Not sure if he's gonna even see or consider my comment.

Vor 2 Monate
ModernOddity
ModernOddity

@Siyzerix Ask that next time Tom is live. I don't think any of us in the comments could answer that yet.

Vor 2 Monate
Siyzerix
Siyzerix

Will intel offer 3060 performance for their low end with battle mage? Cause that's what nvidia is targetting.

Vor 2 Monate
ModernOddity
ModernOddity

@N Vignesh 17:30 Possibly.

Vor 2 Monate
Cold Runner
Cold Runner

When it comes to Nvidia, something has to change with its arch because there is no way it can continue to just use raw power to keep ahead of RDNA. This gen should have allowed Nvidia to lower power and improve performance as they moved from Samsung's 8nm to TSMC's 4nm. Instead, they just jammed more transistors on the chip and pumped up the clocks. Even if they move to 3nm, this approach will still likely mean higher power consumption as they try to convince people that you need to get 240fps at 4K or your games are going to play like s**t.

Vor 2 Monate
Cold Runner
Cold Runner

@Dralor D - I agree that the question will be what Jensen decides he wants his GPUs to be, and this is going to depend a lot on what the reception of Ada is. Obviously, there will be people that will always buy the latest and greatest, but what is the average gamer going to do?

Vor 2 Monate
Dralor D
Dralor D

@Cold Runner I agree they "should" if they actually wanted to serve the gaming market to the best effect. I just dont think Jensen feels thats necessary. I mean would it make that big of a difference to NVidia sales for gaming gpus, not really. But it would be a lot more expensive to produce 2 lines of cards. If/When Nvidia gets their chiplet tech working then we might see it from them. But it seems their strategy is AI/Programming more than hardware innovation. Im not saying they havent done an amazing job with the hardware, just that its not their main idea of what the future of gpus will be

Vor 2 Monate
Cold Runner
Cold Runner

@Dralor D - Not true. I actually agree with you that it appears to be a data server card that was hastily cut down to be a server card. I've thought this ever since Ampere came out due to its fp64/32 cores. I also 100% agree that Jensen has a shut-up and buy it attitude. This is still part of my point. The arch needs to change. AMD already realized this so it split its GPUs into graphics and compute. Nvidia needs to do the same, but will they?

Vor 2 Monate
Wawa Weewa
Wawa Weewa

@Dralor D 👌

Vor 2 Monate
Dralor D
Dralor D

You assume Nvidia is designing cards for gaming, it doesnt seem that way. It seem the architecture is designed for server etc and it shoe horned onto gaming. Look at Jensens launch. 20 minutes to launch the card, very little excitement and very boring. Then over 1 hour talking about data center, ai, cars etc etc etc. It came off as a real "games just buy the gpus and shutup, we need you to fund us so we can do these other things"

Vor 2 Monate
Prashanth Doshi
Prashanth Doshi

Nvidia left bad taste by gatekeeping dlss 3 to rtx 40 series .

Vor 2 Monate
grospoulpe
grospoulpe

seems they didn't that, because there are too much latency (DLSS 3 computation or something) with RTX 10/20 In fact, it's the same, officially, with FSR 1.0 vs FSR 2.0 (FSR 2.0 needs more computation, therefore, more latency, especially on GTX 10 / RX 400 series)

Vor 2 Monate
Adrenaline Junkie
Adrenaline Junkie

Perfectly stated.

Vor 2 Monate
Apollo
Apollo

In UK we are heading into another recession and we already have energy bills going through the roof so just running these cards wil be like burning money then there is the prices of the 4090 and 4080 and 4070(aka 4080 12GB) in gbp they're just insane. Then there's articles on needing to buy a ATX 3.0 psu if you don’t want to risk melting your power cables or end up with fire under your desk. These cards seem to be accidents waiting to happen, hope i'm wrong but i'm not risking it.

Vor 2 Monate
John Thunder
John Thunder

If Nvidia will give us again 60 Ti and 70 class cards with 8 GB of VRAM then AMD wins. I just don't care about 4090s and 4080s. My money is best spent elsewhere maybe on a good midrange card from AMD.

Vor 2 Monate
John Thunder
John Thunder

@ThexWITCHxMaster Yes, I agree. Amd will be a better choice this generation.

Vor 2 Monate
ThexWITCHxMaster
ThexWITCHxMaster

You might even be able to get a higher end GPU with AMD if they price GPU's closer to what they did with RDNA2.

Vor 2 Monate
Bjørns Time is Relative
Bjørns Time is Relative

750 for the 4070 Ti would've still been fine for them, though, with 3090s having double the vram. I guess they just want to milk as much as possible before they have to lower the price.

Vor 2 Monate
cracklingice
cracklingice

4060 *cough* 4080 12gb is $100 more than I expected they would price it at. The 4070 *cough* 4080 16gb is right where I expected the price to land and the 4090 is actually $200 cheaper than I expected. They did not release a 4080 in my eyes as that would have been a cut down 102 with around 12k cuda cores and cut the memory bus back to 256-bit.

Vor 2 Monate
Pro720HyperMaster720
Pro720HyperMaster720

I really really hope Intel at least would release low end discrete GPUs for Desktop the next couple of years or generations, keeping that way at least the foot in the market while they improve their drivers and attempt to step up, up to mid range with Druid or something like that, this would also be possible if they don’t release discrete GPUs for desktops and use the new designs for laptops, but as I said “I hope” because at least this way would look less than a total give up and more like, we’ve hit a wall, we’re going to keep a low profile but we would stay active preparing things to be able to comeback stronger, again they could be doing the same without discrete, but would be more apparent if Alchemist wouldn’t be the only thing they release to desktop ever or until further notice

Vor 2 Monate
Anthony Lipke
Anthony Lipke

I'm waiting for benchmarks but I'm not expecting the 4080 12gb to make sense compared to the 3080 12gb let alone the high models being sold at $1000 now assuming you must go Nvidia. If AMD is an option for you I expect 4080 12gb won't make sense at all.

Vor 2 Monate
Sadiqur Rahman
Sadiqur Rahman

RDNA 3 will sell a lot in coming next 3 years

Vor 2 Monate
Paul John
Paul John

@haukionkannel I absolutely agree on price/performance the 6000 series is a clear winner, that's why i want a 6700xt but i'm not willing yo pay $700 for it. AMD supposedly had an MSRP reduction a day or 2 ago but that looks to be only in MSRP which are all fictional at this stage. And it most likely would only affect a few markets like the USA. If AMD doesn't actually sell these cards at a reasonable price people still won't go to them.

Vor 2 Monate
haukionkannel
haukionkannel

@Beretteres That would only mean that AIB would put the price near Nvidia in anyway... And if the price difference is $100, AMD is the better choice...

Vor 2 Monate
haukionkannel
haukionkannel

@Paul John In our area AMD has been the best price / performance option for a years! Still even in here Nvidia has much bigger market share!

Vor 2 Monate
Paul John
Paul John

@haukionkannel If AMD doesn't compete in your market either on price or stock then of course Nvidia wins. Right now consumers have 2 main choices AMD/Nvidia. I want a 6700xt, they're $700 in my region and very low stock. A 3060 is $420 and dropping in price all year with good supply, 3060ti $450/470 and also good supply and price drops all year.

Vor 2 Monate
Beretteres
Beretteres

@haukionkannel if the price difference was like 100 between amd and nvidia I would get nvidia just because it offers more features and better drivers and less hassles but this time AMD has the opportunity to significantly undercut nvidia with prices way more consumer friendly, if they dont I will just get a 500 6800xt or 600$ 3080

Vor 2 Monate
BiscuitPuncher
BiscuitPuncher

I’ve been arguing and trying to explain this to people for a few months now. That Intel is late to the game and will have to bow out or take a heavy loss. Not a single person agreed with me concerning Intel. They need to start developing for the next generation to launch when the 50-series cards are released in a couple years.

Vor 2 Monate
Ali Shaikh
Ali Shaikh

I don't know about you guys but I think that according to the supposed specs of Navi 31, it should comfortably beat the 4090 in rasterisation. The 4090 is increase core count by just about 60% vs 3090 whereas AMD is increasing core count by 2.4X. Nvidia is likely to have higher clocks, but can it really compete with such a vast increase of cores from AMD, who are also expecting higher clocks? The 3090 was already struggling with core utilisation as seen by 4K having a lesser performance decrease than expected compared to other cards. I don't know if the 4090 will change that or if AMD will experience the same thing but I would be shocked if AMD couldn't achieve at least 4090 parity.

Vor 2 Monate
Buggerlugz
Buggerlugz

Love how Nvidia is now deliberately trying to damage AMD now, by god Nvidia must be worried!

Vor 2 Monate
Sam The Multimedia Man
Sam The Multimedia Man

Those price for the silicon are for the printed chips the actual silicon material isn't that expensive and has no supply issues, Hemlock semiconductor reported they have been easily keeping up with demand and could make more if needed. This just goes to show you the need for chip manufactures in the USA is great and would help a lot with keeping the cost of making electronic devices down.

Vor 2 Monate
ThexWITCHxMaster
ThexWITCHxMaster

I have a 3090 and will not be getting a new GPU this generation and probably not the 5000 series, but I still am interested in seeing how AMD answers Nvidia Graphics Cards. How will a 7900xt be up against a 4090 in cost and performance 🤔🤷‍♂️.

Vor 2 Monate
Scroopy Nooperz
Scroopy Nooperz

@fireflame ft yeah I just want something that can game at 4K high eye candy settings at a constant 60FPS or above. I game on a 4K projector and i'm too used to a big screen and highly detailed res now xD Don't get me wrong, for $499 the 3070 is a beast for 1440p and can do 4K too, but I think the big limit on 4K is that 8GB VRAM - older games are fine in 4K but the latest AAA stuff can be hit and miss. That said, upscaling technology like DLSS and FSR may eventually make it that GPUs dont even need that big of a VRAM buffer. I guess we'll see soon enough.

Vor 2 Monate
fireflame ft
fireflame ft

@Scroopy Nooperz may I ask why you want something better than your 3070? 4k gaming?

Vor 2 Monate
Golf R
Golf R

@MrKillswitch88 that’s my biggest worry but that’s not nvidias fault

Vor 2 Monate
L D
L D

@MrKillswitch88 Only the adapters have the issue. Just buy a new PSU and you are fine.

Vor 2 Monate
Scroopy Nooperz
Scroopy Nooperz

@Dominic M yeah man I just want something that will be roughly equivalent to a 3090/ti but I don't want to pay more than $700. Got an RTX 3070 that I'm fine to keep for a while unless the tech industry meets my targets and pricing.

Vor 2 Monate
HenryViii Fake
HenryViii Fake

If Intel _fix their drivers,_ they stand to have a solid impact in the low-to-midrange part of the GPU market. IF.

Vor 2 Monate
Jingyang Lin
Jingyang Lin

You see a problem, the 4080 12 GB make 4090 sound like a good deal. At 78% price premium, they are offering double Vram and more than double CUDA cores. Usually as u scale higher up the lineup, you get exponential premium while having alot less CUDA cores bump. This is insane how the situation have flipped. Not saying 4090 is cheap. If you ask me, 4090 is only worth $999.

Vor 2 Monate
Siyzerix
Siyzerix

If Intel can make a battlemage low end laptop budget gpu that packs rtx 3060 100w performance, they could be competitive.

Vor 2 Monate
StingyGeek
StingyGeek

AMD for the win if they make cards with the features people actually want and can afford to buy. Looking forward to them moving off of monolitic dies for gpus.

Vor 2 Monate
Wapn Perfo
Wapn Perfo

Well, you have to consider that the price of memory, electrical components, and even materials like heatsinks, have skyrocketed over the last couple years. These prices reflect inflation, and people buying into everything that went down.

Vor 2 Monate
MOLE MAN
MOLE MAN

Cant wait for amd to release their gpu's and have nvidias greed during the crypto craze punished

Vor 2 Monate
Banzai431
Banzai431

Sad that Intel couldn't come up with something competitive to shake the market up a bit. Having just 2 major players is difficult for us consumers.

Vor 2 Monate
Spank Buda
Spank Buda

@Xp4 But you're proving my point. Nvidia is going by it the correct way by trying to buy ARM at that time who holds tons of patents and they're known for CPU architectures. Did Intel during their recent buying spree purchase a company who dealt with GPU architecture? No, they just hired a few ex-employees from AMD and Nvidia. Now, I understand that Intel have the license to make GPU's from decades ago but, it's not the same like AMD who was a CPU designer and then purchased ATI when jumping on the GPU side to make GPU's and CPU's. Have Intel ever done this in the early 2000's? And also, I'm speaking on Nvidia designing their own CPU's without the help from Arm and not for AI data centers. I'm speaking on the consumer level and not commercial. Hell, I don't believe that Nvidia have the license to make CPU's like Intel and AMD.

Vor 2 Monate
Xp4
Xp4

@Spank Buda For end users and market image and competetition those are things that doesn't matter at all. Oh, and NVIDIA did entered CPU market building their own SoC using ARM btw... That's when they realised they needed to buy ARM if they want to keep improving on that part.

Vor 2 Monate
Banzai431
Banzai431

@Spank Buda All the same, I was hoping they would at least be a little competitive. Would have been so much better for us if there was a third player to keep the big 2 honest.

Vor 2 Monate
Spank Buda
Spank Buda

AMD & Nvidia have been designing GPU's for how long? It's like Nvidia trying to enter in the CPU market today. Do you understand that there's patents, drivers, hiring a new staff, outside contracts, manufacturing, etc. that these companies must pay before seeing their ROI's? If it was that simple for a 3rd player to be involved this would've happened a while ago.

Vor 2 Monate
Matthew Bossert
Matthew Bossert

I hope Intel just takes their time mastering the practice on the GPU. I know they want to be in at least something GPU related for AI and to have a Solution to in the future Compete with apples M series in Graphics maybe Efficiency or at least try in that area and maybe if they master the tech of Desktop GPUs they'll have ARC cards in all areas by their E or F Maybe I or N series assuming that they don't stop doing it after arc a

Vor 2 Monate
Andy S
Andy S

Besides all the bashing I must say Intel has done a great job with one API in the professional workspace. AMD is way behind Intel's solutions.

Vor 2 Monate
ProfessorAdonisCnut
ProfessorAdonisCnut

If they're going to sell them all binned down as A580s anyway, they could try doing it in a way that is software only and make sure that the unsupported bioses/drivers that can reverse it find their way out into the community. Would be the best option left to them to create a lot of positive enthusiast sentiment, no obligation to support those cards at full capability, and no lost revenue if they were going to sell them that way anyway. Get people to see it the way they did the Phenom II X3. Intel's doesn't normally like to position themselves as the mythologized hackable alternative like that, but if they swallow their pride it might be the least bad option left. Might not hold true if they really are killing of Arc though.

Vor 2 Monate
RegisteredBlindGamer
RegisteredBlindGamer

I got a 6700 10gb brand new for 329.99. Best value GPU around. The 6700 Fighter went for 299.99 for a while too. 3060's are still going for 450 or more in the UK and 3080's are still 740-850.

Vor 2 Monate
RegisteredBlindGamer
RegisteredBlindGamer

@Aleks Developer if you just bought from a retailer try get a refund for the difference. I got one for my 6700 it was by 20 but it still sucked seeing the price drop the same day I placed my order.

Vor 2 Monate
Aleks Developer
Aleks Developer

@Hold The Truth Hostage Let's hope for the best. I just learned AMD lowered all their prices just this morning.

Vor 2 Monate
Paul John
Paul John

Those prices are opposite in my region, shame as i really want a 6700xt but will likely get a 3060 by end of the year based on how Nvidia's prices are dropping here. Meanwhile AMD 6700XT haven't had any price drops all year.

Vor 2 Monate
Hold The Truth Hostage
Hold The Truth Hostage

Found 6700xt on ebay for $324

Vor 2 Monate
Hold The Truth Hostage
Hold The Truth Hostage

@Aleks Developer hmm 🤔 good point, but with the miners tossing cards it should drop to below $500 even before November

Vor 2 Monate
Justin Van Horne
Justin Van Horne

News flash: corporate shills defend multi billion dollar companies that gouge customers and thrive on predatory pricing practices. oh and dont forget reviewers of GPU's i want to see 4070 trending on release day, none of this 4080 12GB crap please.

Vor 2 Monate
Justin Van Horne
Justin Van Horne

@The Phantom Channel yeah i was originally leaning towards 4060ti personally, but the techtubers would never go that far, its enough of a stretch harassing them to band together and label all their launch reviews as 4070. not that any of them are willing to sacrifice their warehouses of free stuff to do so.

Vor 2 Monate
The Phantom Channel
The Phantom Channel

They should call it for what its feature set represents, a 60 series. The 4090 should be the 4080, the 4080 16GB should be the 4070, which puts the 4080 12GB as the 4060.

Vor 2 Monate
Sal3600
Sal3600

Demand and supply boy. Can't fight economics.

Vor 2 Monate
DrunkSupportCharacter
DrunkSupportCharacter

I dont believe that intel are aiming for the high end, They know that they have work to do regarding drivers, I believe its more of a push for a graphics focus, To implement generation relevant GPU performance into their CPUs and to have a unified driver that does that too, they need something on the market that will work, that will get their graphics technologies used, an example is the new nuc extreme that im likely gonna replace my PC with, the other thing is its likely gonna be more supported on a Linux based system than NVIDIA at least

Vor 2 Monate
Deliveredmean42
Deliveredmean42

well at the very least I'll look forward to see the rx 8700 in a year or two. Depends how AMD will price the rx 7800 tho.

Vor 2 Monate
Hardcore Hardware
Hardcore Hardware

Selling a 4070 as a 4080 12gb is very scummy especially with a 192 bit bus, so the stack has shifted once again, now how much will the 60 series card be? $599? Considering how Nvidia treated EVGA I don't think this is a bid to shift old stock because I don't think they really care about partners at all. Its up to board partners to shift old stock, Nvidia already made money selling the chips so why would they really care if partners aren't able to shift them? I think this is them just raising prices because they really enjoyed the last 2 years of high profits, shareholders expect to increase profits not decrease and how would leather jacket man explain a huge profit loss? If you look at this from a corporate perspective its plain to see what's going on here, high prices are here to stay and Jensen expects us to pay for their future losses. As for AMD prices on their new cards, I'm not expecting significantly lower prices, just look at the x670 board prices which is a strong indication Radeon 7000 series cards won't be much cheaper than Nvidia's prices. Most probably like $100 cheaper on the top end card max, I can feel it in my bones they won't be much cheaper. IF the 7900XT releases at $1500AUD I'll buy one but I know damn well that it won't be $1500AUD more like $2200-$2500AUD with some reference PCB cards at $1999AUD. Imo people should be picking up second hand 3000 series cards and 2000 series cards and Radeon 6000 series cards, you can get great deals on 2080 Super and 2070 Super and tbh a 2070 Super is all you really need to enjoy your games. I saw a 3090Ti selling on a site here for $1500AUD, just for reference I paid $3250AUD for my FTW 3 Ultra 3090.. My 2070 Super in my HTPC runs all of my games perfectly fine at 4k, tbh I regret not waiting and picking up a 6900XT for $1299 now:(

Vor 2 Monate
An An
An An

I find myself lucky for not falling into the graphic madness. The monopoly of hardware producer probably will take out the best of us and I will hide into Dota 2 community to avoid being blindly robbed by absurd pricing

Vor 2 Monate
nubnubbud
nubnubbud

I can't imagine the A780 would not sell as a budget workstation card if it was made non-OEM. Sadly, no manufacturer is willing to give anything above 16gig memory unless it's super expensive. A slower, cheap card with massive VRAM and raytracing cores would be the go-to for freelancers, indie artists and devs, and probably a good amount of AI and software engineers, as well as gamers who just can't get pretty enough mods. at about $500 you could see people buying 2-3 and solidly outcompeting the 40 series in 3D rendering speeds, especially because intel's got some real beef in its denoise algorithms.

Vor 2 Monate
kian moiny
kian moiny

if intel starts using their own nodes they can make a big difference in their cost of production

Vor 2 Monate
Richy Rich
Richy Rich

Considering that Moore's Law is truly dead, I think Intel could make significant profits on the graphics card market if they don't try to play the same game as AMD and Nvidia. We don't need bigger and "better" cards every year. We need affordable cards that are tailored to the end user experience. For example, if they put high end RTX on a low end, budget card, I know a ton of Minecraft players and streamers who would absolutely eat them up. High end cards without RTX would also be quite popular, for the price point, since ray tracing is a bit underwhelming in most AAA titles anyway.

Vor 2 Monate
Vijayender Joshi
Vijayender Joshi

You can't lose money on the silicon if it's already sitting in your inventory and if you cannot offload it then you risk doing a huge write-off anyways. It should be thought of as recovering some of the capital.

Vor 2 Monate
Hoang D
Hoang D

i really hope AMD would take less margin this generation and embarass Nvidia by showing a Navi 32 7700XT matching 4080 12gb at $300-$400 lower msrp. but they probably will just do same shit as Nvidia, moving products name a tier up and undercut a tiny bit for equivalent performing Nvidia gpus.

Vor 2 Monate
fabitosan
fabitosan

Awesome resume, analysis and information, thanks man, I fell sorry for intel, they arrived really late, maybe could do better but depends on how much are willling to loose for a few years

Vor 2 Monate
Mariusz Melerski
Mariusz Melerski

"Only" 10-20% more for liquid cooler or probably also for very good air cooler is 90-180$ for 4080 12GB, so it is still a lot of money 🙂

Vor 2 Monate
Michael Gray
Michael Gray

Tom, you have said again and again that Nvidia has warehouses full on 30 series cards. What about AMD are they just as bad off? Or are they in a better position?

Vor 2 Monate
Yunus Emre N.
Yunus Emre N.

330 for an a770 16gb limited edition on the same day as a 4090. Interesting strategy if the bom cost is true. Pricing seems competitive. We have to see the reviews to comfirm the state of the drivers though

Vor 2 Monate
Eduardo Santiago
Eduardo Santiago

330 for the 8gb version

Vor 2 Monate
Benjamin
Benjamin

looking forward to rdna3 still using my amd 480 even tho its on its way out.. im not going green this time either... and i did have some hope for intel's arc.. but im not a laptop user.. so ima wait a few months and hope its worth it c:

Vor 2 Monate
Владимир
Владимир

Great series, Tom. Thanks for your thoughts on the matter.

Vor 2 Monate
Nelson Nunes
Nelson Nunes

Honestly, Intel can make a partnership with PowerVR for a combined solution. Intel encoding, and PowerVR efficiency/knowledge can make wonders in the gpu ecosystem... I will start to think that in the long-run (for next 10years) cost/watt/fps will make more sense than just fps raw power... more and more the games development are getting standarized engines ( like Vulkan ). Also DLSS ,XeSS and FSR are the best solution for 4K gaming ... something that Intel can exploit and work around that, even for lower-resolutions. Games nowdays are getting gimmick features that only tries to justify a 1500$ gpu.... For example, at 4K does anti-alising matters ? I remember that Anti-Alising was poor-man "DLSS" because couldnt run at native resolution ... RayTracing an unnecessary gimmick best used for Photo-Mode in games....GPU Markting pushing this it bafles me...

Vor 2 Monate
Alexander N
Alexander N

Even if they called it a 4070 ti at $699 I still wouldn't buy it.

Vor 2 Monate
Y P
Y P

Intel will have a huge advantage in future if they deliver the A lineup successfully as they manufacture their own chips.

Vor 2 Monate
Tri Nguyen
Tri Nguyen

@Spank Buda Agreed. Intel tried to do chiplet for Alchemist and Battlemage but failed completely! $3.5 billion R&D for nothing! AMD will launch chiplet GPU in RNDA3 in October.

Vor 2 Monate
Spank Buda
Spank Buda

No they do not! Why are you all thinking that these companies enjoy dumping money into a department that's not generating revenue? Intel cannot get into this intense gaming GPU market when AMD & Nvidia have been doing this for decades! AMD just now somewhat have gotten their drivers on track. Intel mistake was that they should've been working on 1 or 2 GPU's that were mid range.

Vor 2 Monate
Matthew Bossert
Matthew Bossert

I almost think Nvidias pricing could allow Intel to decide if they want to make a profit or get as close to AMD as they can and lose some money. I see them being in the middle of the road

Vor 2 Monate
Hold The Truth Hostage
Hold The Truth Hostage

I just hope the Secondary Market prices are dropping because Nvidia is out of their minds, at a time Ray Tracing has been proven to be doable without them on older cards & the fact that Ray Patching is a better alternative to ray tracing. As well as the fact that AMD has it's own counter to DSL3 which I guarantee will support 2000s-3000s series cards because it would be stupid for Game dev to support DSL3 with less cards support Jenson is losing his mind especially during inflation & end of mining ⛏️

Vor 2 Monate
Lo-Res Gamer
Lo-Res Gamer

I suppose the "4080" 12 GB _could_ drop to $500 if the 4 nm die costs drop significantly before the end of the generation. But that's a long way away even if it does happen.

Vor 2 Monate
James Deluxa
James Deluxa

They want their margins.. Same was true for the GTX680 and GTX1080. Both around 300mm2.

Vor 2 Monate
Yash Khd
Yash Khd

Intel must play entry-level/mid level card game for at least couple of years till that software driver stack is stable enough. Now people can bash about how the hack a biggest CPU company with big money war cheast can't get stable drivers out of the door but the matter of the fact is graphics drivers are extremely complex piece of tech. It's something that cannot be fixed by throwing big money behind it. It will take up it's own time to stabilize. Both AMD and Nvidia went through the same cycle in their early days. Remember how crazy/buggy Nvidia drivers used to be..but now it's more of a stable thing.

Vor 2 Monate
Yohan MESTRE
Yohan MESTRE

Nvidia will regret when amd will put a card between the two 4080 and calling it stronger than a 4080

Vor 2 Monate
k910
k910

@Yohan MESTRE if Nvidia think their investors won't notice if it is not working bro their stock been crashing hard they have to atone for their sins pow is goon like really goon if you think other crypto will replace it then you might not know about gas fees on eth network every day dollar rise it get more hard to do business with other country the value of native currency are falling at alarming rates and fed is helping them where i can afford to buy a 4070 in normal days am kept at 4050 or 4060 price bracket i cant afford the card i want to and it will leave peoples like me out there potential buyer list it hammer hard nvida

Vor 2 Monate
Yohan MESTRE
Yohan MESTRE

@k910 the way I see it is : Nvidia is playing a difficult game they need high margin to maintain their stock price and they need to anticipated a over stock of there old GPU. NVIDIA still have good will but this is shifting slowly. Nvidia won't sell much now but once the market have normalize they will drop a high selling card something like the 3060 i guess. AMD have learn that been the budget brand doesn't do much for them for exemple : every one should have buy a rx 570 when it was bellows i don't know 160$ yet people choose to buy the GTX 1050 ti. So they do the next best thing having high margin but not stupid price‚ and having good driver + feature.

Vor 2 Monate
k910
k910

@Yohan MESTRE yep, amd did not get this, yet they are clearly playing amd. At this rate amd will not be a market leader what amd do basically is put a 100-50 usd discount and slide their GPU in two Nvidia sku what amd need to show is make their own segment cut Nvida kick their ass and show them there are the price to performance brand amd is just making huge margins by being the 2nd one while letting Nvidia be at euphoria soon nvidia run out of steam and they grab the market and keep the rise in price and acquire market shares when nvidia crack down amd playing very cleaver and dirty too they are making their margins while closing the gap and like intel they will snap nvidia like they did with intel amd is way cheaper than Nvidia they can offer reasonable prices but no they trow one SKU which make them look good like 6600xt pretend to be good guys while let all the fall on nvidia

Vor 2 Monate
Yohan MESTRE
Yohan MESTRE

@k910 or at the price of the week 4080

Vor 2 Monate
k910
k910

and price it 650 usd

Vor 2 Monate
jwdickieson
jwdickieson

I disagree with it being a 4070 TI, I believe it should have been a 4070, because the number of compute core "cuda core" difference between the "4080 12 gig" and the 4080 16 gig is enough to create a 4070 TI 12 gig version with around 8600 compute cores

Vor 2 Monate
Lord Farquaa
Lord Farquaa

Thanks for the insight and monetary breakdown on how the video cards are manufactured and there TRUE cost!

Vor 2 Monate
Suomynona
Suomynona

I have already committed myself to AMD this year. I am not liking Intel or Nvidia decisions this cycle so I'm not buying. AMD is making all the right choices.

Vor 2 Monate
Thomsen
Thomsen

If even AMD have a lot of cards to sell imagine Nvidia

Vor 2 Monate
Hotrob
Hotrob

I wonder if the honest ray tracing uplift isn't as much as they're saying. If the 4070 was on par with the 3080ti (or even 3080) on raster but had a big bump in Ray tracing, the price premium would be worth it. I'm a bit leery that they're afraid to call it an x70 class card

Vor 2 Monate
Charles Horseman
Charles Horseman

We'll get to see how street pricing goes this round. I hope the $1500 gpus actually put cash in their slush.

Vor 2 Monate

Nächster

here...

0:46

here...

Markiplier

Aufrufe 1 518 523

RTX 4000 Is All Smoke and Mirrors

10:58

RTX 4000 Is All Smoke and Mirrors

ErockOnDeck

Aufrufe 71 000

This MUST be a Scam... "Treasure Box" Gaming PC

16:28

This MUST be a Scam... "Treasure Box" Gaming PC

Linus Tech Tips

Aufrufe 1 300 000

ICH BAUE EIN HAUS

27:22

ICH BAUE EIN HAUS

laserluca

Aufrufe 308 691

ich GREIFE iCRIMAX  an ?

35:24

ich GREIFE iCRIMAX an ?

Standart Skill

Aufrufe 144 049

irl stream at World Cup MEETING RONALDO

2:26:32

irl stream at World Cup MEETING RONALDO

IShowSpeed

Aufrufe 8 356 298

Would You Rather #10

0:43

Would You Rather #10

Evakz

Aufrufe 351 045

GTA 5 - HUND zu GOTT HUND upgraden!!

23:13