Friday, September 30th 2022

ICYMI: 16GB Arc A770 Priced Just $20 Higher than 8GB; A770 Starts Just $40 Higher Than A750

The Intel Arc 7-series performance-segment graphics cards announced earlier this week, are all priced within $60 of each other. The series begins with the Arc A750 at USD $289. $40 more gets you the Arc A770 Limited Edition 8 GB, at $329. The top-of-the-line Arc A770 Limited Edition 16 GB is priced just $20 higher, at $349. This puts the Intel flagship at least $30 less than the cheapest NVIDIA GeForce RTX 3060 available in the market right now, which can be had for $380. The dark horse here is the AMD Radeon RX 6650 XT, which is going for as low as $320.

Intel extensively compared the A770 to the RTX 3060 in its marketing materials, focusing on how its ray tracing performance is superior to even that of NVIDIA RTX in this segment, and that the Intel XeSS performance enhancement is technologically on-par with 2nd generation super-scaling techs such as FSR 2.0 and DLSS 2. If Intel's performance claims hold, the A770 has the potential to beat both the RTX 3060 and RX 6650 XT in its segment. The Arc A750, A770 8 GB, and A770 16 GB, go on sale from October 12. Stay tuned for our reviews.
Add your own comment

41 Comments on ICYMI: 16GB Arc A770 Priced Just $20 Higher than 8GB; A770 Starts Just $40 Higher Than A750

#1
Solaris17
Super Dainty Moderator
sick. also is it just me or do these cards like actually look nice? like idk just gets me. I like my dual fan minimalism.
Posted on Reply
#2
btarunr
Editor & Senior Moderator
Solaris17sick. also is it just me or do these cards like actually look nice? like idk just gets me. I like my dual fan minimalism.
They look just as good IRL.
Posted on Reply
#3
nguyen
ARC looks to be very price competive, A750 is a nice mainstream GPU in today market
Posted on Reply
#4
The Quim Reaper
I really don't see the point of having both an 8GB & 16GB 770 card with only a $20 price difference...very odd.
Posted on Reply
#5
Keullo-e
S.T.A.R.S.
The Quim ReaperI really don't see the point of having both an 8GB & 16GB 770 card with only a $20 price difference...very odd.
There's been this "pay a little more for a little bettter card" thing as long as I can remember. For a tight budget, that 20USD may be worth saving.
Posted on Reply
#6
usiname
The Quim ReaperI really don't see the point of having both an 8GB & 16GB 770 card with only a $20 price difference...very odd.
Limited quantity of 1000 units 16gb to be sent for reviews and raise the hype further, while the 8GB will be available on the market
Posted on Reply
#7
ZetZet
The Quim ReaperI really don't see the point of having both an 8GB & 16GB 770 card with only a $20 price difference...very odd.
16GB is probably barely better, if any. Considering RTX 3070 is fine with 8GB of VRAM, I don't see how a card in 3060 performance tier would need more. It's not a 4K card anyway.
Posted on Reply
#8
Hyderz
Solaris17sick. also is it just me or do these cards like actually look nice? like idk just gets me. I like my dual fan minimalism.
unlike those lumbering behemoths team green has
Posted on Reply
#9
ZoneDymo
ZetZet16GB is probably barely better, if any. Considering RTX 3070 is fine with 8GB of VRAM, I don't see how a card in 3060 performance tier would need more. It's not a 4K card anyway.
Future textures requirements and rt perhaps
Posted on Reply
#10
john_
Performance per dollar against 3060, but what about against a Radeon card?
Also prices are too close between those models. Many look at the glass as half full, saying that 16GB is only $20 more than the 8GB model. But what if the glass is in fact half empty and the 8GB model is only $20 cheaper?
Also I was expecting A750 to be much lower priced. I guess Intel can't go any lower. They where expecting better performance from those cards, probably they where expecting to be selling the A770 for $400 or more, but they can't.
Posted on Reply
#11
dragontamer5788
16GB for $349 ?

This is incredible for any compute-oriented programmer.
Posted on Reply
#12
GunShot
btarunrThey look just as good IRL.
Ha!

Alright, we ALL get it now!

*walk away all jelly*
Posted on Reply
#13
Tsukiyomi91
I think the A770 8GB model will be more than enough for gaming and some rendering. 16GB seems a little too much for a card with performance comparable to a 3060 IMO. but a $20 increase for double the VRAM is kinda alright.
Posted on Reply
#14
GunShot
Tsukiyomi91I think the A770 8GB model will be more than enough for gaming and some rendering. 16GB seems a little too much for a card with performance comparable to a 3060 IMO. but a $20 increase for double the VRAM is kinda alright.
Yeah, until you've reached that title with a lot of eye-candy, etc. and you'll quickly notice that 8GB VRAM, or lower, bottleneck (hell, even 12GB of VRAM) that the next-gen titles will be pushing.
Posted on Reply
#15
Tsukiyomi91
GunShotYeah, until you've reached that title with a lot of eye-candy, etc. and you'll quickly notice the 8GB VRAM bottleneck (hell, even 12GB of VRAM) that the next-gen titles will be pushing.
probably in another 2-3 years assuming game studios don't optimize their games, the assets used for it and the engine that runs in the bg. Gonna be a little off-topic; if game studios uses UE5 or make an engine that behaves like UE5, then maybe we don't need to get GPUs that has more than 8 or 12GB VRAM and have a minimum of 150GB of free space on a storage drive.
Posted on Reply
#16
GunShot
Tsukiyomi91probably in another 2-3 years assuming game studios don't optimize their games, the assets used for it and the engine that runs in the bg. Gonna be a little off-topic; if game studios uses UE5 or make an engine that behaves like UE5, then maybe we don't need to get GPUs that has more than 8 or 12GB VRAM and have a minimum of 150GB of free space on a storage drive.
~2 to 3-years from now? Nah, those titles are already here... today. e.g. Godfall, FC6, etc.
Posted on Reply
#17
Vayra86
Almost interesting. Almost!
Posted on Reply
#18
wolf
Performance Enthusiast
It is just me or is anyone else FAR more interested in the ARC A770 launch over the 4090.

Yeah we get it, it's big, loud, hungry and fast, cool - but the A770 is such a fascinating product it just seems so much more nuanced, interesting, features need to be explored, perf across a massively varied suite will be a roller-coaster

Tempted to pick one up just to play with it
Posted on Reply
#19
Hyderz
once they start selling i hope they get some traction and hopefully that be enough for their next gen gpu to compete with
team red n green in the upper gpu segments. I'm tempted to try their gpus but i already have a 3090
Posted on Reply
#20
Solid State Brain
I might be interested in the 16GB A770 version if it works well on Linux and actually renders reasonably fast with Blender as Intel suggested a while back.
That's where the more VRAM the better, and most other GPUs of similar price only have 8GB. The non-Ti 3060 which should be slower has 12GB.
Posted on Reply
#21
Crackong
So.
Judging from the Xe core vs price
I think the A380 should drop to $80 ?
Posted on Reply
#22
Vayra86
Living life on the edge... will I sidegrade from GTX 1080 to an A770 and probably break even on sale/purchase? Hmmmmmmmmm

I kinda know it'll probably not be a good idea, but still tempted :D
Posted on Reply
#23
DeathtoGnomes
Vayra86Living life on the edge... will I sidegrade from GTX 1080 to an A770 and probably break even on sale/purchase? Hmmmmmmmmm

I kinda know it'll probably not be a good idea, but still tempted :D
I would. FOR SCIENCE!
Solaris17sick. also is it just me or do these cards like actually look nice? like idk just gets me. I like my dual fan minimalism.
IDK ifs that normal RGB or just intel blue lighting, but yea its not overdone to steal focus off everything else.
Posted on Reply
#24
Bwaze
If A770 16 GB is really a limited edition model, it won't be just $20 more expensive, no matter what the MSRP says.
Posted on Reply
#25
AnarchoPrimitiv
Hyderzonce they start selling i hope they get some traction and hopefully that be enough for their next gen gpu to compete with
team red n green in the upper gpu segments. I'm tempted to try their gpus but i already have a 3090
Based on history, I have a feeling that Intel will only be cutting into AMD's marketshare and not Nvidia's, and if this occurs, Intel's presence in the GPU market will do absolutely nothing to improve conditions for consumers.

In the past (as far back as 2008), even when AMD has offered GPUs that perform better and even at a lower price, everyone still buys Nvidia. Say what you want about AdoredTV, but a couple years ago he did a great three or four part video that used market research, sales figures, etc the demonstrated this very phenomenon and described it as Nvidia's "mentia", or mindshare. Basically it goes like this: the vast majority of PC hardware consumers are not enthusiasts who compare specs and benchmarks for hours and days prior to making a purchase, just like consumers in any other market, they base their purchasing decisions on a lot of non-empirical, irrational factors. These consumers therefore will be more influenced by the fact that they notice more people own Nvidia than AMD, that Nvidia has more fans that are more vocal online, and despite not being true since the 290x, the fans constantly repeat that AMD runs hot and that they have bad drivers (despite there being no real comprehensive empirical data to back up such a claim) and because they're not the type to do research, they're just going to accept the accusations as fact. Now, in their mind, they've associated Nvidia with the winning side and therefore want to associate themselves with the winning side as well.

These consumers will not look at the benchmarks, see Intel is performing better in their price tier and buy an Intel videocard. They will only consider Intel when they see a bunch of other people willing to buy intel or when, psychologically speaking, they've come to associate Intel videocards with the "winning side". The same irrational considerations that prevent them from buying an AMD videocard will also prevent them from buying an Intel videocard and therefore, Intel's sales will predominantly come at AMD's expense because the people most willing to take a chance on an Intel videocard are the ones willing to buy an AMD videocard...most likely because they're willingness to do the research and look at benchmarks is probably what brought them to buy an AMD videocard and would therefore make them also open to the idea of buying an Intel videocard.

Your diehard types who only buy Nvidia and will not even consider AMD (whether by habit or active choice), which I feel makes a large portion of Nvidia's marketshare and the consumer GPU market as a whole, are probably not going to consider Intel either and are only hoping that Intel's entry into the market will allow them to continue to buy Nvidia, but at a lower price. If Intel stays in the market, then years down the road they may change this, but for the first six months or year, or even Intel's first couple of generations, they'll predominantly take marketshare from AMD and this will do absolutely nothing to improve the conditions of the GPU market for consumers. As long as Nvidia holds on to 80% marketshare (or probably anything over 50%) they're going to have the monopolistic power to keep their prices high and even maintain the trend of constantly increasing prices every generation (like how a 4080 12GB is basically a 4070 so now xx70 tier cards are priced around $800, and because AMD's shareholders will expect the same profits and profit margin, AMD will follow suit to some degree).

Bottom line, everyone hoping that Intel will correct the course of the GPU market is going to be disappointed because if it happens at all, it won't be happening any time in the immediate future.
Posted on Reply
Add your own comment
May 5th, 2024 19:18 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts