Specifications and Design
We dive into the liquid-cooled version of Vega and find it performs better than expected.
Just a couple of short weeks ago we looked at the Radeon Vega Frontier Edition 16GB graphics card in its air-cooled variety. The results were interesting – gaming performance proved to fall somewhere between the GTX 1070 and the GTX 1080 from NVIDIA’s current generation of GeForce products. That is under many of the estimates from players in the market, including media, fans, and enthusiasts. But before we get to the RX Vega product family that is targeted at gamers, AMD has another data point for us to look at with a water-cooled version of Vega Frontier Edition. At a $1500 MSRP, which we shelled out ourselves, we are very interested to see how it changes the face of performance for the Vega GPU and architecture.
Let’s start with a look at the specifications of this version of the Vega Frontier Edition, which will be…familiar.
Vega Frontier Edition (Liquid) | Vega Frontier Edition | Titan Xp | GTX 1080 Ti | Titan X (Pascal) | GTX 1080 | TITAN X | GTX 980 | R9 Fury X | |
---|---|---|---|---|---|---|---|---|---|
GPU | Vega | Vega | GP102 | GP102 | GP102 | GP104 | GM200 | GM204 | Fiji XT |
GPU Cores | 4096 | 4096 | 3840 | 3584 | 3584 | 2560 | 3072 | 2048 | 4096 |
Base Clock | 1382 MHz | 1382 MHz | 1480 MHz | 1480 MHz | 1417 MHz | 1607 MHz | 1000 MHz | 1126 MHz | 1050 MHz |
Boost Clock | 1600 MHz | 1600 MHz | 1582 MHz | 1582 MHz | 1480 MHz | 1733 MHz | 1089 MHz | 1216 MHz | – |
Texture Units | ? | ? | 224 | 224 | 224 | 160 | 192 | 128 | 256 |
ROP Units | 64 | 64 | 96 | 88 | 96 | 64 | 96 | 64 | 64 |
Memory | 16GB | 16GB | 12GB | 11GB | 12GB | 8GB | 12GB | 4GB | 4GB |
Memory Clock | 1890 MHz | 1890 MHz | 11400 MHz | 11000 MHz | 10000 MHz | 10000 MHz | 7000 MHz | 7000 MHz | 1000 MHz |
Memory Interface | 2048-bit HBM2 | 2048-bit HBM2 | 384-bit G5X | 352-bit | 384-bit G5X | 256-bit G5X | 384-bit | 256-bit | 4096-bit (HBM) |
Memory Bandwidth | 483 GB/s | 483 GB/s | 547.7 GB/s | 484 GB/s | 480 GB/s | 320 GB/s | 336 GB/s | 224 GB/s | 512 GB/s |
TDP | 300 watts ~350 watts |
300 watts | 250 watts | 250 watts | 250 watts | 180 watts | 250 watts | 165 watts | 275 watts |
Peak Compute | 13.1 TFLOPS | 13.1 TFLOPS | 12.0 TFLOPS | 10.6 TFLOPS | 10.1 TFLOPS | 8.2 TFLOPS | 6.14 TFLOPS | 4.61 TFLOPS | 8.60 TFLOPS |
Transistor Count | ? | ? | 12.0B | 12.0B | 12.0B | 7.2B | 8.0B | 5.2B | 8.9B |
Process Tech | 14nm | 14nm | 16nm | 16nm | 16nm | 16nm | 28nm | 28nm | 28nm |
MSRP (current) | $1499 | $999 | $1200 | $699 | $1,200 | $599 | $999 | $499 | $649 |
The base specs remain unchanged and AMD lists the same memory frequency and even GPU clock rates across both models. In practice though, the liquid cooled version runs at higher sustained clocks and can overclock a bit easier as well (more details later). What does change with the liquid cooled version is a usable BIOS switch on top of the card that allows you to move between two distinct power draw states: 300 watts and 350 watts.
First, it’s worth noting this is a change from the “375 watt” TDP that this card was listed at during the launch and announcement. AMD was touting a 300-watt and 375-watt version of Frontier Edition, but it appears the company backed off a bit on that, erring on the side of caution to avoid breaking any of the specifcations of PCI Express (board slot or auxiliary connectors). Even more concerning is that AMD chose to have the default state of the switch on the Vega FE Liquid card at 300 watts rather than the more aggressive 350 watts. AMD claims this to avoid any problems with lower quality power supplies that may struggle to hit slightly over 150 watts of power draw (and resulting current) from the 8-pin power connections. I would argue that any system that is going to install a $1500 graphics card can and should be prepared to provide the necessary power, but for the professional market, AMD leans towards caution. (It’s worth pointing out the RX 480 power issues that may have prompted this internal decision making were more problematic because they impacted the power delivery through the motherboard, while the 6- and 8-pin connectors are generally much safer to exceed the ratings.)
Even without clock speed changes, the move to water cooling should result in better and more consistent performance by removing the overheating concerns that surrounded our first Radeon Vega Frontier Edition review. But let’s dive into the card itself and see how the design process created a unique liquid cooled solution.
The Radeon Vega Frontier Edition Liquid Cooled Card
The liquid cooled card shares dimensions with the air-cooled card, though without an integrated blower fan, the likeness stops there. The color scheme is reversed, with a yellow brushed metal body and blue accents and illumination. The top Radeon logo and the blue R cube on the end light up in blue, and as I stated on Twitter, I really hate blue LEDs. They are just uncomfortable to my eyes and I know I’m not the only one. Otherwise, the design of this card is just as sexy as the first Vega FE we looked at.
It still requires a pair of 8-pin power connections to run and the liquid cooling tubing and power to the radiator comes from the front of the card. There is plenty of length to the tubing and cabling, allowing for installation in nearly any chassis configuration.
On the back is a full cover back plate with an exposed area for the GPU tach, a set of LEDs that defaults to blue and indicates the GPU workload of the card. The blue on these is particularly piercing…
Internally we have a unique liquid cooler design. On the left is the pump and block covering the GPU and HBM2 stacks and a blue block covering the power delivery on the card as well. Liquid flows in from the top into the GPU block, through the GPU block outlet on the upper right, down through the VRM cooling, around to the far left, and the back out to the radiator.
This unit on the right is part of the diaphragm pump design that makes this card interesting. Think of this is as a flexible reservoir with a high-tension spring to create pressure back into the system. A diaphragm pump works with one-way check valves and reciprocating diaphragm material to create alternating areas of pressure and vacuum. The T-split you see at the top of the primary pump allows the liquid stored in the overflow area to maintain reliable usage of the cooler through the course of natural evaporation of fluid. This is very similar the kinds of pumps used in fish tanks and artificial hearts, likely a more expensive solution than was found on the Radeon Pro Duo or Fury X as an attempt to correct the deficiencies of older generations (noise, reliability).
This kind of cooler design was only made possible by the extended PCB of the Vega Frontier Edition, either by design or as a happy accident. The noise made by this pump is very different than traditional AIO coolers we have used in the office, more of a “gurgle” than any kind of “whine”. It’s more muted than the Radeon Pro Duo or Fury X, that’s for certain.
After reading this review of
After reading this review of the WC Vega FE. I am fairly sure AMD is hitting a wall with their GPU’s. The fact that AMD is slapping a AIO water cooler on their “pro-sumer” card, tells me their ability to provide efficient cards is out the window. Granted, the air cooled version is a bit of a *fart in the wind* situation.
After reading many reviews of the Vega FE (both air and WC), AMD’s drivers are once again the main culprit of their fairly poor GPU performance in games. IMO, the review needed professional/computational benchmarks. Other than that, seeing AMD’s new high end GPU falling below a 1080 is a bit concerning. I have lost a lot of hope in AMD’s RX Vega line up of GPU’s. But, I’ll remain optimistic and ultimately hope AMD steps up to the plate with their upcoming gaming oriented GPU’s. Good review as always. Thanks for posting the reviews Mr. Shrout and take care (same goes with the rest of the PCper crew).
The only reason anyone would
The only reason anyone would buy this or the air cooled model is a fanatical hatred of Nvidia. Even if you went crazy and bought the Titan Xp, you’d still be in for better performance and lower costs than the liquid cooled card. I’m not sure what AMD’s game is here but it doesn’t look good. The release drivers had better produce some impressive gains or they’ve wasted 2+ years of engineering time with this “new” architecture. I will continue to recommend the Polaris parts to anyone with a limited budget that asks. As things stand now I can’t do the same for Vega.
AMD
Another
Major
Disappointm
AMD
Another
Major
Disappointment.
Yeah, Ryzen/EPYC have been
Yeah, Ryzen/EPYC have been soooo disappointing
Something does not seem right
Something does not seem right with this launch going to reserve my judgement until the gaming cards launch.
Ryan, can your team run some High Bandwidth Cache test or is that feature disabled?
From Gamers Nexus:
Screaming
From Gamers Nexus:
Screaming “it’s not a gaming card!” from the front page of the internet doesn’t change the fact that, in actuality, AMD does intend for Vega: FE to sit wedged between gaming and production. This is something the company has confirmed with GamersNexus. It also does not change the fact that the card clearly exhibits driver inadequacies, and it is not on the press to “wait” to review an incomplete product – it’s on the company to ship a complete device which is representative of its marketing. We agree, and have published, that this is not a device best used for gaming. We’re not saying it is a gaming device, and have shown some production workloads where Vega: FE does well. What we’re saying is that, having now spent a week on various aspects of this device, giving AMD a free pass by repeatedly copying and pasting “it’s not a gaming card” into comment boxes isn’t going to encourage the company to fix its inadequacies. The drivers are inadequate to an extent that makes it difficult to tell whether the hardware is any good.
Maybe some commerical bank
Maybe some commerical bank funding from GamersNexus to AMD, or asking the GamersNexus readers to purchase AMD stocks, but Nvidia and JHH are seeing mad revenue growth in the data center, and even the automotive markets, and AMD probably should focus on the Epyc and Radeon Pro WX/Radeon instinct markets until AMD can design a gaming only focused GPU Micro-Arch!
AMD’s Epyc server SKUs need AMD matching GPU accelerator/AI products with even more compute, and ROPs/TMUs be damned, those data center revenues and margins are where the growth is for Nvidia, and where AMD is also going to get the most revenue growth, from Epyc revenues even more than any Radeon Pro WX/Radeon instinct GPU or gaming only GPU sales over the next few years.
When those Epyc revenues arrive, then AMD can divert some funding over to RTG for RTG to design some gaming only focused SKUs. And Navi is going to bring that very same Zen/Zeppelin die ecomomy of scale to AMD’s GPU development process and allow AMD to go back to providing yearly updates accross all of its GPU lines/GPU markets mainstream to flagship, and even mobile GPUs, where AMD really lacks a presence. Zen/Vega APUs will help there also.
JHH over at Nvidia, sure has spent extra time talking up other markets, and it’s those GPUs for Data Center/Automotive where Nvidia is getting its largest Y/Y revenue growth increase figures from.
GamersNexus has just published(7/18) some new undervolting/etc. benchmarks for the Vega FE, and there is some better gaming figures/power usage figures to be had. But AMD does need to hire more software/driver engineers and the Epyc revenue stream will provide some much needed funding for that and other R&D. AMD needs to still plow all its revenues back into the business, including forgoing any profits to dividends payouts. Let the AMD share holders earn via equity over the next few years, but it’s reinvest like mad for AMD over the next few years.
With AI and other “Big
With AI and other “Big Data”-type applications really taking off, coupled with nVidia withholding its newest architectures for gamers in favor of releasing datacenter-oriented devices with logically huge pricetags, it makes a lot of sense for AMD to position Vega for non-gaming use first (to me at least.) Unfortunately, that leads to lots of high-pitched squealing on the Internet … and not from the professional crowd.
Yep, AMD fanbois out in force
Yep, AMD fanbois out in force for this article.
I’ve purchased AMD video cards for the last few generations and was waiting to see if Vega would leapfrog Pascal… and it looks like it won’t unless AMD really does some magic with their drivers.
I like reading about magic in fantasy books, but don’t believe in it in real life… so after the first Vega benchmarks came out, I decided to fork over $700 for a 1080 Ti.
I am now back with the green team… and am very satisfied with the card. I don’t have to create a custom resolution to support my Cinema 4K monitor, it just works, unlike with my AMD/ATI graphics card…
I’ve put my money where my mouth is and committed to a 1080 Ti. I really doubt that the highest end RX Vega gaming card will come close to 60fps gaming on a 4K monitor with decent quality settings.
I bet you that the bullshit
I bet you that the bullshit AMD said about “RX Vega will be better compared to Vega Frontier in gaming” is just empty words. Nobody does a different GPU for gaming & compute. So, the only chance that RX Vega will be better in gaming is to have more CUs, which I highly doubt.
Also, RX Vega is not yet on the market because HBM2 is very buggy and it costs a lot, so they considered (and rightfully so) that it is better to launch first the more expensive “compute” cards for the professionals and then, when HBM2 starts to ramp up (which didn’t happen yet) they will launch the same GFX card but with different name, 8GB of VRAM, same 300W TDP and same close to, but not quite there, GTX 1080 performance.
If I were them I would sue Micron and whoever works on making HBM2, because they lost an awful lot of time, of money, of market share, but also, I would fire all the engineers from power-optimization department, because consuming 300W for a GTX 1080 – 180W comparable performance is just lame. Global foundries might have something to do with that, but as we saw with Ryzen, 14nm process from GF has the potential to be very energy efficient.
Read the review again. it
Read the review again. it needs 350W and a water cooling solution just to get close to a reference 1080 with a blower type cooler. unbelievable, right?
I haven’t seen anything from
I haven’t seen anything from AMD to indicate that they’d be using a different GPU for RX Vega cards. Different memory, software and power delivery choices on the reference cards, though- sure.
omg!
well,amd really make
omg!
well,amd really make useless earth crushed erhh…’thing’
its still crystal clear that even gtx 1070 is nuch better than ANY version of amd vega gpu’s.
i wonder why Pcper not taking OC’s and its 11gb higher mem version from test.
also example bcoz and vega was LIQUID cooling oc’d verion gtx 1080 version shloud be example ZOTAC GeForce® GTX 1080 AMP Extreme+ (11Gbps GDDR5X), and its clear, its beating amd vega liquid cooling vega.
and we cant took only 3dmark scores,sure we MUST check effiency,and that is amd vega lausiest for many years,same way as amd fury x.
if 165w gtx 1070 loose only little amd vega liquied monster its tell all amd vega old lausy techs.
its crystal clear that with thouse numbers and specs amd vega, any of its variant cant and not shoud get any good review score,they are so lausy.
score is average,not even close ‘ecomended’, or ‘good buy’
2,5-3 star
amd is maded lausy gpu,but also very expensive..
i wonder alot what a heck amd thinking when its made and also even release that kind bad effiency old technics ,and i must say slow (for its power eat/fps) gpu??!
yea,its about 1990 years tdp!
its clear its value go down quickly and its well…re-seling value even quicker…if some one eally buy it.
i dont take that earth eater my pc even free.
shame amd,why you offer ppl this kind…’thing’!!??
> yea,its about 1990 years
> yea,its about 1990 years tdp!
um, no, not even close. TDPs were never this high in 1990 because the CPUs in use then weren’t nearly as powerful.
You’re complaining about its cost? This isn’t the gaming card. Pro cards are expensive. AMD’s recent demonstration in Budapest indicates they’re going for a lower price point than nVidia (but performance might not be as good.)
Seeing as I typically run games on medium-high settings at 1080 (or 720), a card that performs close to a 1070 but is cheaper would be great for me. I concur the power consumption seems really high, though.
… that and GPUs didn’t
… that and GPUs didn’t exist in 1990. Vector processors, yes. 2D graphics cards? Yes. GPUs? … absolutely not
The number of fanboys
The number of fanboys attempting to give AMD a free pass on Vega’s performance is laughable. When Nvidia pulled the same crap with “Titan” and tried justifying charging $1000 for a GPU because it was a “pro” card, those same fanboys saw right through it. Call this chip what it is and stop sugar coating it: it’s AMD’s lame attempt at at Titan… charging a massive early adopter tax and hiding behind the guise of “pro” branding. The fact that its not branded as an RX card means absolutely nothing. If anything, AMD should be ashamed that they are releasing a $1500 gpu with such lame drivers.
Drivers aren’t going to make up for the fact that this card consumes 2x as much power as a 1080, even if they do eventually close some of the gap. The only way RX Vega would perform substantially better than FE Vega would be to add more shaders, otherwise we’re talking <10% improvements from drivers in all likelihood. At the end of the day this card is too hot and too slow to compete with cards Nvidia has had out for well over a year.
I'm sick of paying Nvidia's bullshit prices, but if this is AMD's best shot I really have no choice.
Well that was crap for gamers
Well that was crap for gamers but look where Nvidia is seeing the greatest revenue growth. And some folks used that Titan X semi-pro SKU(And saved thousands over the Quadro prices) for its extra compute, same as will be using the Radeon Pro FE for its extra compute for non gaming workloads. Nvidia has the extra data center market revenues, as well as gaming market revenues, to afford to have a gaming only focused line of GPUs with the ROP/TMU counts to really fling out the FPS at the cost of any unnecessary compute for gaming only workloads.
If I were AMD I would not worry about gaming as much and would instead focus on the Vega/Full professional data center focused GPU/AI focused SKUs to go along with the Epyc data center focused CPU SKUs. That data center market is where both AMD and Nvidia will see the greatest revenue growh potential for their GPUs and CPUs(AMD Only). And Business in the PC and gaming markets is a relatively mature market where there is not so much growth potential and it’s just AMD and Nvidia competing for a share of a relatively fixed sized/stagnent market that also suffers wild demend swings. Gaming does not have enough revenue/revenue growth to support both AMD’s and Nvidia’s recenue growth needs and they both have to look towards the data center/AI/automotive/semi-custom console markets for any revenue growth.
Gamers are going to have to give up their on their illlogical sense of entitlement because the gaming only GPU market is not enough for both AMD and Nvidia to survive on as businesses.
The DRIVERS are fairly
The DRIVERS are fairly up-to-date, so any optimizations that can be done apply to NEWER game titles. The only thing that could affect performance ACROSS THE BOARD is if there is some specific hardware change that is messing things up with the current state of drivers.
But… this isn’t like the RYZEN CPU issues where code-jumping to CCX or other issues can affect performance. VEGA is a slightly tuned (AFAIK) version of the prior GPU so not sure what we can expect.
Plus, if there was some “game changing” update to be had with how code is manipulated it should affect non-gaming applications too so it seems unlikely AMD would launch VEGA-FE with some crippling bug in the drivers.
*So, I think sitting just below the GTX1080 on AVERAGE is what we can expect, with the cards remaining HOT and having throttling issues in many instances which may cause more STUTTER on average than the NVidia cards.
With the cost of HBM2 memory AMD’s in a tough spot and frankly will probably resort to cherry-picked benchmarks and “magic” promises like the High Bandwidth Memory Controller etc.
I was rooting for AMD, but unfortunately they seem to have severe engineering difficulties getting the HEAT LOW and the FREQUENCY HIGH (I’m aware that frequency vs pipeline is a separate issue but the fact remains the chips put out too much heat and thus have cooling issues).
**Raja claimed that the RX-480 GPU was at lower frequencies because the GPU was initially meant for mobile, and that was why scaling up the frequency was problematic. Okay. Fine. So I had HIGH HOPES that they would get the heat/frequency issue sorted out when working with the fabrication engineers. But 350W to get similar results to a GTX1080? Almost 2x the power? What happened?
Raja Koduri said in Reddit
Raja Koduri said in Reddit AMA that his software team “wishes [it] were true” that Vega was a revision of Polaris. He said it’s a new GPU architecture and the first able to utilize Infinity Fabric. I don’t know all of the ways it’s different, though. I’ve always taken more of an interest in CPUs than GPUs.
Blue LEDs(and bluish lights
Blue LEDs(and bluish lights in general) cause rhodopsin mediated photoreversal, which physically damages your eyes. This effect is wavelength and spectral power distribution dependent, NOT intensity dependent.
I did not know that,
I did not know that, interesting info.