Amd rx 480

Wozza63

Biblical Poster
http://www.engadget.com/2016/05/31/amds-radeon-rx480-gpu-is-vr-ready-for-just-199/

Priced at only $200, it performs on par with a GTX 980, 2 of them combined also offer better performance than a single 1080 at the cost of a 1070.

Think AMD has really stepped up their game here and 2 480s could be a winning combo.

Also touted by AMD was an 8 core 16 thread Zen CPU with 40% speed improvement per clock cycle over Excavator which are already doing pretty well in the APUs.

radeon-480-dx12-100663928-orig.png


and the announcement vid from LinusTechTips is entertaining too.

[video=youtube;eqCo0LTUhsE]https://www.youtube.com/watch?v=eqCo0LTUhsE[/video]
 

Everon

Enthusiast
AMD Polaris 10 reveal at Computex

Hi guys,

No comparison benchmarks yet of course with the new 1080 or 1070 so it will be interesting to those when they come out.

Anandtech info: http://www.anandtech.com/show/10389/amd-teases-radeon-rx-480-launching-june-29th-for-199

WccfTech: http://wccftech.com/amd-radeon-rx-480-polaris-10-launch/

Arstechnica: http://arstechnica.co.uk/gadgets/2016/06/amd-rx-480-polaris-release-date-price-specs/

I have already decided and preordered a 1080 ftw from Evga, but its still interesting to see the new AMD price point in comparison to Nvidia's and the different market they are going for.

All the best, Jay.
 
Last edited:

Everon

Enthusiast
Ah I did look for an existing thread but didn't see one lol, apologies.

Feel free to edit my post and delete the vid if you want.
 

SpyderTracks

We love you Ukraine
It's certainly an interesting card and the price point is unbelievable, it will totally invalidate any of they're previous gen cards as cheaper alternatives.

Performance wise though, I'm interested that they only talked about ashes as a benchmark which we know is heavily optimised for amd. I wonder how it will perform with titles like tomb raider or crysis 3
 

Wozza63

Biblical Poster
It's certainly an interesting card and the price point is unbelievable, it will totally invalidate any of they're previous gen cards as cheaper alternatives.

Performance wise though, I'm interested that they only talked about ashes as a benchmark which we know is heavily optimised for amd. I wonder how it will perform with titles like tomb raider or crysis 3

That's only because Nvidia haven't added Asynchronous compute with DX12.
 

Wozza63

Biblical Poster
Agreed, that's definitely something amd have got right. Will be interesting. Am I right in thinking the embargo on these is lifted on 29th June?

That's the release date, not sure about review embargos which is usually a week or two earlier. Seems like LinusTechTips will probably get exclusive first dibs as well like they did earlier today. Which in my opinion would make it a flawed review if they were exclusively given permission to review and release their video earlier as much as I like most of LTT videos.
 

SpyderTracks

We love you Ukraine
That's the release date, not sure about review embargos which is usually a week or two earlier. Seems like LinusTechTips will probably get exclusive first dibs as well like they did earlier today. Which in my opinion would make it a flawed review if they were exclusively given permission to review and release their video earlier as much as I like most of LTT videos.

Ok, will keep an eye out. Can you keep us updated? I take it you'll be following this closely :)
 

SmokeDarKnight

Author Level
Yeah i got the feeling that Jayztwocents was a little biased towards the Nvidia card he was gifted by nvidia. There was a bit of a comments storm after some users started questioning his review.
 

Wozza63

Biblical Poster
Ok, will keep an eye out. Can you keep us updated? I take it you'll be following this closely :)

Generally see it on Hexus/KitGuru or one of the tech channels I follow on YouTube. Didn't even know about it until the LinusTechTips vid early this morning.

Yeah i got the feeling that Jayztwocents was a little biased towards the Nvidia card he was gifted by nvidia. There was a bit of a comments storm after some users started questioning his review.

I've felt that way about LTT and Nvidia for a while, not just this release but it seems like they might have even been swayed the other way now. None of their videos have had AMD cards to be benchmarked against or if they do it's a token one or two compared to a dozen different Nvidia cards, although they really liked the Fury Nano.
 

SmokeDarKnight

Author Level
I guess its going to be hard to get valuable reviews until it gets into the hands of the users.

I mean LTT and J2C aren't going to want to say ANYTHING bad about anyone giving them free merchandise to test as they run the risk of loosing future collaborations etc. From a business point of view i guess they have to work that way as its what brings in the crowds and pays their bills.

TotalBiscuit sort of discussed this from a software side too. Since he flamed a few games and software developers he seems to have been blacklisted from developer reviews.
 

Wozza63

Biblical Poster
I guess its going to be hard to get valuable reviews until it gets into the hands of the users.

I mean LTT and J2C aren't going to want to say ANYTHING bad about anyone giving them free merchandise to test as they run the risk of loosing future collaborations etc. From a business point of view i guess they have to work that way as its what brings in the crowds and pays their bills.

TotalBiscuit sort of discussed this from a software side too. Since he flamed a few games and software developers he seems to have been blacklisted from developer reviews.

It's alright for TB, he has a tonne of subscribers and doesn't have any employees. LTT has to pay I think about a dozen employees from the YouTube channel, Vessel etc. so I imagine it can be a bit of a struggle to keep a profit. Don't really understand why they have so many people if I'm honest.
 

Oussebon

Multiverse Poster
That's only because Nvidia haven't added Asynchronous compute with DX12.
It's worth noting that some AMD titles seem to be making a big play on that at the moment in order to produce big numbers for AMD relative to Nvidia, while skipping out on other elements of DX12.

Take WH:TW. R9 390 loses badly to the GTX 970 in DX11, while beating it comfortably in DX12 (presumably due to Aync Compute). But, SLI isn't really supported, while crossfire is, in the WH:TW DX12 we've seen so far - despite CA saying that one of their 3 major assets for DX12 is improved multi-gpu support. DX12 is also supposed to make better use of the CPU - and again we're just not seeing as much of that.

Studios making use of hardware that AMD had the good sense to invest in is great, and they deserve their wins for that, but skimping on features that DX12 is supposed to bring to the rest of us is kinda annoying.

And Ashes has benches that are separate for DX12 for the CPU, and CPU, and has multiple levels of benchmarks.

The performance gap between the 1070 and the FuryX is wiped out the more that happens on the screen:
http://hothardware.com/reviews/nvidia-geforce-gtx-1070-review-high-performance-lower-pricing?page=5
But a lot of websites just publish one Ashes benchmark with no explanation whatsoever of which bench they used. (Which isn't AMD or Ashes' fault, it's the given website's for being stupid).

Also the benchmark for WH:TW is DX12 only and being released to select reviewers only. Unlike every other TW title since Shogun II there's no in-built benchmark tool. I have somewhat cynical feelings towards that. Games Workshop appear to have had a hand in some of the more unsavoury elements of that game, but I won't be laying that one that their door. With regards to what the benchmark can tell us for CPU performance increases with DX12: http://www.overclock3d.net/reviews/gpu_displays/total_war_warhammer_pc_performance_review/8

So I'll wait for a full range of benchmarks before making my mind up on Polaris based on how it behaves in these kinds of titles.

It does present a worrying polarisation of games though, particularly if Nvidia titles we see later on behave in a similar way. This is more of an issue than whether we wanted Geralt or Lara's hair to be fluffy.

Edit: And we do see Pascal cards getting benefit from Async, unlike Maxwell: http://www.computerbase.de/2016-05/geforce-gtx-1080-test/11/
So it's not actually correct to say they haven't added it. Though it is correct to say what async features they have added won't net them the same gains as AMD in Ashes (e.g. +2fps vs +5)
Though I don't speak enough German to tell you which particular Ashes bench was used.

Since Polaris Tech Day is embargoed until 29th June I expect we might not see reviews before then
http://videocardz.com/60373/amd-polaris-tech-day-nda-ends-on-june-29th
 
Last edited:

Oussebon

Multiverse Poster
Heh, I hadn't noticed before but the GTX 1080 vs RX 480 slide + video, you can see the GTX 1080 is being run with higher settings: https://youtu.be/ZwlQvjwYFEM?t=2185
There's more detail on the ground etc on the one on the right.

That's not to undermine the apparently huge value of a single RX 480 in the middle market, but it is a touch cheeky :)
 

Wozza63

Biblical Poster
Heh, I hadn't noticed before but the GTX 1080 vs RX 480 slide + video, you can see the GTX 1080 is being run with higher settings: https://youtu.be/ZwlQvjwYFEM?t=2185
There's more detail on the ground etc on the one on the right.

That's not to undermine the apparently huge value of a single RX 480 in the middle market, but it is a touch cheeky :)

Hmmm that's interesting and certainly noticeable. Although he did also say that the AMDs had average 52% usage while the 1080 had around 98% average so it goes both ways. It confused me as to why it would not be utilizing its full power.
 

Oussebon

Multiverse Poster
Crossfire will be another reason for 52% utilisation, that might improve with better drivers.

Either they wanted to show ~50% utilisation so nerfed the settings, or they can't actually push better settings because that's the best CF RX 480 can do on Ashes (yet, at rate).

We don't know that the 1080's FPS and/or utilisation would be at the same settings as the RX 480 CF.

The more I think about it, the less it swings both ways and the cheaper a stunt it is tbh. It's just a bit too blatant.

If they wanted to make a point about FPS vs £, they could have just shoved an R9 390 or a GTX 1070 in against the 1080 with a bar chart showing FPS vs $. But I guess everyone else already did that.
 
Last edited:

NilSatis

Bright Spark
Crossfire will be another reason for 52% utilisation, that might improve with better drivers.

Either they wanted to show ~50% utilisation so nerfed the settings, or they can't actually push better settings because that's the best CF RX 480 can do on Ashes (yet, at rate).

We don't know that the 1080's FPS and/or utilisation would be at the same settings as the RX 480 CF.

The more I think about it, the less it swings both ways and the cheaper a stunt it is tbh. It's just a bit too blatant.

If they wanted to make a point about FPS vs £, they could have just shoved an R9 390 or a GTX 1070 in against the 1080 with a bar chart showing FPS vs $. But I guess everyone else already did that.

AMD could have had their absolutely useless and quite frankly stupid "power efficiency" feature on which has been implemented since Crimson (although now there is at least a toggle for it). They claim it simply reserves power when the card is not pushed hard (i.e for older games where the card isn't pushed and newer ones to conserve power)but it gimps performance in every way in every scenario you can try and use it on, full clocks or not, 50% usage or not. The best part about this toggle is....its set to "on" as default on any new driver you download. If you are a novice user or simply aren't aware of it,(or choose not to change it for benchmarks and reviews.......no names mentioned) performance in most games is ruined and it isn't obvious why. Whoever decided on this feature being a default is a bit deluded. I would also add that in most benchmarks and sites I have seen no mention of this in their reviews. My own testing has concluded at least some sites have left this setting on (from the figures they tout on 3 series card benchmarks) and it will definitely alter performance figures. I would normally be sure AMD knows this themselves but its interesting for other benchmarks.

A little surprised the r9 390 loses out to the 970 in Total War so badly in dx11. In nearly every game now the 390 is ahead of the 970 in Directx 11 too on the latest beta driver, and the only limiting factor is normally use of Opengl (doom) where Nvidia comes out on top and poor initial driver support (as in late....) from AMD which is nothing new. Total War games were always cpu limited so this is an odd result for AMD but maybe shows again their heavier driver overheads until they release new ones optimised for new games. They are doing great with DX12 as you mentioned (and we all know why tech wise) but as we have seen in the past many game dev's use older technology to release games on older engines etc. and quite frankly make more money. We will see how well dx12 is used on major titles this year. Hopefully a lot. Just cause 3 and a couple of others promised support for dx12 as a future patch, but it hasn't happened yet. At the moment Im wondering why.

For the 480, I am reserving judgement until it comes out. As an owner of a 390 it does nothing for me this card really at 1080p/1440p mid settings which is where it is aimed but for others that haven't got this sort of level gpu it sounds almost too good to be true. Which is where I am with it at the moment. We will see. It is also a bit stupid of them using the 2 480s factor as a comparison as every gamer knows SLI is worse than ever at the moment purely due to lazy developers (I should say greedy publishers) not using it or even including support in some of the latest games. AMD is also notoriously slow with crossfire profiles although they arguably work better scaling wise than Nvidia cards when paired up. For people wanting to stay at 1080p for absolutel max settings in games and are on a budget however, at the moment it looks fantastic value.
Looking at the video the difference in settings is more than barely apparent, its glaringly obvious. Im not sure the difference in fps bumping those settings up to be equal was worth the cheap look of using this method of demonstration. Bit silly AMD. The simple fact is if they hadnt done this at this current time the end result would not have been above the 1080. I can see why they did it, but its still a bit silly.
 
Last edited:

Wozza63

Biblical Poster
AMD could have had their absolutely useless and quite frankly stupid "power efficiency" feature on which has been implemented since Crimson (although now there is at least a toggle for it). They claim it simply reserves power when the card is not pushed hard (i.e for older games where the card isn't pushed and newer ones to conserve power)but it gimps performance in every way in every scenario you can try and use it on, full clocks or not, 50% usage or not. The best part about this toggle is....its set to "on" as default on any new driver you download. If you are a novice user or simply aren't aware of it,(or choose not to change it for benchmarks and reviews.......no names mentioned) performance in most games is ruined and it isn't obvious why. Whoever decided on this feature being a default is a bit deluded. I would also add that in most benchmarks and sites I have seen no mention of this in their reviews. My own testing has concluded at least some sites have left this setting on (from the figures they tout on 3 series card benchmarks) and it will definitely alter performance figures. I would normally be sure AMD knows this themselves but its interesting for other benchmarks.

A little surprised the r9 390 loses out to the 970 in Total War so badly in dx11. In nearly every game now the 390 is ahead of the 970 in Directx 11 too on the latest beta driver, and the only limiting factor is normally use of Opengl (doom) where Nvidia comes out on top and poor initial driver support (as in late....) from AMD which is nothing new. Total War games were always cpu limited so this is an odd result for AMD but maybe shows again their heavier driver overheads until they release new ones optimised for new games. They are doing great with DX12 as you mentioned (and we all know why tech wise) but as we have seen in the past many game dev's use older technology to release games on older engines etc. and quite frankly make more money. We will see how well dx12 is used on major titles this year. Hopefully a lot. Just cause 3 and a couple of others promised support for dx12 as a future patch, but it hasn't happened yet. At the moment Im wondering why.

For the 480, I am reserving judgement until it comes out. As an owner of a 390 it does nothing for me this card really at 1080p/1440p mid settings which is where it is aimed but for others that haven't got this sort of level gpu it sounds almost too good to be true. Which is where I am with it at the moment. We will see. It is also a bit stupid of them using the 2 480s factor as a comparison as every gamer knows SLI is worse than ever at the moment purely due to lazy developers (I should say greedy publishers) not using it or even including support in some of the latest games. AMD is also notoriously slow with crossfire profiles although they arguably work better scaling wise than Nvidia cards when paired up. For people wanting to stay at 1080p for absolutel max settings in games and are on a budget however, at the moment it looks fantastic value.
Looking at the video the difference in settings is more than barely apparent, its glaringly obvious. Im not sure the difference in fps bumping those settings up to be equal was worth the cheap look of using this method of demonstration. Bit silly AMD. The simple fact is if they hadnt done this at this current time the end result would not have been above the 1080. I can see why they did it, but its still a bit silly.

I don't think I have that option... :(

Guess I'll have to do a driver update.

Also you say SLI/Crossfire is becoming irrelevant but I'd say the opposite. DX12 and Vulkan both bring better multi GPU compatibility and has the ability to pool the resources and processing power together regardless of design and regardless of drivers apart from the driver allowing this functionality to begin with which I'd guess would be implemented relatively quickly by all 3 sides (inc. Intel).
 

Oussebon

Multiverse Poster
they arguably work better scaling wise than Nvidia cards when paired up.
Thought it was the other way around, not that I paid it much attention.

http://www.techspot.com/review/1033-gtx-980-ti-sli-r9-fury-x-crossfire/page7.html
And one of those games is Attila where SLI support is pretty bad. I tried it once and it was okay actually, saw meaningful gains, but everyone else says it's truly awful or in some cases doesn't work. It's terrible to non-existent in WH:TW, but Crossfire is fine or else those with a R9 295x2 would be pretty mad with AMD and CA.

Like Fallout 4 appears to hate crossfire. It also doesn't like SLI, but it hates crossfire more. http://www.gamersnexus.net/guides/2285-amd-r9-390-crossfire-vs-sli-gtx-970-benchmark
But in the other results it seems perhaps a bit towards Nvidia. It's a Bethesda game though, using a similar engine to Skyrim I think, and Skyrim doesn't like SLI much.

Hard to think what a fair set of games to test on would be really. *headscratch*

Ofc there's this article that concludes quite early on that Crossfire is the winner. Having benched mostly AMD titles:
http://wccftech.com/multi-gpu-nvidia-sli-amd-crossfire-performance-value-comparison/

Cross-SLI setups are what's particularly interesting and it will be good to see how SLI vs CF vs CF-SLI will be when more DX12 titles come out. For now the benches show some nice benefits, with slightly bigger benefits with the AMD card as the primary adapter, but that's probably because they were benched in Ashes which regardless of Async is an AMD title and only one game. :)
 
Last edited:
Top