Sticky

AMD Vega MegaThread! FAQ and Resources - page 5

695 answers Last reply
  1. find me one game that use GPU accelerated feature but at the same time they did not engage with marketing effort with AMD or nvidia. TressFX was first introduced with the original Tomb Raider reboot. now if tCrystal Dynamincs did not engage any marketing effort with AMD will they ever use TressFX? heck AMD most likely promoting CD to use their TressFX (and back then it was still not open source) because AMD know that without being pushed to use it game developer will not going to implement it in their games. because it is none other than AMD that said game developer have no interest with GPU accelerated physics feature:

    http://aphnetworks.com/news/2011/03/25/amd-game-developers-not-exactly-interested-hardware-accelerated-physics

    btw tressfx was only dealing with hair simulation (and grass with TressFX 2). they are not complete physics solution like PhysX and Bullet are.
    Reply to renz496
  2. renz496 said:
    find me one game that use GPU accelerated feature but at the same time they did not engage with marketing effort with AMD or nvidia. TressFX was first introduced with the original Tomb Raider reboot. now if tCrystal Dynamincs did not engage any marketing effort with AMD will they ever use TressFX? heck AMD most likely promoting CD to use their TressFX (and back then it was still not open source) because AMD know that without being pushed to use it game developer will not going to implement it in their games. because it is none other than AMD that said game developer have no interest with GPU accelerated physics feature:

    http://aphnetworks.com/news/2011/03/25/amd-game-developers-not-exactly-interested-hardware-accelerated-physics

    btw tressfx was only dealing with hair simulation (and grass with TressFX 2). they are not complete physics solution like PhysX and Bullet are.


    Any game using Bullet?

    http://bulletphysics.org/wordpress/

    That thing runs using OCL and... That's about it XD

    Cheers!
    Reply to Yuka
  3. Yuka said:
    renz496 said:
    find me one game that use GPU accelerated feature but at the same time they did not engage with marketing effort with AMD or nvidia. TressFX was first introduced with the original Tomb Raider reboot. now if tCrystal Dynamincs did not engage any marketing effort with AMD will they ever use TressFX? heck AMD most likely promoting CD to use their TressFX (and back then it was still not open source) because AMD know that without being pushed to use it game developer will not going to implement it in their games. because it is none other than AMD that said game developer have no interest with GPU accelerated physics feature:

    http://aphnetworks.com/news/2011/03/25/amd-game-developers-not-exactly-interested-hardware-accelerated-physics

    btw tressfx was only dealing with hair simulation (and grass with TressFX 2). they are not complete physics solution like PhysX and Bullet are.


    Any game using Bullet?

    http://bulletphysics.org/wordpress/

    That thing runs using OCL and... That's about it XD

    Cheers!


    GTA V. there are few others but to my knowledge none of them ever use GPU accelerated feature that offered by the engine.
    Reply to renz496
  4. renz496 said:
    Yuka said:
    renz496 said:
    find me one game that use GPU accelerated feature but at the same time they did not engage with marketing effort with AMD or nvidia. TressFX was first introduced with the original Tomb Raider reboot. now if tCrystal Dynamincs did not engage any marketing effort with AMD will they ever use TressFX? heck AMD most likely promoting CD to use their TressFX (and back then it was still not open source) because AMD know that without being pushed to use it game developer will not going to implement it in their games. because it is none other than AMD that said game developer have no interest with GPU accelerated physics feature:

    http://aphnetworks.com/news/2011/03/25/amd-game-developers-not-exactly-interested-hardware-accelerated-physics

    btw tressfx was only dealing with hair simulation (and grass with TressFX 2). they are not complete physics solution like PhysX and Bullet are.


    Any game using Bullet?

    http://bulletphysics.org/wordpress/

    That thing runs using OCL and... That's about it XD

    Cheers!


    GTA V. there are few others but to my knowledge none of them ever use GPU accelerated feature that offered by the engine.


    Because enabling compute is a huge performance penalty for everything else. Tinfoil hats ensue: AMD is less affected by it! LOL.

    Anyway, I have to say nVidia is very bad at "sharing" their stuff. Remember even Linus gave them the middle finger, so...

    Cheers!
    Reply to Yuka
  5. Quote:
    Because enabling compute is a huge performance penalty for everything else. Tinfoil hats ensue: AMD is less affected by it! LOL.


    nah more likely because GPU accelerated physics is more exclusive to PC. remember 7th gen console most likely not be able to dedicate extra resource just for much pretty physics effect that did not affect gameplay mechanic at all. with 8th gen console GPU accelerated physics was supposed to be feature than game developer can finally use because of the increased raw power (during PS4 official unveil they show demo gpu physics being done using havok). but because MS and sony want the console to be more affordable they skimp on the hardware resulting new games have trouble just to get 1080p 60FPS. some of the games already struggle just to run "medium" setting on both console so most of this developer most likely did not want to add another frame rate killer effect in their games. and then boom! 4k craze :P

    Quote:
    Anyway, I have to say nVidia is very bad at "sharing" their stuff. Remember even Linus gave them the middle finger, so...

    Cheers!


    that is simply how nvidia rolls. but they tend to be very clear with their intention. i still remember when nvidia being ask about licensing Gsync tech to other and their answer is straight no.
    Reply to renz496
  6. renz496 said:
    that is simply how nvidia rolls. but they tend to be very clear with their intention. i still remember when nvidia being ask about licensing Gsync tech to other and their answer is straight no.


    And that is where we don't really give nVidia the benefit of the doubt in this DX12 multi-GPU conversation, I'd say. If they find a way to "protect their IP" (not incorrect technically, but still annoying), they will. And what "protect the IP" means, can vary from locking everyone else out or just standing on their own corner, sometimes both at the same time.

    Cheers!
    Reply to Yuka
  7. Yuka said:
    renz496 said:
    that is simply how nvidia rolls. but they tend to be very clear with their intention. i still remember when nvidia being ask about licensing Gsync tech to other and their answer is straight no.


    And that is where we don't really give nVidia the benefit of the doubt in this DX12 multi-GPU conversation, I'd say. If they find a way to "protect their IP" (not incorrect technically, but still annoying), they will. And what "protect the IP" means, can vary from locking everyone else out or just standing on their own corner, sometimes both at the same time.

    Cheers!


    if nvidia have no intention for their card to used in mixed combination manner then they most likely coming out with such statement right now. DX12 launch in 2015 with windows 10. fast forward to 2017 and still nvidia did nothing to stop it. they probably will not going to help game developer when asked how to make their GPU work better with AMD GPU in DX12 multi GPU but they will not going to stop it either. also mixed multi GPU is not really a new idea. it has been done before. and back then many people also expect nvidia going to stop it from working. but nvidia simply let it happen.
    Reply to renz496
  8. goldstone77 said:


    we probably will get better answer by the end of month. hopefully this is not some purposely made up rumor to inflate the hype. there are already people expecting vega to OC up to 1800mhz.
    Reply to renz496
  9. goldstone77 said:


    uhm, the Vega Core specs don't add up- it's got higher clockspeed and core count than the Eclipse but lower performance numbers?

    I think the leaks we have seen suggest the core should run around 1200mhz, giving GTX 1070 level perf (unknown number of shaders, though I suspect it'll be somewhat cut down).
    Reply to cdrkf
  10. renz496 said:
    if nvidia have no intention for their card to used in mixed combination manner then they most likely coming out with such statement right now. DX12 launch in 2015 with windows 10. fast forward to 2017 and still nvidia did nothing to stop it. they probably will not going to help game developer when asked how to make their GPU work better with AMD GPU in DX12 multi GPU but they will not going to stop it either. also mixed multi GPU is not really a new idea. it has been done before. and back then many people also expect nvidia going to stop it from working. but nvidia simply let it happen.


    If they flat out say that, then it's kicking the hornet's nest. One thing is for them to keep others at bay from something no one else is licensing, but DX12 is another ball game.

    I think they're waiting to the very last moment to say it, if they indeed want to. But, I think the bad press and all the associated problems with taking such a stance, are not worth it to nVidia... Or at least, I'd hope that is the case.

    Cheers!
    Reply to Yuka
  11. Yuka said:
    renz496 said:
    if nvidia have no intention for their card to used in mixed combination manner then they most likely coming out with such statement right now. DX12 launch in 2015 with windows 10. fast forward to 2017 and still nvidia did nothing to stop it. they probably will not going to help game developer when asked how to make their GPU work better with AMD GPU in DX12 multi GPU but they will not going to stop it either. also mixed multi GPU is not really a new idea. it has been done before. and back then many people also expect nvidia going to stop it from working. but nvidia simply let it happen.


    If they flat out say that, then it's kicking the hornet's nest. One thing is for them to keep others at bay from something no one else is licensing, but DX12 is another ball game.

    I think they're waiting to the very last moment to say it, if they indeed want to. But, I think the bad press and all the associated problems with taking such a stance, are not worth it to nVidia... Or at least, I'd hope that is the case.

    Cheers!


    DX12 spec is something that MS and other IHV discussed a lot among them to dictate which feature should be included. if nvidia does not want their GPU to be used with other vendor product they can just outright say it. just like AMD that coming out right clean about why they are not supporting FL12_1 in their hardware.
    Reply to renz496
  12. cdrkf said:
    goldstone77 said:


    uhm, the Vega Core specs don't add up- it's got higher clockspeed and core count than the Eclipse but lower performance numbers?

    I think the leaks we have seen suggest the core should run around 1200mhz, giving GTX 1070 level perf (unknown number of shaders, though I suspect it'll be somewhat cut down).


    Does this mean they are going with the slower HBM2 I thought it was 512Gb/s ?? I was expecting high clock speeds alright... the 1200mhz was only early engineering sample. An the Vega architecture is designed to run faster than Polaris. But I think they were supposed to be using stacks of 256Gb/s HBM2.
    Reply to jaymc
  13. Intel is licensing AMD's graphics
    by FUAD ABAZOVIC on15 MAY 2017
    "Done deal

    We can now confirm the rumours that Intel has given up on Nvidia because it has written a cheque to license AMD's graphics."
    http://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics
    Reply to goldstone77
  14. goldstone77 said:
    Intel is licensing AMD's graphics
    by FUAD ABAZOVIC on15 MAY 2017
    "Done deal

    We can now confirm the rumours that Intel has given up on Nvidia because it has written a cheque to license AMD's graphics."
    http://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics


    i'm not sure if "given up" is the right word for it. it's more like they don't want it? but initially intel and nvidia do cross licensing in 2004 so both can get what they want. things start to go sour for both of them in 2008/2009. the initial licensing deal was supposed to end in 2011. but in the end intel decided to settle the issue out of court and reward nvidia 1.5 billion to be paid in installment. both use this settlement as a stepping stone to "renew" the licensing deal that they do in 2004. but the catch is nvidia have to ditch the idea about making their own x86 CPU. permanently. this pretty much confirm that intel bar nvidia from making intel new chipset not so much about licensing issue but to kill nvidia attempt in entering x86 market directly.
    Reply to renz496
  15. renz496 said:
    goldstone77 said:
    Intel is licensing AMD's graphics
    by FUAD ABAZOVIC on15 MAY 2017
    "Done deal

    We can now confirm the rumours that Intel has given up on Nvidia because it has written a cheque to license AMD's graphics."
    http://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics


    i'm not sure if "given up" is the right word for it. it's more like they don't want it? but initially intel and nvidia do cross licensing in 2004 so both can get what they want. things start to go sour for both of them in 2008/2009. the initial licensing deal was supposed to end in 2011. but in the end intel decided to settle the issue out of court and reward nvidia 1.5 billion to be paid in installment. both use this settlement as a stepping stone to "renew" the licensing deal that they do in 2004. but the catch is nvidia have to ditch the idea about making their own x86 CPU. permanently. this pretty much confirm that intel bar nvidia from making intel new chipset not so much about licensing issue but to kill nvidia attempt in entering x86 market directly.


    Well, I think Intel definitely looks at Nvidia as a strong competitor to it's autonomous vehicle business. AMD's GPU's are cheaper, but the quality is really attractive when compared to Nvidia. I think AMD is just a better deal for Intel's bottom line on a multiple different fronts.
    Reply to goldstone77
  16. goldstone77 said:
    renz496 said:
    goldstone77 said:
    Intel is licensing AMD's graphics
    by FUAD ABAZOVIC on15 MAY 2017
    "Done deal

    We can now confirm the rumours that Intel has given up on Nvidia because it has written a cheque to license AMD's graphics."
    http://www.fudzilla.com/news/graphics/43663-intel-is-licensing-amd-graphics


    i'm not sure if "given up" is the right word for it. it's more like they don't want it? but initially intel and nvidia do cross licensing in 2004 so both can get what they want. things start to go sour for both of them in 2008/2009. the initial licensing deal was supposed to end in 2011. but in the end intel decided to settle the issue out of court and reward nvidia 1.5 billion to be paid in installment. both use this settlement as a stepping stone to "renew" the licensing deal that they do in 2004. but the catch is nvidia have to ditch the idea about making their own x86 CPU. permanently. this pretty much confirm that intel bar nvidia from making intel new chipset not so much about licensing issue but to kill nvidia attempt in entering x86 market directly.


    Well, I think Intel definitely looks at Nvidia as a strong competitor to it's autonomous vehicle business. AMD's GPU's are cheaper, but the quality is really attractive when compared to Nvidia. I think AMD is just a better deal for Intel's bottom line on a multiple different fronts.


    about auto i'm not too sure about it. nvidia probably was in auto longer than intel (since the very start of tegra). in any case intel does not like GPU itself. intel did not like the fact that GPU is simply better than CPU in certain type of task. with Xeon Phi intel try to convince the market that x86 can also be very good at massively parallel task. in intel dream those fastest supercomputer in the world should only consist of x86 core. but nvidia disturb the market by introducing GPU into the equation where 80%-90% of the computation performance are coming directly from GPU.
    Reply to renz496
  17. AMD launches Radeon Vega Frontier Edition
    by Hilbert Hagedoorn on: 05/17/2017 12:26 AM

    "AMD just launched their Radeon Vega Frontier Edition card. It is a 'mission intelligent' enterprise graphics card for professional usage (data-crunching) and product designers, and yes it is not for PC gamers.

    The card comes with 16 GB of HBM2 graphics memory and will perform in the 13 TFLOP (fp32) performance bracket. It will be available late June. At this point there have been no consumer announcements regarding Radeon RX Vega graphics cards. From the looks of it, the announcements will be made during Computex."

    http://www.guru3d.com/news-story/amd-launches-radeon-vega-frontier-edition.html

    "Earlier on in the presentations Mark Papermaster mentioned Radeon Rx Vega to become available in June. Meanwhile Chief architecht of the Radeon Technology Group Raja Koduri is talking a lot about the benefits of the new Vega architecture and heterogeneous computing (cpu+gpu) and its various possible implementations. AMD things that Vega is going to be big in the data-center, he shows examples of DeepBench inbetween the NVidia P100 and Vega. Nvidia scored 133 Ms, the Vega setup 88. In this score lower is better.
    AMD launches Radeon Vega Frontier Edition - the card is a mission intelligent enterprise graphics card for professional usage (data-crunching) and product designers. The card comes with 16 GB of HBM2 graphics memory and will perform in the 13 TFLOP (fp32) performance bracket. The 16GB Radeon Vega Frontier Edition will become available late June 2017."

    http://www.guru3d.com/news-story/10pm-cest-amd-financial-analyst-presentation-live-feed.html
    Reply to goldstone77
  18. still i want to see how that 13tflops of Vega going against 12tflops of nvidia Titan Xp.
    Reply to renz496
  19. AMD Vega: Frontier Edition in June, Gaming Likely Later
    By Steve Burke Published May 16, 2017 at 8:17 pm

    "Vega: Probably July or Later for Gaming"

    http://www.gamersnexus.net/news-pc/2914-amd-vega-frontier-edition-in-june-gaming-later
    Reply to goldstone77
  20. July......that's mean Q3 instead of Q2 for "RX Vega"? it seems anandtech also have the same sentiment (second last paragraph).

    Quote:
    For AMD gamers who have been holding out for Vega, it’s clear that they’ll have to hold out a bit longer. AMD is developing traditional consumer gaming cards as well, but by asking gamers to hold off a little while longer when the Vega FE already isn’t launching until late June, AMD is signaling that we shouldn’t be expecting consumer cards until the second half of the year.


    Anandtech
    Reply to renz496
  21. renz496 said:
    July......that's mean Q3 instead of Q2 for "RX Vega"? it seems anandtech also have the same sentiment (second last paragraph).

    Quote:
    For AMD gamers who have been holding out for Vega, it’s clear that they’ll have to hold out a bit longer. AMD is developing traditional consumer gaming cards as well, but by asking gamers to hold off a little while longer when the Vega FE already isn’t launching until late June, AMD is signaling that we shouldn’t be expecting consumer cards until the second half of the year.


    Anandtech


    Well AMD did say Vega by 2h- not 'Vega gaming product 2h'.... If the FE card comes out in June they have met their target, just.
    Reply to cdrkf
  22. jaymc said:
    cdrkf said:
    goldstone77 said:


    uhm, the Vega Core specs don't add up- it's got higher clockspeed and core count than the Eclipse but lower performance numbers?

    I think the leaks we have seen suggest the core should run around 1200mhz, giving GTX 1070 level perf (unknown number of shaders, though I suspect it'll be somewhat cut down).


    Does this mean they are going with the slower HBM2 I thought it was 512Gb/s ?? I was expecting high clock speeds alright... the 1200mhz was only early engineering sample. An the Vega architecture is designed to run faster than Polaris. But I think they were supposed to be using stacks of 256Gb/s HBM2.


    Well it looks like 256gb/s HBM2 isn't available yet, the current HBM2 runs at 1.6ghz and not 2ghz required for that 256gb/s speed.

    Still, 400+ gb/s bandwidth should be ample (just look at a GTX 1080 vs a Fury X- the latter has 512 gb/s of bandwidth vs 320 gb/s on the 1080, yet the 1080 is a much faster card. Yes high bandwidth helps however it has to be balanced against the rest of the card. Fury had way more bandwidth than it really needed given it's shader performance).

    As for the speeds- what I'm saying is for the *bottom model* (the cheapest 'Core') vega is likely to use a lower clockspeed (e.g. 1200mhz) and that will go up for higher sku. That is the point of the table- they are suggesting that Vega will have 3 gaming models aimed at different price / performance points. The issue I have is the table also suggests that the entry level 'core' that is intended only to go up against the GTX 1070 is faster than the faster, more expensive part aimed at the GTX 1080. That doesn't make sense.

    To my mind the specs would be something like this:
    - Vega Core: 1200 mhz clock, Circa 3500 shaders, 400gb/s memory bandwidth
    - Vega Eclipse: 1400 mhz clock, Circa 3500 shaders, 480gb/s memory bandwidth (based on the specs of the recently announced FE)
    - Vega Nova: 1500+mhz clock, full 4096 shaders, 480gb/s memory bandwidth
    Reply to cdrkf
  23. cdrkf said:
    renz496 said:
    July......that's mean Q3 instead of Q2 for "RX Vega"? it seems anandtech also have the same sentiment (second last paragraph).

    Quote:
    For AMD gamers who have been holding out for Vega, it’s clear that they’ll have to hold out a bit longer. AMD is developing traditional consumer gaming cards as well, but by asking gamers to hold off a little while longer when the Vega FE already isn’t launching until late June, AMD is signaling that we shouldn’t be expecting consumer cards until the second half of the year.


    Anandtech


    Well AMD did say Vega by 2h- not 'Vega gaming product 2h'.... If the FE card comes out in June they have met their target, just.


    if they can launch the "semi pro" vega next month then it means the gaming version must also be ready right now because Vega is not like GP100 where the focus is 100% for professional market. so what the exact reason AMD holding gaming Vega to be released at the same time as semi pro Vega?
    Reply to renz496
  24. renz496 said:
    cdrkf said:
    renz496 said:
    July......that's mean Q3 instead of Q2 for "RX Vega"? it seems anandtech also have the same sentiment (second last paragraph).

    Quote:
    For AMD gamers who have been holding out for Vega, it’s clear that they’ll have to hold out a bit longer. AMD is developing traditional consumer gaming cards as well, but by asking gamers to hold off a little while longer when the Vega FE already isn’t launching until late June, AMD is signaling that we shouldn’t be expecting consumer cards until the second half of the year.


    Anandtech


    Well AMD did say Vega by 2h- not 'Vega gaming product 2h'.... If the FE card comes out in June they have met their target, just.


    if they can launch the "semi pro" vega next month then it means the gaming version must also be ready right now because Vega is not like GP100 where the focus is 100% for professional market. so what the exact reason AMD holding gaming Vega to be released at the same time as semi pro Vega?


    My guess would be availability of HBM2.
    Reply to cdrkf
  25. Volta: Poor Vega ?

    https://www.youtube.com/watch?v=BH-lOKfinDw

    Edit:
    fixed link..
    Reply to jaymc
  26. jaymc said:
    Volta: Poor Vega ?

    https://www.youtube.com/watch?v=BH-lOKfinDw

    Edit:
    fixed link..


    We can only hope that it the performance is competitive compared to Nvidia, or AMD is going to be riding the short bus for at least another year till Navi.
    Reply to goldstone77
  27. Wonderful...
    Reply to axlrose
  28. Yeah messing up the specs in the table would suggest it's not quite reliable. IF the figures on TDP are reliable it's a bit of a shocker though. At 275W for basically everything (very doubtful!) it would look like NVidia has a lead of 100W or more with their cards.
    Reply to varis
  29. jaymc said:


    Computex is May 30th, so saying Vega wont be available that week doesn't means there is a long delay. Some sites are speculating Jun 4th others June 5th, and yet others saying late June.

    There is just too much speculation and very little fact. All we really know now is that Vega will be at Computex, but wont be immediately available and will launch for the workstation and server market first. That and Vega will use 2 stacks of HBM2.

    Any way you look at it, Vega is looking to be competing against volta as opposed to pascal. No telling how much faster voltla will be over the current nvidia lineup, it's supposedly yet another a new architecture.
    Reply to Martell1977
  30. Volta is a monster it's huge.. looks to be more powerful alright but may still suffer considerably when it comes to Dx12 & Async Compute.

    If AMD does manage to get ahead it looks like it will be short lived.. Well have to wait and see I guess..

    Once again time will tell...seem's like were constantly waiting eh !!! wait wait wait...

    At least we will get some specs an bench's at long last.
    Reply to jaymc
  31. jaymc said:
    Volta is a monster it's huge.. looks to be more powerful alright but may still suffer considerably when it comes to Dx12 & Async Compute.


    Nvidia isn't stupid, Volta wont suffer in Dx12 & Async Compute. In fact Im pretty certain that is the exact reason why Volta moved up as you can see AMD leads with DX12 on 8 plus core count CPU's. I expect Vega to really shine a light on this when DX12 tests are done on high CPU core count systems that let AMD's highly threaded driver shine.

    May 30th is almost here I'm like a kid in a candy store with Vega, AMD's Threadripper and Intel's Skylake-x. This is just turning out to be a lovely year for the PC.
    Reply to JamesSneed
  32. Martell1977 said:
    jaymc said:


    Computex is May 30th, so saying Vega wont be available that week doesn't means there is a long delay. Some sites are speculating Jun 4th others June 5th, and yet others saying late June.

    There is just too much speculation and very little fact. All we really know now is that Vega will be at Computex, but wont be immediately available and will launch for the workstation and server market first. That and Vega will use 2 stacks of HBM2.

    Any way you look at it, Vega is looking to be competing against volta as opposed to pascal. No telling how much faster voltla will be over the current nvidia lineup, it's supposedly yet another a new architecture.


    we probably see similar improvement like kepler to maxwell. die size might end up bigger but more power efficient. just look at GV100 itself. 850mm2 and yet consume about the same as 610mm2 GP100. also with TSMC 12nm FFN nvidia also increase the limit on how big they can make their GPU quite significantly. the mystery is what comes after volta. we know for AMD Navi will succeed Vega using 7nm node. if this is the old nvidia by now we should already know the name of nvidia two next architecture after volta. probably there is something after volta that nvidia unwilling to tell the public yet.
    Reply to renz496
  33. jaymc said:
    Volta is a monster it's huge.. looks to be more powerful alright but may still suffer considerably when it comes to Dx12 & Async Compute.

    If AMD does manage to get ahead it looks like it will be short lived.. Well have to wait and see I guess..

    Once again time will tell...seem's like were constantly waiting eh !!! wait wait wait...

    At least we will get some specs an bench's at long last.


    i don't think nvidia will keep ignoring their issues with DX12. just like how nvidia address kepler issue with maxwell when they don't have any hardware in current gen console. with async compute nvidia fix some the issue with pascal. those that looking at GV100 details already speculate that there is further improvement that might help nvidia in async compute usage.
    Reply to renz496
  34. jaymc said:
    Volta: Poor Vega ?

    https://www.youtube.com/watch?v=BH-lOKfinDw

    Edit:
    fixed link..


    Vega well be out well before Volta as a gaming GPU. That is a specialist dedicated compute product (like GP100 it probably doesn't even have display outputs).

    Vega will be out over the summer- I don't expect to see a gaming product based on Volta from nVidia until 2018 at the earliest.
    Reply to cdrkf
  35. renz496 said:
    jaymc said:
    Volta is a monster it's huge.. looks to be more powerful alright but may still suffer considerably when it comes to Dx12 & Async Compute.

    If AMD does manage to get ahead it looks like it will be short lived.. Well have to wait and see I guess..

    Once again time will tell...seem's like were constantly waiting eh !!! wait wait wait...

    At least we will get some specs an bench's at long last.


    i don't think nvidia will keep ignoring their issues with DX12. just like how nvidia address kepler issue with maxwell when they don't have any hardware in current gen console. with async compute nvidia fix some the issue with pascal. those that looking at GV100 details already speculate that there is further improvement that might help nvidia in async compute usage.


    It's been known for a while that Volta is supposed to be nVidias first design to include hardware schedulers for Async compute (which is what they currently lack).

    That said, Volta as a gaming gpu is a long way off, and I'd put money on the fact that we will *never* see GV100 as a gaming product just like there are no traditional 'graphics' boards using GP100 either. GV104 will likely be the first true Volta gaming product, and I doubt we'll see that until 2018.

    That gives Vega a 5 to 6 month run against Pascal.
    Reply to cdrkf
  36. personally i'd expect gaming volta to be 1H 2018 product. but in the end it all depends on how fast Vega is. for sure nvidia will not going to let AMD have that single GPU crown for more than one quarter. the last time they did that AMD almost become the most dominant discrete GPU maker in the world.
    Reply to renz496
  37. renz496 said:
    personally i'd expect gaming volta to be 1H 2018 product. but in the end it all depends on how fast Vega is. for sure nvidia will not going to let AMD have that single GPU crown for more than one quarter. the last time they did that AMD almost become the most dominant discrete GPU maker in the world.


    Remember if you include consoles and integrated gpu's AMD already ship more gpu's than nVidia, second only to Intel :P
    Reply to cdrkf
  38. cdrkf said:
    renz496 said:
    personally i'd expect gaming volta to be 1H 2018 product. but in the end it all depends on how fast Vega is. for sure nvidia will not going to let AMD have that single GPU crown for more than one quarter. the last time they did that AMD almost become the most dominant discrete GPU maker in the world.


    Remember if you include consoles and integrated gpu's AMD already ship more gpu's than nVidia, second only to Intel :P


    maybe that is but if we only counting PC market right now nvidia sell more discrete GPU than AMD's combined sales for discrete GPU and APU.

    https://jonpeddie.com/press-releases/details/attach-rate-up-amd-and-nvidia-increased-on-all-fronts-in-q4-2016
    Reply to renz496
  39. we need AMD to exactly tell us when RX Vega CE going to launch (with exact dates) at Computex. if they can't even provide that then we have a reason to worry. mean while there are already rumors circling around about Titan Volta.
    Reply to renz496
  40. In absence of Computex here's some minor information nuggets/more conjencture for the most information starved:

    http://wccftech.com/amd-ceo-gaming-rx-vega-launching-soon-frontier-edition/
    https://www.extremetech.com/computing/249761-amd-clarifies-vega-launch-dates-server-rollouts-talks-zen-follow-ups

    TLDR: optimist's version is that it would start getting busy in the Vega gaming space in July, for pessimists September (assuming you actually believe what you read on the Internet).
    Reply to varis
  41. renz496 said:
    we need AMD to exactly tell us when RX Vega CE going to launch (with exact dates) at Computex. if they can't even provide that then we have a reason to worry. mean while there are already rumors circling around about Titan Volta.


    Titan Volta, they wouldn't release that next year? Their next line is 2000's series. 2060,2070, and 2080.
    Reply to goldstone77
  42. goldstone77 said:
    Titan Volta, they wouldn't release that next year? Their next line is 2000's series. 2060,2070, and 2080.


    It's very possible could be the 11XX series, depends on what they decide to do this time.
    Reply to Martell1977
  43. so Vega FE june 27th. no pricing detail yet.
    RX Vega late june/or early august. AMD plan to launch them at SIGGRAPH 2017 event. also no hint at pricing yet.
    Reply to renz496
  44. Siggraph is 30th of July so wouldn't that put the actual introduction of various Vega tiers around August-October?

    I wager that the powers that be have been ramping up HDM2 production since the introduction of new GPUs is by new pretty much a certainty, there is a market for your units. One would think that around the launch dates I give above, the supply problems would be almost a non-issue already?

    But the introduction of the card could be even later, if AMD keeps having issues with the Vega line, so Siggraph would just be the unveiling and maybe (hopefully, likely?) the availability of some high-end line from that point onwards.

    Guru3D recaps some earlier info including pricing rumours. http://www.guru3d.com/news-story/computex-2017-amd-press-conference-%E2%80%93-radeon-rx-vega-news.html

    Quote:

    The first card would be called the RX Vega Core which will start at $399. It will deliver performance on par with or better than Nvidia’s GeForce GTX 1070.
    Then RX Vega Eclipse which will be priced at $499 and will compete head to head with the GTX 1080.
    The RX Vega Nova will be the Big Vega that will retail at $599 and rival the GTX 1080 Ti.


    If that's correct it's actually a bit underwhelming. AMD is positioning itself as the price leader, only matching competing products and slightly undercutting prices. Or then people who forward rumours are not very well informed of the eventual performance figures :)

    AMD's piece at Computex was quite CPU centric. Also new announcements from Intel. Do we have a Computex thread on the forums? :)

    http://www.anandtech.com/show/11476/computex-2017-amd-press-event-live-blog-starts-10pm-et
    Reply to varis
  45. Quote:
    If that's correct it's actually a bit underwhelming. AMD is positioning itself as the price leader, only matching competing products and slightly undercutting prices.


    AMD not really do it because they like it. look what they did with 7970 for example. instead of pricing the card at $400 like their previous flagship (HD5870= $400, HD6970= $380) they choose to price them even more expensive than nvidia GTX580. back then it was something that was unthink of when it comes to AMD because majority of people expect that AMD will always undercut nvidia pricing. but this is a very clear sign that AMD also want to price their product like that if given the chance. and since AMD decided to bring in HBM to consumer grade product it become even more harder for AMD to undercut nvidia.
    Reply to renz496
  46. So now no June or even July but August or later? By the time something comes out nvidia will have already surpassed themselves again won't they? Starting to think red team will never be competitive again...
    Reply to axlrose
  47. axlrose said:
    So now no June or even July but August or later? By the time something comes out nvidia will have already surpassed themselves again won't they? Starting to think red team will never be competitive again...


    they can....just look at Ryzen right now. but problem is nvidia is quite relentless even without opposition. IMO when intel know AMD can't beat them after sandy they shift most their focus to conquer ARM next. and they burn a lot of money on that venture without visible success. mean while AMD keep working with their limited budget and after five years they finally have something that really catch intel off guard. with AMD financial literally in the red every quarter intel most likely did not suspect AMD can come up with very competitive product that can challenge their existing product.

    nvidia might hold the performance crown by brute force now but they are very aware that AMD have always try to hit them hard in unexpected way. back in 2008 that was HD4800. and recently with AMD winning all the console contract they try to influence game development more towards their hardware by introducing low level API in a way that can give them quite an advantage but not on nvidia. but so far on pc gaming landscape nvidia can still hold their own while slowly try to neutralize the advantage that AMD gain.
    Reply to renz496
Ask a new question Answer

Read More

Vega Next Generation AMD Graphics