Page 2 of 9 FirstFirst 1234 ... LastLast
Results 11 to 20 of 88

Thread: What Mantle means for Nvidia's GPU market.

  1. #11
    Quote Originally Posted by Rottis View Post
    OpenGL supports vendor specific extensions and all parties have done several of those each. NVidia's answer to Mantle was to make a little different kind of extension, which basically bypasses standard OpenGL way of doing things, basically their own API inside OpenGL. This is by no means any more portable thank Mantle is, just a little messier approach.
    If AMD and Nvidia each write their own extensions to do essentially the same thing, then it's likely that it will be standardized and incorporated into some future core version of OpenGL. That has happened a number of times in the past. Microsoft will probably incorporate it into DirectX, too.

    And then if you can get all of the benefits of Mantle in an industry-standard API without actually having to use Mantle, then Mantle becomes pointless. Of course, Microsoft will probably make the DirectX version exclusive to Windows 9, while OpenGL will take about three years to add it to the core standard. But still, if Nvidia can offer the same benefits as Mantle via OpenGL and/or DirectX extensions, then that nearly guarantees that Mantle will be short-lived at best.

  2. #12
    8-bit overflow
    Join Date
    Feb 2012
    Posts
    376
    Quote Originally Posted by psoomah View Post
    It just won't make sense to buy Nvidia with a flood of Mantle games on the way.


    Post APU13 will see Nvidia AIB sales plummet and AMD AIB sales skyrocket.
    I'm sorry but this is very, very wishful thinking, and very naive.
    At worst for AMD: Mantle does absolutely not much.
    At best for AMD: Nvidia fanboys will simply skip the next generation and hibernate until generation after that.

    They're not going to buy AMD cards, and you're not going to see huge marketshare shifts, and neither will developers.

    Already now you see nvidia's marketing machine giving out promises to their fanboys.
    "they have Mantle but we have G-sync!!, inferior frame rates doesn't matter because g-sync is just sooo much better!!!"
    And their fanboys will eat this promise, because they want to believe.
    They will listen to nvidia's marketing over the benchmarks, and simply skip the generation in anticipation for this promised g-sync.

    That's how it'll go.

  3. #13
    Senior Member
    Join Date
    Dec 2009
    Posts
    1,149
    Are anyone's OpenGL drivers fully threadsafe allowing for dispatching draw calls from multiple threads simultanously? Because that is kind of one of the major points of Mantle...

  4. #14
    8-bit overflow
    Join Date
    May 2010
    Posts
    518
    Quote Originally Posted by tensz View Post
    I'm sorry but this is very, very wishful thinking, and very naive.
    At worst for AMD: Mantle does absolutely not much.
    At best for AMD: Nvidia fanboys will simply skip the next generation and hibernate until generation after that.

    They're not going to buy AMD cards, and you're not going to see huge marketshare shifts, and neither will developers.

    Already now you see nvidia's marketing machine giving out promises to their fanboys.
    "they have Mantle but we have G-sync!!, inferior frame rates doesn't matter because g-sync is just sooo much better!!!"
    And their fanboys will eat this promise, because they want to believe.
    They will listen to nvidia's marketing over the benchmarks, and simply skip the generation in anticipation for this promised g-sync.

    That's how it'll go.
    Compelling arguments. Any more where those came from?

  5. #15
    640k who needs more?
    Join Date
    Sep 2010
    Posts
    905

    Fanboy percentage

    Quote Originally Posted by tensz View Post
    I'm sorry but this is very, very wishful thinking, and very naive.
    At worst for AMD: Mantle does absolutely not much.
    At best for AMD: Nvidia fanboys will simply skip the next generation and hibernate until generation after that.

    They're not going to buy AMD cards, and you're not going to see huge marketshare shifts, and neither will developers.

    Already now you see nvidia's marketing machine giving out promises to their fanboys.
    "they have Mantle but we have G-sync!!, inferior frame rates doesn't matter because g-sync is just sooo much better!!!"
    And their fanboys will eat this promise, because they want to believe.
    They will listen to nvidia's marketing over the benchmarks, and simply skip the generation in anticipation for this promised g-sync.

    That's how it'll go.
    How much percentage the fanboys are from the customers, which will upgrade their PC for the wave of the current new games(BF4, COD:G, etc.)? I would estimate around 20%.
    IIRC, BF3 generated something like 1B$ sales in upgrades; I doubt that BF4 will get close, since the competition of the new consoles; but if at all price points NV loses heavily, everybody but the NV fanboys will go with AMD.

  6. #16
    Senior Member
    Join Date
    Nov 2011
    Posts
    2,593
    Quote Originally Posted by tensz View Post
    I'm sorry but this is very, very wishful thinking, and very naive.
    At worst for AMD: Mantle does absolutely not much.
    At best for AMD: Nvidia fanboys will simply skip the next generation and hibernate until generation after that.

    They're not going to buy AMD cards, and you're not going to see huge marketshare shifts, and neither will developers.

    Already now you see nvidia's marketing machine giving out promises to their fanboys.
    "they have Mantle but we have G-sync!!, inferior frame rates doesn't matter because g-sync is just sooo much better!!!"
    And their fanboys will eat this promise, because they want to believe.
    They will listen to nvidia's marketing over the benchmarks, and simply skip the generation in anticipation for this promised g-sync.

    That's how it'll go.
    Indeed. As much as I have used Radeons in the past you just have to look at a lower level forum like OCN and look at how users with 3930k OCed with two 780s are complaining that a single 290x makes too much heat.

    There are definitely people who will never switch to AMD. Nvidia could have a card that is marginally faster yet costs 30% more and people would still buy it. Oh wait, that's happening already.

    I personally feel that there are so many Intel/Nvidia users out there who hate AMD simply because they don't like the idea of people spending less money and getting performance close to them.

    Take a peek at a less knowledgeable forum like Tom's or OCN or whatever and you'll notice that there's a lot of people whining about power consumption like their 100w Intel is still 100w at 5ghz.

    AMD has a massive image problem and they constantly let Nvidia and Intel beat them up, and AMD never does anything evil back to them AFAIK. I feel like AMD is stuck in the early 90s mentality of "our customers are all highly educated clients who will do research and understand computers thoroughly" and AMD hasn't made the change to the current market of "I bought a closed loop cooler, changed my multiplier, and bought the most expensive graphics card out there, I'm so amazing with computers even my mom and grandparents say so too!"

    Read is executing fantastically but he's not used to competing in a market where your competitors release FUD and bash your product and he is instead used to a market where if you make something decent at an affordable price people will buy it and not complain.

    AMD is going to be stuck where they are no matter what kind of products they release, because unless they release an absolutely perfect product, they're going to get hammered with FUD.

    And I'm talking along the lines of, quieter than Nvidia's offerings, cooler than Nvidia, faster than Nvidia, and not a single driver problem, all while being faster.

    AMD is going to have to step up and start getting more aggressive towards its competitors if it ever wants to gain serious traction.

    They could have easily pulled a bunch of FUD about TIM in the IHS of IB chips and questioned how long the chip will live because the TIM might degrade. There's lots of things they could have done and they just keep trying to nail price to performance and thinking that that's going to instantly solve their problems.

    So end rant I suppose. I appreciate the graphics API talk, the amount of people who don't understand APIs in the slightest talking about how Mantle is a failure and it's going to make double the work for game developers or whatever is astounding. Which, once again, shows AMD not grasping how to handle the market. A simple infographic that had a flow chart of a DirectX drawcall alongside one next to a Mantle one would have been all it would have taken to help people understand how these things work.

  7. #17
    AMD doesn't need to talk to the typical user whose attention span and technical knowledge amount to reading review number charts and conclusions.

    They are doing the right thing by talking to developers first which is what matters most when you want adoption of new tech, in fact Mantle is AMD responding to the wishes of developers. People saying that AMD is failing because they haven't shown numbers 3 months prior of the API's official launch are just as asinine as the fanboys attacking it.

    Anyways, numbers for the drooling masses will be here in a couple of days.

    P.S: Funny thing is the attacks will not stop. The goal posts will be moved to them saying very few games will ever use it to matter or that Nvidia will come out with their own (as if that negated the existence of Mantle in the first place). If Mantle bothers you in any shape or form, get over it. It's here, it will happen and nothing anyone says will change that.

    As my favorite vorlon once said: The avalanche has already started, it is too late for the pebbles to vote.

  8. #18
    Quote Originally Posted by BioSehnsucht View Post
    Are anyone's OpenGL drivers fully threadsafe allowing for dispatching draw calls from multiple threads simultanously? Because that is kind of one of the major points of Mantle...
    OpenGL doesn't let you specify which thread you're running things from. As far as OpenGL is concerned, all OpenGL commands come from the same thread. Which means that a game engine had better make it actually work that way or else different threads are going to trip over each other and break things.

    DirectX 11 does let you use multi-threaded rendering. But as far as I'm aware, Civilization V is the only game to ever do so--and I suspect that this is why the game performed so erratically in benchmarks. At the very least, it's an option that DirectX 11 games mostly avoid.

    And for good reason. There's a big downside to multi-threaded rendering that people tend to ignore. If the CPU gets ahead of the GPU, passing along stuff faster than the GPU can render it, an OpenGL command may take longer to return as the video drivers make the CPU wait before submitting more stuff. This is necessary, as without it, you could have a situation where draw calls are submitted many frames ahead of time and the resulting display latency makes a game completely unplayable.

    But when the GPU makes the CPU wait, as far as the OS is aware, that core is busy and isn't available to be used by anything else. Set up multi-threaded rendering and you could easily have several CPU cores spending most of their time waiting for the video drivers to say that it's okay to continue, but unavailable to do other CPU work needed by the game engine. Even "wasting" a single CPU core on this is substantial, and the reason why a Core i3 processor often beats a Pentium dual-core in games by so much even though they're nearly identical apart from hyperthreading and the game doesn't scale well to very many cores.

    It's not very hard to set up a game engine so that you have a rendering thread that does little else besides spam OpenGL commands. This isn't just draw calls; it's more like pass along new values for this uniform, that uniform, some other uniforms, switch to this texture, that vertex array object, this sampler, and then finally a draw call. And then repeat all of that for the next thing that you want to draw, occasionally mixing in other OpenGL commands. It's pretty trivial to make the work of calculating what the uniforms should be, among other things, done in other threads, so having a single rendering thread is only a meaningful bottleneck if you need more OpenGL commands than a single CPU core can handle.

  9. #19
    Quote Originally Posted by sdlvx View Post
    And I'm talking along the lines of, quieter than Nvidia's offerings, cooler than Nvidia, faster than Nvidia, and not a single driver problem, all while being faster.
    AMD Linux drivers still are a pile of sh*t. So I keep on buying Nvidia and still have to regret it. What I hate is that I have to buy Nvidia because AMD can't write good drivers for Linux and prefer creating a new dedicated API for game makers.

    Which, once again, shows AMD not grasping how to handle the market. A simple infographic that had a flow chart of a DirectX drawcall alongside one next to a Mantle one would have been all it would have taken to help people understand how these things work.
    If you think end users care about the API and what/how calls are being made, then it's you who don't understand how to handle the market. People interested by that, game programmers, already have expressed their interest.

    For the wider market, what is needed is FPS comparisons of games using Mantle vs games usings OGL or DX.
    Speaking for myself.

  10. #20
    8-bit overflow
    Join Date
    Sep 2009
    Posts
    305
    Quote Originally Posted by tensz View Post
    I'm sorry but this is very, very wishful thinking, and very naive.
    At worst for AMD: Mantle does absolutely not much.
    At best for AMD: Nvidia fanboys will simply skip the next generation and hibernate until generation after that.

    They're not going to buy AMD cards, and you're not going to see huge marketshare shifts, and neither will developers.

    Already now you see nvidia's marketing machine giving out promises to their fanboys.
    "they have Mantle but we have G-sync!!, inferior frame rates doesn't matter because g-sync is just sooo much better!!!"
    And their fanboys will eat this promise, because they want to believe.
    They will listen to nvidia's marketing over the benchmarks, and simply skip the generation in anticipation for this promised g-sync.

    That's how it'll go.
    Hard core gamers out number fanboi's 1000 to 1. The problem is fanboi's are 1000 times more vocal. The buying public will sway to whatever is giving them the experience they are seeking. Fanboi's will be swayed to whomever is perceived to be number one! Being a fanboi is mostly about ego and attaching yourself to something that is number one. In most cases.

    With the console market taking majority of the pie there is a probability of a huge market shift towards AMD. I know we all like to think things will stay the same, but they never do. With a minimum 5-7 year grip, with the possibility of that extending down the road, AMD will become relevant to even the most hardcore Nvidia fan. There is no skipping generations. This isn't a yearly GPU product refresh, but a long term strategy.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
WordPress Appliance - Powered by TurnKey Linux