Page 1 of 11 123 ... LastLast
Results 1 to 10 of 107

Thread: Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

  1. #1

    Batman: Arkham Asylum Enables AA Only on NVIDIA Hardware on PCs

    Quote Originally Posted by techPowerup
    Anti-Aliasing has been one of the most basic image-quality enhancements available in today's games. PC graphics hardware manufacturers regard it as more of an industry standard, and game developers echo with them, by integrating anti-aliasing (AA) features in the game, as part of its engine. This allows the game to selectively implement AA in parts of the 3D scene, so even as the overall image quality of the scene is improved, so is performance, by making sure that not every object in the scene is given AA. It seems that in one of the most well marketed games of the year, Batman: Arkham Asylum, doesn't like to work with ATI Radeon graphics cards when it comes to its in-game AA implementation.

    Developed under NVIDIA's The Way it's Meant to be Played program, and featuring NVIDIA's PhysX technology, the game's launcher disables in-game AA when it detects AMD's ATI Radeon graphics hardware. AMD's Ian McNaughton in his recent blog thread said that they had confirmed this by an experiment where they ran ATI Radeon hardware under changed device IDs. Says McNaughton: "Additionally, the in-game AA option was removed when ATI cards are detected. We were able to confirm this by changing the ids of ATI graphics cards in the Batman demo. By tricking the application, we were able to get in-game AA option where our performance was significantly enhanced." He further adds that the option is not available for the retail game as there is a secure-rom.

    With no in-game AA available to ATI Radeon users, although the features do technically work on ATI Radeon hardware, the only way AA can be used is by forcing it in Catalyst Control Center. This causes the driver to use AA on every 3D object in the scene, reducing performance, compared to if the game's in-game AA engine is used. "To fairly benchmark this application, please turn off all AA to assess the performance of the respective graphics cards. Also, we should point out that even at 2560?1600 with 4x AA and 8x AF we are still in the highly playable territory," McNaughton adds. Choose with your wallets.
    http://www.techpowerup.com/104868/Ba...re_on_PCs.html

    Wow, a new low...
    My opinions are my own and does not represent that of my employer.

  2. #2
    GROAN.

    http://www.semiaccurate.com/forums/s...=6906#post6906

    This was already posted about in the Games sub-forum. This wasn't Rocksteady's doing, and nVidia had no reason to debug their solutions on ATI hardware. The problems are AMD not providing QA for low-level fixes not provided by the engine, and the engine licenser not having a cross-licensing agreement that encourages developers to share solutions.
    Or something like that.

  3. #3
    8-bit overflow
    Join Date
    Sep 2009
    Posts
    605
    Its is not in a game developers interest to p*ss off a large part of the market.
    "Oh, btw, we only checked Nvidia Hardware, and most of the stuff won't work on Ati....they didn't give us any money".

    As has been said before, this is not the first time Nvidia has been caught out.
    People have not forgotten all the texture-filtering fiascos and such.

    This is just more of the same.

    They should just put a very big "NVIDIA ONLY" sticker on the box....with a dollar sign in the background..

  4. #4
    Quote Originally Posted by Evildead666 View Post
    Its is not in a game developers interest to p*ss off a large part of the market.
    "Oh, btw, we only checked Nvidia Hardware, and most of the stuff won't work on Ati....they didn't give us any money".

    As has been said before, this is not the first time Nvidia has been caught out.
    People have not forgotten all the texture-filtering fiascos and such.

    This is just more of the same.

    They should just put a very big "NVIDIA ONLY" sticker on the box....with a dollar sign in the background..
    If nVidia subsidizes low-level implementations of features and certifies it for their hardware, and AMD doesn't come to the table, then the developer has no reason to exhaust additional resources propagating the changes. Compare the cost of making games to making movies...anything a developer omits is due 80% of the time to one of two factors: 1) Time constraints, 2) Determination that the cost of the feature would not be offset by additional sales.

    Do lazy omissions happen? Yes. Those almost always manifest as gameplay or game logic glitches, however. Visual consistency across platforms is expensive and time-consuming...if that were the genesis of this problem, it would manifest itself much more starkly than by a lack of optimized AA.
    Or something like that.

  5. #5
    >intel 4004
    Join Date
    Jun 2009
    Posts
    11,281
    Quote Originally Posted by olivergringold View Post
    GROAN.

    http://www.semiaccurate.com/forums/s...=6906#post6906

    This was already posted about in the Games sub-forum. This wasn't Rocksteady's doing, and nVidia had no reason to debug their solutions on ATI hardware. The problems are AMD not providing QA for low-level fixes not provided by the engine, and the engine licenser not having a cross-licensing agreement that encourages developers to share solutions.
    Quote Originally Posted by olivergringold View Post
    If nVidia subsidizes low-level implementations of features and certifies it for their hardware, and AMD doesn't come to the table, then the developer has no reason to exhaust additional resources propagating the changes. Compare the cost of making games to making movies...anything a developer omits is due 80% of the time to one of two factors: 1) Time constraints, 2) Determination that the cost of the feature would not be offset by additional sales.
    Unreal Engine 3 without AA again.

    So you are talking if some vendors doesn't pump money into the game making progress, then their product should not deserve even the basic stuff as AA? And that the game studio should listen to the vendor(s) who pumps money into their project, and prepared to be used as black propaganda against the vendors' competitors?

    Isn't this a kind of "anti-competitive" measure? Looks it is to me.

    P.S. Unreal Engine 3 is now becoming a new way of getting money and resources invested into your game development progress, all thanks to NVIDIA's "The way it's meant to be payed" scheme. Thank you NVIDIA, for letting the game industry to know where to get the funds and support from now on when developing DirectX 10 games.

  6. #6
    Quote Originally Posted by 265586888 View Post
    Unreal Engine 3 without AA again.

    So you are talking if some vendors doesn't pump money into the game making progress, then their product should not deserve even the basic stuff as AA? And that the game studio should listen to whatever the vendor who pumps money into their project and used as black propaganda?
    Basic stuff ain't so basic when you read the starting salaries of people who'd be qualified to code that kind of thing.

    Quote Originally Posted by 265586888 View Post
    Isn't this a kind of "anti-competitive" measure? Looks it is to me.
    Only if nVidia required Rocksteady not to implement the changes on ATI hardware. They didn't have to mandate it because it would've been too expensive anyways. Chalk that up to industry politics.
    Or something like that.

  7. #7
    >intel 4004
    Join Date
    Jun 2009
    Posts
    11,281
    Quote Originally Posted by olivergringold View Post
    Basic stuff ain't so basic when you read the starting salaries of people who'd be qualified to code that kind of thing.

    Only if nVidia required Rocksteady not to implement the changes on ATI hardware. They didn't have to mandate it because it would've been too expensive anyways. Chalk that up to industry politics.
    Care to explain why Ian McNaugton point out that they can trick the game to open in-game AA options and get performance improvements with AA enabled when they changed the device ID of their hardware and runs the pre-release version of the game?

    Don't use expensive as excuse, if you so wanted to do one thing, then nothing is expensive (in this capitalism market), as in this case, NVIDIA did win a chance for black propaganda with that money. Who knows whether it's expensive?

  8. #8
    Quote Originally Posted by 265586888 View Post
    Care to explain why Ian McNaugton point out that they can trick the game to open in-game AA options and get performance improvements with AA enabled when they changed the device ID of their hardware and runs the pre-release version of the game?

    Don't use expensive as excuse, if you so wanted to do one thing, then nothing is expensive (in this capitalism market), as in this case, NVIDIA did win a chance for black propaganda with that money. Who knows whether it's expensive?
    If ATI and nV have both made the same progress implementing DX, then yes, you would expect something coded for one vendor to work on the other vendor's card (excluding proprietary APIs like PhysX and CUDA). The reason the fix wasn't opened up to ATI cards is because (brace yourself for this) making sure that there are no bugs with cross-configuration performance is expensive.

    "[If] you so wanted to do one thing, then nothing is expensive." What the heck does that even mean? Being a capitalist doesn't mean having infinite capital.
    Or something like that.

  9. #9
    >intel 4004
    Join Date
    Jun 2009
    Posts
    11,281
    Quote Originally Posted by olivergringold View Post
    If ATI and nV have both made the same progress implementing DX, then yes, you would expect something coded for one vendor to work on the other vendor's card (excluding proprietary APIs like PhysX and CUDA). The reason the fix wasn't opened up to ATI cards is because (brace yourself for this) making sure that there are no bugs with cross-configuration performance is expensive.

    "[If] you so wanted to do one thing, then nothing is expensive." What the heck does that even mean? Being a capitalist doesn't mean having infinite capital.
    Nonsense, then you shouldn't fix ANY bugs after releasing a game as they are also expensive (in terms of job incentives), as this usually involves a team of programmers to do so.

    But if you are so wanted to win, then you won't care the price. This, as the company's image, is what NVIDIA is showing to people right now. NVIDIA sure don't have infinite capital, but enough to create a few black propaganda projects under the "The Way It's Meant To Be Payed" scheme.

    The fine examples are discussed in Ian McNaugton's post about recent titles which looks bad on ATI hardware. And those games are expensive by responding AMD's feedback but NOT NVIDIA's, hm... intriguing...

  10. #10
    Let's see...popular game, nVidia sponsored, delayed for pc until release of the HD5870, bundled with nVidia cards, in-game AA working but disabled for ATI cards, uses PhysX...

    Huang's wet dream: When the 5870 is released, everyone will benchmark it against nVidians in global AA vs in-game AA and CPU PhysX vs GPU PhysX, and they will all see which way it's meant to be played!!!!!

    Unfortunately for Huang, Reviewers don't seem to like being manipulated.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
WordPress Appliance - Powered by TurnKey Linux