It seems that you're using an outdated browser. Some things may not work as they should (or don't work at all).
We suggest you upgrade newer and better browser like: Chrome, Firefox, Internet Explorer or Opera

×
I tried the FXAA, and it works marvelously well !

A true revolution to me, the HUD blurring is noticeable, but not annoying at all, and certainly not altering the gameplay, that's all we need ^^

An effect that is so close to Ubersampling setting that it's hard to tell the difference, and honestly, between losing 40% fps and 4 fps, the choise is easy to make...
The effect is even better with AA enabled : edges are more blended, and seems to become slightly more accurate than with FXAA alone.

Powerful algorithm, i don't understand why this technology was not more used before, considering how old it is... maybe to sell more powerful GPU's ? :p

I noticed also that TW2 AA was not getting better with drivers options, i finally got the answer to this...
avatar
Fenris78: The effect is even better with AA enabled : edges are more blended, and seems to become slightly more accurate than with FXAA alone.
You might want to manually turn off "AllowAntialias=0" in the Documents\Witcher 2\Config\user.ini file.. while keeping "AllowSharpen=1"

Because in-game AA + FXAA causes too much blur, check the Geralt's eyes when you move the camera infront of him.

----------
avatar
Fenris78: Powerful algorithm, i don't understand why this technology was not more used before, considering how old it is... maybe to sell more powerful GPU's ? :p
Its not powerful, but rather a lean algorithm.. might want to check the original documents

http://developer.download.nvidia.com/assets/gamedev/files/sdk/11/FXAA_WhitePaper.pdf

And the person who researched on this technique maintains a blog -> http://timothylottes.blogspot.com/ <-

Also this technique has downside, I think you might have missed it (Like when you said in-game AA + FXAA = great quality, but it produces too much blurring)

----------
avatar
Fenris78: I noticed also that TW2 AA was not getting better with drivers options, i finally got the answer to this...
Ah..not only Witcher 2, I know a few more games which this DLL wrapper works with. Risen, Divinity 2 Dragon Knight Saga, Dragon Age 2 DX11 (thou I don't like that game), etc... :)

-Edited-
Post edited August 04, 2011 by Anarki_Hunter
http://hotfile.com/dl/125853856/6607130/injectFxaa_by_some_dude_6.7z.html
here's the latest update if anyones interested
avatar
yaleblor: http://hotfile.com/dl/125853856/6607130/injectFxaa_by_some_dude_6.7z.html
here's the latest update if anyones interested
I even found 8.7 now: http://hotfile.com/dl/125887302/d5a49f2/injectFxaa_by_some_dude_8.7z.html
For your "Sharp" setting, are you using the User.ini AllowSharpening setting or are you using the FXAA sharpening?

The thing I really like about FXAA is that, moreso than the ingame AA, it seems to really take the edge off of the dithered look resulting from the shadowing implementation.

I didn't think to try the the game with both FXAA and in-game AA on at the same time, I'll have to try that. I don't really see too much difference (that would be visible while playing anyway) in the texture quality when you have both activated and sharpen on.
Post edited August 05, 2011 by Teiresias
avatar
Teiresias: For your "Sharp" setting, are you using the User.ini AllowSharpening setting or are you using the FXAA sharpening?
at the time of writing ( screenshots ) the sharpen filter was not implemented in the FXAA injector, so it's the in-game sharpen
I gotta admit, I never though about the overall details of the game increasing just by the method of sharpening (i do not know if the final viewport image is sharpened or only on the image assets which are being mapped into the geometry)

"^_^, hehe...technically smart guys in CDPR.

Usually used to depend on FAN made mods ( or self modification), which involved taking each individual image asset (or by batch processing) to increase/bump up the sharpness of the texture.

The option in the game (or in FXAA) totally removes the requirement to download GigaBytes of graphic assets again to increase details of the game. :D
Post edited August 14, 2011 by Anarki_Hunter
those are shader-made-fake-details lulz useless, sorry

real HIGH RES / HIGH DEFINITION user made textures are a different thing, and neither TW1 nor 2 have such things yet :(

FO3: http://www.fallout3nexus.com/downloads/file.php?id=12056

Oblivion: http://planetelderscrolls.gamespy.com/View.php?view=OblivionMods.Detail&amp;id=7219&amp;id=2363

etc
avatar
Licaon_Kter: those are shader-made-fake-details lulz useless, sorry

real HIGH RES / HIGH DEFINITION user made textures are a different thing, and neither TW1 nor 2 have such things yet :(

FO3: http://www.fallout3nexus.com/downloads/file.php?id=12056

Oblivion: http://planetelderscrolls.gamespy.com/View.php?view=OblivionMods.Detail&amp;id=7219&amp;id=2363

etc
YEah..... a real High resolution or unique textures (used as replacement for original assets) are a different thing (they are on a different level).

But a bit of sharpness does wonders, than being plain!. (Witcher 2 in SSAA value 2; doesn't look that detailed than Witcher 2 with AA + allowsharpen).

(Ah..I wanna know how CDPR implemented allowsharpen; full screen sharpen or only sharpen on textures in the memory before mapping them onto geometry. Because full screen sharpness hallowing almost does not exist in witcher 2!_or my eyes are deceiving me)

(I also wonder if there is a way to a execute a hack when any* image asset is read into the memory via either DirectX API or OpenGL, a hack to allowsharpen...)
avatar
Licaon_Kter: those are shader-made-fake-details lulz useless, sorry

real HIGH RES / HIGH DEFINITION user made textures are a different thing, and neither TW1 nor 2 have such things yet :(

FO3: http://www.fallout3nexus.com/downloads/file.php?id=12056

Oblivion: http://planetelderscrolls.gamespy.com/View.php?view=OblivionMods.Detail&amp;id=7219&amp;id=2363

etc
I don't think TW2 needs one... it has really good textures!