STEALTH NINJA {l Wrote}:
yer monitor wont display it, but yer console gunna tell ya how fast it can make the pictures ...............
bob, here's an example of when you would use this......
old setup = 299 fps .................. new setup costing £500 = 299 fps............ if ya turn fps_max upto 500 then you will see an increase, same goes for the Vsync, if its enabled it will limit yer FPS. I
If ya ever used novabench that displays max fps on the stress test, think i got about 1000 fps, which is higher than my older card.......... although i am aware that my screen is only 60hz
OK - FPS displayed in game is how many frames (full screens) are being rendered by the graphics card at that time.
Your monitor will simply ignore any extra frames over its refresh rate. e.g. fps of 160 from the hardware will result in the monitor ignoring 100 of those frames every second and only displaying those that are generated at the time it updates itself. This is where screen tear occurs, if you card sends out frame data as the monitor picks up the scan (or sync) signal, then sends another frame whilst the first is being drawn, the monitor can get out of sync and display part of the second frame as it refreshes instead of all of the first frame. A good quality monitor should lower the chance of screen tear though.
Bottom line, I leave v-sync on. No point my graphics card running flat out, using electricity and also potentially lowering lifespan (what do you think will last longer, a gfx @ 30 degrees C or a gfx at 75 degrees C?), when the monitor will discard loads of the data I just paid for. Couple that with the fact that the human eye wouldn't notice the difference anyway.....
I turn it off for L4D2 though because my single core CPU is arse and I only get 32fps with it.
Ah yeah, my GTX460 wasn;t recognised by L4D2 until after a recent driver update. No problem, just means you have to dip into the settings to get things looking nice.