Ivan Tuzikov Posted March 10, 2016 Posted March 10, 2016 (edited) I have a 7870 Radeon GPU in my PC. Display is set to 60 Hz (maximum supported). Could somebody please explain me the following things: 1. When I turn Vsync ON, set FPS limit to 60 and the GPU renders less than 60 FPS, for example 43, which would be the actual number of displayed frames? 2. When I turn Vsync OFF, set FPS limit to 30 and the GPU renders more than 30 FPS, for example 33, which would be the actual number of displayed frames? Thanks for advance! Edited March 10, 2016 by Ivan Tuzikov
fiveworlds Posted March 10, 2016 Posted March 10, 2016 (edited) According to http://www.geforce.com/hardware/technology/adaptive-vsync/technology when Vsync is on the frame rate is set just below your monitor's refresh rate. As you may or may not know your monitor may only be capable of running x frames per second. When Vsync is off the frame rate can go above the maximum frame rate of your monitor causing glitchy effects. Edited March 10, 2016 by fiveworlds
Ivan Tuzikov Posted March 11, 2016 Author Posted March 11, 2016 I think my post was rather inexplicit in terms of wording. Say, I set through Radeon Settings a 30 fps limit and V-sync OFF and my display cap is 60 Hz. Will the display show each rendered frame twice to correspond to 60 Hz display? Will it eliminate tearing?
fiveworlds Posted March 11, 2016 Posted March 11, 2016 (edited) Say, I set through Radeon Settings a 30 fps limit and V-sync OFF and my display cap is 60 Hz. Will the display show each rendered frame twice to correspond to 60 Hz display? Will it eliminate tearing? If your monitor has a 60hz display it is capable of running 60 frames per second. If you set your graphics card with a 30 frames per second limit it will not go over 30 frames per second and it will not display each rendered frame twice. For instance if you watch a movie at 60 frames per second you will only see a maximum of 30 frames per second only half of the frames will be displayed the movie does not increase in duration. So basically your graphics card would "discard" every second frame. Which is fine for most things however certain things require faster frame rate to look normal for instance a dropped ball bouncing. Edited March 11, 2016 by fiveworlds
Recommended Posts
Create an account or sign in to comment
You need to be a member in order to leave a comment
Create an account
Sign up for a new account in our community. It's easy!
Register a new accountSign in
Already have an account? Sign in here.
Sign In Now