Camera rotation speed affected by framerate using New Input System

Cheo

Active member
Hello, I saw several other threads about this issue, which mention a Unity limitation and suggest to use either Rewired or the new Input System. I was already intending to use the latter and imported the integration, which works just as expected and is quite convenient, except for the fact that it doesn't solve the framerate issue ! When limiting the framerate to 30 with Application.targetFrameRate in the CharacterControllerSampleScene scene, the camera takes about 4 seconds to make a whole turn around the Nolan character, while it can be done in quite litterally the blink of an eye with no limitation (which amounts to 200+ fps on my computer).
Can someone make a quick test in this sample scene to confirm whether they have the same issue and perhaps provide a fix ? Thanks in avance !
 
Are you running the latest version of the controller? This issue was fixed awhile ago so the movement is framerate independent now.
 
Well I just created a new test project in Unity 2021.3.10f1 (LTS), Input System 1.4.3 and UCC 2.4.8, and confirm that there is the exact same issue as in my current project (in Unity 2022.1.17f1 with UCC and Input System up t date as well), but the contrast is in fact even worse due to the uncapped fps sometimes reaching over 500 !
I tried with V-Sync on and off, but it changed nothing. If you don't have a clue about what may be the root of this issue I can perhaps send you my test project ?
 
Hello, allow me to bump this thread, I doubled checked my test project and maintain that there is a framerate issue despite all of the assets and packages being up to date. If nobody else still has this issue, I once again propose to share the test project.
This issue is particularly annoying, I have a game I'd like to publish by the end of the year but it wouldn't be acceptable with such an inconsistent rotation speed. Really hope someone can help me out with this !
 
I thought that I responded to this. On the camera controller handler there is a checkbox called Adjust for Framerate that you can deselect and it'll work as you expect.
 
Ugh, that was a false hope - I had no idea this parameter existed, but turning it off an on doesn't seem to change anything :(

I believe there's one lead though - the CameraControllerHandler script contains a reference to the old PlayerInput script as m_PlayerInput. It is used several times inside the script to get inputs, maybe this could be the root of the issue ? I'm gonna try a quick test to figure this out.
 
I was mainly testing with my Xbox One controller so far, and it seems the framerate issue is present for sure when using it ! If there is a difference for the mouse it can't be a big one. The CharacterInput asset is left exactly as you designed it. Did you test the input demo scene using a controller with and without a framerate cap ?
 
Actually the framerate issue is present for sure with the mouse as well, the gap is way more noticeable with a controller, the rotation speed is vastly lower when capping to 30 fps.
I must have misunderstood the Player Input thing, apparently it's an abstract class and the original script doesn't need to be on the character. Really don't have a clue where this issue comes from !
 
So you aren't seeing much of a difference with that toggle and the mouse? You should definitely see a fairly large difference. As a test I set the framerate to 20 and when I moved my mouse a set distance in the same amount of time as when the framerate is 200 the camera moves a lot less compared to when that toggle is disabled.
 
After lowering the look sensitivity I confirm the Adjust for framerate bool has an effect and can raise the rotation speed by 1.5 when using a mouse. The issue still remains with the controller as evidenced by a print of m_PlayerInput.GetLookVector after line 145 of the CameraControllerHandler : regardless of the look vector mode (although it is worse with Smoothed selected), the value is clearly lower when capping the framerate, whether Adjust for framerate is true or false. It seems worth noting that the look vector is obtained just before the check for Adjust for framerate, so there must be something wrong between GetLookVector and the controller. May I please ask you to run another quick test with a controller and a debug of GetLookVector ?

Here is a quick video so you can see how it looks on my end :

 
I currently don't have a controller to test against, but from the PlayerInput side of things there isn't a difference. It doesn't know what device the input is coming from.

Have you tried using the Input Manager system? I'm curious if that's any different for you. Also, if you divide the look vector time by delta time that should normalize it no matter what the framerate is. You could try adding that adjust to see if it helps with your scenario.
 
Hi Cheo, I had noticeable slowed input using a cheap usb controller when I did some tests to ensure Split Screen worked for USC, and I had to manually make the difference up, but I figured it was the cheap controller being the issue or my mouse being on max sensitivity on my system (not in unity). So I wonder for one part, what is your mouse sensitivity set to outside of Unity? Maybe this is the difference between speed of mouse input vs controller. I am using new input system also.
 
Hello, thanks for your answers. Going back to the old input system was what I should have done in the first place, I completely forgot I had this issue with it to begin with and was hoping for the new input system to fix it. So the issue is the same in both cases, and if I understand it correctly the greater the framerate the more GetLookVector will be called which would explain the gap, correct ?
I don't understand where exactly I'm supposed to divide this look vector time, I don't even know where this property is ! Are you just talking about lookVector in CameraControllerHandler's Fixed Update ?

My mouse's DPI is about 800 and the Xbox One Controller has by default a lower sensitivity, but that's not the point - regardless of the controller's sensitivity, the final rotation values should remain consistent whatever the framerate.
 
Within the camera controller handler this value is retrieved:

Code:
var lookVector = m_PlayerInput.GetLookVector(true);

It is this value that is causing a problem for you. If you divide by the delta time then it will normalize it per frame.

I really wish that I could reproduce this on my end. It's strange that I'm not able to even with a mouse.
 
Dividing this value by deltaTime, which is 0.02, results in a really bigger number, a look vector of just 1 for example becomes 50 ! Setting the look sensitivity in Unity Input to 0.001 doesn't even solve the issue and wouldn't even be a desirable solution if it did. And once again this is just using an xbox one controller which has a limited sensitivy. Dividing the high result by something like 50 does not solve the issue either and seems to lead back to inconsistent rotation speeds between different framerates.

I simply wrote this line, and tried both before and after the adjust for framerate line, with and without this parameter. Is this what you wanted me to write, andif yes don't you get some crazy boost on your end ?
C#:
lookVector /= Time.deltaTime;
 
Forgot to say that the division doesn't even solve the rotation speed issue with the controller. However, I just did another test with my mouse with a very low dpi and it actually looks like it doesn't have any issue at all and doesn't need any tweaking ! I really don't get it, if someone with a controller could please make a test on their end it would help seeing things more clearly !
 
lookVector /= Time.deltaTime;
Off Topic but I advise that you never use division for a look vector, or much else really. Reason being it'll cause errors when divided by 0. You can always achieve division without this chance of error, by multiplication. For example 1/10 = 1 * 0.1 you could also then just increase this value to result in a more desirable value or multiply it by some factor.
Really wish I could help more but I don't seem to get noticeable frame rate difference, only mouse vs controller.
 
Please make it clear for me : when using your controller, does the rotation speed remain consistent whether the framerate is uncapped or capped to a low number like 30 or 20 ? If not, do the adjust for framerate bool or division/multiplication by delta time change anything ?

I just remembered I had another controller somewhere, I’ll go grab it in a few days and run the same test to be sure whether this issue only occurs with my xbox one controller.
 
If not, do the adjust for framerate bool or division/multiplication by delta time change anything ?
At this moment I am not able to test this sorry, I am tied up with work, but, I will indeed make this test for you as soon as I get a chance and give a a more complete answer :)
 
I think the best route is for me to be able to reproduce it so I can look into it further. Are you able to reproduce the problem with just a mouse?
 
Top