Mouse input sensitivity changes with framerate change

I am wondering if you have any insight into a problem I am facing regarding mouse sensitivity.
I changed the update from fixed to normal update due to the fact that 60 fps update when playing at 240 fps is noticeable, after a few players mentioned it felt like mouse sens was at 60 fps. After updating the update frequency, I am noticing when changing the fps from 60 to 144 to 240, there is a big difference in sensitivity in game. The input is set to raw and with the polling of a mouse and the update of the cursor being the change of position from update to update, I am not sure I know why this is happening. Any ideas?
 
I might be missing something because to me it looks like the raw look vector mode is a straight CurrentLookVector = RawLookVector and i removed the time dependent pieces:

C#:
 /// <summary>
        /// Updates the look smoothing buffer to the current look vector.
        /// </summary>
        private void UpdateLookVector()
        {
            if (!m_HasFocus){// || m_UpdateFrame == Time.frameCount) {
                return;
            }
           //m_UpdateFrame = Time.frameCount;

            m_RawLookVector.x = GetAxisRaw(m_HorizontalLookInputName);
            m_RawLookVector.y = GetAxisRaw(m_VerticalLookInputName);

m_RawLookVector.x = m_RawLookVector.x * m_LookSensitivity.x;
                m_RawLookVector.y = m_RawLookVector.y * m_LookSensitivity.y;
                m_CurrentLookVector = m_RawLookVector;

*** just saw that the camera controller handler is pulling the value on a fixed update, doh.
solved. thanks.
 
It wasn't as easy as I had expected but I am also using a very old UCC build so not sure it is applicable to the current release. I changed the fixed time duration and tweaked all of the physics and scaled some of the animation controller timing of the deterministic objects to adjust properly.
 
The latest version has this change in. If you are using the Update loop you should set the Application.targetFramerate.
 
Top