I’m not exceptionally skilled at first-person shooters, but I’m very particular about making sure that my look sensitivity in first-person games is what I’m used to, even in a non-competitive context. That is to say, it should feel the same between games. To me, this seems like something easy to solve, but every game I’ve ever played, no matter the budget, does the same thing - they give you a slider with an arbitrary unit on it.
Preface
Mouse sensitivity is measured in DPI: dots per inch, where a “dot” refers to the smallest unit of raw sensor movement. If your OS leaves the sensitivity alone, then DPI effectively means pixels per inch.
If you want the most precision possible, you should not adjust your mouse sensitivity in your OS settings - you should configure it in the mouse software. If you agree with me or you don’t care, feel free to skip the rest of this section:
Mouse software
When you adjust your mouse sensitivity in the mouse software, you’re adjusting the sensitivity of the sensor. So if you increase it by 25%, then the mouse will report 1 dot at 80% of the distance of the previous sensitivity (1 / 1.25 = 0.8). All is well and good here.
OS Sensitivity
In contrast, if you set your OS sensitivity to 125%, your OS tries to fake this increase: If the mouse reports 4 dots in one poll, it’s adjusted to 5. If the mouse reports 8, it’s adjusted to 10. This works fine for these nicely divisible values; however, for numbers that don’t end up as a whole integer, the OS uses subpixel precision by storing the remainder in an internal accumulator. For simplicity, let’s use a scenario where the mouse is moving at a steady pace of 1 dot per poll at 125% OS sensitivity:
| poll # | Pointer movement | Accumulator after movement |
|---|---|---|
| 1 | 1 | 0.25 |
| 2 | 1 | 0.50 |
| 3 | 1 | 0.75 |
| 4 | 2 | 0.00 |
| 5 | 1 | 0.25 |
| 6 | 1 | 0.50 |
And so on. Note that there are separate accumulators for the X and Y axis, but for this example, let’s assume the mouse is moving only along 1 axis.
In the first poll, the mouse reports 1 dot, which gets converted to 1.25 pixels, so the cursor moves 1 pixel and the remaining 0.25 is stored in the accumulator. This process repeats, storing an additional 0.25 in the accumulator until poll #4, where adding the 1.25 movement to 0.75 leads to a movement of 2 pixels. So every 4th poll, the mouse moves 2 pixels instead of one, adding jitter to our steady 1 dot per poll speed.
You can test this yourself, if you’ve got the hand and eyes for it. Set your OS mouse sensitivity to some value a little over 1x and then move the mouse along an axis, pixel by pixel. You’ll notice that at some point, it will jump 2 pixels instead of 1. If you move the mouse back, it jumps 2 pixels again, but only 1 pixel in the area immediately around it.
The precision loss is mostly negligible at values close to 1:1 and gets worse the higher the OS sensitivity is set. For example, if your OS sensitivity is 3x the mouse sensitivity, the cursor can move no less than 3 pixels at a time.
This is why it’s common for competitive players to recommend modifying mouse sensitivity only in mouse software and leaving OS sensitivity at 1:1 (as well as disabling acceleration, or “Enhance pointer precision” on Windows, but that’s a different story).
Granted, you don’t lose much precision when the OS is trying to lower the mouse sensitivity rather than raise it, but the point is: For the rest of this post we’ll assume a 1:1 OS sensitivity (or otherwise Raw Input being enabled in the game).
The problem
The naive approach to achieve consistent look sensitivity between first-person games is aiming for the same degrees of rotation per dot. You might measure how much your mouse needs to travel to turn 360 degrees, and you might notice that despite this, it still doesn’t feel the same between games. This is due to a difference in FOV between games.
I created a tool in Unity many years ago to test and transfer look sensitivities to other games. The page has a pretty good explanation:
Though this tool has the option to measure the amount that one’s mouse travels in a 360 degree turn, it can also measure the distance to look from one edge of the player’s field of view to the other (from an object on one edge of the screen to an object on the other). Consistent sensitivity using this reference is typically preferred over consistency in degrees because your reticle travels the same amount relative to objects around it.
For example, if you’re looking down the scope of a sniper rifle, you would not want to turn the same amount of degrees per dot that you would when looking around normally. You would want the visible world to move across the screen by the same amount. Measuring the distance your mouse travels when looking across your field of view creates this perceived consistency in all FOVs, so you do not need to maintain consistent FOV between games.
To be fair, matching sensitivities in terms of degrees of rotation is fine so long as the FOV is consistent between games. Another factor is the window size but I’ll assume most people play in full screen.
Anyway, the tool was more of a learning project for me - you really just need a piece of paper.
The Solution
On my desk there’s an old envelope with 2 markings on it a couple inches apart. The distance between these markings is the distance my mouse should travel to look from one side of my FOV to the other. So whenever I start a new first-person game, I do the following:
- Align 2 objects exactly at both edges of my screen
- Look at the object to the left, with my crosshair aligned vertically at the horizon
- Pause the game
- Put that paper under my mouse with the right side of the mouse aligned with the first marking
- Unpause the game and move my mouse to the right until I’m looking at the object to the right
- See where the mouse landed. If the right side of the mouse is to the left of the marking, decrease the game’s look sensitivity. If it’s to the right of it, increase it. Then start again from step 2 until the mouse ends up aligned with the second marking.
The markings I initially drew with the same process, but instead of the crosshair, I moved my desktop cursor from one side of the screen to the other. I wanted my look sensitivity to feel like my pointer speed. Another benefit of this method is that it even ensures consistent look sensitivity between mice from different brands of which the software may report different speeds as the same DPI value, since you’re basing it on physical mouse travel distance.
Ideally, developers would use something like “Dots per FOV°” as the unit. If we measure look sensitivity in dots to turn the camera by the amount of degrees that your horizontal FOV takes up, then we have just one number that we have to remember. Want to look around at about the same speed as your pointer? Set it to your horizontal resolution (again, assuming full screen and a 1:1 OS sensitivity). The only caveat is that higher numbers would be slower, which might be counterintuitive but I guess you could just use the reciprocal or something.
Realistically though no game developer is going to start doing this so if you want consistent look sensitivity, just use the paper method.
Thanks for reading.