Mouse Sensitivity and Settings in Apex Legends
A set of questions that come up all the time on the Apex Legends Discord are «What sensitivity should I use?», «What DPI should I be playing at» and/or «My aim sucks, what should I do?». This post is going to answer questions like that.
What sensitivity to use?
The obvious (and pretty stupid) answer to that question is: «Just use what feels right!». But let’s take this question apart by first answering as if it came from someone who has played other games in the past, because that’s easy to answer: If you have played another first or third person shooter for extended periods of time, start out using the same sensitivity.
Well, let me specify that a little better: Use the equivalent sensitivity, which may or may not be the same sensitivity. It depends on the games inherent yaw value. Boring explanation in the next paragraph:
Essentially when a game transforms the delta of your recently detected mouse movement into a rotation of your view, it does so using a specific yaw value. In Apex Legends for example, the default yaw is 0.022. This value gets multiplied with your sensitivity and the result is the amount of degrees your view gets rotated per mouse increment. So if for example you have a sensitivity of 5 and your mouse reports an increment, the game will rotate your view by 5*0.022=0.11º in the game world.
This means to convert your sensitivity from one game to Apex Legends and have it do the same amount of rotation per mouse increment, you need to multiply it by the yaw of that other game over 0.022. Ergo, the formula for that would be:
Converted sensitivity = old sensitivity * old yaw / new yaw
For example, Overwatch has a yaw of 0.0066, and as we now know Apex uses 0.022. That gives us 0.0066/0.022=0.3. So let’s say we use 6.3 sensitivity in Overwatch, which would rotate our view by 6.3*0.0066=0.04158º. To get the same rotation per increment in Apex Legends, we calculate the sensitivity we need to use via 6.3*(0.0066/0.022)=6.3*0.3=1.89. Let’s double check if a sensitivity of 1.89 rotates us the same amount of degrees in Apex Legends: 1.89*0.022=0.04158º. Cool. But before you go and play, note that even though you might have the mathematically correct equivalent sensitivity now, it might not feel the same yet. We will get to why that is in a second.
So how do you find out what yaw a game uses? Well, you try to google it, measure it yourself (with Sensitivity Matcher for example), or see if it happens to be on mouse-sensitivity.com. Also here’s a bunch of values for popular games right now:
- 0.022 – any Source Engine game (CS:GO, Apex Legends) and most id-tech 2 and later games (Quake, Doom(2016), Call of Duty, CS 1.6, etc.)
- 0.0066 – Overwatch
- ~0.01007 – Paladins
- 0.5555 – Fortnite (if you are using the in-game slider)
Now that you have your mathematically correct equivalent mouse sensitivity, chances are there’s too many numbers behind the decimal point, and the menu in Apex Legends only allows for a measly single digit. The solution is to open your config file and put it in there. That file is:
The entry you are looking for is called ‘mouse_sensitivity’.
Alright, after doing all that—okay, it was only two steps, calculating your equivalent sensitivity and putting it into your config—you might still end up realizing that your sensitivity feels off. The reason for this is that, while we did indeed calculate the mathematically correct equivalent sensitivity for you, it might not be the “tangibly” correct relative sensitivity; You will rotate by the same amount of degrees with the same mouse movements, but that doesn’t necessarily mean your crosshair moves the same arbitrary distance on your screen with the same mouse movement, because that also depends on your FoV, and if they aren’t the same in both games, it’s not gonna feel the same. Now you could just go ahead and lower your Apex FoV to 90º to have the same feeling as you do in CS:GO or whatever, but turns you don’t have to, because we can easily calculate the correct relative sensitivity in the same way that games calculate it when you use the zoom on a weapon for example. All we need to do is to divide the equivalent sensitivity that we just calculated by the FoV of that other game over the FoV you use in Apex. You obviously can do the same calculation when you just want to change your FoV in Apex and have your mouse movements feel the same. The formula is:
Relative sensitivity = sensitivity / (old FoV / new FoV)
Oh, and this is math we are talking about, so you can just multiply with the reciprocal, which is gonna make it a lot faster to punch into your calculator, as you can ignore the brackets:
Relative sensitivity = sensitivity * new FoV / old FoV
For example, if you are using a sensitivity of 3.87 with a 90º FoV in CS:GO, but a 100º FoV in Apex, the relative sensitivity that would feel the same in Apex would be 3.87*100/90=3.87/0.9=4.3.
Great, we’ll set our sensitivity in Apex to 4.3. Let’s say three games later we realize that a 100º FoV in Apex feels funky, and we want to go to 83º so stuff on the screen looks less distant. No problem: 4.3*83/100=356.9/100=3.569.
But what if you think your sensitivity is wrong and your aim is bad as a result of it, or what if this is your first shooter on PC?
«My aim sucks, what should I do?»
We had the stupid answer of «Just use what feels right!» already, so let’s tell you how you get to the point where you think it feels right:
- Go into training in Apex Legends
- Do the duck/jump/run/slide/walk thingy in the beginning
- Follow Bloodhound down to the boxes
- Try to keep your crosshair on his head while doing extended strafes left and right
- Repeat step 4 at varying distances.
Essentially do this:
- If your crosshair tends to overshoot (meaning it tends to be to the left of his head when you strafe right and vice versa), then decrease your sensitivity and repeat step 4.
- If your crosshair tends to undershoot (meaning it tends to be to the right of his head when you strafe right and left when you strafe left), increase your sensitivity and repeat step 4.
Here’s a little config tweak that will make this process a lot less cumbersome in case you need to set your sensitivity with more than one decimal point of accuracy; Put this into your settings.cfg:
bind_US_standard "F5" "exec sens.cfg"
Then create a new text file called ‘sens.cfg’ in the ‘cfg’ subdirectory inside the Apex directory (default: C:\Program Files (x86)\Origin Games\Apex\cfg) and put this into it:
Now, when you want to change your sensitivity in a more accurate manner, you can edit this file, save it, and then press F5 in the game to set your sensitivity to the value that you set in the file.
Obviously, you shouldn’t expect to arrive at a sensitivity that gives you perfectly accurate tracking. But eventually, as you make your adjustments smaller, you should home in on what feels right and accurate enough for your skill. You should use that sensitivity.
Last but not least, there are some special cases:
- You overshoot and undershoot depending on the distance
- The required arm motion to track is uncomfortably ample for the sensitivity you arrive at
- At closer distances, you constantly run out of mouse-pad for the sensitivity you arrive at
- You have really good muscle memory at your original sensitivity but realize your new sensitivity needs to be far lower
If any of these points apply to you, throw all the non-sense you think you know about mouse acceleration (like «It’s bad for muscle memory!») out of the window and inform yourself about proper mouse acceleration. Here’s two videos to get you started: https://youtu.be/KORL144_co8, https://youtu.be/PmY1OTacEzA?t=57. Ignore all the scary stuff you might hear about unsigned drivers, Povohat’s mouse accel filter is incredibly simple to install nowadays: http://mouseaccel.blogspot.com/2015/12/new-method-for-mouse-acceleration.html.
First off, there’s no such thing in mouse sensors. Your mouse doesn’t scan arbitrary dots on some surface. The sensor is able to count a (nowadays usually specifiable) amount of increments per inch, the correct term is CPI (counts per inch) and not DPI (dots per inch). So what CPI should you play at? There’s no simple answer here, it depends on your mouse.
Until very recently (about 2-3 years ago) it used to be the case that the vast majority of mouse sensors had a specific native CPI value and could only count that. Every other value that you set up in the driver or with a button on the mouse was achieved artificially, by doing an operation on the information from the native CPI. That is potentially a problem. It works fine when you go from a native CPI of 800 to 400 for example, because you can do that cleanly; you just throw out every second increment and it’s gonna feel perfectly consistent. Imagine walking up a staircase and skipping every second stair. You are gonna move at a constant speed when you do that. But when you set some arbitrary value, like 500, you can’t do that in a clean way. You’d need to throw away an increment every 1.6 increments. What essentially happens is that the mouse reports several increments an a row and then throws one away, then reports several in a row, and so on. Imagine you walk up a staircase of 8 stairs, but you have to skip 3 stairs in total on the way. You can’t do that with a constant and regular speed, you gotta take 2, skip 1, then 2, skip 1, then 1, then skip 1. It literally causes skipping in your mouse movement, it’s not great.
So if your mouse isn’t terribly new (and even if it is) you should google ‘mouse name native DPI’ to find out if your sensor still has a single native CPI, and then use that. You can half it or divide it by 4 if you want to, that’s always clean. If your mouse has a laser sensor, there’s also a good reason to do it, because those tend to have some inherent (mild) acceleration that you can tune down by using a lower CPI.
If you have a fancy new sensor that doesn’t have a single native CPI, you can use whatever you want. Well, you should actually do some tests, because at extremely high values (far above 3200) those sensors still tend to cause some tracking issues sometimes, but something like 3200 CPI is generally save from that. There’s also a very good reason to use higher CPI values: It is mathematically more accurate to increase your CPI while lowering your sens accordingly, you will be able to do smaller and more accurate turns, because there’s just more input information for the game to work with. So my actual recommendation is: Use your native CPI or (if your mouse/sensor doesn’t have one) the highest available CPI (but no higher than 3200) and if your mouse has a laser sensor, cut that value in half. To adjust your sensitivity accordingly and end up with the same effective sensitivity:
New sensitivity = old sensitivity * old CPI / new CPI
That’s about it, hope this answered your questions. If you have any further questions, feel free to hit me up at email@example.com, Twitter (@haschischtasche) or on Discord (vergeofapathy#6694, you can also @haschischtasche on the Apex Legends server).
I like waffles…