The actual answer in on Stack exchange in their comments.
It is related to a mix of actual display resolution vs conversions to virtual resolutions (the scaled resolution), and use of single precision floating point calculations.
Essentially my understanding is what it is doing is storing the value needed to convert your actual resolutions number of pixels (2160p) to a virtual resolution number of pixels (2160/1.75 horizontally) but that gets you fractions of a virtual pixel. So instead of 1.75 it scaled by 1.75182... to get to a whole number of virtual pixels to work with. Then on top of that the figure is slightly altered from what we'd expect by floating point errors.
If you take the actual horizontal resolution 2190 and divide it by the virtual resolution it's trying to use 1233 pixels, you need a conversion value of 1.75182.... to convert to it so you don't get fractions of a pixel. If you used 1.75 you'd get 1234.2857... pixels. So gnome is storing the fraction that gets you a clean conversion in pixels to about 4 decimal places of a pixel.
Full credit to rakslice at Stack Exchange who also goes into the detail.