My OS is set to use native monitor's resolution 1920x1080.
It is connected via DVI (though it shows HDMI).
DVI is digital format, it sends 24 bit RGB pixel-by-pixel.
Here is no place for such distortion if the monitor just shows what video-card sent and thats it.
However, the monitor itself processes picture adding brightness, contrast, gamma processing and other "improvements". I.e. it can show all in sepia color
.
(if interested you can look manual
https://www.lg.com/ca_en/support/manuals?csSalesCode=W2361V-PF.ACC)
One of such "improvements" is Sharpness. Probably here it applies some mask that highlights gradients trying to make them even sharper. Here it goes to subpixel distortions.
This processing would make sense for analog signal. Or may be for movies. But for graphic work, drawings, it does not make sense. I believe I've minimized it by setting it somewhat in middle, to 5. Value 1 is a little bit blurry, 10 creates this subpixel distortion. 4 makes some antialiasing.