how can the algorithm fail, at least on a linear axis?
Roughly speaking, it takes candidate step values, e.g. 1.0, 0.5, 0.2, 0.1, 0.05, 0.02, ...
and for each step calculates interval length in pixels.
If that interval is between MinLength and MaxLength, the candidate is accepted,
otherwise rejected.
Usually, first candidates are rejected because a step is too long, and last ones
because a step is too short.
If MinLength and MaxLength are too close, it is quite possible that there is not a single acceptable candidate -- for example, suppose scaling factor = 100.
Then step=1.0 gives an interval of 100 pixels,
step=0.5 gives an interval of 50 pixels, step of 0.2 gives an interval of 20 pixels, etc.
If you set MinLength=30 and MaxLength=40, then all steps will be rejected.