Lazarus
Programming => Graphics => Graphics and Multimedia => BGRABitmap and LazPaint => Topic started by: VTwin on July 03, 2020, 05:36:28 am

Here is a question for circular, but any input is welcome.
This application:
https://apps.apple.com/us/app/strabotools/id1496239162#?platform=ipad
does an interesting image analysis to determine the orientations of image brightness gradients. They give no references, and I have not located any papers on this. I'd very much like to implement it, do you have any ideas on how this might be done?
In addition, I implemented some edge detection routines several years ago, Laplacian, Prewitt, Sobel, Kirsch, and DoG, in addition to your "Contour". I think I got them right, if not optimized. None of them work as well as I hoped for my needs. Do you have any interest in such edge detection algorithms?
Cheers,
VTwin
EDIT: I think I asked a similar question regarding edge detection a few years ago. Sorry if this is a repeat.

Hi VTwin
Hmm I suppose one can compute the gradient orientation by computing the delta of the value in x and y.
For example, using the following coefficients for the variation in x:
[0.5, 0, 0.5,
1, 0, 1,
0.5, 0, 0.5]
and similarly for the variation in y:
[0.5, 1, 0.5,
0, 0, 0,
0.5, 1, 0.5]
applied on the pixel values V and summed up.
You get a vector (x, y) from which you can compute an angle (using ArcTan2).
If you have some code to propose for filters, it is welcome.
Regards

None of them work as well as I hoped for my needs.
Hi!
Do you know the Canny Edge Detection?
Known for good results.
https://www.hindawi.com/journals/cin/2018/3598284/ (https://www.hindawi.com/journals/cin/2018/3598284/)
Sorry  I dont know any Pascal implementation.
Winni

Hi VTwin
Hmm I suppose one can compute the gradient orientation by computing the delta of the value in x and y.
For example, using the following coefficients for the variation in x:
[0.5, 0, 0.5,
1, 0, 1,
0.5, 0, 0.5]
and similarly for the variation in y:
[0.5, 1, 0.5,
0, 0, 0,
0.5, 1, 0.5]
applied on the pixel values V and summed up.
You get a vector (x, y) from which you can compute an angle (using ArcTan2).
If you have some code to propose for filters, it is welcome.
Regards
circular,
Thank you very much for that suggestion. I'm an amateur, my primary sources have been:
Efford, 2000. Digital Image Processing, A Practical Introduction Using Java. Pearson.
Parker, 2001. Algorithms for Image Processing and Computer Vision, 2nd Ed. Wiley.
The kernel you give is identical to the Sobel edge detection kernel (* 0.5), which makes sense. I have been playing around with it summing the convolved pixel lightness in x and y. It is somewhat promising, but not working yet. I'll keep at it and report back.
Thanks, I may do so.
Cheers
@winni, thanks. Yes, the above references discuss Canny and a few others. I have not attempted to implement it.

Indeed, that is in fact the Sobel operator *0.5. That seems quite logical to me.
You can sum in x and y but then you just get the contour level. If you want to have the direction, you need x and y separately.

Indeed, that is in fact the Sobel operator *0.5. That seems quite logical to me.
You can sum in x and y but then you just get the contour level. If you want to have the direction, you need x and y separately.
Yes, that is what I am doing.
Efford covers Prewitt and Sobel gradient and magnitude in Section 7.4.1.

circular,
Is there a formal definition of "lightness"? I see in your code:
"The lightness here is defined as the subjective sensation of luminosity, where
blue is the darkest component and green the lightest"
Is there a reference? I ask because scientific image analysis software seems to use grayscale, I believe ImageJ and MATLAB do. It seems to me more information is available in color images, and lightness may be preferable.
Thanks,
VTwin

The problem I mentioned requires forming a 2x2 variancecovariance matrix from the x, y values convolved using Sobel or Prewitt kernels. The eigenvectors of that give the fabric orientation and strength.

circular,
Is there a formal definition of "lightness"? I see in your code:
"The lightness here is defined as the subjective sensation of luminosity, where
blue is the darkest component and green the lightest"
Is there a reference? I ask because scientific image analysis software seems to use grayscale, I believe ImageJ and MATLAB do. It seems to me more information is available in color images, and lightness may be preferable.
Well the way grayscale is computed may vary. Basically, the sensation of lightness is additive, so in theory, whatever the color channels are, you can sum them up to get the total lightness. But...
 The values are generally in sRGB colorspace where there is a gamma value. So one need to get the linear RGB values by applying a power of 2.2 (that's an approximation that works very well). That entails that low values are darker than what their value suggest. If you don't apply this correction, you get the standard HSL lightness, which is the L value of TStdHSLA. This mistake is so common that it has somewhat become a standard.
 The eyes do not have the same sensitivity depending on the color. Basically a color is made up of the sum of all possible visible wavelength. At the center of the visible spectrum of humans, towards green, you get the most detection by the eye receptors. For example, the color of a yellow paper is the sum of wavelength including green, yellow, orange and red. That's most of the visible spectrum excepting deep blue, to which eyes or not very sensitive. So yellow is almost as light as white. If you take it into account, you get the corrected HSL lightness, which is the L value of THSLAPixel and corresponds to the Y value in XYZ colorspace. If you have linear RGB values, there are constants you can use as weights to compute it (you can find them in the grayscale functions of BGRABitmap). I personally called this the subjective sensation of luminosity because that corresponds to the actual light intensity that is detected by the eyes.
 Third thing to consider is how eyes get used to light. As eyes can accommodate to darkness, dark areas are not perceived as dark. This luminosity correspond to the L* of the L*a*b* colorspace, and is referred to as a subjective lightness. Personally I would rather specify that it includes accommodation of the eyes. You can get this value by using ToLabA method or by applying power 1/3 to the luminosity (cf the Y value).
You may wonder why we don't use XYZ colorspace then, as it is the most accurate representation of color. The problem is that XYZ is defined with an equal energy illuminant (E) that doesn't exist in real life and cannot be displayed as such because X, Y and Z are imaginary colors.

The problem I mentioned requires forming a 2x2 variancecovariance matrix from the x, y values convolved using Sobel or Prewitt kernels. The eigenvectors of that give the fabric orientation and strength.
Not sure why you need such matrix. Applying ArcTan2 to the (x, y) values would give you the angle, no?

circular,
Many thanks for the detailed reply. I have a basic understanding of color spaces, but am working towards a better comprehension. That is helpful.
I see, for example, the MATLAB function rgb2gray converts RGB values to grayscale by forming a weighted sum of the R, G, and B components:
0.2989 * R + 0.5870 * G + 0.1140 * B
as a float. This appears to be the same, as a byte, as your BGRAToGrayscaleLinear, while your BGRAToGrayscale function includes a gamma correction.
In doing image analysis, my assumption is that gamma correction and the use of word rather than byte gray levels is preferable, giving more information, so "lightness" seems to be a good choice.
Cheers

The problem I mentioned requires forming a 2x2 variancecovariance matrix from the x, y values convolved using Sobel or Prewitt kernels. The eigenvectors of that give the fabric orientation and strength.
Not sure why you need such matrix. Applying ArcTan2 to the (x, y) values would give you the angle, no?
For a single pixel the convolution gives the gradient as a vector, [gx, gy], from which you can get the magnitude, g = Sqrt(gx*gx+gy*gy), and orientation, ArcTan2(gy, gx). However, I want the magnitudes and orientations of the maximum and minimum principle values of all the vectors. If all the vectors are translated to the origin, they form a cloud whose moments are parallel and perpendicular to the "bestfit" edge gradients for the entire image.
EDIT: convolution > correlation

In doing image analysis, my assumption is that gamma correction and the use of word rather than byte gray levels is preferable, giving more information, so "lightness" seems to be a good choice.
Indeed. For display, you might need to apply gamma compression (power 2.2).