ru/ ru

ISSN: 1023-5086


ISSN: 1023-5086

Scientific and technical

Opticheskii Zhurnal

A full-text English translation of the journal is published by Optica Publishing Group under the title “Journal of Optical Technology”

Article submission Подать статью
Больше информации Back

DOI: 10.17586/1023-5086-2022-89-08-43-53

УДК: 612.821

Relation of local window size in a model of modules with estimation of the size of visual images and their segmentation

For Russian citation (Opticheskii Zhurnal):

Бондарко В.М., Данилова М.В. Связь размера локального окна в модели модулей с оценкой размера зрительных изображений и их сегментацией // Оптический журнал. 2022. Т. 89. № 8. С. 43–53.


Bondarko V.M., Danilova M.V. Relation of local window size in a model of modules with estimation of the size of visual images and their segmentation  [in Russian] // Opticheskii Zhurnal. 2022. V. 89. № 8. P. 43–53.

For citation (Journal of Optical Technology):

V. M. Bondarko and M. V. Danilova, "Relation of local window size in a model of modules with estimation of the size of visual images and their segmentation," Journal of Optical Technology. 89(8), 461-468 (2022).


Subject of study. The fundamental mechanisms of visual image processing, such as size estimation and segmentation, are considered. The possibility of describing these mechanisms using the model of modules proposed by V. D. Glezer and based on electrophysiological data from studies of the receptive fields of neurons in the visual cortex was investigated. Aim of study. This study aimed to investigate mechanisms of size estimation and the segmentation processes by comparing the experimental data obtained by the authors with modeling results. Methods. Psychophysical methods were used for the experimental part. Two different paradigms were used for size estimation: a modified Ebbinghaus illusion with different surroundings of the test stimuli or a comparison of the sizes of upright and oblique crosses previously used in experiments on segmentation. Spatial-frequency filtering of images in local areas of the visual field was performed during modeling using a finite set of filters. Main results. The dependence of the size estimate on the distance between images and on their shape and the relationship of those estimates with segmentation was demonstrated for the first time to our knowledge. Another new result shows that images with different shapes are perceived as equal in size if the modules that optimally describe these images (i.e., maximum energy is preserved in images when filtered with a limited number of filters) are of equal sizes. The obtained results demonstrate that the model of modules can describe to a first approximation the mechanisms performing the estimation of the sizes of the images and their segmentation. Correspondence between the data of neurophysiological, psychophysical, and modeling investigations was shown for the first time to our knowledge. The conclusions were supported by a comparison of the experimental results, and the modeling, with patterns of traditional Byzatntine icon paintings, as well as Russian avant-garde paintings of the early 20th century, which were inspired by the former. Practical significance. Application and further enhancement of the model of modules as an artificial neural network that ensures the segmentation, image size estimation, and recognition of visual objects can be of practical significance.


size estimation, segmentation, spatial-frequency analysis, module model, optical illusions, Ebbinghouse illusion, painting


The research was supported by the State program 47 "Scientific and Technological Development of the Russian Federation" (2019-2030), theme No. 0134-2019-0005.

OCIS codes: 330.7326 330.4060 330.5510


1. W. Wundt, Die geometrisch-optischen Täuschungen (Teubner, Leipzig, 1898).
2. V. M. Bondarko, M. V. Danilova, and V. N. Chikhman, “Segmentation of visual images: experimental data and modeling,” J. Opt. Technol. 88(12), 692–699 (2021) [Opt. Zh. 88(12), 7–17 (2021)].
3. V. P. Lutsiv, “Convolutional deep-learning artificial neural networks,” J. Opt. Technol. 82(8), 499–508 (2015) [Opt. Zh. 82(8), 11–23 (2015)].
4. A. K. Tsytsulin, A. I. Bobrovski˘ı, A. V. Morozov, V. A. Pavlov, and M. A. Galeeva, “Using convolutional neural networks to automatically select small artificial space objects on optical images of a starry sky,” J. Opt. Technol. 86(10), 627–633 (2019) [Opt. Zh. 86(10), 30–38 (2019)].
5. Y. Xie, F. Zhu, and Y. Fu, “Main-secondary network for defect segmentation of textured surface images,” in IEEE Winter Conference on Applications of Computer Vision (2020), pp. 3531–3540.
6. J. Beck, “Textural segmentation, second-order statistics, and textural elements,” Biol. Cybern. 48, 125–130 (1983).
7. Yu. E. Shelepin, V. D. Glezer, V. M. Bondarko, M. B. Pavlovskaya, I. A. Vol, and Yu. P. Danilov, “Spatial vision,” in Vision Physiology, A. L. Byzov, ed. (Nauka, Moscow, 1992), pp. 528–586.
8. B. Julesz, “Experiment in the visual perception of texture,” Sci. Am. 232, 34–43 (1975).
9. B. Julesz, “Textons, the elements of texture perception, and their interactions,” Nature 290, 91–97 (1981).
10. B. Julesz, “Texton gradients: the texton theory revisited,” Biol. Cybern. 54, 245–251 (1986).
11. J. Beck, A. Sutter, and R. Ivry, “Spatial frequency channels and perceptual grouping in texture perception,” Comput. Vision Graphics Image Process. 37, 299–325 (1987).
12. A. C. Bovik, M. Clark, and W. S. Geisler, “Multichannel texture analysis using localized spatial filters,” IEEE Trans. Pattern Anal. Mach. Intell. 12, 55–73 (1990).
13. M. R. Turner, “Texture discrimination by Gabor functions,” Biol. Cybern. 55, 71–82 (1986).
14. E. Arsenault, A. Yoonessi, and C. Baker, “Higher order texture statistics impair contrast boundary segmentation,” J. Vis. 11(10):14, 1–15 (2011).
15. M. J. Prakash, S. Kezia, I. S. Prabha, and V. V. Kumar, “A new approach for texture segmentation using gray level textons,” Int. J. Signal Process. Image Process. Pattern Recognit. 6(3), 81–90 (2013).
16. V. D. Glezer, Vision and Thinking (Nauka, Leningrad, 1985).
17. N. A. Kaliteevsky, V. E. Semenov, V. D. Glezer, and V. E. Gauselman, “Algorithm of invariant image description by the use of a modified Gabor transform,” Appl. Opt. 33(23), 5256–5261 (1994).
18. V. D. Glezer, V. V. Yakovlev, and V. E. Gauselman, “Harmonic basis function for spatial coding in the cat striate cortex,” Vis. Neurosci. 3, 351–383 (1989).
19. I. A. Vol, “A spatial-frequency model of visual system hyperacuity,” Sens. Sist. 2(2), 133–138 (1988).
20. O. V. Zhukova, E. Yu. Malakhova, and Yu. E. Shelepin, “La Gioconda and the indeterminacy of smile recognition by a person and by an artificial neural network,” J. Opt. Technol. 86(11), 706–715 (2019) [Opt. Zh. 86(11), 40–50 (2019)].
21. E. Yu. Malakhova, “Information representation space in artificial and biological neural networks,” J. Opt. Technol. 87(10), 598–603 (2020) [Opt. Zh. 87(10), 50–58 (2020)].
22. M. A. Titarenko and R. O. Malashin, “Image enhancement by deep neural networks using high-level information,” J. Opt. Technol. 87(10), 604–610 (2020) [Opt. Zh. 87(10), 59–68 (2020)].
23. O. V. Zhukova and P. P. Vasil’ev, “Reconstruction of a neural network and alteration of the operators’ strategies during the recognition of facial images,” J. Opt. Technol. 87(10), 581–589 (2020) [Opt. Zh. 87(10), 25–37 (2020)].
24. A. K. Kharauzov, Yu. E. Shelepin, O. V. Tsvetkov, O. V. Zhukova, and S. V. Pronin, “Methods of masking threatening images and detecting electrophysiological indicators of their unconscious perception,” J. Opt. Technol. 87(10), 611–618 (2020) [Opt. Zh. 87(10), 69–80 (2020)].
25. Yu. E. Shelepin, A. K. Kharauzov, O. V. Zhukova, S. V. Pronin, M. S. Kupriyanov, and O. V. Tsvetkov, “Masking and detection of hidden signals in dynamic images,” J. Opt. Technol. 87(10), 624–632 (2020) [Opt. Zh. 87(10), 89–102 (2020)].
26. O. Ronneberger, P. Fischer, and T. Brox, “U-net: convolutional networks for biomedical image segmentation,” Lect. Notes Comput. Sci. 9351, 234–241 (2015).

27. R. O. Malashin, “Principle of least action in dynamically configured image analysis systems,” J. Opt. Technol. 86(11), 678–685 (2019) [Opt. Zh. 86(11), 5–13 (2019)].
28. H. Ebbinghaus, Grundzüge der Psychologie, Volume II, Part I (Veit, Leipzig, 1908).
29. W. H. Ehrenstein and J. Hamada, “Structural factors of size contrast in the Ebbinghaus illusion,” in New Horizons in the Study of Gestalt Perception, S. Sumi and K. Noguchi, eds. (Keio University, 1996), pp. 158–169.
30. V. M. Bondarko and L. A. Semenov, “Size estimates in Ebbinghaus illusion in adults and children of different age,” Hum. Physiol. 30(1), 24–30 (2004).
31. H. R. Wilson and D. J. Gelb, “Modified line element theory for spatial frequency and width discrimination,” J. Opt. Soc. Am. A 1, 124–131 (1984).
32. M. V. Danilova and V. M. Bondarko, “Foveal contour interactions and crowding effects at the resolution limit of the visual system,” J. Vis. 7(2):25, 1–18 (2007).
33. V. V. Bychkov, “Icon and Russian avant-garde of the early 20th century,” in Book of Non-classical Aesthetics, O. B. Korevishche, ed. (IF RAN, Moscow, 1998), pp. 58–75.
34. C. C. Douglas, Swans of Other Worlds: Kazimir Malevich and the Origins of Suprematism 1908–1915 (UMI Research Press, 1980).
35. D. Sarab’yanov and A. Shatskikh, Kazimir Malevich: Painting, Theory (Iskusstvo, Moscow, 1993).
36. Y. A. Griber and A. G. Egorov, “The Vitebsk project by Kazimir Malevich: a case-study of urban life modernization,” Mediterr. J. Soc. Sci. 6(4), 309–314 (2015).
37. R. Vidrih, “Iconisation at work. Malevich’s Black Square, the modern icon at Tate Modern,” IKON 9, 343–354 (2016).
38. J. S. Girgus, S. Coren, and M. Agdern, “The interrelationship between the Ebbinghaus and Delboeuf illusions,” J. Exp. Psychol. 95(2), 453–455 (1972).
39. D. J. Weintraub and M. K. Schneck, “Fragments of Delboeuf and Ebbinghaus illusions: contour context explorations of misjudged circle size,” Percept. Psychophys. 40(3), 147–158 (1986).
40. A. Bulatov and A. Bertulis, “Distortions of length perception,” Biol. Cybern. 80, 185–193 (1999).
41. A. Bulatov, A. Bertulis, and L. Mickiene, “Geometrical illusions: study and modelling,” Biol. Cybern. 77, 395–406 (1997).
42. P. J. Burt, “Fast filter transforms for image processing,” Comput. Graphics Image Process. 16, 20–51 (1981).