Abstract:
An algorithm for the enhancement of digital images is described. The algorithm is based upon the analogy that may exist between the description of a macroscopic system composed of many particles, and a digital image composed of many pixels. The analogy assumes that the intensity in a digital image fluctuates, so that the algorithm takes into account that fluctuations must decrease to a minimum in such a way that an enhanced image may be thought of as an image in an equilibrium state, leading to a quantitative criterion to stop the enhancement process. This may be taken as the starting point of a computer aided vision system, the next step being the image segmentation leading to the identification of the various patterns forming the image and which is described in a forthcoming paper. (C) 1996 American Institute of Physics.