Abstract:
To accommodate the needs of intelligent construction in coal mines, a large number of digital image acquisition devices have gradually been deployed underground, playing an important role in the safe and efficient production of coal mine. However, the quality of the collected images is hard to ensure due to factors such as uneven illumination caused by the uneven arrangement of artificial light sources underground and noise introduced by coal dust and other particles. To address this issue, a method for image restoration in the HSV color space is proposed. This method involves converting the image to the HSV color space and separately extracting the value, hue, and saturation components; the improved Multi-Scale Retinex algorithm is used to repair and balance the value, noise analysis in the frequency domain with the application of a Butterworth filter is used to repair the hue, and a correlation-based adaptive saturation correction method is used to adjust the saturation; finally, the restored image is inversely transformed from the HSV space back to the RGB space, thus completing the image restoration. In terms of image restoration effects, the described method shows significant improvements over the classical Multi-Scale Retinex (MSR) algorithm and the Multi-Scale Retinex with Color Restore (MSRCR) algorithm, managing to maintain the color and edges of the image while improving its visual perception. The experimental comparison of different algorithms processed images in terms of standard deviation, mean gradient, and information entropy shows that the proposed image restoration algorithm has improved by 24.24%, 48.38%, and 1.43% respectively compared to the MSR algorithm, and by 8.68%, 39.88%, and 1.35% respectively compared to the MSRCR algorithm. The experimental results indicate that the described method can effectively enhance the quality of images and videos obtained from underground observations, providing high-quality image data support for mine safety production and intelligent decision-making.