Image and event fusion method based on wavelet and dynamic complementary filtering[J]. Chinese Journal of Engineering. DOI: 10.13374/j.issn2095-9389.2024.01.23.004
Citation: Image and event fusion method based on wavelet and dynamic complementary filtering[J]. Chinese Journal of Engineering. DOI: 10.13374/j.issn2095-9389.2024.01.23.004

Image and event fusion method based on wavelet and dynamic complementary filtering

  • This paper discusses the data fusion of event camera and traditional frame camera, and proposes a new fusion method to improve the image quality under high dynamic range, high-speed moving objects and complex lighting conditions. As a new type of vision sensor, event camera has the advantages of high time resolution and low power consumption, but there are a lot of noise and feature loss in its data. While traditional frame cameras excel at spatial resolution, their ability to handle high-speed motion and high dynamic range scenes is limited. To address these challenges, this paper proposes an image and event data fusion method combining discrete wavelet transform and dynamic gain complementary filtering. In this method, the entropy of image information is calculated to evaluate the exposure of the input image frame, and then the high and low frequency information in the event stream data and image frame data is extracted by using discrete wavelet transform, and the fusion of image and event is realized by dynamic gain complementary filter. Experimental results on HDR Hybrid Event-Frame dataset shows that the fusion method is effective in improving image quality, even under high dynamic range and complex lighting conditions. In addition, compared with several classic high-quality image reconstruction methods on HDR Hybrid Event-Frame dataset, and the three metrics of MSE, SSIM and Q-score reach 0.0199, 0.90 and 6.07, respectively.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return