Salient Object Detection via Global and Local Cues

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Authors Na Tong, Huchuan Lu, Ying Zhang, Xiang Ruan
Journal/Conference Name PATTERN RECOGNITION
Paper Category
Paper Abstract Previous saliency detection algorithms used to focus on low level features directly or utilize a bunch of sample images and manually labeled ground truth to train a high level learning model. In this paper, we propose a novel coding-based saliency measure by exploring both global and local cues for saliency computation. Firstly, we construct a bottom-up saliency map by considering global contrast information via low level features. Secondly, by using a locality-constrained linear coding algorithm, a top-down saliency map is formulated based on the reconstruction error. To better exploit the local and global information, we integrate the bottom-up and top-down maps as the final saliency map. Extensive experimental results on three large benchmark datasets demonstrate that the proposed approach outperforms 22 state-of-the-art methods in terms of three popular evaluation measures, i.e., the Precision and Recall curve, Area Under ROC Curve and F-measure value. Furthermore, the proposed coding-based method can be easily applied in other methods for significant improvement. HighlightsWe present a coding-based algorithm for salient object detection.Integration of local and global cues makes the saliency maps more accurate, intact.Bottom-up maps provide foreground and background codebooks for following steps.Fusion of FC and BC based results makes the saliency results more uniform, robust.Our coding-based method can be easily applied in other methods for improvement.
Date of publication 2015
Code Programming Language MATLAB
Comment

Copyright Researcher 2021