TY - JOUR
T1 - Object proposal with kernelized partial ranking
AU - Wang, Jing
AU - Shen, Jie
AU - Li, Ping
N1 - Publisher Copyright:
© 2017 Elsevier Ltd
PY - 2017/9
Y1 - 2017/9
N2 - Object proposals are an ensemble of bounding boxes with high potential to contain objects. In order to determine a small set of proposals with a high recall, a common scheme is extracting multiple features followed by a ranking algorithm which however, incurs two major challenges: 1) The ranking model often imposes pairwise constraints between each proposal, rendering the problem away from an efficient training/testing phase; 2) Linear kernels are utilized due to the computational and memory bottleneck of training a kernelized model. In this paper, we remedy these two issues by suggesting a kernelized partial ranking model. In particular, we demonstrate that i) our partial ranking model reduces the number of constraints from O(n2) to O(nk) where n is the number of all potential proposals for an image but we are only interested in top-k of them that has the largest overlap with the ground truth; ii) we permit non-linear kernels in our model which is often superior to the linear classifier in terms of accuracy. For the sake of mitigating the computational and memory issues, we introduce a consistent weighted sampling (CWS) paradigm that approximates the non-linear kernel as well as facilitates an efficient learning. In fact, as we will show, training a linear CWS model amounts to learning a kernelized model. Extensive experiments demonstrate that equipped with the non-linear kernel and the partial ranking algorithm, recall at top-k proposals can be substantially improved.
AB - Object proposals are an ensemble of bounding boxes with high potential to contain objects. In order to determine a small set of proposals with a high recall, a common scheme is extracting multiple features followed by a ranking algorithm which however, incurs two major challenges: 1) The ranking model often imposes pairwise constraints between each proposal, rendering the problem away from an efficient training/testing phase; 2) Linear kernels are utilized due to the computational and memory bottleneck of training a kernelized model. In this paper, we remedy these two issues by suggesting a kernelized partial ranking model. In particular, we demonstrate that i) our partial ranking model reduces the number of constraints from O(n2) to O(nk) where n is the number of all potential proposals for an image but we are only interested in top-k of them that has the largest overlap with the ground truth; ii) we permit non-linear kernels in our model which is often superior to the linear classifier in terms of accuracy. For the sake of mitigating the computational and memory issues, we introduce a consistent weighted sampling (CWS) paradigm that approximates the non-linear kernel as well as facilitates an efficient learning. In fact, as we will show, training a linear CWS model amounts to learning a kernelized model. Extensive experiments demonstrate that equipped with the non-linear kernel and the partial ranking algorithm, recall at top-k proposals can be substantially improved.
KW - Consistent weighted sampling
KW - Object proposal
KW - Partial ranking
UR - http://www.scopus.com/inward/record.url?scp=85019357468&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85019357468&partnerID=8YFLogxK
U2 - 10.1016/j.patcog.2017.03.022
DO - 10.1016/j.patcog.2017.03.022
M3 - Article
AN - SCOPUS:85019357468
SN - 0031-3203
VL - 69
SP - 299
EP - 309
JO - Pattern Recognition
JF - Pattern Recognition
ER -