Skip to content
Snippets Groups Projects
Commit 44c04129 authored by Ross Girshick's avatar Ross Girshick
Browse files

TEST: BINARY -> SVM

parent bfeb811c
No related branches found
No related tags found
No related merge requests found
EXP_DIR: svm
TRAIN:
# don't use flipped examples when training SVMs for two reasons:
# 1) R-CNN didn't
# 2) I've tried and it doesn't help, yet makes SVM training take 2x longer
USE_FLIPPED: False
TEST:
BINARY: True
SVM: True
......@@ -97,9 +97,9 @@ __C.TEST.MAX_SIZE = 1000
# IoU >= this threshold)
__C.TEST.NMS = 0.3
# Experimental: use binary logistic regression scores instead of K-way softmax
# scores when testing
__C.TEST.BINARY = False
# Experimental: treat the (K+1) units in the cls_score layer as linear
# predictors (trained, eg, with one-vs-rest SVMs).
__C.TEST.SVM = False
# Test using bounding-box regressors
__C.TEST.BBOX_REG = True
......
......@@ -136,11 +136,10 @@ def im_detect(net, im, boxes):
net.blobs['rois'].reshape(*(blobs['rois'].shape))
blobs_out = net.forward(data=blobs['data'].astype(np.float32, copy=False),
rois=blobs['rois'].astype(np.float32, copy=False))
if cfg.TEST.BINARY:
# simulate binary logistic regression
if cfg.TEST.SVM:
# use the raw scores before softmax under the assumption they
# were trained as linear SVMs
scores = net.blobs['cls_score'].data
# Return scores as fg - bg
scores = scores - scores[:, 0][:, np.newaxis]
else:
# use softmax estimated probabilities
scores = blobs_out['cls_prob']
......
......@@ -265,10 +265,12 @@ def parse_args():
if __name__ == '__main__':
# Must turn this off to prevent issues when digging into the net blobs to
# pull out features
# pull out features (tricky!)
cfg.DEDUP_BOXES = 0
cfg.TEST.BINARY = True
# Must turn this on because we use the test im_detect() method to harvest
# hard negatives
cfg.TEST.SVM = True
args = parse_args()
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment