machine learning - Select optimized parameters for libsvm-linear kernel -
i use rbf
kernel ml libsvm. looking exploring other kernels dataset.
there multiple parameters optimize each specific kernel. c , g
parameters used grid search selecting optimal combination of cost , gamma.
-d degree : set degree in kernel function (default 3) -g gamma : set gamma in kernel function (default 1/num_features) -r coef0 : set coef0 in kernel function (default 0) -c cost : set parameter c of c-svc, epsilon-svr, , nu-svr (default 1) -n nu : set parameter nu of nu-svc, one-class svm, , nu-svr (default 0.5) -p epsilon : set epsilon in loss function of epsilon-svr (default 0.1) -m cachesize : set cache memory size in mb (default 100) -e epsilon : set tolerance of termination criterion (default 0.001) -h shrinking : whether use shrinking heuristics, 0 or 1 (default 1) -b probability_estimates : whether train svc or svr model probability estimates, 0 or 1 (default 0) -wi weight : set parameter c of class weight*c, c-svc (default 1)
i want know relevant parameters each individual kernel. since there multiple parameters selection. ex: c , g
rbf
kernel. provide grid size , range of parameters ex: 10^-3 10^11 c , 10^3 10^-13 g
my perl grid generator rbf kernel:
( $i = -3; $i <= 11; $i += 1 ) { ( $j = 3; $j >= -13; $j += -1 ) { $a = 2**$i; $b = 2**$j; $output = "svm-train -c $a -g $b -v 5 $argv[0]"; print "$output >& ${argv[0]}_${a}_${b}.out \n"; } }
libsvm supports 4 kernels: linear, poly, rbf , sigmoid (which not valid kernel).
- linear: no parameters
- poly: gamma (>0, float), coef0 (float), degree (>1, int)
- rbf: gamma (>0, float)
- sigmoid: gamma (>0, float), coef0 (float)
you cannot provide general parameters grids data dependent.
c svm parameter, needs fitted always. remaining parameters not kernel specific , should not worry them.
Comments
Post a Comment