New Immissions/Updates:
boundless - educate - edutalab - empatico - es-ebooks - es16 - fr16 - fsfiles - hesperian - solidaria - wikipediaforschools
- wikipediaforschoolses - wikipediaforschoolsfr - wikipediaforschoolspt - worldmap -

See also: Liber Liber - Libro Parlato - Liber Musica  - Manuzio -  Liber Liber ISO Files - Alphabetical Order - Multivolume ZIP Complete Archive - PDF Files - OGG Music Files -

PROJECT GUTENBERG HTML: Volume I - Volume II - Volume III - Volume IV - Volume V - Volume VI - Volume VII - Volume VIII - Volume IX

Ascolta ""Volevo solo fare un audiolibro"" su Spreaker.
CLASSICISTRANIERI HOME PAGE - YOUTUBE CHANNEL
Privacy Policy Cookie Policy Terms and Conditions
Talk:Support vector machine - Wikipedia, the free encyclopedia

Talk:Support vector machine

From Wikipedia, the free encyclopedia

This is the talk page for discussing improvements to the Support vector machine article.
This is not a forum for general discussion about the article's subject.

Article policies

There is not enough information about solving SVMs. There is only the mention that it's a quadratic optimization problem. A discussion of solving and a brief example would be very helpful. 216.145.54.158 21:59, 4 August 2006 (UTC)


Hey - I dont think that kernel PCA should redirect here? Sort of like redirecting "dogs" to "Animals" It probably needs its own article.--137.215.9.20 09:07, 26 July 2006 (UTC)

Indeed, kernel PCA should not redirect here - KPCA and SVM have only in common the kernel trick, but one is a feature extractor, the other is a classifier. KPCA is in fact closer to spectral clustering than it is to the SVM. I'll try to write something about KPCA some time soon.

Hello! I have a comment about the first sentence, which says "A SVM is a <blank>", and <blank> has been either "statistical classification model" and "supervised learning method" lately. I'm in favor of the former because it situates SVM in a very large group of related methods from both conventional statistics and machine learning. "Supervised learning" is deficient on two counts -- s.l. also includes regression as well as classification, and it suggests only a link to machine learning and not conventional statistics. So I'd like to hear what other people have to say. Happy editing, Wile E. Heresiarch 02:35, 29 Apr 2004 (UTC)

Yes, indeed, classification is more specific than supervised learning. However, there are both classification and regression forms of a support vector machine. The latter is often called Support Vector Regression (SVR). I added a discussion of SVR to this article, although it is difficult for laypeople to understand. Anyway, I think that supervised learning is a more accurate description. The article supervised learning is much less stubby than classification, too.
To me, machine learning is statistics, so I don't have a preference for "supervised learning" over "classification" on that basis. -- hike395 04:46, 29 Apr 2004 (UTC)

Hmm. Not quite the way I'd put it. However, I haven't got my references on me at the moment, so I can't come up with a precise description. -- 213.253.39.90


Contents

[edit] Non-linear? and origin of machine term?

Do SVMs have to be non-linear? I thought they could be either linear or non-linear. -- Oliver PEREIRA 13:18 Jan 26, 2003 (UTC)

No. See if my new edit makes you happier. By the way, does anyone know why they're called "machines"? Nobody seems to call older machine learning techniques (e.g. the perceptron, feed-forward networks, etc.) "machines". --Ryguasu 00:13 Apr 2, 2003 (UTC)

According to Vapnik (The Nature Of Statistical Learning Theory, p. 133) they are non-linear: The Support Vector (SV) machine implements the following idea: it maps the input vectors x into a high-dimensional feature space Z through some nonlinear mapping, chosen a priori. In this space, an Optimal separating hyperplane is constructed. To be strict a so-called linear SVM is an optimal hyperplane (it has support vectors, but is not a support vector machine), although many authors ignore this. --knl 15:57, 28 Aug 2004 (UTC)
The mapping of the feature vectors into the feature space (possibly one of a higher dimension) only makes sense if it is a nonlinear one. Would it be linear, no additional information could be gained by this step, because the feature vectors would still have the same relations, only modified by some scalar factor. So chosing a linear mapping would make no sense and only add to the computation expense. Hence, any but the most basic support vector machines, which are in fact called linear support vector machines, are nonlinear at a very fundamental level. —The preceding unsigned comment was added by 206.106.168.10 (talk) 01:58, 12 March 2007 (UTC).
I learned that the reasoning behind the name support vector machine is the following: The support vectors, that is, the samples along the hyperplanes, are used to generate/solve for the maximum margin hyperplane between the two (i.e. the positive example support vector and the negative example support vector) support vectors. This maximum margin hyperplane (characterized by w and b) is the machine which returns 1 or -1 (or 0 if the test point/vector is right on the SVM) when given positive and negative points/vectors. So, machine in this case means "something that returns 1 (true) or -1 (false)" for the purposes of classification. Not sure how to find a reference for this though. --Rajah 16:27, 20 February 2007 (UTC)
Another useful interpretation of the name is the analogy in Burges' paper "A Tutorial on Support Vector Machines for Pattern Recognition". If you treat the decision surface in feature space as a rigid sheet, and suppose each support vector exerts a force alpha (the value of the dual variable associated with it) perpendicularly against it (say by a rod from the decision sheet to the support vector (in feature space)) then at optimality the forces and torques associated all cancel out. So basically the margin is "supported" by these forces exerted by the "support vectors" and subsequently lies at a position of mechanical stability. This is a nice justification for the name support vector and also, being rather mechanistic, makes machine seem less of a stretch. Svm slave 12:25, 24 February 2007 (UTC)

Fast and lightweight? As compared to what? (Sitting in front of machine that's spent 3 days on a linearly separable dataset C4.5 takes a few minutes to chomp through). User:Iwnbap


Don't forget Sequential Minimal Optimization (SMO) [1]


As someone who doesn't know SVM very well yet, I say this article could really use a picture. The article is technically correct, but not enlightening to a newbie. Someone drew me a picture with two classes (+ and -), some separating planes (i.e., a 2-d example) and explained that many cases not near the planes are often thrown away. I thought it helped a lot. If I knew SVM well enough, I would upload and label the picture. Maybe I will if I learn more. dfrankow 03:54, 3 March 2006 (UTC)

Hope the picture I added makes it more clear AnAj 19:23, 15 June 2006 (UTC)
I can not see the picture in the article. Is the link wrong? —The preceding unsigned comment was added by 84.216.75.78 (talk • contribs) .
Apparently there was something wrong with the thumbnail cache. I purged it and now it shows correctly on all thumbnail sizes at least for me. AnAj 09:04, 13 August 2006 (UTC)

[edit] How to add a Loss-Matrix to SVM

Maybe anybody can point out how to incoperate a Loss matrix into the svm framework. I've been looking for this information on various places and I think this would add some great value to the article.

[edit] Need more practically USEFUL info.

Article is too academically minded. There should a paragraph on real-world applications, like SVMs and logistic regression method are being used nowadays in filtering e-mail spam messages. How and why? Thanks in advance! —The preceding unsigned comment was added by 195.70.32.136 (talk) 15:54, 18 January 2007 (UTC).

Hmm I think would be nice to have some real world examples where SVM beats every competetor (should be easy to find). Maybe a good idea would be to create some images with the different cases that arise on real world data (e.g. linear seperable, linear seperable with soft margin, seperable with rbf kernel and seperable with rbf kernel and soft margin...) to clarify why there are different extentions on the original linear formulation.--Cyc 12:25, 5 February 2007 (UTC)

Not going to excise comments from a talk page and I agree that examples are useful. But the comment above is totally wrong.... The law of conservation of generalisation means that for every case where SVMs are better at classification than another method there is a corresponding case where they are worse. It's a fundamental principle of machine learning (however counter-intuitive it seems). Outside the world of the theoretical, there are many applications where SVM's are not the best approach (and I use SVMs in industry). It all depends on the data and the application. I'd agree it's worth including the RBF kernel as in my experience it's probably used more frequently in the real world than linear SVM's as data is rarely linearly separable in n-dimensions. A brief explanation and a link to Kernel methods is probably enough. Maybe I'll do it myself if I get time.... 212.84.127.134 09:13, 28 February 2007 (UTC)

[edit] This should tie into least-squares

SVMs and SVRs are a simple application of least squares which has been around a lot longer than either of these two. People from many disciplines have never heard of machine learning or SVM/Rs but they have heard of least squares. Even the non-linear adjustment is standard for least-squares polynomial fitting. The standard least squares is just SVR and SVM does a similar trick while trying to fit a hyperplane BETWEEN the data rather than fit a line ON the data.

I'm not sure this is strictly true (though admittedly while SVMs are my specialty I know relatively little about advanced least-squares algorithms). I agree that least-squares SVMs (LS-SVRs and LS-SVMs as per Suykens' book) implement a regularised least-squares technique, but standard SVRs (and SVM classifiers) implement a cost function which is linear, not quadratic (specifically, SVRs usually use Vapnik's epsilon-insensitive cost). Actually, it might be good to cover least-squares SVMs and maybe mention (briefly, the details might be a bit excessive for now) very general extensions like Smola, Scholkopf and Muller's paper "General Cost Functions for Support Vector Regression". Svm slave 12:34, 24 February 2007 (UTC)

[edit] Secure Virtual Machine (SVM) Technology?

Is there a wiki page for this article, I typed in SVM and got here. I also typed in all the other obvious things and think that this term does not have a wiki article! TonyMartin 19:36, 06 March 2007 (UTC)

As far as I know, there is no such article on the english-language wikipedia. There is however a short article on the german-language wikipedia. Feel free to add Secure Virtual Machine to the list of requested article for computer science, computing, and Internet. (Or even better, start that article yourself if you feel knowledgable about this topic). — Tobias Bergemann 08:37, 7 March 2007 (UTC)

Addendum: Your signature appears to be broken, as the wiki link leads to a user page for a non-existing user "TonyMartin" instead of User:Tonymartin234567. — Tobias Bergemann 08:40, 7 March 2007 (UTC)

Superscript text The claim "they simultaneously minimize the empirical classification error and maximize the geometric margin" is impercise to say the least. More like: maximize margin hoping that error is minimized.

Static Wikipedia (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2007 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu -

Static Wikipedia 2006 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu

Static Wikipedia February 2008 (no images)

aa - ab - af - ak - als - am - an - ang - ar - arc - as - ast - av - ay - az - ba - bar - bat_smg - bcl - be - be_x_old - bg - bh - bi - bm - bn - bo - bpy - br - bs - bug - bxr - ca - cbk_zam - cdo - ce - ceb - ch - cho - chr - chy - co - cr - crh - cs - csb - cu - cv - cy - da - de - diq - dsb - dv - dz - ee - el - eml - en - eo - es - et - eu - ext - fa - ff - fi - fiu_vro - fj - fo - fr - frp - fur - fy - ga - gan - gd - gl - glk - gn - got - gu - gv - ha - hak - haw - he - hi - hif - ho - hr - hsb - ht - hu - hy - hz - ia - id - ie - ig - ii - ik - ilo - io - is - it - iu - ja - jbo - jv - ka - kaa - kab - kg - ki - kj - kk - kl - km - kn - ko - kr - ks - ksh - ku - kv - kw - ky - la - lad - lb - lbe - lg - li - lij - lmo - ln - lo - lt - lv - map_bms - mdf - mg - mh - mi - mk - ml - mn - mo - mr - mt - mus - my - myv - mzn - na - nah - nap - nds - nds_nl - ne - new - ng - nl - nn - no - nov - nrm - nv - ny - oc - om - or - os - pa - pag - pam - pap - pdc - pi - pih - pl - pms - ps - pt - qu - quality - rm - rmy - rn - ro - roa_rup - roa_tara - ru - rw - sa - sah - sc - scn - sco - sd - se - sg - sh - si - simple - sk - sl - sm - sn - so - sr - srn - ss - st - stq - su - sv - sw - szl - ta - te - tet - tg - th - ti - tk - tl - tlh - tn - to - tpi - tr - ts - tt - tum - tw - ty - udm - ug - uk - ur - uz - ve - vec - vi - vls - vo - wa - war - wo - wuu - xal - xh - yi - yo - za - zea - zh - zh_classical - zh_min_nan - zh_yue - zu