Support Vector Machines

(lightning talk)

(LPW '07) (john melesky)

Presupposing:

The problem: find a line that separates these two categories of thing

For humans, this is easy.

For mathematicians, it's actually not too hard.

For humans, this is easy.

For mathematicians computers, it's actually not too hard.

There are two problems, though.

Problem, the first:

Problem, the first:

Problem, the second:

Problem, the second:

Problem, the second:

Conveniently, Support Vector Machines address both of the problems i've identified.

Solution, the first:

Solution, the first:

Solution, the first:

Solution, the first:

Solution, the first:

A joke:

Q: How many mathematicians does it take to change a lightbulb?

A joke:

Q: How many mathematicians does it take to change a lightbulb?

A: One, who hands it to 127 Londoners, thus reducing it to an earlier joke.

A question:

Q: How do mathematicians categorize non-linearly-separable data?

A question:

Q: How do mathematicians categorize non-linearly-separable data?

A: Munge the data until it's linearly separable, thus reducing it to an earlier slide.

A question:

Q: How do mathematicians categorize non-linearly-separable data?

A: Munge the data until it's linearly separable, thus reducing it to an earlier slide.

Seriously. The munging is done using what are known as "kernel methods".

Kernel Methods

Kernel Methods + Support Vectors = Support Vector Machines

In Perl:

Algorithm::SVM - bindings to libsvm

(Also wrapped by AI::Categorizer)