Currently being updated.
Automatic reload in seconds.


 
Subscribe: RSS Podcasts iTunes
Episode 550             Episode 552
Episode 551

Kernel method
Tue, 2018-Nov-06 01:11 UTC
Length - 2:49

Direct Link

Welcome to random Wiki of the Day where we read the summary of a random Wikipedia page every day.

The random article for Tuesday, 6 November 2018 is Kernel method.

In machine learning, kernel methods are a class of algorithms for pattern analysis, whose best known member is the support vector machine (SVM). The general task of pattern analysis is to find and study general types of relations (for example clusters, rankings, principal components, correlations, classifications) in datasets. For many algorithms that solve these tasks, the data in raw representation have to be explicitly transformed into feature vector representations via a user-specified feature map: in contrast, kernel methods require only a user-specified kernel, i.e., a similarity function over pairs of data points in raw representation.

Kernel methods owe their name to the use of kernel functions, which enable them to operate in a high-dimensional, implicit feature space without ever computing the coordinates of the data in that space, but rather by simply computing the inner products between the images of all pairs of data in the feature space. This operation is often computationally cheaper than the explicit computation of the coordinates. This approach is called the "kernel trick". Kernel functions have been introduced for sequence data, graphs, text, images, as well as vectors.

Algorithms capable of operating with kernels include the kernel perceptron, support vector machines (SVM), Gaussian processes, principal components analysis (PCA), canonical correlation analysis, ridge regression, spectral clustering, linear adaptive filters and many others. Any linear model can be turned into a non-linear model by applying the kernel trick to the model: replacing its features (predictors) by a kernel function.

Most kernel algorithms are based on convex optimization or eigenproblems and are statistically well-founded. Typically, their statistical properties are analyzed using statistical learning theory (for example, using Rademacher complexity).

This recording reflects the Wikipedia text as of 01:11 UTC on Tuesday, 6 November 2018.

For the full current version of the article, go to http://en.wikipedia.org/wiki/Kernel_method.

This podcast is produced by Abulsme Productions based on Wikipedia content and is released under a Creative Commons Attribution-ShareAlike License.

Visit wikioftheday.com for our archives, sister podcasts, and swag. Please subscribe to never miss an episode. You can also follow @WotDpod on Twitter.

Abulsme Productions produces the current events podcast Curmudgeon's Corner as well. Check it out in your podcast player of choice.

This has been Ivy. Thank you for listening to random Wiki of the Day.

For current episodes, or for the rest of the Wiki of the Day family of podcasts go here.


Archive Episodes:
1-100  101-200  201-300  301-400  401-500
501-600  601-700  701-800  801-805  

  Buy WotD Stuff!!

Feedback welcome at feedback@wikioftheday.com.

These podcasts are produced by Abulsme Productions based on Wikipedia content.

They are released under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Creative Commons License

Abulsme Productions also produces Curmudgeon's Corner, a current events podcast.

If you like that sort of thing, check it out too!


Page cached at 2019-07-18 01:55:25 UTC
Original calculation time was 1.0145 seconds

Page displayed at 2019-07-20 12:10:47 UTC
Page generated in 0.0055 seconds