Muutke küpsiste eelistusi

Feature Extraction: Foundations and Applications Softcover reprint of the original 1st ed. 2006 [Pehme köide]

Edited by , Edited by , Edited by , Edited by
  • Formaat: Paperback / softback, 778 pages, kõrgus x laius: 235x155 mm, kaal: 1623 g, XXIV, 778 p., 1 Paperback / softback
  • Sari: Studies in Fuzziness and Soft Computing 207
  • Ilmumisaeg: 30-Apr-2017
  • Kirjastus: Springer-Verlag Berlin and Heidelberg GmbH & Co. K
  • ISBN-10: 366251771X
  • ISBN-13: 9783662517710
  • Pehme köide
  • Hind: 280,32 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Tavahind: 329,79 €
  • Säästad 15%
  • Raamatu kohalejõudmiseks kirjastusest kulub orienteeruvalt 2-4 nädalat
  • Kogus:
  • Lisa ostukorvi
  • Tasuta tarne
  • Tellimisaeg 2-4 nädalat
  • Lisa soovinimekirja
  • Formaat: Paperback / softback, 778 pages, kõrgus x laius: 235x155 mm, kaal: 1623 g, XXIV, 778 p., 1 Paperback / softback
  • Sari: Studies in Fuzziness and Soft Computing 207
  • Ilmumisaeg: 30-Apr-2017
  • Kirjastus: Springer-Verlag Berlin and Heidelberg GmbH & Co. K
  • ISBN-10: 366251771X
  • ISBN-13: 9783662517710
Everyonelovesagoodcompetition. AsIwritethis,twobillionfansareeagerly anticipating the 2006 World Cup. Meanwhile, a fan base that is somewhat smaller (but presumably includes you, dear reader) is equally eager to read all about the results of the NIPS 2003 Feature Selection Challenge, contained herein. Fans of Radford Neal and Jianguo Zhang (or of Bayesian neural n- works and Dirichlet di usion trees) are gloating I told you so and looking forproofthattheirwinwasnota uke. Butthematterisbynomeanssettled, and fans of SVMs are shouting wait til next year! You know this book is a bit more edgy than your standard academic treatise as soon as you see the dedication: To our friends and foes. Competition breeds improvement. Fifty years ago, the champion in 100m butter yswimmingwas22percentslowerthantodayschampion;thewomens marathon champion from just 30 years ago was 26 percent slower. Who knows how much better our machine learning algorithms would be today if Turing in 1950 had proposed an e ective competition rather than his elusive Test? But what makes an e ective competition? The ?eld of Speech Recognition hashadNIST-runcompetitionssince1988;errorrateshavebeenreducedbya factorofthreeormore,butthe eldhasnotyethadtheimpactexpectedofit. Information Retrieval has had its TREC competition since 1992; progress has been steady and refugees from the competition have played important roles in the hundred-billion-dollar search industry. Robotics has had the DARPA Grand Challenge for only two years, but in that time we have seen the results go from complete failure to resounding success (although it may have helped that the second years course was somewhat easier than the ?rsts).
An Introduction to Feature Extraction.- An Introduction to Feature
Extraction.- Feature Extraction Fundamentals.- Learning Machines.- Assessment
Methods.- Filter Methods.- Search Strategies.- Embedded Methods.-
Information-Theoretic Methods.- Ensemble Learning.- Fuzzy Neural Networks.-
Feature Selection Challenge.- Design and Analysis of the NIPS2003 Challenge.-
High Dimensional Classification with Bayesian Neural Networks and Dirichlet
Diffusion Trees.- Ensembles of Regularized Least Squares Classifiers for
High-Dimensional Problems.- Combining SVMs with Various Feature Selection
Strategies.- Feature Selection with Transductive Support Vector Machines.-
Variable Selection using Correlation and Single Variable Classifier Methods:
Applications.- Tree-Based Ensembles with Dynamic Soft Feature Selection.-
Sparse, Flexible and Efficient Modeling using L 1 Regularization.- Margin
Based Feature Selection and Infogain with Standard Classifiers.- Bayesian
Support Vector Machines for Feature Ranking and Selection.- Nonlinear Feature
Selection with the Potential Support Vector Machine.- Combining a Filter
Method with SVMs.- Feature Selection via Sensitivity Analysis with Direct
Kernel PLS.- Information Gain, Correlation and Support Vector Machines.-
Mining for Complex Models Comprising Feature Selection and Classification.-
Combining Information-Based Supervised and Unsupervised Feature Selection.-
An Enhanced Selective Naïve Bayes Method with Optimal Discretization.- An
Input Variable Importance Definition based on Empirical Data Probability
Distribution.- New Perspectives in Feature Extraction.- Spectral
Dimensionality Reduction.- Constructing Orthogonal Latent Features for
Arbitrary Loss.- Large Margin Principles for Feature Selection.- Feature
Extraction for Classificationof Proteomic Mass Spectra: A Comparative Study.-
Sequence Motifs: Highly Predictive Features of Protein Function.