Muutke küpsiste eelistusi

E-raamat: Advances in Independent Component Analysis

Edited by
Teised raamatud teemal:
  • Formaat - PDF+DRM
  • Hind: 196,98 €*
  • * hind on lõplik, st. muud allahindlused enam ei rakendu
  • Lisa ostukorvi
  • Lisa soovinimekirja
  • See e-raamat on mõeldud ainult isiklikuks kasutamiseks. E-raamatuid ei saa tagastada.
Teised raamatud teemal:

DRM piirangud

  • Kopeerimine (copy/paste):

    ei ole lubatud

  • Printimine:

    ei ole lubatud

  • Kasutamine:

    Digitaalõiguste kaitse (DRM)
    Kirjastus on väljastanud selle e-raamatu krüpteeritud kujul, mis tähendab, et selle lugemiseks peate installeerima spetsiaalse tarkvara. Samuti peate looma endale  Adobe ID Rohkem infot siin. E-raamatut saab lugeda 1 kasutaja ning alla laadida kuni 6'de seadmesse (kõik autoriseeritud sama Adobe ID-ga).

    Vajalik tarkvara
    Mobiilsetes seadmetes (telefon või tahvelarvuti) lugemiseks peate installeerima selle tasuta rakenduse: PocketBook Reader (iOS / Android)

    PC või Mac seadmes lugemiseks peate installima Adobe Digital Editionsi (Seeon tasuta rakendus spetsiaalselt e-raamatute lugemiseks. Seda ei tohi segamini ajada Adober Reader'iga, mis tõenäoliselt on juba teie arvutisse installeeritud )

    Seda e-raamatut ei saa lugeda Amazon Kindle's. 

Independent Component Analysis (ICA) is a fast developing area of intense research interest. Following on from Self-Organising Neural Networks: Independent Component Analysis and Blind Signal Separation, this book reviews the significant developments of the past year. It covers topics such as the use of hidden Markov methods, the independence assumption, and topographic ICA, and includes tutorial chapters on Bayesian and variational approaches. It also provides the latest approaches to ICA problems, including an investigation into certain "hard problems" for the very first time. Comprising contributions from the most respected and innovative researchers in the field, this volume will be of interest to students and researchers in computer science and electrical engineering; research and development personnel in disciplines such as statistical modelling and data analysis; bio-informatic workers; and physicists and chemists requiring novel data analysis methods.

Muu info

Springer Book Archives
Contributors xv Foreword xix Part I Temporal ICA Models Hidden Markov Independent Component Analysis 3(20) William D. Penny Richard M. Everson Stephen J. Roberts Introduction 3(1) Hidden Markov Models 3(3) Independent Component Analysis 6(2) Generalised Exponential Sources 6(1) Generalised Autoregressive Sources 7(1) Hidden Markov ICA 8(2) Generalised Exponential Sources 9(1) Generalised Autoregressive Sources 10(1) Practical Issues 10(2) Initialisation 10(1) Learning 10(2) Model Order Selection 12(1) Results 12(7) Multiple Sinewave Sources 12(2) Same Sources, Different Mixing 14(2) Same Mixing, Different Sources 16(1) EEG Data 16(3) Conclusion 19(1) Acknowledgements 20(1) Appendix 20(3) Particle Filters for Non-Stationary ICA 23(22) Richard M. Everson Stephen J. Roberts Introduction 23(1) Stationary ICA 23(2) Non-Stationary Independent Component Analysis 25(3) Source Model 27(1) Particle Filters 28(2) Source Recovery 29(1) Illustration of Non-Stationary ICA 30(3) Smoothing 33(3) Temporal Correlations 36(2) Conclusion 38(1) Acknowledgement 38(1) Appendix: Laplaces Approximation for the Likelihood 39(6) Part II The Validity of the Independence Assumption The Independence Assumption; Analyzing the Independence of the Components by Topography 45(18) Aapo Hyvarinen Patrik O. Hoyer Mika Inki Introduction 45(2) Background: Independents Subspace Analysis 47(2) Topographic ICA Model 49(4) Dependence and Topography 49(1) Defining Topographic ICA 50(1) The Generative Model 51(1) Basic Properties of the Topographic ICA Model 52(1) Learning Rule 53(1) Comparison with Other Topographic Mappings 54(1) Experiment 55(4) Experiments in Feature Extraction of Image Data 55(2) Experiments in Feature Extraction of Audio Data 57(1) Experiments with Magnetoencephalographic Recordings 58(1) Conclusion 59(4) The Independence Assumption: Dependent Component Analysis 63(12) Allan Kardec Barros Introduction 63(1) Blind Source Separation by DCA 64(1) The ``Cyclone Algorithm 65(2) Experimental Results 67(1) Higher-Order Cyclostationary Signal Separation 68(1) Conclusion 68(2) Appendix: Proof of ACF Property 3 70(5) Part III Ensemble Learning and Applications Ensemble Learning 75(18) Harri Lappalainen James W. Miskin Introduction 75(1) Posterior Averages in Action 76(2) Approximations of Posterior PDF 78(1) Ensemble Learning 79(4) Model Selection in Ensemble Learning 81(1) Connection to Coding 82(1) EM and MAP 83(1) Construction of Probabilistic Models 83(3) Priors and Hyperpriors 85(1) Examples 86(5) Fixed Form Q 86(2) Free Form Q 88(3) Conclusion 91(2) References 92(1) Bayesian Non-Linear Independent Component Analysis by Multi-Layer Perceptrons 93(30) Harri Lappalaien Antti Honkela Introduction 93(2) Choosing Among Competing Explanations 95(2) Non-Linear Factor Analysis 97(9) Definition of the Model 97(2) Cost Function 99(3) Update Rules 102(4) Non-Linear Independent Factor Analysis 106(1) Experiment 107(9) Learning Scheme 107(1) Helix 108(1) Non-Linear Artificial Data 109(6) Process Data 115(1) Comparison with Existing Methods 116(2) SOM and GTM 116(1) Auto-Associative MLPs 117(1) Generative Learning with MLPs 118(1) Conclusion 118(2) Validity of the Approximations 118(1) Initial Inversion by Auxiliary MLP 119(1) Future Directions 120(1) Acknowledgements 120(3) Ensemble Learning for Blind Image Separation and Deconvolution 123(22) James Miskin David J. C. MacKay Introduction 123(1) Separation of Images 124(10) Learning the Ensemble 126(3) Learning the Model 129(1) Example 129(3) Parts-Based Image Decomposition 132(2) Deconvolution of Images 134(6) Conclusion 140(1) Acknowledgements 141(4) References 141(4) Part IV Data Analysis and Applications Multi-Class Independent Component Analysis (MUCICA) for Rank-Deficient Distributions 145(16) Francesco Palmieri Alessandra Budillon Introduction 145(1) The Rank-Deficient One Class Problem 146(5) Method I: Three Blocks 148(1) Method II: Two Blocks 149(1) Method III: One Block 150(1) The Rank-Deficient Multi-Class Problem 151(3) Simulations 154(4) Conclusion 158(3) References 159(2) Blind Separation of Noisy Image Mixtures 161(22) Lars Kai Hansen Introduction 161(1) The Likelihood 162(1) Estimation of Sources of the Case of Known Parameters 163(1) Joint Estimation of Sources, Mixing Matrix, and Noise Level 164(2) Simulation Example 166(1) Generalization and the Bias-Variance Dilemma 167(3) Application to Neuroimaging 170(5) Conclusion 175(3) Acknowledgments 178(1) Appendix: The Generalized Boltzmann Learning Rule 179(4) Searching for Independence in Electromagnetic Brain Waves 183(18) Ricardo Vigario Jaakko Sarela Erkki Oja Introduction 183(1) Independent Component Analysis 184(2) The Model 184(1) The FastICA Algorithm 184(2) Electro- and Magnetoencephalography 186(2) The Analysis of the Linear ICA Model 188(1) The Analysis of EEG and MEG Data 189(5) Artifact Identification and Removal from EEG/MEG 189(2) Analysis of Multimodal Evoked Fields 191(2) Segmenting Auditory Evoked Fields 193(1) Conclusion 194(7) ICA on Noisy Data: A Factor Analysis Approach 201(16) Shiro Ikeda Introduction 201(1) Factor Analysis and ICA 202(3) Factor Analysis 202(2) Factor Analysis in Preprocessing 204(1) ICA as Determining the Rotation Matrix 204(1) Experiment with Synthesized Data 205(3) MEG Data Analysis 208(5) Experiment with Phantom Data 209(2) Experiment with Real Brain Data 211(2) Conclusion 213(2) Acknowledgments 215(2) Analysis of Optical Imaging Data Using Weak Models and ICA 217(18) John Porrill James V. Stone Jason Berwick John Mayhew Peter Coffey Introduction 217(1) Linear Component Analysis 218(1) Singular Value Decomposition 219(2) SVD Applied to OI Data Set 220(1) Independent Component Analysis 221(4) Minimisation Routines 223(1) Application of SICA to OI Data 223(2) The Weak Causal Model 225(2) Weak Causal Model Applied to the OI Data Set 226(1) Some Remarks on Significant Testing 227(1) The Weak Periodic Model 227(1) Regularised Weak Models 228(1) Regularised Weak Causal Model Applied to OI Data 229(1) Image Goodness and Multiple Models 230(1) A Last Look at the OI Data Set 231(1) Conclusion 232(3) References 233(2) Independent Components in Text 235(22) Thomas Kolenda Lars Kai Hansen Sigurdur Sigurdsson Introduction 235(3) Vector Space Representations 235(2) Latent Semantic Indexing 237(1) Independent Component Analysis 238(8) Noisy Separation of Linear Mixtures 239(3) Learning ICA Text Representations on the LSI Space 242(1) Document Classification Based on Independent Components 243(1) Keywords from Context Vectors 244(1) Generalisation and the Bias-Variance Dilemma 244(2) Examples 246(5) MED Data Set 248(1) CRAN Data Set 249(2) Conclusion 251(6) Seeking Independence Using Biological-Inspired ANNs 257(20) Pei Ling Lai Darryl Charles Colin Fyfe Introduction 257(1) The Negative Feedback Network 258(1) Independence in Unions of Sources 259(5) Factor Analysis 261(1) Minimal Overcomplete Bases 261(3) Canonical Correlation Analysis 264(5) Extracting Multiple Correlations 266(1) Using Minimum Correlations to Extract Independent Sources 267(1) Experiments 268(1) ϵ-Insensitive Hebbian Learning 269(6) Is this a Hebbian Rule? 270(1) Extraction of Sinusoids 271(2) Noise Reduction 273(2) Conclusion 275(2) References 275(2) Index 277