[1] R. Kannan, M. Ishteva, B. Drake, and H. Park. Bounded matrix low rank approximation. In G. R. Naik, editor, Non-negative Matrix Factorization Techniques, Signals and Communication Technology, pages 89-118. Springer Berlin Heidelberg, 2016. [ bib | DOI | http ]
[2] I. Markovsky. System identification in the behavioral setting: A structured low-rank approximation approach. In E. Vincent et al., editors, Latent Variable Analysis and Signal Separation, volume 9237 of Lecture Notes in Computer Science, pages 235-242. Springer, 2015. [ bib | pdf ]
System identification is a fast growing research area that encompasses a broad range of problems and solution methods. It is desirable to have a unifying setting and a few common principles that are sufficient to understand the currently existing identification methods. The behavioral approach to system and control, put forward in the mid 80's, is such a unifying setting. Till recently, however, the behavioral approach lacked supporting numerical solution methods. In the last 10 yeas, the structured low-rank approximation setting was used to fulfill this gap. In this paper, we summarize recent progress on methods for system identification in the behavioral setting and pose some open problems. First, we show that errors-in-variables and output error system identification problems are equivalent to Hankel structured low-rank approximation. Then, we outline three generic solution approaches: 1) methods based on local optimization, 2) methods based on convex relaxations, and 3) subspace methods. A specific example of a subspace identification method-data-driven impulse response computation-is presented in full details. In order to achieve the desired unification, the classical ARMAX identification problem should also be formulated as a structured low-rank approximation problem. This is an outstanding open problem.

Keywords: system identification; errors-in-variables modeling, behavioral approach; Hankel matrix, low-rank approximation, impulse response estimation, ARMAX identification.
[3] M. Ishteva. Tensors and latent variable models. In E. Vincent, A. Yeredor, Z. Koldovský, and P. Tichavský, editors, Latent Variable Analysis and Signal Separation, volume 9237 of Lecture Notes in Computer Science, pages 49-55. Springer International Publishing, 2015. [ bib | DOI | http ]
Keywords: Latent variable models; Tensor; Low rank
[4] P. Dreesen, T. Goossens, M. Ishteva, L. De Lathauwer, and J. Schoukens. Block-decoupling multivariate polynomials using the tensor block-term decomposition. In E. Vincent, A. Yeredor, Z. Koldovský, and P. Tichavský, editors, Latent Variable Analysis and Signal Separation, volume 9237 of Lecture Notes in Computer Science, pages 14-21. Springer International Publishing, 2015. [ bib | DOI | http ]
Keywords: Multivariate polynomials; Multilinear algebra; Tensor decomposition; Block-term decomposition; Waring decomposition
[5] I. Markovsky. Rank constrained optimization problems in computer vision. In A. Argyriou J. Suykens, M. Signoretto, editor, Regularization, Optimization, Kernels, and Support Vector Machines, Pattern Recognition, chapter 13, pages 293-312. Chapman & Hall/CRC Machine Learning, 2014. [ bib | pdf ]
[6] I. Markovsky and K. Usevich. Nonlinearly structured low-rank approximation. In Yun Raymond Fu, editor, Low-Rank and Sparse Modeling for Visual Analysis, pages 1-22. Springer, 2014. [ bib | DOI | pdf ]
Polynomially structured low-rank approximation problems occur in algebraic curve fitting, , conic section fitting, subspace clustering (generalized principal component analysis), and nonlinear and parameter-varying system identification. The maximum likelihood estimation principle applied to these nonlinear models leads to nonconvex optimization problems and yields inconsistent estimators in the errors-in-variables (measurement errors) setting. We propose a computationally cheap and statistically consistent estimator based on a bias correction procedure, called adjusted least-squares estimation. The method is successfully used for conic section fitting and was recently generalized to algebraic curve fitting. The contribution of this book's chapter is the application of the polynomially structured low-rank approximation problem and, in particular, the adjusted least-squares method to subspace clustering, nonlinear and parameter-varying system identification. The classical in system identification input-output notion of a dynamical model is replaced by the behavioral definition of a model as a set, represented by implicit nonlinear difference equations.

Keywords: structured low-rank approximation, conic section fitting, subspace clustering, nonlinear system identification.
[7] I. Markovsky. Algorithms and literate programs for weighted low-rank approximation with missing data. volume 3, chapter 12, pages 255-273. Springer, 2011. [ bib | DOI | pdf | software ]

This file was generated by bibtex2html 1.97.