Options
Lukas Gonon
Former Member
Title
Dr.
Last Name
Gonon
First name
Lukas
Now showing
1 - 5 of 5
-
PublicationReservoir Computing Universality With Stochastic Inputs( 2019)The universal approximation properties with respect to Lp-type criteria of three important families of reservoir computers with stochastic discrete-time semi-infinite inputs are shown. First, it is proved that linear reservoir systems with either polynomial or neural network readout maps are universal. More importantly, it is proved that the same property holds for two families with linear readouts, namely, trigonometric state-affine systems and echo state networks, which are the most widely used reservoir systems in applications. The linearity in the readouts is a key feature in supervised machine learning applications. It guarantees that these systems can be used in high-dimensional situations and in the presence of large datasets. The Lp criteria used in this paper allow the formulation of universality results that do not necessarily impose almost sure uniform boundedness in the inputs or the fading memory property in the filter that needs to be approximated.Type: journal articleJournal: IEEE Transactions on Neural Networks and Learning SystemsVolume: Forthcoming
-
PublicationApproximation bounds for random neural networks and reservoir systems(Institute of Mathematical Statistics, 2023-02)Journal: The Annals of Applied ProbabilityVolume: 33Issue: 1
-
PublicationExpressive Power of Randomized Signature( 2021)
;Cuchiero, ChristaTeichmann, JosefWe consider the question whether the time evolution of controlled differential equations on general state spaces can be arbitrarily well approximated by (regularized) regressions on features generated themselves through randomly chosen dynamical systems of moderately high dimension. On the one hand this is motivated by paradigms of reservoir computing, on the other hand by ideas from rough path theory and compressed sensing. Appropriately interpreted this yields provable approximation and generalization results for generic dynamical systems by regressions on states of random, otherwise untrained dynamical systems, which usually are approximated by recurrent or LSTM networks. The results have important implications for transfer learning and energy efficiency of training.Volume: NeurIPS 2021 Workshop DLDE -
PublicationDiscrete-time signatures and randomness in reservoir computing
;Cuchiero, Christa ;Grigoryeva, LyudmilaTeichmann, JosefA new explanation of geometric nature of the reservoir computing phenomenon is presented. Reservoir computing is understood in the literature as the possibility of approximating input/output systems with randomly chosen recurrent neural systems and a trained linear readout layer. Light is shed on this phenomenon by constructing what is called strongly universal reservoir systems as random projections of a family of state-space systems that generate Volterra series expansions. This procedure yields a state-affine reservoir system with randomly generated coefficients in a dimension that is logarithmically reduced with respect to the original system. This reservoir system is able to approximate any element in the fading memory filters class just by training a different linear readout for each different filter. Explicit expressions for the probability distributions needed in the generation of the projected reservoir system are stated and bounds for the committed approximation error are provided.Type: forthcoming -
PublicationType: working paper