Viral News

Reservoir Computing meets Recurrent Kernels and Structured Transforms

[Submitted on 12 Jun 2020 (v1), last revised 21 Oct 2020 (this version, v2)]

Download PDF

Abstract: Reservoir Computing is a class of simple yet efficient Recurrent Neural
Networks where internal weights are fixed at random and only a linear output
layer is trained. In the large size limit, such random neural networks have a
deep connection with kernel methods. Our contributions are threefold: a) We
rigorously establish the recurrent kernel limit of Reservoir Computing and
prove its convergence. b) We test our models on chaotic time series prediction,
a classic but challenging benchmark in Reservoir Computing, and show how the
Recurrent Kernel is competitive and computationally efficient when the number
of data points remains moderate. c) When the number of samples is too large, we
leverage the success of structured Random Features for kernel approximation by
introducing Structured Reservoir Computing. The two proposed methods, Recurrent
Kernel and Structured Reservoir Computing, turn out to be much faster and more
memory-efficient than conventional Reservoir Computing.

Submission history

From: Jonathan Dong [view email]

[v1]
Fri, 12 Jun 2020 16:46:34 UTC (5,339 KB)

[v2]
Wed, 21 Oct 2020 16:11:25 UTC (5,341 KB)

Read More

Show More

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button

Adblock Detected

Please consider supporting us by disabling your ad blocker