Aliquid state machine (LSM) is a type ofreservoir computer that uses aspiking neural network. An LSM consists of a large collection of units (callednodes, orneurons). Each node receives time varying input from external sources (theinputs) as well as from other nodes. Nodes arerandomly connected to each other. Therecurrent nature of the connections turns the time varying input into aspatio-temporal pattern of activations in the network nodes. The spatio-temporal patterns of activation are read out bylinear discriminant units.
The soup of recurrently connected nodes will end up computing a large variety ofnonlinear functions on the input. Given alarge enough variety of such nonlinear functions, it is theoretically possible to obtain linear combinations (using the read out units) to perform whatever mathematical operation is needed to perform a certain task, such asspeech recognition orcomputer vision.
The wordliquid in the name comes from the analogy drawn to dropping a stone into a still body of water or other liquid. The falling stone will generateripples in the liquid. The input (motion of the falling stone) has been converted into a spatio-temporal pattern of liquid displacement (ripples).
LSMs have been put forward as a way to explain the operation ofbrains. LSMs are argued to be an improvement over the theory of artificial neural networks because:
Criticisms of LSMs as used incomputational neuroscience are that
If a reservoir hasfading memory andinput separability, with help of a readout, it can be proven the liquid state machine is a universal function approximator usingStone–Weierstrass theorem.[1]
{{citation}}: CS1 maint: multiple names: authors list (link)