Time-recurrent networks are considered. Synaptic plasticity is
defined by a simple Hebb rule. It is well known that this
Hebbian mechanism can support learning and memory.
We show that this plasticity is a computational instrument
with large possibilities. In particular, the synaptic matrix
can store different information, both dynamic and static. For
example, the network can perform the Fourier and wavelet
transformations and calculate probability distributions of unknown
parameters. These networks can analyse and identify dynamics,
calculate likelihood, study autoregression etc. They can
resolve even more sophisticated problems, for example decoding
fractal images.