Submitted by twovests in programming

  1. deep learning neural networks are 'universal function approximators'.

  2. this means, given a number of input/output samples from a function and a big enough brain, the neural network can approximate that function.

  3. maybe we can use this to encode a number of things?

e.g. 3D manifolds (models with a defined inside/outside) can be approximated. input (x, y, z), output (point is in model).

e.g. music; given a time (t), give the amplitude of the sample at that time.

you can also use supplementary data. given a time (t) and the height of the previous 5 samples, a neural network might be able to better predict the next sample. (it could also diverge over time, giving a wonderfully distorting song).

i might work on this later today instead of my school work if anybody is interested

6

Comments

You must log in or register to comment.

voxpoplar wrote

I mean this is basically just what human memory is right?

3

twovests OP wrote

Maybe sort of?

There's probably some cool implications about the study of artificial neural networks for real-life neural networks, and ANNs are based off biological NNs, but I definitely don't know enough about either to say

2