# Using the Neural Net

[from Shannon Williams (960121.14:00)]

Sorry it has taken a week for me to respond. I was hoping to respond
yesterday, but our VAX was down so I could not get on-line. If you respond
to this post, and I do not answer today, then I probably will not answer
until next Saturday.

About 'Using Neural Nets' and 'prediction':

···

--------------------------------------------------------------------------
Rather than attempt to discuss your understandings of what I might
understand, let's start over. I will only attempt to explain to you a
method of generating memories using neural networks. I will not describe
the ramifications of implementing this method.

I called this memory generation 'prediction'. To me it looks like
prediction. But you look at it for yourself, and perhaps you will name it
something else.

For right now, concetrate on seeing how the neural network can be used to
generate memories. Wipe from your mind any other use of the network. Once
you see how it can be used to generate memories, you will be able to answer
for yourself, most of the questions that you demanded of me.

Below is part of the diagram that Bill Powers (960110.1730) drew to explain
how he visualizes using neural networks. I assume that if (output) is
unequal to R2 then the 'neural net' modifies its connections until (output)
does equal R2.

input---> neural net --------> (output)
/|\ |
> >
C <--------
/|\
>
R2

The change that I would make to this diagram is simply to define a
relationship between R2 and input such that:

input---> neural net --------> (output)
/|\ /|\ |
> > >
delay C <--------
> /|\
-----<-----|
>
R2

So now:

input = input(t=0)
R2 = input(t>0)

In other words, if you were saying the alphabet then:

if input = 'A' then R2 = 'B'.

In other words, the ultimate input in this diagram is R2, not 'input'.

What are your questions so far? Do you see how memory is created?

-Shannon