post-synaptic pre-synaptic Hopfield output 1 1 1 1 0 -1 0 1 -1 0 0 1
% hpallo.m, makes all of the % Hebb (HB), post-synaptic (PO), % pre-synaptic (PR), and Hopfield % covariance (CM) matrices using % patterns in matrix P where P[m,n] % has m patterns of n unit states; % connetivity matrices are [n,n]; % outer-product method is used % [m,n] = size(P); HB = zeros(n); PO = zeros(n); PR = zeros(n); CM = zeros(n); for k=1:m, hbdw=(P(k,:))'*(P(k,:)); podw=(P(k,:))'*((2*P(k,:))-1); prdw=((2*P(k,:))-1)'*(P(k,:)); cvdw=((2*P(k,:))-1)'*((2*P(k,:))-1); HB = HB + hbdw; PO = PO + podw; PR = PR + prdw; CM = CM + cvdw; end, MSK = (ones(n) - eye(n)); HB = HB .* MSK; PO = PO .* MSK; PR = PR .* MSK; CM = CM .* MSK;
P = [ 1 1 1 1 0 0 0 0 0 0; 0 0 0 0 0 0 1 1 1 1] P = 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 hpallo HB % the Hebbian matrix is all positive HB = 0 1 1 1 0 0 0 0 0 0 1 0 1 1 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 0 0 0 0 0 0 1 0 1 1 0 0 0 0 0 0 1 1 0 1 0 0 0 0 0 0 1 1 1 0 PO % the post-synaptic matrix is positive and negative PO = 0 1 1 1 -1 -1 -1 -1 -1 -1 1 0 1 1 -1 -1 -1 -1 -1 -1 1 1 0 1 -1 -1 -1 -1 -1 -1 1 1 1 0 -1 -1 -1 -1 -1 -1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 -1 -1 -1 -1 -1 -1 0 1 1 1 -1 -1 -1 -1 -1 -1 1 0 1 1 -1 -1 -1 -1 -1 -1 1 1 0 1 -1 -1 -1 -1 -1 -1 1 1 1 0 PR % the pre-synaptic is also positive/negative PR = 0 1 1 1 0 0 -1 -1 -1 -1 1 0 1 1 0 0 -1 -1 -1 -1 1 1 0 1 0 0 -1 -1 -1 -1 1 1 1 0 0 0 -1 -1 -1 -1 -1 -1 -1 -1 0 0 -1 -1 -1 -1 -1 -1 -1 -1 0 0 -1 -1 -1 -1 -1 -1 -1 -1 0 0 0 1 1 1 -1 -1 -1 -1 0 0 1 0 1 1 -1 -1 -1 -1 0 0 1 1 0 1 -1 -1 -1 -1 0 0 1 1 1 0 CM % the Hopfield is positive/negative at double strength of others CM = 0 2 2 2 0 0 -2 -2 -2 -2 2 0 2 2 0 0 -2 -2 -2 -2 2 2 0 2 0 0 -2 -2 -2 -2 2 2 2 0 0 0 -2 -2 -2 -2 0 0 0 0 0 2 0 0 0 0 0 0 0 0 2 0 0 0 0 0 -2 -2 -2 -2 0 0 0 2 2 2 -2 -2 -2 -2 0 0 2 0 2 2 -2 -2 -2 -2 0 0 2 2 0 2 -2 -2 -2 -2 0 0 2 2 2 0
function OUT = hpsy(v,M,itn) % this function computes synchronous % iterations through autoassociative % networks where v is the state vector, % M is the connectivity matrix, itn is % the desired number of iterations and % OUT is a martix to hold v through time % [m,n]=size(M); OUT=zeros(m,itn); for i=1:itn, v = M*v; v = v>0; OUT(:,i)=v; end;
in1 = P(1,:)'; in2 = P(2,:)'; OHB=hpsy(in1,HB,10) OHB = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OPO = hpsy(in1,PO,10) OPO = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OPR = hpsy(in1,PR,10) OPR = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OCM = hpsy(in1,CM,10) OCM = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
v = [1 1 0 0 0 0 0 0 0 0]'; OHB=hpsy(v,HB,10) OHB = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OPO = hpsy(v,PO,10) OPO = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OPR = hpsy(v,PR,10) OPR = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OCM = hpsy(v,CM,10) OCM = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0All the networks show pattern completion
v = [0.1 0 0 0 0 0 0 0 0 0]'; OHB=hpsy(v,HB,10) OHB = 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OPO = hpsy(v,PO,10) OPO = 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OPR = hpsy(v,PR,10) OPR = 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OCM = hpsy(v,CM,10) OCM = 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
v = [1 1 1 1 0 0 0 0 0 1]'; OHB=hpsy(v,HB,10) OHB = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 1 1 1 1 1 1 1 1 1 OPO = hpsy(v,PO,10) OPO = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OPR = hpsy(v,PR,10) OPR = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OCM = hpsy(v,CM,10) OCM = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
P = [1 0 1 0 1 0 1 0 1 0; 1 1 1 1 1 0 0 0 0 0]; hpallo v = [1 1 1 1 1 0 0 0 0 0]'; OHB=hpsy(v,HB,10) % the Hebb network is not stable OHB = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 OPO = hpsy(v,PO,10) % the post-synaptic network is not stable OPO = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 OPR = hpsy(v,PR,10) % the pre-synaptic network is stable OPR = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 OCM = hpsy(v,CM,10) % the Hopfield network is stable OCM = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
In this example, the patterns overlap, making them difficult for some nets to learn to differentiate.
The Hebb network fails to differentiate because it lacks inhibition.
The post-synaptic can't differentiate because it doesn't learn in cases where the pre-synaptic element is "on" when the post-synaptic element should be "off."
P = [1 0 1 0 1 0 1 0 1 0; 1 0 0 1 1 1 1 0 0 1]; hpallo v = [0 0 1 0 1 0 1 0 1 0]'; OCM = hpsy(v,CM,10) OCM = 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 v = [0 0 0 0 1 0 1 0 1 0]'; OCM = hpsy(v,CM,10) OCM = 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 1 0 1 0 1 0 1 0 1 0 0 0 0 0 0 0 0 0 0
function OUT = hpas(v,M,itn) % this function computes asynchronous % iterations of autoassociative networks, % where v is the state vector, M is the % connectivity matrix, itn is the number % of iterations and OUT is a matrix % to hold v over time; some units will % update before others have had a turn % [m,n]=size(M); OUT=zeros(m,itn/10); for i=1:itn, [sdrv rindx]=max(rand(1,n)); v(rindx)=M(rindx,:)*v; v=v>0; if rem(i,10)==0, OUT(:,i/10)=v; end, end;
OCM = hpcas(v,CM,100) OCM = 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
OCM = hpcas(v,CM,100)
v = [ 0.4 0.3 0.5 0.3 0.4 0.2 0.1 0.1 0.2 0.1 ]'; % weak and noisy pattern 1 OCM = hpcas(v,CM,100) OCM = 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0
The pattern is correctly recalled. Other times, it can generate the wrong output pattern:
OCM = hpcas(v,CM,100) OCM = 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 0 0 0 0 0 0 0 0 0 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1 1
function RP = hprnpt(m,n,prob) % function RP = hprnpt(m,n,prob) % this function will generate a set of % m random 0-1 patterns n states long, % with a probability prob that any % state will be a 1 % RP=rand(m,n); RP=RP
Plus, we want to verify the output to a particular pattern:
% script hpcmvr.m % this script will verify learning % in the Hopfield covariance network; % it will initialize the network to % 0.5 times the idx row of P, update % 100 times and print out the idx row % of P next to the final state of % the network % vh=P(idx,:)'; v=vh*(0.5); O=hpcas(v,CM,100); [vh O(:,10)]
P = hprnpt(2,30,0.5) hpallo idx=1; % choose pattern 1 hpcmvr % check recall ans = 1 1 1 1 1 1 1 1 0 0 1 1 0 0 1 1 0 0 0 0 1 1 1 1 0 0 0 0 0 0 1 1 1 1 1 1 0 0 0 0 0 0 1 1 1 1 0 0 1 1 0 0 1 1 1 1 1 1 0 0Two patterns are easily stored in 30 units. Now, test with 10 patterns:
P = hprnpt(10,30,0.5); hpallo idx=1; hpcmvr ans = 1 1 0 0 0 1 1 1 0 1 1 0 1 1 0 0 1 0 1 1 1 1 0 0 0 0 0 1 0 0 1 1 1 1 1 1 0 1 0 1 1 0 0 0 0 0 0 0 1 1 1 0 1 0 0 1 0 1 1 130-unit Hopfield nets begin to show incorrect recalls with about 5 patterns, so the capacity of Hopfield nets is around 0.15 patterns per unit.
Problem 11
With 3 patterns, the Hopfield net can recall the pattern reliably:
P = hprnpt(3,30,0.5); hpallo idx=1; hpcmvr ans = 1 1 0 0 1 1 0 0 0 0 1 1 1 1 0 0 1 1 1 1 0 0 0 0 1 1 0 0 0 0 1 1 1 1 1 1 1 1 0 0 1 1 0 0 0 0 1 1 0 0 1 1 1 1 0 0 1 1 1 1The following script with lesion the connectivity matrix with chosen probability prob:
% hplscm.m % this script will lesion the % Hopfield covarience matrix CM % by changing each element to zero % with probability prob % [m,n] = size(CM); mask = rand(m) > prob; CM = CM .* mask;
prob=0.5; hplscm idx=1; hpcmvr ans = 1 1 0 1 1 0 0 0 0 0 1 1 1 0 0 0 1 1 1 1 0 1 0 1 1 1 0 1 0 0 1 1 1 1 1 1 1 0 0 1 1 0 0 1 0 0 1 1 0 0 1 0 1 1 0 0 1 1 1 1This pattern was not correctly recalled. The network performance degrades gracefully as more and more weights are eliminated. The network performance is noticably affected after about 40-60% of connections are lesioned.
Distributed representations are robust!
Thomas J. Anastasio tstasio@uiuc.edu