MATLAB and Simulink resources for Arduino, LEGO, and Raspberry Pi

# Thread Subject: Bengali character recognition

 Subject: Bengali character recognition From: asdf Date: 27 Jul, 2012 10:31:12 Message: 1 of 2 Hi, we are doing a Bengali character recognition project. there are 69 character patter. we have formed 16X16 skeleton images of all characters. so our input P= 256X69 matrix Target is T= 69X69 matrix (by eye(69)) now here is our neural network code....   S1 = 69;   [R,Q] = size(input_pattern);   [S2,Q] = size(target);   P = input_pattern; % 256X69 matrix   T = target; % target =eye(69);   net = newff(P,T,[S1 S2],{'logsig' 'logsig' },'traingdx');   net=init(net); net.performFcn = 'sse'; net.trainParam.goal = 0.000001; net.trainParam.epochs = 700; net.trainParam.max_fail=374; net = train(net, P, T); [row col]=size(P);  for i=1:10      P = P+ randn(row,col);        net = train(net, P, T); % training with errored value  end % examine with the training data to check network is trained or not for i=1:aa P(:,i)   A2 = sim(net,P(:,i));   A2 = compet(A2);;   answer = find(compet(A2) == 1) end after executing this code we find that it does not even work for the training data. if it can not recognize the training data how it will recognize random data. Please help me..... i have tried with different number neuron combination. Please help....
 Subject: Bengali character recognition From: Greg Heath Date: 11 Aug, 2012 22:28:08 Message: 2 of 2 % "asdf " wrote in message % ... % > Hi, % > we are doing a Bengali character recognition project. % > % > there are 69 character patter. % > we have formed 16X16 skeleton images of all characters. % > so our input P= 256X69 matrix % > Target is T= 69X69 matrix (by eye(69)) % > % > now here is our neural network code....   > S1 = 69; % Why? [ I N ] = size(P) % [ 256 69 ] [ O N ] = size(T) % [ 69 69 ] classindex = vec2ind(T) % 1:69 % For an I-H-O net, the number of unknown weights is   Nw = (I+1)*H+(H+1)*O % = 257*69+70*69 = 22563 % but the Number of training equations is only Neq = N*O % = 69*69 = 4761 ~ Nw / 4.7 Therefore need either/and/or 1. Fewer hidden nodes 2. More data (Adding random noise has merit) 3. Validaton Stopping 4. Regularization(trainbr) > [R,Q] = size(input_pattern); > [S2,Q] = size(target); > P = input_pattern; % 256X69 matrix > T = target; % target =eye(69); > > net = newff(P,T,[S1 S2],{'logsig' 'logsig' },'traingdx'); % No. S2 is automatically obtained from T; net = newff(P,T,S1,{'tansig' 'softmax' },'trainscg'); % Best for classification > net=init(net); % Unnecessary. Newff weight initialization is automatic > net.performFcn = 'sse'; > net.trainParam.goal = 0.000001; > net.trainParam.epochs = 700; > net.trainParam.max_fail=374; % No. Validation stop default is 6 % % Why not delete the last four statements and rely on defaults? > net = train(net, P, T); [net,tr] = train(net,P,T); % tr contains the training results %For all data (train/val/test) Y = net( P ) classes = vec2ind(Y) numerr = numel(classes~=classindex) Pcterr0 = 100*numerr/N % For separate train/val/test results, use tr > [row col]=size(P); > for i=1:10 > P = P+ randn(row,col); > net = train(net, P, T); % training with errored value  [ net tr ] = train(net, P, T) % Need tr   % Need to calculate error rates Pcterr(i) for each loop > end   > % examine with the training data to check network is trained or not No. See classes and vec2ind above Hope this helps. Greg