|
|||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | ||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |
java.lang.Object | +--edu.wlu.cs.levy.SNARLI.BPLayer
BPLayer is a class supporting creation and training of layered neural networks through back-propagation. Sigma-Pi connections and back-prop-through-time are also supported. Networks are built implicitly by connecting layers. Public methods use double-precision floating point arrays; internal computation uses the package JLinAlg.
Constructor Summary | |
BPLayer(int siz)
Creates a neural network layer. |
Method Summary | |
double |
actdev(double xact)
Applies first derivative of layer's activation function to activation on unit (not net input). |
double |
actfun(double xnet)
Applies layer's activation function to net input. |
double[] |
activate(edu.wlu.cs.levy.SNARLI.BPLayer src,
double[] clmp)
Activates this layer by clamping activations on another and running a forward pass. |
void |
attach(double[][] pattern)
Attaches pattern to layer as input or target. |
void |
batch(double eta,
double mu)
Runs one step of back-propagation in batch mode. |
void |
batch(int nep,
double eta,
double mu)
Same as batch(nep, eta, mu, report), but with error reporting every second. |
void |
batch(int nep,
double eta,
double mu,
int report)
Trains all layer's in this layer's network, using back-propagation in batch mode. |
void |
bpttPattern()
Steps through one pattern for Back-Prop Thru Time. |
void |
bpttPattern(int n)
Steps through one pattern for Back-Prop Thru Time. |
void |
bpttResetEta()
Resets current weight- and bias-changes for Back-Prop-Through-Time. |
void |
bpttUpdate(double eta,
double mu,
int npat)
Updates weights and bias on layer using the Delta Rule, for Back-Prop-Through-Time. |
void |
connect(edu.wlu.cs.levy.SNARLI.BPLayer from)
Makes a normal (full, Sigma) connection to this layer from another layer. |
void |
connect(edu.wlu.cs.levy.SNARLI.BPLayer from1,
edu.wlu.cs.levy.SNARLI.BPLayer from2)
Makes a Sigma-Pi connection to this layer from two other layers. |
void |
delay(edu.wlu.cs.levy.SNARLI.BPLayer from)
Same as delay(from, weight), with weight = 1.0. |
void |
delay(edu.wlu.cs.levy.SNARLI.BPLayer from,
double weight)
Makes one-to-one time-delay connection to this layer from the specified layer, using the specified connection strength. |
double |
dontCare()
Returns out-of-bounds value for don't-care condition. |
double[] |
getBias()
Returns the bias on this layer. |
double |
getMaxError()
Returns error of maximum magnitude on layer over all training patterns. |
double |
getRMSError()
Returns Root-Mean-Squared error on layer over all training patterns. |
double[][] |
getSquaredErrors()
Returns squared errors on layer over all training patterns. |
double[][] |
getWeights(edu.wlu.cs.levy.SNARLI.BPLayer from)
Returns the weights on this layer from another layer (Sigma connection), as a 2D array. |
double[][][] |
getWeights(edu.wlu.cs.levy.SNARLI.BPLayer from1,
edu.wlu.cs.levy.SNARLI.BPLayer from2)
Returns the weights on this layer from two other layers (Sigma-Pi connection), as an array of 2D arrays. |
void |
online(int nep,
double eta,
double mu)
Same as online(nep, eta, mu, report), but with error report every second. |
void |
online(int nep,
double eta,
double mu,
int report)
Trains all layers in this layer's network, using back-propagation in on-line mode. |
void |
randomize()
Same as randomize(seed), with arbitrary seed. |
void |
randomize(long seed)
Randomizes biases and weights on this layer. |
void |
randomize(java.util.Random rand)
Randomizes biases and weights on this layer to values normally distributed around zero. |
static void |
reportValue(int iter,
int maxit,
int report,
double value,
java.io.PrintStream stream)
Reports a value in a friendly way. |
void |
resetMu()
Resets previous weight- and bias- changes. |
void |
setBias(double[] v)
Sets bias on this layer to the values in vector. |
void |
setWeights(edu.wlu.cs.levy.SNARLI.BPLayer from1,
edu.wlu.cs.levy.SNARLI.BPLayer from2,
double[][][] w)
Sets weights on this layer from two others (Sigma-Pi connection) to values in array of 2D arrays. |
void |
setWeights(edu.wlu.cs.levy.SNARLI.BPLayer from,
double[][] w)
Sets weights on this layer from another (Sigma connection) to values in 2D array. |
double[][] |
test()
Tests a (trained) network layer. |
double[][] |
test(int n,
double a)
Tests a (recurrent) network layer without any input. |
Methods inherited from class java.lang.Object |
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait |
Constructor Detail |
public BPLayer(int siz)
siz
- number of units in the layerMethod Detail |
public double actfun(double xnet)
xnet
- net input to activation function
public double actdev(double xact)
xact
- activation on unit
public double dontCare()
public void randomize(long seed)
seed
- seed for random number generatorpublic void randomize()
public void randomize(java.util.Random rand)
public void setWeights(edu.wlu.cs.levy.SNARLI.BPLayer from, double[][] w) throws edu.wlu.cs.levy.SNARLI.BPConnectException, edu.wlu.cs.levy.SNARLI.BPSizeException
from
- the layer connected fromw
- the matrix of weights
BPConnectException
- if there is no such conenction
BPSizeException
- if array size mismatches layer sizespublic void setWeights(edu.wlu.cs.levy.SNARLI.BPLayer from1, edu.wlu.cs.levy.SNARLI.BPLayer from2, double[][][] w) throws edu.wlu.cs.levy.SNARLI.BPConnectException, edu.wlu.cs.levy.SNARLI.BPSizeException
from1
- layer connected fromfrom2
- layer connected fromw
- array of weights
BPConnectException
- if there is no such conenction
BPSizeException
- if array size mismatches layer sizespublic void setBias(double[] v) throws edu.wlu.cs.levy.SNARLI.BPSizeException
v
- the vector of bias values
BPSizeException
- if vector size doesn't equal layer sizepublic double[][] getWeights(edu.wlu.cs.levy.SNARLI.BPLayer from) throws edu.wlu.cs.levy.SNARLI.BPConnectException, edu.wlu.cs.levy.SNARLI.BPInitException
from
- the layer connected from
BPConnectException
- if there is no such conenction
BPInitException
- some network weights are uninitializedpublic double[][][] getWeights(edu.wlu.cs.levy.SNARLI.BPLayer from1, edu.wlu.cs.levy.SNARLI.BPLayer from2) throws edu.wlu.cs.levy.SNARLI.BPConnectException, edu.wlu.cs.levy.SNARLI.BPInitException
from1
- layer connected fromfrom2
- layer connected from
BPConnectException
- if there is no such conenction
BPInitException
- some network weights are uninitializedpublic double[] getBias() throws edu.wlu.cs.levy.SNARLI.BPInitException
BPInitException
- if the bias is uninitializedpublic double getRMSError() throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
BPInitException
- if the weights are uninitialized
BPSizeException
- if number of input and output patterns differspublic double getMaxError() throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
BPInitException
- if some network weights are uninitialized
BPSizeException
- if number of input and output patterns differspublic double[][] getSquaredErrors() throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
BPInitException
- if some network weights are uninitialized
BPSizeException
- if number of input and output patterns differspublic void connect(edu.wlu.cs.levy.SNARLI.BPLayer from)
from
- layer to connect frompublic void connect(edu.wlu.cs.levy.SNARLI.BPLayer from1, edu.wlu.cs.levy.SNARLI.BPLayer from2)
from1
- layer to connect fromfrom2
- layer to connect frompublic void attach(double[][] pattern) throws edu.wlu.cs.levy.SNARLI.BPSizeException
pattern
- pattern to attach
BPSizeException
- on width/size mismatchpublic void batch(int nep, double eta, double mu, int report) throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
nep
- number of epochs for trainingeta
- learning ratemu
- momentumreport
- number of generations between error reports
BPInitException
- if some network weights are uninitialized
BPSizeException
- if lengths of patterns differpublic void batch(int nep, double eta, double mu) throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
nep
- number of epochs for trainingeta
- learning ratemu
- momentum
BPInitException
- if weights hav not been initialized
BPSizeException
- if number of input and output patterns differspublic void batch(double eta, double mu) throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
eta
- learning ratemu
- momentum
BPInitException
- if some network weights haven't been initialized
BPSizeException
- if number of input and output patterns differspublic void online(int nep, double eta, double mu, int report) throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
nep
- number of passes through dataeta
- learning ratemu
- momentumreport
- number of generations between error reports
BPInitException
- if some network weights are uninitialized
BPSizeException
- if lengths of patterns differpublic void online(int nep, double eta, double mu) throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
nep
- number of epochs for trainingeta
- learning ratemu
- momentum
BPInitException
- if weights hav not been initialized
BPSizeException
- if number of input and output patterns differspublic double[][] test() throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
BPInitException
- if some network weights are uninitialized
BPSizeException
- if number of input and output patterns differspublic double[][] test(int n, double a) throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
n
- number of steps to runa
- intial activation value
BPInitException
- if some network weights are uninitialized
BPSizeException
- if number of input and output patterns differspublic void delay(edu.wlu.cs.levy.SNARLI.BPLayer from, double weight) throws edu.wlu.cs.levy.SNARLI.BPSizeException
from
- layer to delay fromweight
- connection strength
BPSizeException
- if layers have different sizespublic void delay(edu.wlu.cs.levy.SNARLI.BPLayer from)
from
- layer to delay frompublic void resetMu()
public void bpttResetEta()
public void bpttUpdate(double eta, double mu, int npat) throws edu.wlu.cs.levy.SNARLI.BPMomentumException
eta
- learning ratemu
- momentumnpat
- total number of patterns
BPMomentumException
- if no momentum has been set on the layerpublic double[] activate(edu.wlu.cs.levy.SNARLI.BPLayer src, double[] clmp)
src
- "source" layer to clampclmp
- vector of clamping values
BPConnectException
- if there is no path between the layers
BPInitException
- if some network weights are uninitializedpublic void bpttPattern(int n) throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException, java.lang.IllegalArgumentException
n
- number of ticks from end of pattern
BPInitException
- if some network weights are uninitialized
BPSizeException
- if lengths of patterns differ
java.lang.IllegalArgumentException
- if n < 0 or n >= current
pattern lengthpublic void bpttPattern() throws edu.wlu.cs.levy.SNARLI.BPInitException, edu.wlu.cs.levy.SNARLI.BPSizeException
BPInitException
- if some network weights are uninitialized
BPSizeException
- if lengths of patterns differpublic static void reportValue(int iter, int maxit, int report, double value, java.io.PrintStream stream)
iter
- iteration numbermaxit
- maximum number of iterationsreport
- reporting intervalvalue
- value to reportstream
- print stream that reports value
|
|||||||||
PREV CLASS NEXT CLASS | FRAMES NO FRAMES | ||||||||
SUMMARY: NESTED | FIELD | CONSTR | METHOD | DETAIL: FIELD | CONSTR | METHOD |