Peter M. Williams
In many applications network response ought to be invariant under a certain group of symmetries of the input space. More generally the output vector should transform covariantly with the input vector. If the input space has a geometric structure, an arbitrary network and correspondence between input and output symmetries, it is possible to generate an associated network exhibiting exactly the invariance or covariance properties required. Enforcing symmetries at the training stage necessarily has a computational cost. It is shown that for suitable error and transfer functions this can be reduced by indirect training of the original network on an extended training set. Training and exploitation of the networks described can be costly if large numbers of symmetries are involved. This is less significant in cases where the number of symmetries is relatively small, or which are bound by training rather than execution time. Moreover, evaluation of the network function for a given input vector is readily amenable to parallel computation.
This paper is not available online