An Introduction to Mixed Order Hyper Networks
Kevin Swingler
University of Stirling
Webpage
16 January 2019
14:15 - 15:15
Room 3.03
Earl Mountbatten Building
Abstract
This talk introduces Mixed Order Hyper Networks (MOHNS), a machine learning
technique that can be used for regression, as a generative network or as a
surrogate fitness function in optimisation tasks. MOHNS can be considered a
type of high order neural network with similarities to both MLPs and
Hopfield networks, but with qualities lacking in both of those
architectures. This work concentrates on binary MOHNS, which map vectors of
binary variables onto a real valued output. There are convex cost functions
available for training MOHNS, meaning that there are no local minima to
trap the training algorithm. Binary MOHNs represent a basis set, meaning
that any binary function can be modelled to arbitrary accuracy. The
structure of connectivity of a learned MOHN is, to some extent, human
readable, allowing insights into the underlying function. The interpretable
nature of the MOHN architecture also makes it an interesting choice as a
surrogate fitness function for metaheuristic optimisation tasks.
Experiments have shown that several of the standard benchmark combinatorial
fitness functions can be modelled in far fewer evaluations than are
required to find an optimal solution by other search heuristics. When
sampling from a noise free fitness function, the number of fitness
evaluations required to build a model is equal to the number of parameters
in the model, allowing a lower bound of required evaluations to be defined
for a given function.
This talk will describe the structure and learning
algorithms for MOHNS and then give some examples of their use as surrogate
fitness function models.
Host: Phil Bartie