This site uses cookies

By using this site, you consent to our use of cookies. You can view our terms and conditions for more information.

Return to Session

Using context to shift from line to point attractors in a bidirectional associative memory

Ms. Kinsey Church
University of Ottawa ~ School of Psychology
Matt Ross
University of Ottawa, Canada
Sylvain Chartier
University of Ottawa, Canada

One challenge for artificial neural networks is stabilizing on a desired response in a previously learned series of responses. This process is akin to going from a line to a point attractor. Since a single pattern can lead to multiple outcomes, the network faces a one-to-many problem. Using context, information given by the environment, is proposed in order to differentiate between the stimuli associated with themselves (point attractor) and the next in the series (line attractor). To test this with multi-step pattern time series, a Bidirectional Associative Memory (BAM) is used with alphanumeric stimuli as inputs. These stimuli are arranged in three different series of increasing difficulties where letters represent the stimuli and numbers represent the context: one long time series, two time series of different lengths, and three independent time series. Each of these time series has its own identifying numeric context. To determine which letter the BAM needs to converge on, the desired response in the specified context is compared with the output at each iteration during recall. When the desired response is reached, the context is changed, causing the network to switch attractors and therefore allowing the BAM to correctly stabilize on the desired output. This provides an effective solution to the one-to-many problem and allows the BAM to stabilize on the desired response, regardless of the length of the series or level of correlation between stimuli. This could represent how the most effective behaviour is selected from a series of behaviours to solve a given problem.



recurrent neural networks


Cognitive Modeling

There is nothing here, yet. Be the first to create a thread