00000

## Complex Sine Wave Example

In this example, we generated a signal from the summation of five sinusoidal signals with randomly chosen amplitudes, periods, and phases. We then apply a simple supervised classifier model to make recursive and non-recursive predictions into the future.

For those that have joined the Knowm Development Community, the source code for the experiment is available in under `SignalPredictionAppKtRam`.

We will be going over the relevant parts of this App during the following tutorial.

## Signal Generation

This example uses a number of superimposed sine waves for our signal. The `ComplexSineSignalGenerator`, generates a sine wave from the superposition of a number of random sub-sine waves.

The wave can be generated and queried as follows.

## Spiked Feature Vector Encoding

The `SignalPredictionAppKtRam` uses a `Float_A2D_Buffered` encoder to convert the real-valued signal values S(t) into spiked feature vectors.

First these signals are converted into a sparse encoding F(S(t)) using a spatially adaptive encoder `A2D_Encoder`.

Then spiked features are buffered to form a feature set vector:

$\{ F(S(t - N)), F(S(t - N+1)), F(S(t - N+2)), \ldots , F(S(t-1)) \}$

The adaptive encoder is a simple recursive method for producing a spike encoding. It can be conveniently realized through strictly anti-Hebbian learning via a binary decision tree with AHaH nodes at each tree node. Starting from the root node and proceeding to the leaf node. In this example, the input S(t) is summed with a bias b, y=S(t) + b. Depending on the sign of the result y, it is routed in one direction or another toward the leaf nodes. The bias is then updated according to anti-Hebbian learning.

If we then assign a unique integer to each node in the decision tree, the path that was taken from the root to the leaf becomes the spike encoding. This process is an adaptive analog to digital conversion. This generates an adaptive binning method for values in our signal – one with finer precision around areas of high density. You can learn more about this encoder type in this article on the A2D Encoder

## Classification and Learning

We use a kt-RAM based `LinearClassifier`, to make predictions for future signal values.

A number of AHaH nodes compose the linear classifier, one for each bin generated by our adaptive A2D spike encoder. Each AHaH nodes label is thus connected to the adaptive binning created by the A2D encoder wich changes as it adapts to our distribution. We could also use a simple discretization of the input space is we used a `Float_MaxMin` encoder instead.

Each AHaH node is trained on the spiked feature vectors and outputs are evaluated by taking the label associated with the AHaH node with highest activation.

During `SignalPredictionAppKtRam.nonRecursivePredict` we use the previous true signal values to predict the next value as follows. We only apply a single pass to the data.

During `SignalPredictionAppKtRam.recursivePredict()` we use the predicted values from the previous time steps to predict the next value. We only apply a single pass to the data.

## Results

The signal was generated from the summation of five sinusoidal signals with randomly chosen amplitudes, periods, and phases. Our linear classifier is simulated on kT-RAM with 8-bit `BYTE` core precision and we set our A2D encoder to a depth of 6 giving us spatial resolution with 2^n-1 = 32 bins and hence 32 unique labels.

## Recursive Prediction

The experiment ran for a total of 10,000-time steps. During the last 300-time steps, recursive prediction occurred. The following results are generated.

Figure 1: Recursive Predction

## Non-Recursive Prediction

We run the experiment for a total of 10,000-time steps allowing our classifier to use the previous true time steps for prediction. Below are two separate functions and their approximations.

Figure 2: Example 1: Sine Wave Non-Recursive Prediction

Figure 3: Example 2: Sine Wave Non-Recursive Prediction

We record the error for both using an exponential moving average and generate the following plots.

Figure 4: Example 1: Sine Wave Non-Recursive Prediction Error

Figure 5:- Example 2: Sine Wave Non-Recursive Prediction Error

Initially, the undertrained classifier and A2D encoder are not well suited to the problem and we see a high error. Over time, the error term decreases as the Linear Classifier learns and the A2D encoder adapts.

Increasing the time of our simulation from 10,000 steps to 100,000 creates a nonlinear decrease in error.

Figure 6: – Example 1: Temporal increase in precision

Figure 7: Example 1: Temporal increase in precision

## Conclusion

We outline a simple model for sine wave signal prediction from randomly generated data. A spatially adaptive encoder is used to bin the real-valued signals and create sparse spike encodings. These are then buffered and used to train an AHaH based linear classifier for prediction. Two types of predictions are made. 1) Recursive: predictions are used to generate the next signal prediction and 2) Non-Recursive: true signal values are used as buffered features for prediction.

The results in Figures 1-5 display the accuracy of these models by graphing both the predicited signal to the actual signal. Large spikes away from the actual signal can be seen. These reflect an incorrect classification by an AHaH node far from the correct location of the signal. The number of these errors is decreased as extra training is used (Figures 6 and 7).

Prev: Signal Prediction
Next: Reinforcement LearningImage by rhoftonphoto

00000

## Review of 2017 Energ...

250160AHaH Computing, Thermodynamic Computing, Competitive C...

## AHaH Computing in a ...

150200This is a video series titled: “AHaH Computing i...

## Machine Learning Cap...

14000Machine Learning with Thermodynamic RAM and the Knowm A...