Movement Extrapolation w/ TensorFlow (2)

Training a TensorFlow model to extend short sequences of movement.


In my previous post, I created a TouchDesigner workflow that uses a neural network to extend short sequences of movement. I also realized just how much variation needs to be represented in the training data in order for this to work as flexibly as I would eventually like it to. For every motion I want it to be able to reproduce, I need to produce every possible variation of that motion, including variations in center of motion, orientation, speed, scale, etc. This is a LOT of training data that would take me a ton of time to compile.

I realized that one partial solution to this would be to train the neural network not on position and orientation data, but on velocity and angular velocity. In other words, not the values themselves per frame, but rather how much they’ve changed since the previous frame. This would mean that any motion that involves staying still would look exactly the same: its frames would all be (0,0,0,0,0,0), regardless of where the object is sitting or where it’s facing. An object spinning in place clockwise would always produce frames that look like (0,0,0,0,1,0), etc.

Training on Velocity Data

To get a CHOP whose channels represent the velocity of the tracker data values, I’m taking the OpenVR data, delaying it by 1 frame, and subtracting it from the new OpenVR data the following frame to get the difference.

For angular velocity, I’m using an angle CHOP to convert the rotation values into 2-vector representations. This is the only representation of rotation I found whose values don’t occasionally jump between values, causing spikes in the difference values which may throw the neural network off.

Just as before, I’m outputting the difference values to a 121-frame trail CHOP, which outputs its history to a UDP script.

In Python, the only change that needs to be made is to the initial shape of the empty accumulator arrays, to reflect the increased number of datapoints per frame:

#capture.py
try:
    histories = np.load("./train/histories.npy")
    frames = np.load("./train/frames.npy")
    print(f"Saved data with {histories.shape[0]} samples loaded.")
except:
    print("No saved data found. Creating new dataset.")
    histories = np.zeros((0, 120, 9))
    frames = np.zeros((0, 9))