Create Sparse Matrix Based Liquid State Machine
sparse matrix
liquid state machine
tensorly
torch
efficient data processing
complex datasets
brain neurons
This text explains the process of creating a sparse matrix-based liquid state machine using tensorly and torch libraries. It focuses on efficient data processing for intricate datasets such as brain neurons.
use tensorly
to create random sparse tensor and eye sparse tensor with ease, which could be numpy only, and requires the sparse
package.
scipy
, pytorch
, tensorflow
, jax
all support sparse tensor construction but advanced apis are not.
use eye
to create bias and input matrix, extract node values. use random sparse tensor to initialize weight matrix. use self matrix multiplication to perform propagation.
the human brain has roughly 87 billion neurons, and every one of them has thousands of synapses.
import torch
= 1_000_000
large_number 0).repeat(2, 1)
torch.arange(large_number).unsqueeze(= torch.arange(large_number).unsqueeze(0).repeat(2, 1)
index_arr = torch.ones(large_number)
val_arr = torch.sparse_coo_tensor(index_arr, val_arr, (large_number, large_number))
sparse_eye # sparse_eye.to('cuda')
alternatively:
import torch
import tensorly.contrib.sparse as tsl_sp
= 1_000_000
large_number = tsl_sp.eye(large_number)
numpy_eye = torch.sparse_coo_tensor(numpy_eye.coords, numpy_eye.data, numpy_eye.shape) torch_eye