SpikyPanda reimplements neural network architectures in TypeScript using a graph-based approach. Every neuron is a node, every synapse is a link — an explicit, traversable graph that you can inspect, serialize, and manipulate.
The goal is to expose these graphs to 3D representation for visualization and spatial interaction, ultimately moving toward Spiking Neural Networks with biologically-inspired learning rules.
Nodes represent neurons, links represent synapses. A bag model holds runtime state - activations, gradients, and biases live alongside the topology, making the network fully introspectable.
Supports backpropagation for CNNs and MLPs with Adam optimizer, as well as neuroevolution for the bestioles ecosystem simulation. All training operates directly on the graph structure.
Leaky Integrate-and-Fire (LIF) neurons, Spike-Timing-Dependent Plasticity (STDP) learning, and temporal dynamics. Moving from rate-coded to event-driven computation with biologically plausible learning.
Predictive AI versus reactive threshold versus no control, on a sealed lunar habitat CO2 loop. Pick a scrubber state (oversized, normal, degraded), pick a controller, compare runs head to head. Same tiny 401-parameter world model runs in the browser and on an ESP32.
Open demo →Same CO2 controller as sample 1. Adds three astronauts with live heart rate, SpO2, respiratory rate, cognitive alertness, real-time ECG and a work-efficiency readout. Shows how a controller's decisions look on the people inside, including the "minimal scrubber" failure mode where only one crew can survive.
Open demo →Train a convolutional neural network on handwritten digits. Choose from fast, balanced, and accuracy presets, then evaluate on the test set with a visual prediction grid.
Open demo →Compress 6-channel synthetic LiDAR grids with a convolutional autoencoder. Visualize per-channel reconstructions, latent vectors, and training loss curves.
Open demo →Visualize neural network graphs in 3D with BabylonJS. Neurons as tetrahedrons, synapses as colored lines. Train XOR and watch the network learn live.
Open demo →Classify electric motor faults using LSTM/GRU recurrent networks on triaxial vibration sequences. Compare cell types, visualize signals, and analyze confusion matrices.
Open demo →Detect broken rotor bars in 3-phase induction motors from stator current signatures (MCSA). Same LSTM/GRU pipeline as Motor Vibration, trained on the UFU Broken Rotor Bar dataset with 5 rotor states (healthy + 1..4 broken bars).
Open demo →Tier 1 listener: detect drift in the stator current envelope without identifying the fault. Zero-parameter statistical baseline with online calibration and EMA scoring. Simulates the ESP32 wake-up trigger for the classification pipeline.
Open demo →Discriminate Air, Acetone and HCl with a 5-sensor metal oxide e-nose array. Reproduces the DotVision MOS chamber experiment with a single-layer SpikyPanda GRU that beats the published 5-LSTM baseline by 5.8 points (95.5% vs 89.7%) while running at 0.20 ms per window.
Open demo →Estimate depth from synthetic stereo pairs using a dual-branch CNN with cross-synapses. Visualize disparity maps, compare ground truth, and explore stereo matching.
Open demo →Real-time keyword detection from microphone audio using a tiny Conv1D ONNX model. MFCC feature extraction via SpMFCC, live waveform visualization, and 12-class inference in the browser.
Open demo →Visual graph editor for building and connecting computation nodes. Drag, zoom, pan, and wire up typed ports with live bezier connections.
Open demo →Add the core runtime to your project: