* inclusion of episodic long term memory in SNN every-n-number-of-tokens behind (my own idea...); * implementations in C and C# ports without torch/tensorflow (SpikeGPT is in python with torch); * several types of 'attention' and training modes and memory modes; * training/learning without backpropagation; * CPU-friendly in the sense that while it's still kind of slow (unfortunately) at least GPU isn't mandatory
Here are the screenshots of both the c# windows forms implementation and the C/cygwin port...and 2 random screenshots of claude sonnet 4.6 and gemini pro 3.1 about the program:
https://imgur.com/a/SAQqKmm
why is the text generated from seed still far from perfect? 2 reasons: very small corpus and the c# has <100% accuracy.
However the big nice surprise: It seems like grammar and semantics are both learned, this coupled with my idea to include a way for long term episodic memory a long context outside the tiny 'ctx' window can be extended easily to thousands of tokens behind without decrease in speed - could make it a practical program. Generation is also very fast.
future work:
* BPE, right now it's just words tokenizer...not good for code; * did i say "code"? It may actually be a total failure for coding...or maybe not: completely untested; * The program actually has 2 versions, the other one noticeably deviates from this one and it has c and even f# port, however the f# just doesn't work...it always produces complete gibberish...major bug. * never tested on actual neuromorphic CPU, just goood ol' intel universal laptop ones; * python port should be possible; * finally the big test: large text corpus (megabytes) and accuracy over 95% <- ultimate test.