Python Building a homemade Long Short Term Memory with FSMs

  • Thread starter Thread starter Trollfaz
  • Start date Start date
  • Tags Tags
    Homemade
AI Thread Summary
The discussion centers on building a Long Short Term Memory (LSTM) algorithm from scratch, emphasizing the ability of LSTMs to retain memory of past inputs through Recurrent Neural Networks. The author proposes using Finite State Machines (FSMs) as a foundational model, explaining that an FSM can transition between states based on inputs, thereby retaining some memory. The concept involves creating a network of multiple FSMs, where each FSM's output is weighted and aggregated to form the overall system output. The weights are adjusted during training using gradient descent to minimize prediction errors. The author draws parallels between the terminology used in LSTM development and FSMs, noting that while the terms 'gates' and 'neurons' are common in neural networks, FSMs can serve a similar purpose in this context. The author acknowledges their limited expertise in the field, highlighting a personal curiosity in the subject.
Trollfaz
Messages
143
Reaction score
14
I am doing a project to build a Long Short Term Memory algorithm from scratch. LSTMs are capable of retaining memory of the past inputs and carrying them for future operations thanks to Recurring Neural Networks to process a series of inputs such as sounds and text.

One possible way I can think of such methods is Finite State Machines (FSMs) . In the simplest model the FSM at any point in time can be in any state ##s \epsilon S ##. After reading an input at time t, the state of the node transits from ##s_{t-1}## to ##s_t## via a function ##f_{in}(s_{t-1},x_t)## for a valid input ##x\epsilon X##. The node then produces an output ##o_t=f_{out}(s_t)## while it will remain in the transited state for the next iteration. In this way it can retain some memory or information of the past input.

Now in complex modelling such as text, does a large numbers of FSMs build a good LSTM model?
 
Last edited by a moderator:
Technology news on Phys.org
I shall now elaborate on how the network of FSMs work. Allow the system to contain N FSMs for a large value N say ##10^4##. Each FSM has it's output assigned to a random weight and multiplied by it. Hence the aggregate output of the system gives
$$\sum_{i=1}^N w_i o_i= \textbf{w}^T\textbf{o}_t$$
where ##\textbf{w},\textbf{o}_t## is the vector of assigned weights and output of the nodes at t respectively. The weights are free to adjust when we teach the algorithm and are initially set to small random values. During training, we minimize the loss L=##\sum (predicted-actual)^2## by gradient descent with respect to the weights.
 
Sounds like what is (or was, see below) normally done with the terminology of 'gates', or 'neurons', being replaced by the words 'finite state machine'. 'Neural network' is another common term that seems to apply to the same general approach.

Disclaimer: I'm Not an expert by any means! I've only dabbled in the field out of curiousity, and that was many years ago.

Cheers,
Tom
 
Last edited:
Dear Peeps I have posted a few questions about programing on this sectio of the PF forum. I want to ask you veterans how you folks learn program in assembly and about computer architecture for the x86 family. In addition to finish learning C, I am also reading the book From bits to Gates to C and Beyond. In the book, it uses the mini LC3 assembly language. I also have books on assembly programming and computer architecture. The few famous ones i have are Computer Organization and...
I have a quick questions. I am going through a book on C programming on my own. Afterwards, I plan to go through something call data structures and algorithms on my own also in C. I also need to learn C++, Matlab and for personal interest Haskell. For the two topic of data structures and algorithms, I understand there are standard ones across all programming languages. After learning it through C, what would be the biggest issue when trying to implement the same data...
Back
Top