Day 5 - Computing with Dynamics
05 May 2023Morning Session
Valerio Mante
- Experiment:
- 2 inputs: features of the input
-
How do these line attractor came to exist in the first place? (neuronal state space)
- At any time point, one of the line attractors is activated?
- what makes you start near one line vs the other
Dean
-
Time dynamics
- Chaos: minor changes or iitial state, it would exponenetially explode.
- learning rule:
- changing the initial conditions, we would like to reproduce the same trajectory
- RNN
Yulia
- The neuorns are doing some computation with the aim of generating movement:
- These neurons do not just act as an opn-loop system or algorithm with an inout and output, they are rather closed-loop systems: the output is also fed as input. -Ex: ehen grasping an object, this process typically takes seconds but my neurons are spiking on a faster time scale. So how does it happen –> solution is let the neurons form some attractors, that’s how you let they activity stay for longer time sclae timll I complete my goal..WE need attractors state to bridge the different behavioral timescale.
- Condition of satisfaction (COS): I get feedback on when I reached my goal. I know I am done with the action. I can inhibited
-
- the neurons respondible for that intentions then gets inhibited. There would be a global inhibitory signal coming onto this intention layer.
- Loihi: digital processor:
- 120 core, digital processor, Arithmetic unit for computation, memory to keep the neuronal states
- 8000 neurons/core
- cores are parallet\l
- spikes can be graded (i.e carry a value)
- How to create attractors on that:
- WTA: diagonal connectivity matrix to strongly connect neighbors and less to the farther neighbors
–> is Yulia’d main point is the use of attractor (WTA) is imortant to solve this task (motor planning - control - termination ) as opposed to conventional ML with FF networks? if yes, then why should we care about biological neurons, spikes?? timing is only valid in the case of transitionning in between the states?s
evening session
- Also used in decoding olfactory
- Reservoir computing:
- in hardware: the problem of limited bit precision as opposed to 32 floating point precision
- Giacomo: couldn’t get the long time dynamics: i.e having a meemory that lasts beyond the time constants of the synapses
- Dean: we need to add noise for each neuron
- it’s not clear how th edynamics emerge from the weight matrices: the relationship between structure and function in rate-based RNNs
- Time dynamics become interesting when interfacing sensing with control (feedback-loop, closing the loop). You would try to interface two rythms (spatio-temporal pattern)