Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement randomized initialization for neurons #352

Closed
heplesser opened this issue May 20, 2016 · 13 comments
Closed

Implement randomized initialization for neurons #352

heplesser opened this issue May 20, 2016 · 13 comments
Assignees
Labels
I: No breaking change Previously written code will work as before, no one should note anything changing (aside the fix) S: Normal Handle this with default priority T: Enhancement New functionality, model or documentation ZC: Kernel DO NOT USE THIS LABEL ZP: In progess DO NOT USE THIS LABEL

Comments

@heplesser
Copy link
Contributor

With NewConnect, we support randomised initialisation of synapses. Provide the same type of support for neuron initialisation, so that one could write

/iaf_neuron 20 << /V_m << /distribution /uniform /low -75. /high -55. >> >> Create

This was previously trac854.

@jakobj
Copy link
Contributor

jakobj commented May 20, 2016

If no one minds, I'd give this a try. @heplesser if that's OK please assign the ticket to me.

@heplesser heplesser assigned heplesser and jakobj and unassigned heplesser May 21, 2016
@heplesser heplesser added the T: Enhancement New functionality, model or documentation label May 21, 2016
@heplesser
Copy link
Contributor Author

@jakobj Great that you want to work on this! All the "mechanics" should be in place from the Connect framework, but need to be applied to neuron initialization. The ConnParameter class represents parameters and can be used for neuron as well as for synapse parameters. We should probably rename it; it really represents parameter initializers (fixed scalar values, random values, arrays).

Take a look at the ConnBuilder constructor for how parameters objects are created from the connection specification. It should be possible to do this in a quite similar way for parameters for neurons. One thing that might be slightly more complicated is how to know which parameters cannot be set in neuron models.

It may be a good idea to create a few test cases first, especially for initialization using random numbers or arrays in case of parallel building.

@jakobj
Copy link
Contributor

jakobj commented Jun 2, 2016

thanks @heplesser, i agree that we can reuse and should rename ConnParameter. from what i've gathered so far SetDefaults is called whenever Create is called. i am not sure yet, where i can see this in the code, it seems to be some sli dictionary magic. ;) the issue then is: if defaults are changed to initialize all nodes during one connect call with the specified parameters, at which point should we place the randomization? i can surely randomize the default value, but then all nodes created in the same call will have the same (random) value. this is not really what we want. one approach would be to check for the existence of a distribution in the dictionary, store it somewhere else, create all neurons, then randomize the corresponding values. i am not sure about the angle of attack here.

@heplesser
Copy link
Contributor Author

@jakobj The default parameter values are set and reset in Create_l_i_D at the SLI level.

We might want to move this down to the C++-level and proceed as follows:

  1. Fixed parameters are set as defaults in the model prototype (and reset later).
  2. All other parameters (either randomized or given as array) are handled as for synapses and values assigned when a neuron is created.

With this solution, we will not create any overhead for creating neurons with fixed parameters only and we need only a single pass through the neurons that are created.

@jakobj
Copy link
Contributor

jakobj commented Jun 18, 2016

thanks for the pointer @heplesser. i now moved the current functionality entirely to the C++ level. this works quite nicely. i'll look into random distributions now. you can see the changes in my branch: https://github.com/jakobj/nest-simulator/tree/fix352_random_initialization_neurons

@heplesser
Copy link
Contributor Author

@jakobj Looks very good! I cannot comment directly in your code, so I will do it by link.

  • < 1 neuron error: should be "strictly positive", positive includes 0.
  • Invalid parameter error: Add the name of the parameter, so the user does not need to guess
  • /CompareDicts:
    • This function could be so useful that you may want to add it to the general SLI library.
    • The function should not use assert internally, but return a bool; best rename it DictsEqual and let it return true if all is equal.
    • I think as written, the function will pass if d2 contains keys that are only in d2, but not in d1.
    • Should this be recursive, in case we have dicts including dicts?
  • The individual tests should be completely independent of each other and in this form:
{
  ResetKernel
  % setup
  % create
  % check neurons have correct params
  % check defaults untouched --- this can probably be put in a fct, since it is the same for all tests
  % at this point, `true` or `false` should be on the stack
}
assert_or_die

@jakobj
Copy link
Contributor

jakobj commented Jun 19, 2016

Thanks @heplesser for the comments. I implemented DictsEqual by two calls to SubsetQ. That seems to work fine.

@jakobj
Copy link
Contributor

jakobj commented Jun 23, 2016

@heplesser would it be an option to adapt SetStatus to accept parameter distributions as arguments? this way we would provide the user with the option of randomizing parameters whenever they feel like it and we could use it in the Create command. this requires two passes (one for creation, one for parameter randomization) through all created nodes during a single Create call, but it might keep the code simpler. [edit]I have also rewritten history to keep it a bit cleaner. [/edit]

@heplesser
Copy link
Contributor Author

@jakobj This sounds like an interesting idea. I suggest we discuss it in the next developer VC.
@jougs What do you think?

@jakobj
Copy link
Contributor

jakobj commented Jun 23, 2016

@heplesser another issue i stimbled upon: we can't rename ConnParameter to Parameter since this collides with a class Parameter in topology. the alternative ParameterInitializer sounds a bit strange to me. any ideas?

@heplesser
Copy link
Contributor Author

heplesser commented Jun 24, 2016

@jakobj I would suggest doing it the other way around: Rename Parameter in topology to TopologyParameter, so that Parameter will be free for use in NEST generally. I just created PR #408 to rename in topology.

@heplesser
Copy link
Contributor Author

@jakobj I have recently opened #488 for a general approach to initializing parameterization of nodes and connections. We should see this and that issue in context.

@heplesser heplesser added ZC: Kernel DO NOT USE THIS LABEL I: No breaking change Previously written code will work as before, no one should note anything changing (aside the fix) ZP: In progess DO NOT USE THIS LABEL S: Normal Handle this with default priority labels Nov 17, 2016
@jougs
Copy link
Contributor

jougs commented Feb 7, 2018

I'm closing this in favor of compneuronmbu#4.

@jougs jougs closed this as completed Feb 7, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
I: No breaking change Previously written code will work as before, no one should note anything changing (aside the fix) S: Normal Handle this with default priority T: Enhancement New functionality, model or documentation ZC: Kernel DO NOT USE THIS LABEL ZP: In progess DO NOT USE THIS LABEL
Projects
None yet
Development

No branches or pull requests

3 participants