MCP Brain-IT: Multiscale modeling using the MUSIC tool
To understand the brain, there is a need to study multiscale phenomena and to detail out how phenomena at one level of organization are affected by the ongoing brain activity at other levels. Computational modeling and simulations provide an important approach in these attempts.
Most computational neuroscience software applications, such as Nest, Neuron, STEPS, etc, are specialized for a given scale. The software MOOSE is one exception trying to bridge between two scales, the biochemical and electrical (Bhalla, 2011). An important step we aim at is to develop a general framework that supports, during runtime, interoperability between models implemented in different simulators or at different scales. To accomplish this we are building a framework based on a multi-simulation concept, the MUlti-SImulation Coordinator (MUSIC). MUSIC is an API and software library allowing large-scale neuronal network applications to exchange data within a parallel computer during runtime (Djurfeldt et al, 2010). In this collaborative project between SeRC core and applied groups, we extend MUSIC in several ways and benchmark its use in real applications. Here one important aspect relates to evaluating numerical methods to be used when the different applications communicate with each other via MUSIC.
Bhalla, US. Multiscale interactions between chemical and electrical signaling in LTP induction, LTP reversal and dendritic excitability. Neural Netw 24:943-949 (2011).
Djurfeldt, M. et al. Run-time interoperability between neuronal network simulators based on the MUSIC framework. Neuroinformatics 8, 43–60 (2010).