3D Histology, Interactive High-resolution Volume Rendering for Pathology
Histology, the medical analysis at microscopic level of tissue specimen from the human body, is a cornerstone in medical research. Digitization of the microscopy images opens up exciting possibilities for visualization and image analysis.
A Cut Finite Element Method for Incompressible Two-Phase Navier-Stokes Flows
A Cut Finite Element Method (CutFEM) for the time-dependent Navier-Stokes equations involving two immiscible incompressible fluids is developed. The numerical method is able to accurately approximate a discontinuous pressure and a weak discontinuous velocity field across evolving interfaces without requiring the mesh to be fitted to the domain or regularizing the discontinuities.
A key role predicted of for fast synaptic plasticity in working memory in the cortex
We tested whether a cortical spiking neural network model with fast Hebbian synaptic plasticity could learn a multi-item WM task (word list learning). The model could indeed reproduce human cognitive phenomena, while being simultaneously compatible with experimental data on structure, connectivity, and neurophysiology of the underlying cortical tissue.
Accelerated sampling of conformational transitions in DNA
Molecular dynamics is a powerful technique to study the behavior of molecules, and in particular bio-molecules such as proteins and DNA. However, the time scales of biologically important events are often much longer than what can be reached with simulations.
AI-centered visual analytics of histology
This project aims to merge AI techniques with visual exploration.
Analytic Imaging Diagnostics Arena (AIDA) funded
AIDA is a cross-disciplinary and cross-sectoral collaboration aiming for large-scale usefulness from Artificial Intelligence (AI) in healthcare.
Are NVIDIA Tensor cores good for HPC?
The NVIDIA Volta GPU microarchitecture introduces a specialized unit, called Tensor Core that performs one matrix-multiply-and-accumulate on 4×4 matrices per clock cycle.
Atmospheric circulation in a much warmer Earth, simulations using alternative warming scenarios for the Eocene
It is possible to study this period with climate models, but to obtain a reasonable global match between model surface temperature and proxy reconstructions extremely high atmospheric CO2concentrations or a reduction in global cloud albedo is needed. In this work, these two methods are examined.
Automatic workflow, data collection, and development of open-data infrastructure
For DCMD activities in data exploration and visualization, we need to address generation, collection, storing, and organizing data via research on automatic workflow, data collection for materials data, and development of an open-data infrastructure. All activities lead to necessary insights and software to do complex simulations within materials physics and molecular chemistry.
Binding sites for luminescent amyloid biomarkers from non-biased molecular dynamics simulations
A visual analytics environment VIA-MD(visual interactive analysis of molecular dynamics) tailored for large-scale spatio-temporal molecular dynamics simulation data has been developed. A key concept of the environment is to link interactive 3D exploration of geometry and statistical analysis.
Bioinformatics Highlight: Assessing protein mass spectrometry data using Percolator – how to weed out valuable information from the noise
Mass spectrometry (MS) is currently the most effective way to analyze protein on a large scale, and hence one of the most important tools for answering those questions. There are still however difficult challenges in analysing the wealth of data MS-based experiments produce.
Bioinformatics Highlight: Protein structure prediction — state-of-the-art methods proven in contests
Several scientists in the Bioinformatics community study and develop methods for protein structure prediction.
Bioinformatics: Computational methods to assess and remedy mapping bias in allele-specific expression and genomic cis-element analysis
Assessing allele-specific expression (ASE) and allele-specific cis-element (ASC) binding or modification from massively parallel sequencing read data is a straightforward way to home in on transcriptional and cis-regulatory variation at the level of single individuals.
Bioinformatics: Computational optimization of mass spectrometry-based proteomics experiments
Mass spectrometry (MS)-based proteomics is currently the most efficient method for large-scale analysis of protein content in biological mixtures. …
Bioinformatics: Deep learning in protein structure predictions
Protein structure prediction is fundamental for our understanding of molecular functions in cells. Fundamental for this is the prediction of contact between interacting residues and the evaluation of the quality of protein models. We are developing novel deep learning approaches for both these problems.
Bioinformatics: Development of automated protein family classification using hidden Markov models for functional characterization of proteins
There is a great need to subdivide large protein families into smaller, homogeneous subfamilies, corresponding to functional entities. Hidden Markov models constitute a powerful technique for such subclassification.
Bioinformatics: High throughput prediction of disease caused by SNPs
Many diseases are caused by single nuclear polymorphisms (SNPs) that cause an amino acid in a protein to be mutated. However, most SNPs do not cause a disease. Today SNPs are readily detected in large scale studies of individual genetic variation. Here, we want to develop methods for analysis of molecular consequences following these SNPs and thereby aid in the identification of what SNPs are most likely to be disease causing.
Bioinformatics: Improved scaffolding for genome assembly
A crucial part of genomics is the ability to accurately assemble reads (often short) into larger pieces, so-called contigs. This is still considered difficult and recent results indicate that there is no single assembler that good for all data sets. An important step in genome assembly is scaffolding, in which information about read-pairing is used to connect contigs into larger units.
Bioinformatics: Improving the power of statistical tests for genetic interactions
This project aims to improve statistical tests for genetic interactions without main effects. These types of interactions, coined epistatic interactions are hypothesized to account for most of the phenotypic variation, and are therefore important to study.
Bioinformatics: Inferring functional coupling of proteins using next gen sequencing data
Next generation sequencing experiments have a very high throughput producing gigabytes of raw data and setting a big challenge on analytic and processing ability, but in turn have the possibility to produce accurate and abundant data. The main goal of the project has been to try to utilize next generation sequencing data to infer functional coupling in protein interaction networks. The next gen data type used in this project was RNA-Seq.
Bioinformatics: Integrating physical interaction networks in the analysis of Complex Diseases
The traditional genome-wide association (GWA) studies have been largely unsuccessful for complex diseases, like cardiovascular disease. They often fail to replicate their results, and the DNA variants that have been found have a low effect on the disease. It is widely believed that this is due to non-additive genetic interactions, where the disease depends on the variance at two or more loci. Considering the vast number of possible genetic interactions, the traditional GWA methodology is not viable; as a consequence most studies ignore genetic interactions.
Bioinformatics: Study of Cardiovascular Diseases
Integrating physical interaction networks in the analysis of Complex Diseases
Improving the power of statistical tests for genetic interactions
Bioinformatics: Using machine learning potentials in conformational sampling and selection of protein structures
Modeling of protein structure is a central challenge in structural bioinformatics, and holds the promise not only to identify classes of structure, but also to provide detailed information about the specific structure and biological function of molecules. This is critically important to guide and understand experimental studies: It enables prediction of binding, simulation, and design for a huge set of proteins whose structures have not yet been determined experimentally (or cannot be obtained), and it is a central part of contemporary drug development.
Bioinformatics: Using predictions to improve predictions of membrane proteins
Machine learning methods have a long history within bioinformatics. Sequence based machine learning methods can roughly be divided into two classes, local and structural. The local methods are trained on information from a fixed length sequence window surrounding a residue. …
Bioinformatics: VMCMC: a graphical and statistical analysis tool for Markov chain Monte Carlo traces
Analysing the output from Markov chain Monte Carlo (MCMC) computations is a crucial step in many scientific investigations today, and in particular in evolutionary studies where MCMC has proved to be a strong and flexible framework. VMCMC is a convenient tool with two aims: to make phylogenetic MCMC trace analysis of multiple experiments convenient, especially in a HPC environment, and to simplify visualisation of single traces.
Brain network architecture and dynamics of short- and long-term memory
In this project we intend to study cortical network phenomena accompanying brain plasticity effects relevant to short- and long-term memory processes. The overarching aim is to enhance e-science approaches for studying brain networks developed at KTH and KI, and inject corresponding informatics workflows into the environments at SUBIC. PH plans to advance an existing spiking and non-spiking large-scale neural network models to simulate memory phenomena in close collaboration with AL.
Brain-IT Highlight: D1R-Golf signaling modules segregate into compartments
The development of a large signaling model that takes into consideration the existence of at least two D1R-Golf signaling compartments explains the data pattern. …
Brain-IT Highlight: D1R-Golf signaling modules segregate into compartments
The development of a large signaling model that takes into consideration the existence of at least two D1R-Golf signaling compartments explains the data pattern. …
Brain-IT Highlight: Workflows for the estimation of model parameters
When modeling subcellular signaling pathways, experimental data are integrated into a precise and structured framework from which it is possible to make predictions that could be tested experimentally. …
Brain-like approach to Machine Learning
The main aim of this project is to advance the development of hierarchical brain-like network architectures for holistic pattern discovery drawing from the computational insights into neural information processing in the brain in the context of sensory perception, multi-modal sensory fusion, sequence learning and memory association among others
Breakthrough in Big Data: 16X performance gains for Hadoop, delivering over 1.2 million operations per second
At USENIX FAST 2017, researchers from RISE SICS and KTH, in collaboration with Spotify and Oracle, presented a next-generation distribution of Apache Hadoop File System, called HopsFS, that delivers a quantum leap in both the cluster-size and throughput compared to Hadoop clusters.
Cancer screening – natural history, prediction and microsimulation
We will continue work on natural history modelling for cervical, breast and prostate cancer. Methods include HPC-intensive calibrations of simulation likelihoods using Bayesian methods and optimisation procedures for expensive or imprecise objective functions (Laure, Jauhiainen at AstraZenenca, Uncertainty Quantification with SeRC-Brain-IT). We will investigate a computational framework for storage and analysis of m icro-simulation experiments for calibration and prediction (Laure, Dowling).
Canonical quantum observables for molecular systems approximated by ab initio molecular dynamics
It is known that ab initio molecular dynamics based on the electron ground state eigenvalue can be used to approximate quantum observables in the canonical ensemble when the temperature is low compared to the first electron eigenvalue gap.
In this project, we aim to develop machine learning tools for causal discovery and causal inference, along with tools for visualizing large causal structures to a human to increase the interpretability of the structure.
Characterization of molecular motion in Cryo-EM image reconstruction
In collaboration with the Laboratory of Molecular Biology (Cambridge, UK), the SeRC Molecular Simulation community has continued their work on new reconstruction algorithms for cryo-electron microscopy.
Climate: Computational issues with interaction between the core and parameterized small-scale processes in climate models
Building e.g. the atmospheric part of a global climate model is done in two major steps. First the core is formulated, i.e. defining the equation system, choose numerical methods and calculation grid for the resolved flow. Then, standard idealized tests are performed to assess the performance of the core. …
Climate: Direct numerical simulation of cloud turbulence and its interaction with cloud drops
The role played by the clouds is fundamental for the atmosphere and the water of the earth. Nowadays our knowledge in cloud dynamics is still so poor that it represents the cause of an amount of uncertainty in climate predictions and in atmospheric circulation models.
Climate: Ensemble single-column model system
The goal is to develop a user-friendly single-column model of the global climate model EC-Earth that can be run simultaneously at numerous locations through a web interface.
Climate: Treating multiple sea-ice categories in EC-EARTH
EC-EARTH combines the atmosphere model IFS with the ocean model NEMO with the sea-ice model LIM3 being part of NEMO. An outstanding feature of LIM3 compared to LIM2 which was the sea-ice model in the previous EC-EARTH model version is the ability to represent multiple sea-ice thickness categories. …
Data exploration and visualization
Method development is needed into data exploration and visualization of the generated materials data. This is a new and exciting field of multidisciplinary research where data science meets computational materials physics. Specific activities in this group include:
Data-driven brain modeling highlights the importance of the activation patterns of single inhibitory interneurons in the brain
Dendritic plateau potentials generated by the activation of clustered excitatory inputs play a crucial role in neuronal computation and are involved in sensory perception, learning, and memory.
In climate science, we will use datasets collected from existing external projects as well as public datasets to build prediction models that out-perform existing analytical and simulation models.
In this project, we will develop deep learning methods using biological data. In particular, we will address the protein structure prediction problem, which involves predicting the structure from the amino acid sequence, predict interactions with other proteins and peptides, evaluating model qualities and predict amino acid contacts.
In this project, we propose to develop and use generic Deep Learning techniques that are able to model physical (simulated and/or measured) dynamics.
Development of novel modeling techniques
A need for high quality large volume materials data requires basic research into thedevelopment of novel modeling techniques. This work concerns method development with increased accuracy and efficiency, including dynamical mean-field theory (DMFT), spin- dynamics, time-dependent response theory (TDRSP), and molecular dynamics (MD).
Direct Numerical Simulations of cloud droplet – turbulence interactions
The possible enhancement of droplet growth by turbulence is investigated. The first step in this project is to include droplet microphysics, condensation and collection processes, in a DNS code.
e-Spect: In silica spectroscopy of complex molecular systems
Theoretical simulations are essential for the microscopic understanding of spectroscopic data that enable the design of biomarkers and materials. Modeling these complex systems requires a combination of molecular dynamics (MD) and quantum mechanics/molecular mechanics (QM/MM) approaches.
Electronic Highlight: Development of the Dalton program
Two powerful molecular electronic structure programs, Dalton and lsDalton, provide an extensive functionality for the calculation of molecular properties. …
Electronic Highlight: Hyperfine Splitting and Room-Temperature Ferromagnetism of Ni at Multimegabar Pressure
Observed magnetic hyperfine splitting confirms the ferromagnetic state of Ni up to 260 GPa, the highest pressure where magnetism in any material has been observed so far. …
Electronic Highlight: Machine Learning Energies of 2 Million Elpasolite (ABC2D6) Crystals
Elpasolite is a common crystal structure. We have developed and trained a machine learning model to predict formation energies of all 2M pristine ABC2D6 elpasolite crystals that can be made up from main-group elements (up to bismuth). …
Electronic Highlight: On the Influence of Water on the Electronic Structure of Firefly Oxyluciferin Anions
Combining molecular dynamics simulations and time-dependent density functional theory calculations indicate that the preferred binding site for the water molecule is the phenolate oxygen of the anion. …
Electronic: Ab initio simulation of strongly correlated materials: methods and applications
During last several decades, the ab-initio approach to simulations of materials has proven to be very fruitful. This approach is based on the density functional theory (DFT) and has allowed for theoretical modeling of a wide range of properties of solid-state materials without use of any adjustable parameters.
Electronic: Atomistic spin dynamics
The project has the overall objective to study magnetization dynamics on atomic scale using a combination of electronic structure methods and atomistic simulations. Of particular interest are materials aimed for future applications in information technology, like memory devices using spin-transfer torque and nanomagnets for storage. …
Electronic: Exploitation of heterogeneous processor-coprocessor (CPU/CCPU) environments for high level ab initio electronic structure calculations
This project aims to boost performance of quantum chemistry programs by developing a library for evaluation of various algorithms on Intel Xeon Phi coprocessors, and in this way exploit computational resources provided by heterogeneous processor/coprocessors (CPU/CCPU) systems.
Electronic: Method development for calculation of transport and magnetic properties
Computing spin and charge transport properties from first principles is an increasingly important area of research aiming at finding solutions to the world’s increasing demand for clean energy and energy-efficient electronics. …
Electronic: Method development for multiscale computations
By using multi-scale modeling one goes through several characteristic length and time scales in which different physical models of varying levels of theoretical sophistication are applied and tied together in such a way that information is passed between the levels. …
Electronic: Method development in studies of charge transport in organic materials
Ab-initio and semi-classical calculations are perfomed to investigate electronic structure, molecular structure and charge transport in organic crystals, polymers, and in bulk-heterojunction systems.
Electronic: Morphology and electro-optical properties of materials in solar cell
The interest for conjugated systems, which particular photo-physics characteristics emanate from the existence of a cloud of π electrons delocalized, is coming among others from the possibility to incorporate them as active components in a large range of optoelectronic devices, such as organic light emitting diodes , solar cells or photovoltaics , field effect transistor , and (bio)chemical sensors. …
Electronic: Multilayer parallelization of DALTON quantum chemistry program
The main objective of this project is to enhance the performance of the DALTON quantum chemistry program on modern high performance computer systems with multicore CPUs, and enable efficient computations of electronic structure and molecular properties in massively parallel runs (beyond 500 CPUs).
Electronic: Multiscale Modeling Methods for Biomolecules
The main objective of this project is to develop novel hybrid quantum mechanics/molecular mechanics (QM/MM) methods for computation of various spectroscopic properties of the biomolecules.
Electronic: Nucleation, stability and light-absorption of pollutant particles in the atmosphere and its climate implications
The impact of aerosols on climate can be categorized into direct and indirect effects. The direct effect includes scattering and absorption of radiation due to the presence of e.g. fossil fuel black carbon, while the indirect effect arises from modification of cloud properties such as albedo (reflectivity) induced by salts, inorganic acids and organic compounds that occur in atmosphere. …
Electronic: QMMM in the complex TDDFT representation
Since the development of the first organic solar cell in 1983 with a power conversion efficiency lower than 1% this area, also called plastic electronics, has grown and is growing today by leaps and bounds.
Electronic: Spin molecular electronics
A very intriguing prospect is to generalize the thermoelectric concept to spin voltages and spin currents. This leads to the so-called spin Seebeck effect, discovered in 2008. The fact that thermoelectric effects are spin dependent has been known for a very long time but research in the area has been impeded due to experimental difficulties. …
Electronic: Thermoelectric materials
Thermoelectric effects refer to the conversion of a temperature gradient to electric energy or vice versa. At the atomic scale, an applied temperature gradient causes charge carriers in the material to diffuse from the hot side to the cold side. As a result, a potential difference builds up. …
Electronic: Visualization tool for electronic structure calculations
Visualization tools are important not only for interpreting and presenting data, but also for analyzing problems and choosing strategy when approaching a complex problem. Together with Anders Ynnerman (Professor) at LiU, we aim at developing an electronic structure visualization table, in line with the by Ynnerman developed virtual autopsy table.
Enhanced skyrmion stability due to exchange frustration
We show that energy barriers and critical fields of skyrmion collapse as well as skyrmion lifetimes are drastically enhanced due to frustrated exchange and that antiskyrmions are metastable.
A better understanding of the processes in atmospheric boundary layers (ABL) and efficient simulations are required e.g. to improve climate predictions, and simulations of wind parks.
Extended spin model in atomistic simulations of alloys
The proposed model strives to find the right compromise between accuracy and computational feasibility for accurate modeling, even for complex magnetic alloys and compounds.
Feature based exploration of large-scale turbulent flow simulations
This project will enable in-situ detection and tracking of flow structures, which will be used as focus regions to build a multi-resolution description of the data. Interactive visualization methods from volume and flow visualization will be adapted to the new multi- resolution scheme.
FLOW and NA: Quadrature rules for boundary integral methods applied to Stokes flow
This project focuses on the design of accurate quadrature rules for functions with isolated singularities. …
FLOW Highlight: Fully resolved simulations of fibers with a significant size in turbulent flows
The interaction between fibers and carrier fluid is modeled through an external boundary force method. …
FLOW Highlight: Natural Laminar Flow
Direct numerical simulations of flow over wings for understanding the physics of modern methods of transition control. …
FLOW Highlight: Recurring intense bursts of turbulence in rotating channel flow simulations
Carrying out extensive simulations and observing for the first time intense recurrent bursts of turbulence in rotating channel flow. Rotating channel flow is therefore suggested as a prototype for future studies. …
FLOW Highlight: Simulations of turbulent boundary layers at high Reynolds numbers
The study of simplified canonical flows allows for deducing important properties of the physics. Therefore, a number of canonical flow cases have emerged as standard model problems to study wall-bounded turbulence. …
FLOW Highlight: TRITOS: Transitions and Turbulence Of complex Suspensions
Investigating the mechanisms by which the system microstructure determines the macroscopic flow properties provides valuable insight into the nature of flowing suspensions, and also leads to new ways of modelling and controlling it.
FLOW Highlight: Universal Scaling Laws or Dense Particle Suspensions in Turbulent Wall-Bounded Flows
We examine by means of large-scale interface-resolved numerical simulations the macroscopic behavior of dense suspensions of neutrally-buoyant spheres in turbulent plane channel flow. …
FLOW: Adjoint-based linear and nonlinear optimisation in wall-bounded shear flow
FLOW: An accurate integral equation method for simulating multi-phase Stokes flow
We have developed a numerical method based on an integral equation formulation for simulating drops in viscous fluids in the plane.
FLOW: Arctic Sea ice in warm climates
FLOW: ASTRID – ”A STudy of Rotation In Developing boundary-layer flows”: Direct numerical simulations
The main aim of the work is to increase the knowledge of the flow over a rotating disk, also called the von Kármán flow.
FLOW: Bifurcation and stability analysis of a jet in crossflow
This project builds on the work of Refs.  and , where the Direct Numerical Simulation (DNS) of a jet in crossflow was studied in detail and a stability analysis was performed. …
FLOW: Constructing non-reflecting boundaries using multiple Penalty terms
For any difference method the boundary conditions must be implemented so the problem is stable, which can be done by adding a single penalty term at the first and/or last grid point for the onedimensional problem. …
FLOW: Development of subgrid models and application of large-eddy simulations
FLOW: DNS of high-Reynolds number turbulent pipe flow
Fully developed incompressible turbulent flow through a smooth pipe is a canonical problem in fluid mechanics. …
FLOW: Large-scale simulations of turbulent flows, including transition and separation
FLOW: Methods for Lagrangian particles in complex geometries
Fluid flows comprise a wide variety of complex phenomena which, among others, depend on the geometrical configuration of the problem flow phases, as well as on the Reynolds number of the flow (Re). …
FLOW: Numerical experiments in a virtual wind tunnel
Being at the forefront of computational science in the coming years means effectively utilizing millions of processors for simulations of physical phenomena. Such capabilities will allow accurate large scale numerical simulations to take a role analogous to that of physical experiments. This paradigm shift has tremendous implications for many areas, including computation of aeronautical flows.
FLOW: Numerical methods for fluid-structure interaction aero-acoustics
FLOW: Numerical simulation of generation of sound in separated internal flows
FLOW: Numerical simulation of swimming micro-organisms in suspensions
FLOW: Numerical treatment of surfactants in multiphase flow
In this project we have developed a numerical method for two-phase flow with insoluble surfactants and contact line dynamics in two dimensions. …
FLOW: Parallel adaptive FEM
FLOW: Particle-laden transitional flows; small and finite-size particles
FLOW: Rotating channel flow at high Reynolds number
FLOW: Simulation and modelling of turbulent combustion
FLOW: Simulation of free-boundary problems and phase change
FLOW: Simulations of quasi-geostrophic turbulence
Two codes have been developed and implemented for use on massively parallel super computers to simulate twodimensional and quasi-geostrophic turbulence. …
FLOW: Simulations of turbulent boundary layers with passive scalars
The study of turbulent boundary layers with passive scalar transport has been an important research topic for the last few decades, since such a problem involves two fundamental aspects: the development of turbulence in a thin region adjacent to a solid wall, and the transport and diffusion of passive species by turbulent motion. …
FLOW: Spectral-element simulations of turbulent separation
Fluid flows are complicated and nonlinear, which calls for accurate numerical treatment. …
FLOW: Stratified wall-bounded flows
FLOW: Studies of transition in Couette flow
Plane Couette flow, the flow between two parallel walls moving in opposite directions, is the simplest canonical example of the effect of shear on a viscous fluid. …
FLOW: Tools for Visualisations with Simson
FLOW: Wall-bounded turbulence: Re-visit using numerical experiments
In our everyday live, we are constantly surrounded by fluids, be it gaseous air or liquid water.
FLOW: WALLPART – Inertial Particles in developing wall turbulence
Traditionally, direct numerical simulations (DNS) of turbulent flows are performed in simplified computational domains that are characterised by periodic boundary conditions in all three directions.
GROMACS & RELION
Molecular simulation has evolved into a standard technique employed in virtually all high-impact publications e.g. on new protein structures. The main bottleneck for scaling in GROMACS is the 3D-FFT used in the particle-mesh Ewald electrostatics (PME). Since PME is very fast, and used by MD codes world wide, it is worth investigating if the communication overhead can be lowered. This will be done in collaboration with PDSE (see the 3D-FFT sub-project). For extreme scaling, we will also investigate the fast multipole method (FMM) since it has better scaling complexity. A problem was always energy conservation, which is now solved in collaboration with the numerical analysis community, and we will integrate the ExaFMM code of Rio Yokota (Tokyo Tech) into GROMACS.
Improved model quality assessment using deep neural networks
Protein structure modeling is crucial for a detailed understanding of the biological function at the molecular level (Wallner&Elofsson, 2003; Wang et al, 2009). In 2003 we developed the first single-model quality estimation program…
Integrated analysis of cardiac function
Recent developments within image-based simulation and imaging of cardiac function can potentially advance diagnosis and treatment optimization of a range of cardiac diseases.
Iterative eigenvalue algorithms in unbounded domains
In this project, we take an approach where the domain is treated as an infinite domain, leading to a nonlinear eigenvalue problems. New approaches are developed from a numerical linear algebra perspective, e.g. using Arnoldi’s method and modern iterative methods for linear systems, as well as a perspective of numerical methods for partial differential equations.
Large scale neural network models of the basal ganglia – an in silico tool for understanding the healthy and diseased system
We built a BG network model consisting of 80 000 spiking neurons (Lindahl and Hellgren Kotaleski, 2017) and used it to better understand how network parameters contribute to function as well as network dynamics, and how functionality can be recovered in the disease state.
Within the scope of the DataScience MCP, we aim to work on a radical software platform architecture able to close the gap between recording data and acting on it.
Master Thesis project #1
Master thesis project #1 short description
MCP Brain-IT: ’Embedding’ – Investigate activation of receptor induced cascades in the context of an active neuronal network
MCP Brain-IT: Comparative analysis (data/models) of the dynamics of networks
MCP Brain-IT: Computational modeling of intracellular signaling cascades in the basal ganglia
The striatum is the input layer of basal ganglia, a set of ancient brain nuclei involved in control of movements, action selection, reinforced learning and in pathologies like drug addiction and Parkinson disease. The striatum receives several chemical signals not only through projections from the cortex (glutamate) and midbrain (dopamine) but also from striatal interneurons like TAN (acetylcholine) and nNOS (NO).
MCP Brain-IT: Implementing models in hardware
MCP Brain-IT: Multiscale modeling using the MUSIC tool
To understand the brain, there is a need to study multiscale phenomena and to detail out how phenomena at one level of organization are affected by the ongoing brain activity at other levels. Computational modeling and simulations provide an important approach in these attempts. …
MCP Brain-IT: Parameter estimation, uncertainty analysis and global sensitivity analysis of intracellular neuronal models
Within this project we develop a workflow combining uncertainty analysis with global sensitivity analysis and apply this on intracellular models of synapses.
MCP eCPC: eCPC WP1: Platform and infrastructure development
This project will develop an extensible platform capable of integrating predictive models, data, and simulations developed in eCPC. …
MCP eCPC: eCPC WP2: Managing sensitive data in distributed environments
We will develop a secure and scalable infrastructure to store and analyze sensitive and large scale data in the domain of life science, with a focus on providing resources for the eCPC project. …
MCP eCPC: eCPC WP3: Cancer microsimulation
This project will develop a microsimulation engine with the initial endpoints breast and prostate cancer. …
MCP eCPC: eCPC WP4: Cancer systems biology
This project will develop models for eCPC based on a systems biology approach. …
MCP eCPC: eCPC WP5: Prediction and screening models
This project will develop predictive and screening models for breast and prostate cancer on several levels. …
Medical image analysis and deep learning, with applications to prostate biopsies and mammograms
The ability to digitise large quantities of medical images together with recent progress in the area of deep learning and stochastic modelling of highly structured systems offers an opportunity to change and improve diagnostic procedures for screening.
Molecular Highlight: Accelerating cryo electron microscopy refinement
Over the past years cryo electron microscopy has become a dominant technique for determining the structure of large biomolecules. …
Molecular Highlight: Accurate predictions of GPCR-drug complexes
SeRC researchers have developed methods that enable accurate prediction of receptor-drug interactions at the atomic level. …
Molecular Highlight: The world’s fastest software for Molecular Dynamics on CPUs & GPUs
The increase in computing power allows part of (bio) molecular experimental work to be replaced by simulations. However, designing software that leverages the full power of modern (super) computing is becoming more and more challenging due to the fact that hardware is continuously becoming more parallel and heterogeneous. …
Molecular Highlight: Unraveling the strokes of ion transporters in computers
As highlighted by the 2013 Nobel Prize for chemistry, life science and biomolecular modeling are some of the most important applications for molecular simulation. By simulating the motions of membrane protein transporters in close connection with experiments, we have been able to explain fundamental mechanisms of nerve signal propagation both inside and between cells. …
Molecular models of the skin from cemovis data and simulations of permeation
In a collaboration between SeRC researchers at SU, KTH, and KI as well as ERCO Pharma AB, we have developed new computational methods to fit molecular data to cryo-EM image data of vitreous sections (CEMOVIS) of skin.
Molecular: Algorithms for molecular dynamics on heterogeneous architectures
With molecular dynamics molecular processes can be followed in atomistic detail, something which is difficult or impossible to achieve with experimental techniques. But obtaining this information comes at a high computational cost. The equations of motion of hundreds of thousands of atoms need to be integrated for billions of time steps, which can mean months of simulation time. Gains in computational efficiency are therefore highly beneficial for molecular simulation community, not only within SeRC, but world wide. …
Molecular: Algorithms for molecular dynamics on heterogeneous architectures
The amount of compute resources has grown vastly in the recent years, however they are underutilized but there are many problems that can put them into good use. As an example within the field of molecular dynamics, adaptive sampling algorithms such as Markov State modeling or Free energy calculations constitute of many short(100-1000) simulations to gather statistics followed by iterations of adaptive sampling in order to guide simulations for the coming iteration. Although it is a simple workflow to define, these type of problems can easily utilize thousands of cores and generate massive amounts of data and require something more than a queue. …
Molecular: Computational studies of membrane protein-ligand interactions
The goal of our research is to improve atomic-level understanding of receptor-ligand interactions using computational models. Using methods such as molecular dynamics simulations, molecular docking, and homology modeling we model how small molecules interact with proteins and thereby modulate their function. …
Molecular: Distributed computing simulation
The amount of compute resources has grown vastly in the recent years, however they are underutilized but there are many problems that can put them into good use. As an example within the field of molecular dynamics, adaptive sampling algorithms such as Markov State modeling or Free energy calculations constitute of many short(100-1000) simulations to gather statistics followed by iterations of adaptive sampling in order to guide simulations for the coming iteration. …
Molecular: Efficient free energy calculations for biomolecular applications
Free energy calculations is an important task in molecular simulations, since it is probably the most obvious property that can be easily obtained directly in the lab, without several intermediate steps of interpreting raw data. Typical examples include solubility of small compounds in water, the binding energy of a drug to a protein, or how conformational transitions in macromolecules are affected by changes in the environment or ligand binding. …
Molecular: Hierarchical Multiscale Modeling with a new class of force fields
We propose a novel methodology to produce a new class of force fields (FF) for predictive molecular simulations and computer modeling, based on large-scale ab initio computer simulations (with no empirical input), combined with mining from structural data bases such as CSD and PDB. The FFs are effective potentials, obtained in a by us developed inverse Newtonian scheme called the Inverse Monte Carlo (IMC). …
Molecular: Markov state models for simulation
Molecular dynamics simulation has evolved from a severely limited esoteric method into a cornerstone of many fields, in particular structural biology where it is now just as established NMR or X-ray crystallography. To achieve high performance, the simulations are typically run on massively parallel computers using domain decomposition of calculations, and for large enough systems (hundreds of millions of particles) this can scale to very large machines. …
Molecular: Molecular processes at the three-phase contact line
This project is about gaining fundamental understanding of the processes governing the dynamic wetting behavior of droplets on solid surfaces. The dynamics of wetting of a solid by a liquid plays a crucial role in many processes, both in nature and in industry. Despite significant efforts in this field, the exact mechanism of contact line advancement is still not clear. Volume wise, most processes can be described by continuum models (e.g. Navier Stokes). …
Molecular: New state of the art hardware technologies and computational techniques to boost molecular simulations
Recent hardware developments have led to an extensive availability of multi-core CPUs and dedicated accelerator processors such as graphical processing units (GPUs). Using these devices, a massive parallelism at appreciable low cost can be implemented, producing an acceleration in computational performance that can be very high for several applications. …
Molecular: Novel approaches for the evaluation of long range electrostatic interactions in MD
The calculation of electrostatic interactions in molecular computer simulations is by far most time-consuming part due to their long-ranged nature. In the last few years some promising algorithm to treat long range interactions based on Non Uniform FFT has been proposed. …
Molecular: Parallel sampling of functional motions in proteins
Molecular dynamics can reveal molecular motion in atomistic detail, but this comes at a high computational cost. Billions of time steps of femtoseconds are required to reach biologically relevant time scales of microseconds. For particular scientific problems smarter sampling could accelerate sampling by orders of magnitude, thereby enabling new problems to be tackled. …
Molecular: Spectrally accurate fast Ewald summation
The goal of this project is to develop spectrally accurate and fast Ewald summation methods – suitable for parallel computations – for various problems in molecular dynamics and fluid mechanics.
Molecular: Systematic coarse grained modelling of DNA and proteins using the Newton Inversion method
The development of coarse-grained (CG) models of large biological molecules for efficient and accurate large scale molecular simulations is rapidly gaining increasing interest. Such simplified models allow studies across length and time scales not amenable at a detailed atomistic resolution. …
Multi-scale simulations of synaptic plasticity
In this project, models of subcellular signalling cascades important for synaptic plasticity (see e.g. Nair et al, 2015) will be further developed and then challenged in co-simulations. We will explore how to extract phenomenological and simplified activity- dependent synaptic plasticity rules and neuromodulatory effects by considering multi-scale models of a neural network.
NA Highlight: Multi-scale methods for wave propagation
A new type of multi-scale numerical method for simulation of wave problems has been developed and analyzed. …
NA Highlight: Numerical methods for molecular dynamics
Langevin dynamics has been derived from the more fundamental Ehrenfest dynamics. …
NA Highlight: Spectrally accurate fast Ewald summation
A new highly accurate method for the evaluation of forces resulting from electrostatic interactions has been developed. …
NA Highlight: Spectrally accurate fast Ewald summation
A new highly accurate method for the evaluation of forces resulting from electrostatic interactions has been developed.
NA Highlight: Structured iterative methods for waveguide eigenvalue problems
In the field of electromagnetics, certain types of wave propagation can be modeled with waveguides, and can be analyzed by discretizing an associated partial differential equation. …
NA: Reconstruction methods for very noisy Electron Tomography data
Development of numerical methods for inverse problems in structural biology. …
We will focus on two aspects in further developing the Nek5000 code within SESSI. First, we will porting Nek5000 to accelerators using OpenACC and CUDA. We will continue the effort of programming Nek5000 for using accelerators to perform batched small matrix matrix multiplication that this the main computational kernel affecting Nek5000 performance. This work will also include optimization with the possibility of using CUDA in combination with OpenACC and improve the efficiency of data movement between host and GPU memories in the GS operator code. This work will be done in collaboration with the EC-funded exascale EPiGRAM-HS project that is led by PDC, and researchers at the Argonne National Laboratory. Second, we will consider new formulations of the compute and communication intensive kernels of Nek5000, including the main communication library gslib. We continue our work on one-sided communication primitives into this kernel via UPC, a PGAS programming system taking advantage of modern network hardware support for efficient one-sided communication. This includes the expertise of Niclas Jansson who developed an initial proof-of-concept of such new software. Features of new languages will also be used to overlap computation and communications by re-organising the flow of the communication.
Non-conformal simulations in the high-order code Nek5000
Our continuing work on extending the open-source code Nek5000, based on the high-order spectral element method, has now reached a level of maturity such that we can – for the first time – perform simulations of turbulence in complex geometries.
Numerical Analysis: A fast summation method for fiber simulations
A numerical method for large scale simulations of fibers immersed in a Stokesian fluid has been developed. …
Numerical Analysis: Adaptive methods for subsurface flow
This project focuses on adaptive finite element approximations and a posteriori error estimates for PDE with random data. One would like to model subsurface fluid flow and transport to understand, test, and verify predictions for a variety of applications, including: the propagation of contaminants and pollutants in groundwater, carbon sequestration in deep saline aquifers, and flow in composite biological materials. …
Numerical Analysis: Adaptive multilevel Monte Carlo simulation
Stochastic Differential Equations (SDE) are non-deterministic processes used to model natural processes with uncertainty, such as micro scale particle dynamics and the evolution of financial assets. …
Numerical Analysis: An accurate integral equation method for simulating multi-phase Stokes flow
We have developed a numerical method based on an integral equation formulation for simulating drops in viscous fluids in the plane. …
Numerical Analysis: Calibration in mathematical finance
The calibration problem is posed as an optimization problem under pde-constraints, and solved using e.g. techinques from optimal control. …
Numerical Analysis: Coarse graining of stochastic differential equations with application to wireless channel modeling
Stochastic modeling is an essential tool for studying statistical properties of wireless channels. In Multipath Fading Channel (MFC) models the signal reception is modeled by a sum of wave path contributions, and and Clarke’s model  is an important example of such which has been widely accepted in many wireless applications. …
Numerical Analysis: Computational electromagnetics in complex environments
The Yee scheme is still the main workhorse for computational electromagnetics in industry. …
Numerical Analysis: Computational modeling of the mammalian cell
The project aims to develop realistic and computationally effective models of cellular metabolism. …
Numerical Analysis: Cut finite element methods
We develop finite element methods for PDEs where the domain is allowed to cut through a fixed background mesh in an arbitrary fashion without loosing accuracy and without problems with ill-conditioned linear systems. …
Numerical Analysis: Fast Ewald summation for Stokesian particle suspensions
We have developed a numerical method for suspensions of spheroids of arbitrary aspect ratio which sediment under gravity. …
Numerical Analysis: Fast interface tracking
A multiresolution description of interfaces are used to develop a fast algorithm for their propagation. …
Numerical Analysis: Fast methods for high frequency wave propagation problems
We design methods for wave propagation, whose cost grows slowly with the frequency for a fixed tolerance. …
Numerical Analysis: FEniCS-HPC
High Performance Adaptive Finite Element Methods for Turbulent Flow and Multiphysics with Applications to Aerodynamics, Aeroacoustics, Biomedicine and Geophysics. …
Numerical Analysis: Gaussian beam approximations
Development and analysis of Gaussian beam methods for high frequency waves. …
Numerical Analysis: Iterative methods for nonlinear eigenvalue problems
This project involves the derivation and study algorithms for new types of nonlinear eigenvalue problems. …
Numerical Analysis: Multiscale methods in fluid dynamics
Multiscale problems appear in several areas of fluid dynamics. …
Numerical Analysis: Numerical methods for molecular dynamics
The goal of this project is to develop numerical methods for deterministic and stochastic molecular dynamics simulations that include accurate error estimates. …
Numerical Analysis: Numerical treatment of surfactants in multiphase flow
In this project we have developed a numerical method for two-phase flow with insoluble surfactants and contact line dynamics in two dimensions. …
Numerical Analysis: Quadratic hedging
The classic financial theory that started with the arbitrage free pricing theory of Black, Scholes and Merton (1973) deals with markets that are complete in the sense that every financial contract in the market can be perfectly (“almost surely”) replicated by a dynamic portfolio of other contracts.
Numerical Analysis: Radial basis function methods for partial differential equations
In this project radial basis functions are used to discretize and solve PDEs. …
Numerical Analysis: Spectrally accurate fast Ewald summation
The goal of this project is to develop spectrally accurate and fast Ewald summation methods – suitable for parallel computations – for various problems in molecular dynamics and fluid mechanics. …
We aim to develop new better numerical algorithms for certain problems stemming from data science and machine learning, by using state-of-the-art numerical linear algebra techniques and software and solve the corresponding computationally demanding core problems.
Open Space: A tool for space research and communication
OpenSpace is an international open source software development project, with its seat in Norrköping. It is designed to visualize data in astronomy-related research and development.
OpenSpace research results in best paper award at IEEE vis conference 2017: Globe browsing: Contextualized spatio-temporal planetary surface visualization
Results of planetary mapping are often shared openly for use in scientific research and mission planning.In its raw format, however, the data is not accessible to non-experts due to the difficulty in grasping the context and the intricate acquisition process. The OpenSpace software enables interactive contextualization of geospatial surface data of celestial bodies for use in science communication.
Organic materials for the future
Atomistic MD simulations of polymer- and cellulose-based materials are performed to investigate the impact of ionic liquid on the morphology of systems. We will focus on the dynamics of the capacitive charging, which will provide us with the information of the mechanism of doping/dedopinng on the atomistic scale of the intrinsic capacitance in the presence of ionic liquid.
Origin of DNA-Induced Circular Dichroism in a Minor-Groove Binder
With a combination of molecular dynamics simulation, CD response calculations, and experiments on an AT-sequence, we show that the ICD originates from an intricate interplay between the chiral imprint of DNA, off-resonant excitonic coupling to nucleobases, charge-transfer, and resonant excitonic coupling between DAPIs.
PDSE/3D-FFT on Emerging Architectures
One of the most promising techniques to accelerate FFTs and other computational kernels, is the design and programming of specialized logics in reconfigurable hardware (FPGA). By configuring hardware logics so that each core performs small 3D Discrete Fourier Transform, it is possible to specialize hardware logics for fast computation of 3D FFTs.
Promiscuous and selective calmodulin
Calmodulin is a ubiquitous calcium sensor that confers calcium sensitivity to many cellular partners. How calmodulin can bind a large number of targets while retaining some selectivity, a phenomenon called promiscuous selectivity, is a fascinating question that we address with molecular dynamics simulations conducted with GROMACS.
Protein structure prediction — state-of-the-art methods proven in contests
SeRC faculty Björn Wallner and Arne Elofsson are experts on protein structure predictions and have developed several tools for prediction and assessment of protein structure models. Their web service Pcons.net is a popular and important tool…
PSDE Highlight: An atlas of combinatorial transcriptional regulation in mice and man
Availability of large TF combinatorial networks in both humans and mice will provide many opportunities to study gene regulation, tissue differentiation, and mammalian evolution. …
PSDE Highlight: Decentralized Graph Partitioning for Social Networks and Classification Systems
SeRC researchers have developed a novel decentralized method for partitioning of graphs, with applications in areas such as social networks and classification systems, that was awarded with best paper in the IEEE International Conference on Self-Adaptive and Self-Organizing Systems, 2013.
PSDE Highlight: Enabling efficient and future-proof HPC applications: High-level component-based programming frameworks for heterogeneous parallel systems
Recent disruptive changes in computer hardware (in particular, the transition to multi-/manycore and heterogeneous architectures) have led to a crisis on the application software side: efficient programming for modern parallel and heterogeneous systems has become more tedious, error-prone and hardware specific than ever. In particular, this holds for GPU-based systems that are increasingly popular in high performance computing, with GPU architecture evolving quickly – but which application writer has the time to rewrite and/or re-optimize his/her code for each new hardware generation? …
PSDE Highlight: Metabolite profiling identifies a key role for glycine in rapid cancer cell proliferation
Glycine consumption and expression of the mitochondrial glycine biosynthetic pathway seem to be strongly correlated with rates of growth across cancer cells. …
PSDE Highlight: Object-Oriented Modeling and Simulation with Modelica
The modeling language Modelica is bringing about a revolution in the area of simulating complex cyber-physical systems for e.g. robotics, aircrafts, satellites or power plants. …
PSDE Highlight: Pediatric systems medicine: evaluating needs and opportunities using congenital heart block as a case study
Executing a systems medicine programme in pediatrics creates potential for collaboration between clinicians and families who are keen to prevent and predict diseases and nurture wellness in the families’ children. …
PSDE Highlight: Scalable Performance Monitoring
Understanding how parallel applications behave is crucial for using HPC resources efficiently. Particularly, exascale systems will be composed by heterogeneous architectures with multiple levels of concurrency and energy constraints. In such complex scenarios, performance monitoring and runtime systems will play a major role to obtain good application performance and scalability. SeRC researchers have developed techniques for online access to performance data and efficient data formats for performance data.
PSDE Highlight: World Record Hadoop Performance
Led by Dr Jim Dowling, in 2016, SeRC researchers from the PSDE community announced world-record performance for the Hadoop platform, with their next-generation distribution of Apache Hadoop File System, HopsFS. …
PSDE: Big Data Tools for Social Science
Social science has in recent decades become a data science. The use of more powerful statistical tools has enabled researchers to improve our understanding of human interactions through inferring causality in observational studies and finding hidden patterns, general relationships and unknown correlations between individuals and groups of individuals. …
PSDE: Biobanking and biomedical research integration toward a global sharing of biobank resources
Biobanks are biospecimen repositories that collect, process and store biomaterials and generate the associated data and information. …
For the past thirty years, the need for ever greater supercomputer performance has driven the development of many computing technologies which have subsequently been exploited in the mass market. Delivering an exaflop (or a million million million calculations per second) by the end of this decade is the challenge that the supercomputing community worldwide has set itself. …
PSDE: E-Science in Medicine – Prediction, Prognosis, and Understanding of Complex Diseases
We develop and apply integrative experimental and computational tools to identify and enable quantitative understanding of molecular mechanisms of disease. …
The ultimate goal of EGI-InSPIRE is to provide European scientists and their international partners with a sustainable, reliable e-Infrastructure that can support their needs for large-scale data analysis. This is essential in order to solve the big questions facing science today, and in the decades to come. …
The ENCORE project objective is to achieve a breakthrough on the usability, reliability, code portability, and performance scalability of multicore architectures. Design complexity and power density implications stopped the trend towards faster single-core processors. …
PSDE: Integration of data from Swedish Biobanks and Quality Registries
The project aims at developing standards, ontologies, and tools for simplifying data integration from biobanks and quality registries. …
PSDE: Large scale data management using iRODS
This project will investigate the use of iRODS as a framework for data management. …
PSDE: Ontology engineering for management and integration of data
This project aims to develop methods and tools to improve and integrate ontologies for use in management and integration of data. …
PSDE: OpCoReS – Optimized Component Runtime System
OpCoReS focuses on the use of task-oriented programming models, high-level equation-based object-oriented textual/graphical programming models and efficient compilation of such models to task-oriented and data-parallel code, the exploitation of multi-level parallelism in application development as well as during runtime and associated performance monitoring and analysis. …
OPENPROD is an ITEA2 European project that will provide an open, whole-product model-driven rapid systems development, modeling, and simulation environment integrating …
The emergence of highly parallel, heterogeneous, many-core processors poses major challenges to the European software industry. It is imperative that future many-core architectures can be fully exploited without starting from scratch with each new design. In particular, there is an urgent need for techniques for efficient, productive and portable programming of heterogeneous many-core systems. …
The mission of the PRACE RI is to enable high impact European scientific discovery and engineering research and development across all disciplines to enhance European competitiveness for the benefit of society. The PRACE RI seeks to realize this mission through world class computing and data management resources and services open to all European public research through a peer review process. …
The ScalaLife project intends to build a cross-disciplinary Competence Centre for life science software that should evolve to a “one-stop-shop” for users and developers of Life Science software alike. …
Cloud computing has the potential to transform scientific exploration, discovery and results by empowering research and SME communities in new ways. VENUS-C (Virtual Multidisciplinary EnviroNments USing Cloud Infrastructures) is pioneering project for the European Commission’s 7th Framework Programme that draws its strength from a joint co-operation between industry and scientific user communities. …
PSDE: Vision Cloud
The goal of the VISION project is to advance the competitiveness of the EU economy by introducing a powerful ICT infrastructure for reliable and effective delivery of data-intensive storage services, facilitating the convergence of ICT, media and telecommunications. …
Representation of Arctic moist intrusions in global coupled climate models
Events with warm and moist air entering the Arctic have been shown to have a substantial effect on the surface temperature climate in winter. Here, the coupled global climate models participating in the Coupled Model Intercomparison project Phase 5 (CMIP5) are evaluated with respect to these events.
Sessi MCP: A Profiling Framework for the Microsecond Range
Scientific applications of Molecular Dynamics usually require strong scaling. For efficient MD codes, such at GROMACS, this leads to sub-millisecond iteration times, where important tasks take tens of milliseconds. …
Sessi MCP: Algorithms for molecular dynamics on heterogeneous architectures
With molecular dynamics molecular processes can be followed in atomistic detail, something which is difficult or impossible to achieve with experimental techniques. But obtaining this information comes at a high computational cost. …
Sessi MCP: An optimized FFT library for 3D real-valued small-size data
The use of Fast Fourier Transform (FFT) is ubiquitous in computational science. Its usage varies from solving Partial Differential Equations, to calculating convolutions, to performing spectral analysis. …
Sessi MCP: Assembly level vectorization of compute intensive kernels in Nek5000
Nek5000 is a computational fluid dynamics solver for the Navier-Stokes equations based on the spectral element method. This method is a compromise between high accuracy and geometric flexibility. However, this flexibility has a price. …
Sessi MCP: Code optimization for on-node performance: SIMD and LIBXSMM for small dense matrix-matrix multiplications, streaming stores for optimizing cache to memory operations
In the Nek5000 code, the tensor-product-based operator evaluation can be implemented as small dense matrix-matrix multiplications. …
Sessi MCP: Investigation of Communication Kernel in Nek5000
Important advantage of spectral element method is its meshing flexibility coming from spatial decomposition of simulation domain into a set of non-overlapping sub-domains. …
Sessi MCP: Low-Overhead Thread-Parallelization Library
Over the past decade OpenMP has become the most popular threading model for HPC applications. …
Sessi MCP: OpenACC for Nek5000
Nek5000 is an open-source code for the simulation of incompressible flows. Nek5000 is widely used in a broad range of applications, including the study of thermal hydraulics in nuclear reactor cores, the modeling of ocean currents and the simulation of combustion in mechanical engines. …
Sessi MCP: Overcoming I/O limitations on exascale architectures
With larger systems and application scales, I/O is increasingly becoming a bottleneck. This is particularly true when global system states need to save regularly for checkpointing and analysis purposes, like in computational fluid dynamics applications. …
Sessi MCP: Parallel Implementation of the Setup of an Algebraic Multigrid Solver
Numerical modelling of various physical phenomena is based on solving big systems of linear equations arising from the discretisation of PDE. …
Sessi MCP: Refactoring of Nek5000
Nek5000 is a scalable open-source code for CFD modelling with a long development history starting in the 1970s. It is written in FORTRAN77 and C extensively using a number of Fortran features like implicit data typing, common blocks or equivalence. …
Sessi MCP: Runtime Profiling and Automatisation of Projections
One of the important methods used by Nek5000 to speed up solution of linear problem Ax=b with the iterative solvers is the residual projection, in which solutions (x) and right hand sides (b) from previous time steps are used to built projection space for the current solution, right hand side pair. …
Simulation of turbulent wings at various Reynolds numbers.
Continuing with our efforts on understanding the effect of pressure gradient and curvature on turbulent wings, we have now constructed a database consisting of time and spatially resolved turbulence around wings at 4 Reynolds numbers, starting at the low Re=100k up to the (computationally high) 1M.
Software development for exploration and design of complex molecular systems
The dominating software for quantum molecular simulations is an American commercial product (Gaussian). In an undertaking together with PDC, we will develop a full-fledged DFT program with all the standard capabilities as well as non-standard functionalities developed in the Scandinavian Dalton program community and which provides state-of-the-art scaling on contemporary and future HPC hardware platforms based on Intel, ARM, and Power CPUs as well as NVIDIA GPUs.
The aim of this project is to develop stochastic simulation methods in Machine Learning along with a corresponding rigorous efficiency analysis.
Theoretical Characterization of Point Defects in Silicon Carbide and Other Materials
Silicon carbide (SiC) is a large bandgap semiconductor that is in focus for its potential for applications in quantum information processing. It appears possible to engineer defects in SiC with optical and spin properties that are suitable as single photon sources, and states with long enough lifetime to act as qubit memory.
Topology optimization for natural convection flows.
Together with colleagues from Umeå University, we implemented in Nek5000 the possibility to perform high-order accurate topology optimization, useful for flows driven by natural convection.
Translational bioinformatics: Statistical learning for patient stratification
Translational bioinformatics was introduced into eCPC during 2014, when formal collaborations with eSSENCE and the SciLifeLab Clinical Diagnostics facility in Uppsala were also initiated.
Trial design for prediction
Experimental design is an under-appreciated aspect of data science, where better design leads to more efficient parameter estimation and possibilities to address causality. We will contribute to two new studies:
- the STHLM3-MRI study to assess the combination of the S3M test with magnetic resonance imaging (MRI), and
- the ProBio randomised treatment trial for men with metastatic prostate cancer.
Uncertainty quantification and sensitivity analysis in brain modelling
Uncertainty quantification and sensitivity analysis are important aspects of computational modelling, due to the need to assess the validity and precision of model predictions.
Unifying Memory and Storage with MPI Windows
Computing nodes of next-generation supercomputers will include different memory and storage technologies. We propose a novel use of MPI windows to hide the heterogeneity of the memory and storage subsystems by providing a single common programming interface for data movement across these layers.
Variational approximations in the medical sciences
In this new sub-project we will develop machine learning models and tools for Gaussian variational approximations (GVAs) and apply those models to health applications.
Visualization Highlight: Cover story: Interactive Visualization of 3D Scanned Mummies at Public Venues
By combining visualization techniques with interactive multi-touch tables and intuitive user interfaces, visitors to museums and science centers can conduct self-guided tours of large volumetric image data. …
Visualization Highlight: ERC Starting Grant: HEART4FLOW
The objective of the HEART4FLOW project is to develop the next generation of methods for the non-invasive quantitative assessment of cardiac diseases and therapies. …
Visualization Highlight: KAW Research Grant: Seeing Organ Function
More recent medical imaging modalities support acquisition of patient-specific functional data embedded in a high-resolution spatial context. …
Visualization Highlight: Vinnova Framework Grant: Digital Pathology
Within pathology there is an urgent need for substantially increased efficiency in parallel with continued improvements in quality of care. …
Visualization: A Signal Processing Approach to Direct Volume Rendering
In this project we explore the application of state-of-the-art signal processing techniques into volumetric visualization to extract additional information to provide more knowledge about the content inside the dataset. …
Visualization: Assimilation of experimental and computational fluid dynamics
Traditionally, imaging has focused on (time-resolved) acquisition of morphological information. Using MRI, blood flow velocities can be measured in the cardiovascular system. Restrictions in measurement times have limited these scans to 2D, but recent advances in software and hardware allow for time-resolved 3D approaches (4D flow MRI). Analysis of this wealth of data is currently time-consuming and user-dependent.
Visualization: Cardiovascular blood flow analysis
This project analyses blood flow data from direct measurements (4D flow MRI). A clustering approach has been developed based on 2D coherence maps placed in the in- and outflow areas, which summarize the flow behavior of the 3D volume. …
Visualization: Clinically Applied Multivariate Volume Rendering
This project targets an area of medical imaging with great potential for health care benefits, but also constituting a great eScience challenge. The objective of this project is to develop novel medical visualization methods for multivariate Direct Volume Rendering (DVR) that enable new levels of clinical usefulness. …
Visualization: Efficient Methods for Volumetric Illumination
Visualization: FlowZoom: Feature-based, multi-resolution in-situ sampling of very large turbulence simulations
Turbulence simulations have become so large, that the sheer size is an obstacle for even saving the data in full spatio-temporal resolution. …
Visualization: Interactive 3D Histology Visualization for Pathology
Histology, the medical analysis at microscopic level of tissue specimen from the human body, is a cornerstone in healthcare and medical research. …
Visualization: Lagrangian Analysis of Intracardiac Blood Flow using 4D Flow MRI
Cardiovascular disease is a serious health threat in developed countries, and remains the number one cause of death. …
Visualization: Methods for High-Quality Illumination in Interactive Volume Rendering
In order to improve the visual quality and usefulness of direct volume rendering we need to utilize more advanced illumination techniques. The benefits of using more advanced illumination techniques come from improving depth perception as well as making it easier for the user to understand the data which is being visualized.
Visualization: Multi-scale visualization of binding processes related to conformational changes of proteins for Alzheimer’s disease research
Conformational changes of proteins occur as effect of external stimuli and can lead to pathogenic states. E.g. proteins can aggregate into β sheet-rich fibrillar assemblies, known as amyloid fibrils, which cause Alzheimer’s disease. …
Visualization: OpenSpace – A tool for Space Research and Communication
OpenSpace is new open source interactive data visualization software designed to visualize the entire known universe and portray our ongoing efforts to investigate the cosmos. …
Visualization: Organic solar cells design
Organic solar cells, or other photovoltaic devices play an important role in many future technologies. …
Visualization: POrtable Diary Data Collector
Activity diaries are a powerful data source for studying the time use of individuals and entire populations and for creating awareness of individuals’ daily activity patterns. This project is concerned with the development and testing of a portable activity diary data collection tool, the PODD, implemented on a modern smart phone. …
Visualization: Semi-Automatic Visualization and Quantification of Intra-Cardiac 4D Flow MRI Data for Large Patient Studies
Besides the long acquisition times, application of 4D flow is hindered by the complexity of the analysis of the comprehensive data sets. …
Visualization: Supporting collaborative view transformations of simulation-based data
Aims: 1. Create an experimental interactive infrastructure that supports direct simultaneous multi-user view transformations of pre-simulated visual structures in support of collaborative visual exploration in large visualization environments. 2. Evaluate the interactive infrastructures proposed in Aim 1 with domain experts and relevant data and tasks. 3. Produce a permanent support infrastructure that is integrated into real work practices of domain experts and which gives rise to long-lasting positive efficiency effects. …
Visualization: Uncertainty Visualization
This project targets the area of uncertainty analysis in medical imaging. While there is a steady and fast pace of technological advances in diagnostic radiology in general, the area of uncertainty analysis is underdeveloped despite clear needs. …
Visualization: Visualization of MR Diffusion Data
Visualization: Volumetric Finite Element Visualization
The finite element (FE) method is an important computational technique in science, engineering and medicine for solving partial differential and integral equations. …
Workshop on Arctic airmass transformations
A workshop organized by Gunilla Svensson, SU, and Felix Pithan, Alfred Wegener Institute, Germany, was held at MISU 6-9 November 2017. The theme of the workshop was how to improve the understanding of air mass transformations in the Arctic by observational and modelling strategies.