Commit 1ed9d6e8 authored by Matthieu Dorier's avatar Matthieu Dorier

updating README

parent 18c7f095
......@@ -5,28 +5,122 @@ Keras models in the context of the CANDLE research workflows.
It is based on the [Mochi](
components developed at Argonne National Laboratory.
## Overview of FlameStore
FlameStore is an in-memory distributed storage service meant to
store and keep track of Keras models (i.e. deep neural networks).
These models are composed of an _architecture_ (which we call _metadata_)
that can be represented in JSON format, and a set of _layers_,
which are (potentially large) Numpy arrays.
FlameStore is composed of a _Master_ process (also called _Manager_)
and some _Worker_ processes. The Master stores metadata and takes decisions
regarding where to store new models, or whether some models should be
persisted or discarded. The Workers offer storage spaces to store layers.
## Installing
FlameStore itself is purely written in Python. However It depends
on the Mochi components and their Python wrappers. It also depends
on Keras.
on Keras and on the Python HDF5 package. The best way to get all
of these dependencies is to use [spack](
- [ ] _explain how to install the Mochi components_
- [ ] _explain how to install Keras_
Once you have spack installed and setup, clone the `sds-repo`
repository and add it for spack to use:
To install FlameStore, use the following commands.
git clone
cd sds-repo
spack repo add .
You are now ready to install FlameStore by doing:
git clone
cd flame-store
python install
spack install flamestore
Note that two heavy dependencies of FlameStore are Python and Boost.
If you have them installed on your platform already, you can follow
[this tutorial](
to tell spack about them. Make sure that boost has been compiled with
Boost.Python and Boost.Numpy support, otherwise FlameStore will not work.
Once spack has finished installing flamestore and all its dependencies,
you can load them into your environment by calling the following:
source <(spack module loads -m tcl --dependencies flamestore)
You can check that the installation worked by typping `import flamestore`
in a Python interpreter.
in a Python interpreter. Alternatively if you type
flamestore -h
You should get the help message of the `flamestore` program.
## Using FlameStore
### FlameStore Workspace
To use FlameStore, we first need to create a Workspace, that is,
a folder containing input data, Keras models, and configuration
files. To create a workspace, simply type the following:
flamestore create --name myworkspace
This will create a _myworkspace_ directory containing some
subdirectories and a config.json configuration file.
You can also use the `--path` parameter to specify in which
directory the workspace should be created, and `--override`
to indicate that, should a workspace already exist, it will
be deleted first.
The workspace contains two folders.
* _input_: this folder will contain your input data.
* _models_: this folder will contain persisted Keras models
in the form of HDF5 files.
We will see in the following how to use these folders.
### Starting up FlameStore
FlameStore can be started in three ways.
* **Standalone mode:** processes (master and workers) are started
independently. When starting, workers attach to the master, which
can start using them. This mode is useful to start FlameStore
independently of the application(s) that use it and to enable elasticity
(workers can be added and removed).
* **MPI mode:** all processes are started at the same time as a single
MPI program. This mode is also useful to start FlameStore independently
of the application(s) that use it. However, while it provides an easier
way to start it, it cannot be elastic anymore.
* **Embedded mode:** a Python application can deploy FlameStore
processes by itself by importing the flamestore package and by creating
instances of the FlameStoreMaster and FlameStoreWorker classes.
This is useful if the life time of the service is tied to the application
that uses it.
#### Standalone mode
#### MPI mode
#### Embedded mode
### Accessing FlameStore from an application
The following example shows how to create a Workspace and store/load
Keras model to/from it. A more complete example can be found in the
_test_ directory of the source, which trains a 7-layer CNN model on
Markdown is supported
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment