@ -6,7 +6,7 @@ The pyalma package provides a framework to plan, run, pause and continue file ba
## Example
Everything starts with grouping the files of interests (a) [batch(es)](./docs/batch.md). Lets say we have a batch file `example.batch`. This could look as follows.
Everything starts with grouping the files of interests to (a) [batch(es)](./docs/batch.md). Lets say we have a batch file `example.batch`. This could look as follows.
###### `example.batch`
@ -60,13 +60,13 @@ Now that we have specified everything, we can start executing our experiment.
>>> dispatcher.start()
```
The line `dispatcher.start()` starts the concurrent non blocking execution of our experiment. This means the dispatcher stays responsive and we can pause/stop the execution at every given time.
The line `dispatcher.start()` starts the concurrent non blocking execution of our experiment. This means the dispatcher stays responsive and we can pause/stop the execution at any given time.
```
>>> dispatcher.stop()
```
During the execution the `dispatcher` continuously keeps track of which files he still needs to call `run(...)` on, how many iterations he has left and saves the current state of the execution in a file. Loading an experiment (`alma.experiment.load(...)`) the framework first looks for such a save file if one exists the execution pill pick up at the point we've called `dispatcher.stop()`. To pick up the experiment we can perform:
During the execution the `dispatcher` continuously keeps track of which files he still needs to call `run(...)` on and how many iterations he has left. He does so by saving the current state of the execution in a file. Loading an experiment (`alma.experiment.load(...)`) the framework first looks for such a save file and if one exists, the execution will pick up at the point we've called `dispatcher.stop()`. To pick up the experiment we can perform: