Commit ba805224 authored by Wenqi Li's avatar Wenqi Li

added config file for mr_ct_regression

parent cc0ecf2b
......@@ -8,3 +8,4 @@ This page lists NiftyNet networks pre-trained for specific tasks. Information ab
| [ultrasound_simulator_gan_model_zoo](./ultrasound_simulator_gan_model_zoo.md) | Generate simulated ultrasound images at specified poses |
| [highres3dnet_brain_parcellation_model_zoo](./highres3dnet_brain_parcellation_model_zoo.md) | Brain parcellation from T1 MR images |
| [anisotropic_nets_brats_challenge_model_zoo](./anisotropic_nets_brats_challenge_model_zoo.md) | Brain tumor segmentation with anisotropic nets |
| [mr_ct_regression_model_zoo](./mr_ct_regression_model_zoo.md) | Estimating CT from MR using an adaptive sampling strategy |
......@@ -7,8 +7,16 @@ url = https://www.dropbox.com/s/lv31fultntgw66t/mr_ct_regression_model_zoo_data.
action = expand
destination = data
[code]
local_id = mr_ct_regression
url = https://www.dropbox.com/s/ll2kr19suwlof6g/mr_ct_regression_model_zoo_config.tar.gz?dl=1
action = expand
destination = extensions
[error_maps]
local_id = mr_ct_regression
url = https://www.dropbox.com/s/9hp5qglttswecmg/mr_ct_regression_model_zoo_initial_error_maps.tar.gz?dl=1
action = expand
destination = models
# Training regression network with weighted sampling of image windows
This page describes how to acquire and use weighted sampler for image regression.
ref:
Berger et al., "An Adaptive Sampling Scheme to Efficiently Train Fully Convolutional Networks for Semantic Segmentation",
[https://arxiv.org/abs/1709.02764](https://arxiv.org/abs/1709.02764)
## Downloading model zoo files
The training data and initial error maps can be downloaded with the command
```bash
net_download mr_ct_regression_model_zoo
```
(Replace `net_download` with `python net_download.py` if you cloned the NiftyNet repository.)
## Initial training
Command line parameters: ``--starting_iter 0 --max_iter 100``
```bash
python net_run.py train \
-a niftynet.contrib.regression_weighted_sampler.isample_regression.ISampleRegression \
-c ~/niftynet/extensions/mr_ct_regression/net_isampler.ini \
--starting_iter 0 --max_iter 100
```
## Generating error maps
Command line parameters: ``--spatial_window_size 240,240,1 --batch_size 4``
modify the inference batch size and window size for efficiency purpose.
With parameter ``--error_map True``
the errors (elementwise squared differences) will be generated to
``~/niftynet/models/mr_ct_regression/error_maps``.
```bash
python net_run.py inference \
-a niftynet.contrib.regression_weighted_sampler.isample_regression.ISampleRegression \
-c ~/niftynet/extensions/mr_ct_regression/net_isampler.ini \
--inference_iter 100 --spatial_window_size 240,240,1 --batch_size 4 --error_map True
```
## Continue training by sampling according to the error maps:
Command line parameters ``--starting_iter -1``
indicate training the model from the most recently saved checkpoint (at iteration 100).
```bash
python net_run.py train \
-a niftynet.contrib.regression_weighted_sampler.isample_regression.ISampleRegression \
-c ~/niftynet/extensions/mr_ct_regression/net_isampler.ini \
--starting_iter -1 --max_iter 200
```
## Combine them together
Alternating in between error map generation and training with new sampling weights:
(from git cloned source code)
```bash
python net_download.py mr_ct_regression_model_zoo
python net_run.py train \
-a niftynet.contrib.regression_weighted_sampler.isample_regression.ISampleRegression \
-c ~/niftynet/extensions/mr_ct_regression/net_isampler.ini \
--starting_iter 0 --max_iter 100
for max_iter in `seq 200 100 5000`
do
python net_run.py inference \
-a niftynet.contrib.regression_weighted_sampler.isample_regression.ISampleRegression \
-c ~/niftynet/extensions/mr_ct_regression/net_isampler.ini \
--inference_iter -1 --spatial_window_size 240,240,1 --batch_size 4 --error_map True
python net_run.py train \
-a niftynet.contrib.regression_weighted_sampler.isample_regression.ISampleRegression \
-c ~/niftynet/extensions/mr_ct_regression/net_isampler.ini \
--starting_iter -1 --max_iter $max_iter
done
```
This script runs training for 5000 iterations,
and new sampling weights are generated at every 100 iterations.
To see the training/validation curves using tensorboard:
```bash
tensorboard --logdir ~/niftynet/models/mr_ct_regression/logs
```
## Generating regression output
Finally regression maps on the test set could be generated by
(inference without ``--error_map True`` parameter):
```bash
python net_run.py inference \
-a niftynet.contrib.regression_weighted_sampler.isample_regression.ISampleRegression \
-c ~/niftynet/extensions/mr_ct_regression/net_isampler.ini \
--inference_iter -1 --spatial_window_size 240,240,1 --batch_size 4
```
to generate results on training+validation+test, please set
``--dataset_split_file nofile`` to override the splitting file at
``~/niftynet/models/mr_ct_regression/dataset_split_file.txt``
```bash
python net_run.py inference \
-a niftynet.contrib.regression_weighted_sampler.isample_regression.ISampleRegression \
-c ~/niftynet/extensions/mr_ct_regression/net_isampler.ini \
--inference_iter -1 --spatial_window_size 240,240,1 --batch_size 4 --dataset_split_file nofile
```
The output can be found at ``~/niftynet/models/isampler_output/``.
[config]
version = 1.0
[code]
local_id = mr_ct_regression
url = https://www.dropbox.com/s/ll2kr19suwlof6g/mr_ct_regression_model_zoo_config.tar.gz?dl=1
action = expand
destination = extensions
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment