summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authortaesung89 <taesung89@gmail.com>2017-11-05 01:03:55 +0900
committerGitHub <noreply@github.com>2017-11-05 01:03:55 +0900
commitb546d99d32d377287f7cfa9130aac1a3b1c980c2 (patch)
treef52c509f8a3765f03492e03ecf5272e1387892c5
parent276a568218c0d5331d8abb87d141a5c0f4bf6b3f (diff)
Update README.md
-rw-r--r--README.md4
1 files changed, 2 insertions, 2 deletions
diff --git a/README.md b/README.md
index ad371cb..87e40ea 100644
--- a/README.md
+++ b/README.md
@@ -123,13 +123,13 @@ The test results will be saved to a html file here: `./results/facades_pix2pix/l
More example scripts can be found at `scripts` directory.
### Apply a pre-trained model (CycleGAN)
-If you would like to apply a pre-trained model to a collection of input photos (without image pairs), please use `--dataset_mode single` and `--model test` options. Here is a script to apply a model to facade label maps (stored in the directory `facades/testB`).
+If you would like to apply a pre-trained model to a collection of input photos (without image pairs), please use `--dataset_mode single` and `--model test` options. Here is a script to apply a model to Facade label maps (stored in the directory `facades/testB`).
``` bash
#!./scripts/test_single.sh
python test.py --dataroot ./datasets/facades/testA/ --name {my_trained_model_name} --model test --dataset_mode single
```
-You might have to specify `--which_model_netG` to match the generator architecture of the trained model.
+You might want to specify `--which_model_netG` to match the generator architecture of the trained model.
Note: We currently don't have pretrained models using PyTorch. This is in part because the models trained using Torch and PyTorch produce slightly different results, although we were not able to decide which result is better. If you would like to generate the same results that appeared in our paper, we recommend using the pretrained models in the Torch codebase.