Deep exemplar-based colorization

SIGGRAPH 2018


Reviews

Information

  • Paper topic: Images
  • Software type: Code
  • Able to run a replicability test: True
  • Replicability score: 5
  • Software language: C/C++, Python
  • License: MIT
  • Build mechanism: IDE Project (VS,..)
  • Dependencies: zlib / Caffe / protobuf / protoc / boost / Glog / HDF5 / LMDB / Gflags / levelDB / Snappy / Cuda / CudNN / OpenCV / ATLAS / PyTorch / cuda
  • Documentation score {0,1,2}: 1
  • Reviewer: Nicolas Mellado <nmellado0@gmail.com>
  • Time spent for the test (build->first run, timeout at 100min): 10min

Source code information

Comments

As reviewer 1, I could not build the provided Visual Studio solution (see details of my attemp below).
However, it appears that the code is shipped with precompiled demos I didn't find at the first look.
Before running the demo, I had to install the following dependencies (tested with Python 3.6):
>pip install image opencv-python scikit-image
Then, to install pytorch, I followed the instructions given at https://pytorch.org/get-started/locally/, for me it was
>pip install torch===1.4.0 torchvision===0.5.0 -f https://download.pytorch.org/whl/torch_stable.html

Then you need to download and put the data as described here: https://github.com/msracver/Deep-Exemplar-based-Colorization#download-models

And then simply run the script demo\run.bat. It should output two colored images in demo\examples\res\.

Compilation attempt:
=================
I could however compile caffe and other dependencies using CK (https://codereef.ai/portal/c/032630d041b4fd8a:7d5e081bda47dcbc/), namely opencv, boost, gflags, glogs, hd5f, lmbd and protobuf. I had to install precompiled version of 


---- Review 2 -----
Could not test on Windows because Caffe is unavailable for VS2017. I spent multiple hours trying to compile Caffe with VS2017, but stopped after compiling the 13th dependency, ATLAS, that compiles unix-style with cygwin. Pre-built binaries *may* be found for VS2015. Caffe is not maintained as of 2019.

If you want to contribute with another review, please follow these instructions.

Please consider to cut/paste/edit the raw JSON data attached to this paper.