Single-photon 3D imaging with deep sensor fusion




  • Paper topic: Images
  • Software type: Code
  • Able to run a replicability test: False
  • Replicability score: 1
  • Software language: Python, Matlab / Mathematica / ..
  • License: unspecified
  • Build mechanism: Other script, Not applicable (python, Matlab..)
  • Dependencies: matlab/ torch / scikit_image / scipy/tqdm / numpy / skimage /tensorboardX / torchvision / opencv / libann
  • Documentation score {0,1,2}: 1
  • Reviewer: David Coeurjolly <>
  • Time spent for the test (build->first run, timeout at 100min): 100min

Source code information


We tried and failed to use the provided code three different time on linux and Mac OS, and could not generate the Training Data.

Training data generation requires many steps to convert the NYU dataset to simulated SPAD measurements. After downloading (see provided script download_nyu_dataset.bash) and inflating the NYUV2 dataset, I have done the following actions:
 1. I compiled Opencv2.4 and libann. I modified simulated_data/intrinsic_texture/mex/compile.m with the installation paths of these library. The compilation of the mex files ran smoothly
 2. I compiled simulated_data/nyu_utils/compile.m. I didn't have libfreenect in hand, so I commented the two first lines of the script. 
 3.In order to run the simulated_data/ConvertRGBD.m script, I had to move the opencv libraries to simulated_data, so matlab can dynamically link them (there is probably a better way to link, but I couldn't find it).
Matlab finds the library, but triggers the following error: 
ERROR: MATLAB:unexpectedCPPexception
Unexpected Standard exception from MEX file.
What() is:/mathworks/devel/bat/B3p1/build/3p/sources/OpenCV/modules/core/src/matrix.cpp:1319: error: (-213) Unknown/unsupported array type in function getMat_

Considering the time required to download the data and compile the dependencies, I stopped after 10 hours of experiments.

If you want to contribute with another review, please follow these instructions.

Please consider to cut/paste/edit the raw JSON data attached to this paper.