Commit Graph

6809 Commits

Author SHA1 Message Date
Davis King c68af9dcbf Merge branch 'OranjeeGeneral-master' 2017-06-26 21:08:55 -04:00
Davis King 31bcddd5e4 Cleaned up documentation for conv_. Also removed unnecessary tensor
reallocation and copying inside conv_'s backward pass.  Doing this
required adding an add_to_output boolean option to the methods of
tensor_conv.
2017-06-26 21:06:59 -04:00
Davis King b3d5dbd3b3 Fixed typo in comment. 2017-06-26 21:01:47 -04:00
Davis King f9bb4f472b Merge branch 'master' of git://github.com/OranjeeGeneral/dlib into OranjeeGeneral-master 2017-06-26 14:59:03 -04:00
OranjeeGeneral ecb7095e4a refactored interface to reduce complexity so conv and convt layers forward passes have to call setup explicit now and there is only one ()-operator 2017-06-22 17:55:20 +01:00
Davis King 48c68f218e Fixed mex class code printing 2017-06-21 17:06:48 -04:00
Davis King c5847374f4 Added operator= for simd8f so assignment from float compiles. 2017-06-21 09:10:34 -04:00
Davis King cbd187fb61 merged 2017-06-19 20:55:51 -04:00
Davis King 39be45ada2 Made is so pressing e in imglab toggles between views of the image where the
histogram is equalized or unmodified.  This way, if you are looking at
particularly dark or badly contrasted images you can toggle this mode and maybe
get a better view of what you are labeling.
2017-06-19 20:54:45 -04:00
Davis King ba72c2f95c Updated code to work with new random_cropper interface. 2017-06-18 08:11:54 -04:00
Davis King 17b48b97bb Changed the random_cropper's interface so that instead of talking in terms of
min and max object height, it's now min and max object size.  This way, if you
have objects that are short and wide (i.e. objects where the relevant dimension
is width rather than height) you will get sensible behavior out of the random
cropper.
2017-06-17 12:34:26 -04:00
Joachim 3a91295ea7 Merge branch 'master' of https://github.com/davisking/dlib 2017-06-13 13:33:42 +01:00
Grigoris 75f6658223 Minor typo corrections in shape predictor trainer. (#633) 2017-06-07 06:51:19 -04:00
Plumtus 362bec1099 Reinitialize averagers when saved sync file was reloaded. (#629) 2017-06-06 14:19:23 -04:00
Davis King d2b80bfe6f Switched order of things in if statement so cmake hopefully won't give weird errors. 2017-06-06 05:47:00 -04:00
Davis King 74fbca45ca Changed the converter so that, rather than producing one python file with
everything in it, it now makes a python file as before but an additional binary
file with all the weights in it.  This way, if you are working with a network
with a very large number of weights you won't end up with a crazy large python
script.
2017-06-04 10:06:44 -04:00
Davis King f9eab48813 updated docs 2017-06-02 22:51:59 -04:00
Davis King 3ee460da93 Added set_rect_area() 2017-06-02 22:50:01 -04:00
Davis King 5bc1792d4f Added a .fill() member to curand_generator that can create random 32bit
integers.
2017-05-31 16:08:00 -04:00
Davis King 2ee1036299 merged 2017-05-30 17:18:49 -04:00
Davis King e8e064e534 Added --compiler-flags to setup.py so you can pass options directly to gcc. 2017-05-30 17:18:04 -04:00
Evgeniy Fominov 9ed2ba9e5a Possible CLang fix for Neon-based SIMD4i (#612) 2017-05-30 05:40:11 -04:00
Davis King 0ef3b736fd Relaxed the default non-max suppression parameters used by the mmod_options
object so that users of the deep learning MMOD tool don't get spurious errors
about impossibly labeled objects during training.
2017-05-29 20:06:37 -04:00
Davis King 88383a848b Made the converter handle caffe's odd pooling layer output size calculations. 2017-05-28 11:07:02 -04:00
Davis King 724cb5006f Work around a bug in visual studio 2015. 2017-05-27 12:56:24 -04:00
Davis King df19361c8f Made calling clean() on network objects also call clean on any layer detail
objects that also provide a .clean() method.
2017-05-27 11:59:32 -04:00
Davis King 115e8b6dfa updated docs 2017-05-26 17:24:44 -04:00
Davis King e1f2bb3859 Added visit_layers_until_tag() 2017-05-26 17:23:58 -04:00
Davis King 44387e396d merged 2017-05-24 07:25:51 -04:00
Davis King a88f1bd8f2 Made the converter add zero padding layers when needed by Eltwise to replicate
the behavior of dlib's add_prev layers.
2017-05-24 07:24:12 -04:00
Davis King 984b694962 Made error message slightly better. 2017-05-22 19:11:00 -04:00
Davis King cbda2b9e33 Changed caffe converter to require the user to specify the input tensor size
when the converter is run.
2017-05-22 19:06:55 -04:00
Juha Reunanen 97ff8cb2a9 Noticed compiler warning C4800: 'double': forcing value to bool 'true' or 'false' (performance warning) (#538)
- suggesting to change the test to what perhaps was the original intention
2017-05-22 08:58:27 -04:00
Davis King 523e020e18 updated docs 2017-05-20 09:40:48 -04:00
emptyVoid 465e95424d Move shape_predictor_trainer to a separate header (#599)
Moved shape_predictor_trainer to its own header in order to support the
use of shape_predictor with DLIB_ISO_CPP_ONLY defined (i.e. without
threading API wrappers).
2017-05-20 09:37:40 -04:00
Davis King 10d3f93333 Dlib and caffe actually do use max pooling layers with padding in the same way.
So I just removed the error check that was preventing the conversion from
proceeding in that case.  I also added more useful output messages about
setting input tensor dimensions.
2017-05-19 20:16:40 -04:00
Davis King e28768eafa Fixed uninitialized variable in test that caused unreliable test results. 2017-05-17 16:43:32 -04:00
Davis King 537864f6b7 Cleanup and also fixed a cmake error when building dlib outside a project. 2017-05-16 21:14:09 -04:00
Davis King 5515d9cf82 merged 2017-05-16 20:51:44 -04:00
Davis King 4c4376722d yet more print statements 2017-05-16 20:50:34 -04:00
tschw f154fa76d7 Fix installed CMake targets when DLIB_ISO_CPP_ONLY (#595) (#597) 2017-05-16 14:47:53 -04:00
Davis King cb6777422f A test to see more info about what's going weird on appveyor. 2017-05-16 09:55:34 -04:00
Davis King 23332d6852 More logging messages 2017-05-15 07:12:18 -04:00
Davis King 334ba38ec6 A minor change to hopefully reduce inane warnings from visual studio. 2017-05-14 20:07:16 -04:00
Davis King fa94cdfa96 Removed references to old smart pointers from the docs. 2017-05-14 19:59:37 -04:00
Davis King 65aad55748 removed cruft 2017-05-14 19:55:01 -04:00
elelel b57b8b20aa Migrate from dlib::scoped_ptr to std::unique_ptr (#593)
* Convert unique_ptr

* Fix passing unique_ptr as copy by value

* Remove scoped_ptr implementation

* Fix missed files

* Move bool cast into tester macro

* Reexport scoped_ptr alias from sockets
2017-05-14 19:52:34 -04:00
Davis King 31f02b00eb merged 2017-05-14 19:40:32 -04:00
Davis King 77b051bef0 Don't use parallel builds since it makes appveyor run out of ram. Also
simplified script a little.
2017-05-14 19:40:10 -04:00
elelel ca60cf8aea Change type traits from inherited to explicit (#591) 2017-05-13 18:05:52 -04:00