Commit Graph

456 Commits

Author SHA1 Message Date
Davis King bdbc8e418b Renamed something to avoid name clash with standard library. 2016-07-22 16:22:57 -04:00
Evgeniy Fominov bbeac285d1 Shape predictor trainer optimizations (#126)
* Shape predictor trainer optimizations

* Fixed performance leak in single thread mode & made VS2010 support
2016-07-22 09:11:13 -04:00
Davis King 5e550a261e Added some more comments 2016-06-25 18:31:21 -04:00
Davis King 15f4081cdf fixed compiler warning 2016-06-25 13:03:12 -04:00
Davis King a9343acc51 Changed code so the validation listing file doesn't have to be in the imagenet
root folder.
2016-06-25 12:31:59 -04:00
Davis King efcdc871e4 fixed compiler warnings 2016-06-25 11:17:07 -04:00
Davis King f88f784a4e Minor formatting cleanup 2016-06-25 09:47:36 -04:00
Davis King 2469352e95 fixed typo 2016-06-25 09:42:22 -04:00
Davis King fcf7ab6daa Updated examples to refer to the correct file names. 2016-06-25 09:40:11 -04:00
Davis King a76b642a4e renamed examples
--HG--
rename : examples/dnn_mnist_advanced_ex.cpp => examples/dnn_introduction2_ex.cpp
rename : examples/dnn_mnist_ex.cpp => examples/dnn_introduction_ex.cpp
2016-06-25 09:34:53 -04:00
Davis King 541ce716b9 Added the program that made the resnet model. 2016-06-25 09:26:51 -04:00
Davis King 1123eaa134 Changed the message that cmake displays when opencv isn't found so users don't get confused. 2016-06-24 01:28:52 -04:00
Davis King 87493f4971 Added some comments 2016-06-22 22:30:45 -04:00
Davis King f453b03f39 Added an example showing how to classify imagenet images. 2016-06-22 22:26:48 -04:00
Fm cc38772715 #pragma warning moved to dnn.h 2016-06-22 18:09:26 +03:00
Fm 2e741703ef removed wrong empty line 2016-06-22 17:54:28 +03:00
Fm 9930d3279e removed comment form net printing 2016-06-22 17:53:37 +03:00
Fm f3b0159ef1 #pragma warning for C4503 and /bigobj 2016-06-22 17:51:06 +03:00
Fm 63c2465f32 Added compiler flags for VS compiling DNN samples without warnings 2016-06-22 17:22:43 +03:00
Davis King 1c01eaec1d updated example comments 2016-06-11 11:54:44 -04:00
Davis King 6e0f13ba06 minor cleanup 2016-05-30 13:14:04 -04:00
Davis King 53e9c15811 Clarified some parts of the example. 2016-05-30 08:50:28 -04:00
Fm d32bcdfa3d Changed concat syntax into concat1, concat2..., made dtest more readable:: 2016-05-27 09:56:00 +03:00
Fm 2f7d3578d2 Added layer access and printing examples to inception sample 2016-05-26 19:40:10 +03:00
Fm 1f0318e222 depth_group replaced with concat layer 2016-05-26 17:43:54 +03:00
Fm 93e786db6c Merge branch 'master' of https://github.com/davisking/dlib into dnn_group_layer 2016-05-26 17:15:56 +03:00
Davis King b9332698fe updated example 2016-05-23 22:01:47 -04:00
Davis King 5e70b7a2c6 Cleaned up code a little and made the example use a better version of the architecture. 2016-05-22 13:17:10 -04:00
Davis King 0cd76f899b Added an error message if a camera isn't available. 2016-05-18 22:22:56 -04:00
Fm 28c4a48281 Grouping layer added 2016-05-17 13:07:04 +03:00
Davis King ee2a0070db Added comment to show how to deserialize a network. 2016-05-15 14:52:33 -04:00
Davis King ba0f7c5c53 Added a function to dnn_trainer that lets you query the "steps without
progress" estimate.  I also renamed the get/set functions for the shrink amount
to have a consistent name and use the word "factor" instead of "amount".
2016-05-15 14:48:06 -04:00
Davis King 13cc545da3 clarified comments. 2016-05-15 14:31:06 -04:00
Davis King 66166c674d Changed the solver interface to take the learning rate and the layer details
object as an input.  This allows the solvers to exhibit a more complex behavior
that depends on the specific layer.  It also removes the learning rate from the
solver's parameter set and pushes it entirely into the core training code.
This also removes the need for the separate "step size" which previously was
multiplied with the output of the solvers.

Most of the code is still the same, and in the core and trainer the step_size
variables have just been renamed to learning_rate.  The dnn_trainer's relevant
member functions have also been renamed.

The examples have been updated to reflect these API changes.  I also cleaned up
the resnet definition and added better downsampling.
2016-05-14 20:30:45 -04:00
Davis King 1e70c721a4 Made example use the "everything" version of avg pooling. 2016-05-07 14:30:42 -04:00
Davis King 4a7633056c Fixed avg pooling filter sizes to avoid errors with the new rules about
non-one based strides.
2016-05-04 21:40:29 -04:00
Davis King 1f0705ae92 clarified example 2016-04-28 19:41:27 -04:00
Davis King d31723ff45 Fixed typo in example 2016-04-19 06:44:31 -04:00
Davis King b16cc99e8f Added comments about using multiple GPUs 2016-04-18 22:48:07 -04:00
Davis King 603d474352 - Renamed network_type::num_layers to network_type::num_computational_layers.
- Made layer() recurse into repeat objects so that the index given to layer()
  does what you would expect.
- Added an operator<< for network objects that prints the network architecture.
2016-04-16 10:50:15 -04:00
Davis King 61591b13e2 Seeded random number generator with the clock since that's generally a good
thing to do for this kind of training.
2016-04-11 23:11:18 -04:00
Davis King 02c27ff916 fixed formatting 2016-04-11 23:06:32 -04:00
Davis King 423cd85594 renamed a file
--HG--
rename : examples/dnn_mnist_resnet_ex.cpp => examples/dnn_mnist_advanced_ex.cpp
2016-04-11 22:57:11 -04:00
Davis King 902a2beeaf Fleshed out these examples more. 2016-04-11 22:55:49 -04:00
Davis King 02b844ea5c Fixed grammar and clarified a few things. 2016-04-11 21:18:14 -04:00
Davis King 7d7c932f29 Added a narrative to this example. 2016-04-10 17:30:45 -04:00
Davis King 67a81c1c51 Made examples work with new fc<> template. 2016-04-10 12:11:19 -04:00
Davis King f9cb3150d0 upgraded to cudnn v5. Also changed the affine_ layer to not be templated but
to automatically select the right mode.  The serialization format for bn_
layers has also changed, but the code will still be able to deserialize older
bn_ objects.
2016-04-10 10:52:40 -04:00
Davis King fe168596a2 Moved most of the layer parameters from runtime variables set in constructors
to template arguments.  This way, the type of a network specifies the entire
network architecture and most of the time the user doesn't even need to do
anything with layer constructors.
2016-04-08 23:12:53 -04:00
Davis King 030f5a0a76 A bit more cleanup 2016-03-27 10:50:52 -04:00