included in the edge graph. If it isn't then the output labels from
chinese_whispers would be missing faces in this degenerate case. So basically this fixes a bug
where chinese_whispers(), when called from python, would sometimes return a labels array
that doesn't include labels for all the inputs.
dimensions in the same format as the mmod_options object (i.e. two lengths
measured in pixels). This should make defining random_cropping strategies that
are consistent with MMOD settings much more straightforward since you can just
take the mmod_options settings and give them to the random_cropper and it will
do the right thing.
* Use banded Cholesky factorization if possible
Computation cost from n.n.n -> n.n.b where b is band size
* Tidy up whitespace
* Remove typo
* Escape from banded matrix detection correctly
* Use LAPACK banded Cholesky factorisation where possible
* Add banded chol tests
* Add test for banded chol in column major layout
* Use row major layout for banded chol - more efficient as we will pass to LAPACK
early iterations since the model might produce a huge number of false alarms
while the detector is still bad. Processing all these detections can cause it
to run slowly until the model is good enough to avoid really excessive amounts
of false alarms. This change puts more of a limit on the number of false
alarms processed during those early iterations and avoids the slowdown.
*before* allocating new memory. It used to be the other way around which
caused momentary spikes of increased memory usage. This could put you over the
total memory available in some cases which is obviously less than ideal
behavior.
* Add get_net parameter allowing to call the function without forced flush to disk (see the discussion in #869)
* A blindfolded attempt to fix compile error on the CI server
get_face_chip_details() when used with the 68 point landmark model about a
month ago. This change reverts it back to the previous behavior. The change
was very minor, so it shouldn't matter either way. But being consistent is
important and I'm changing it back.