Fixed capitalisation after colons

Signed-off-by: Jim Martens <github@2martens.de>
This commit is contained in:
Jim Martens 2019-09-24 15:40:35 +02:00
parent cb92f63775
commit bec9a6e82c
1 changed files with 3 additions and 3 deletions

View File

@ -44,7 +44,7 @@ class of any given input. In this thesis, I will work with both.
\begin{figure}
\centering
\includegraphics[scale=1.0]{open-set}
\caption{Open set problem: The test set contains classes that
\caption{Open set problem: the test set contains classes that
were not present during training time.
Icons in this image have been taken from the COCO data set
website (\url{https://cocodataset.org/\#explore}) and were
@ -103,7 +103,7 @@ representation of the input and has to find a decompression
that reconstructs the input as accurate as possible. During
training these auto-encoders learn to reproduce a certain group
of object classes. The actual novelty detection takes place
during testing: Given an image, and the output and loss of the
during testing: given an image, and the output and loss of the
auto-encoder, a novelty score is calculated. For some novelty
detection approaches the reconstruction loss is exactly the novelty
score, others consider more factors. A low novelty
@ -259,7 +259,7 @@ can be gained with a technique named Monte Carlo Batch Normalisation (MCBN).
Consequently, this technique can be applied to any network that utilises
standard batch normalisation.
Li et al.~\cite{Li2019} investigated the problem of poor performance
when combining dropout and batch normalisation: Dropout shifts the variance
when combining dropout and batch normalisation: dropout shifts the variance
of a neural unit when switching from train to test, batch normalisation
does not change the variance. This inconsistency leads to a variance shift which
can have a larger or smaller impact based on the network used.