Made background section about dropout sampling more generalised

Signed-off-by: Jim Martens <github@2martens.de>
This commit is contained in:
2019-10-01 13:30:30 +02:00
parent dd654bb5b5
commit 96f9a696f5

View File

@ -384,7 +384,7 @@ the \gls{entropy} \(H(\mathbf{q}) = - \sum_i q_i \cdot \log q_i\).
Miller et al.~\cite{Miller2018} apply the dropout sampling to
object detection. In that case \(\mathbf{W}\) represents the
learned weights of a detection network like \gls{SSD}~\cite{Liu2016}.
learned weights of a detection network, for example \gls{SSD}~\cite{Liu2016}.
Every forward pass uses a different network
\(\widetilde{\mathbf{W}}\) which is approximately sampled from
\(p(\mathbf{W}|\mathbf{T})\). Each forward pass in object