Week 45, Recurrent Neural Networks
Contents
Plan for week 45
Material for the lab sessions, additional ways to present classification results and other practicalities
Searching for Optimal Regularization Parameters \( \lambda \)
Grid Search
Randomized Grid Search
Wisconsin Cancer Data
Using the correlation matrix
Discussing the correlation data
Other ways of presenting a classification problem
Combinations of classification results
Positive and negative prediction values
Other quantities
\( F_1 \) score
ROC curve
Cumulative gain curve
Other measures in classification studies: Cancer Data again
Material for Lecture Thursday November 9
Recurrent neural networks (RNNs): Overarching view
A simple example
RNNs
Basic layout
We need to specify the initial activity state of all the hidden and output units
We can specify inputs in several ways
We can specify targets in several ways
Backpropagation through time
The backward pass is linear
The problem of exploding or vanishing gradients
Four effective ways to learn an RNN
Long Short Term Memory (LSTM)
Implementing a memory cell in a neural network
An extrapolation example
Formatting the Data
Predicting New Points With A Trained Recurrent Neural Network
Other Things to Try
Other Types of Recurrent Neural Networks
Other quantities
$$ {\displaystyle \mathrm {FDR} ={\frac {\mathrm {FP} }{\mathrm {FP} +\mathrm {TP} }}=1-\mathrm {PPV} } $$
$$ {\displaystyle \mathrm {FOR} ={\frac {\mathrm {FN} }{\mathrm {FN} +\mathrm {TN} }}=1-\mathrm {NPV} } $$
«
1
...
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
...
28
»