Chapter 2 Deep Learning and Neural Networks
There’s not a lot of code in this chapter. It’s mostly concepts and mathematical review, so, not much here. But, we see our first model, and some functions are implemented to play around with.
Definitely go check out https://playground.tensorflow.org/ and play around - it’s a great learning tool!
2.1 Fully Connected layers
2.1.1 Box on Page 48
An exercise to build a network and see what is inside!
library(keras)
library(tensorflow)
model <- keras_model_sequential() %>%
layer_dense(units = 5, input_shape = 4) %>%
layer_dense(units = 5) %>%
layer_dense(units = 3)
Now, what’s in the box?
## Model: "sequential_7"
## ______________________________________________________________________________________________
## Layer (type) Output Shape Param #
## ==============================================================================================
## dense_17 (Dense) (None, 5) 25
## ______________________________________________________________________________________________
## dense_18 (Dense) (None, 5) 30
## ______________________________________________________________________________________________
## dense_19 (Dense) (None, 3) 18
## ==============================================================================================
## Total params: 73
## Trainable params: 73
## Non-trainable params: 0
## ______________________________________________________________________________________________
2.2 2.3.6 The Relu Function
Just a quick implementation
## [1] 0
## [1] 3
2.3 2.3.7 Leaky Relu
I couldn’t resist improving a bit here
leaky_relu <- function(x, leak = 0.01){
if(x < 0) return(leak*x)
return(x)
}
#for example
leaky_relu(-5)
## [1] -0.05
## [1] 3