[SOLVED] COEN 140 - Lab8

35.00 $

Category:

Description

Rate this product

Machine Learning and Data Mining

  Convolutional Neural Network

 Build a convolutional neural network for the recognition task with the fashion MNIST data set. Use the “sparse_categorical_crossentropy” loss function, use ‘adam’ as the optimizer, and train your model for 5 epochs.

Adopt the following convolutional neural network structure:

  1. Input layer
  2. 2d-convolutional layer: filter size 3×3, depth=32, padding=’same’, strides = (1,1), ReLU activation function
  3. 2×2 max pooling layer, strides = (2,2), padding = ‘same’
  4. 2d-convolutional layer: filter size 3×3, depth=64, padding=’same’, strides = (1,1), ReLU activation function
  5. 2×2 max pooling layer, strides = (2,2), padding = ‘same’
  6. 2d-convolutional layer: filter size 3×3, depth=64, padding=’same’, strides = (1,1), ReLU activation function
  7. Flattening layer
  8. Fully-connected layer: 64 nodes, ReLU activation function
  9. (output) Fully-connected layer: 10 nodes, softmax activation function
  10. include in the report (70%):
  1. Give the recognition accuracy rate, and show the confusion matrix, both for the test set.

 

  1. For each layer of the network:

Manually calculate the number of parameters of that layer. That is, the number of weight elements (include the bias terms). Show you work. Verify whether your results are the same as those given by model.summary().

  • Write out the output dimension of that layer.
  • Manually calculate the number of multiplications required to generate that layer. The weights have bias terms. You only need to consider the multiplications required to calculate the weighted sums. You don’t need to consider the multiplications involved in the softmax function. Show you work.

 

  1. Compare the recognition accuracy rate of the test set, total number of parameters, and total number of multiplications of this CNN to those in Lab 7 (neural network). Analyze your findings and explain why you obtain different (or similar) results.