• 深度学习Course2第一周Practical aspects of Deep Learning习题整理


    Practical aspects of Deep Learning

    1. If you have 10,000,000 examples, how would you split the train/dev/test set?
    • 33% train. 33% dev. 33% test
    • 60% train. 20% dev. 20% test
    • 98% train. 1% dev. 1% test
    1. When designing a neural network to detect if a house cat is present in the picture, 500,000 pictures of cats were taken by their owners. These are used to make the training, dev and test sets. It is decided that to increase the size of the test set, 10,000 new images of cats taken from security cameras are going to be used in the test set. Which of the following is true?
    • This will increase the bias of the model so the new images shouldn’t be used.
    • This will be harmful to the project since now dev and test sets have different distributions.
    • This will reduce the bias of the model and help improve it.
    1. If your Neural Network model seems to have high variance, what of the following would be promising things to try?
    • Make the Neural Network deeper
    • Get more training data
    • Add regularization
    • Get more test data
    • Increase the number of units in each hidden layer
    1. You are working on an automated check-out kiosk for a supermarket, and are building a classifier for apples, bananas and oranges. Suppose your classifier obtains a training set error of 0.5%, and a dev set error of 7%. Which of the following are promising things to try to improve your classifier? (Check all that apply.)
    • Increase the regularization parameter lambda
    • Decrease the regularization parameter lambda
    • Get more training data
    • Use a bigger neural network
    1. In every case it is a good practice to use dropout when training a deep neural network because it can help to prevent overfitting. True/False?
    • True
    • False
    1. The regularization hyperparameter must be set to zero during testing to avoid getting random results. True/False?
    • True
    • False
    1. With the inverted dropout technique, at test time:
    • You apply dropout (randomly eliminating units) but keep the 1/keep_prob factor in the calculations used in training.
    • You do not apply dropout (do not randomly eliminate units), but keep the 1/keep_prob factor in the calculations used in training.
    • You apply dropout (randomly eliminating units) and do not keep the 1/keep_prob factor in the calculations used in training
    • You do not apply dropout (do not randomly eliminate units) and do not keep the 1/keep_prob factor in the calculations used in training
    1. Increasing the parameter keep_prob from (say) 0.5 to 0.6 will likely cause the following: (Check the two that apply)
    • Increasing the regularization effect
    • Reducing the regularization effect
    • Causing the neural network to end up with a higher training set error
    • Causing the neural network to end up with a lower training set error
    1. Which of the following actions increase the regularization of a model? (Check all that apply)
    • Decrease the value of the hyperparameter lambda.
    • Decrease the value of keep_prob in dropout.
      Correct. When decreasing the keep_prob value, the probability that a node gets discarded during training is higher, thus reducing the regularization effect.
    • Increase the value of the hyperparameter lambda.
      Correct. When increasing the hyperparameter lambda, we increase the effect of the L_2 penalization.
    • Increase the value of keep_prob in dropout.
    • Use Xavier initialization.
    1. Which of the following is the correct expression to normalize the input x ? \mathbf{x}? x?
    • x = x − μ σ x = \frac{x-\mu }{\sigma } x=σxμ
  • 相关阅读:
    Circular view path [ ]: would dispatch back to the current handler URL 错误解决
    并行多核体系结构基础知识
    Linux之(9)shell基础概念(1)
    Python3----------抽象(多态、封装、继承等)
    点云数据转pnts二进制数据
    深入浅出零钱兑换问题——背包问题的套壳
    Flink 1.13 源码解析——JobManager启动流程之ResourceManager启动
    Java-内部类
    vue-3d-model属性介绍
    3C电子胶黏剂在手机制造方面有哪些关键的应用
  • 原文地址:https://blog.csdn.net/l8947943/article/details/126070647