Which Activation and Weights to Choose in Neural Networks

  1. RELU? ELU?
  2. Sigmoid or Tanh?
  3. Set all weights to 0?
  4. Set all weights to random values?