duvenaud a day ago

This is simply wrong. Backprop has the same asymptotic time complexity as forward.

  • bobmarleybiceps a day ago

    I think they're misusing "forward propagation" and "backward propagation" to be basically mean "post training inference" and "training". they seem to be assuming n iterations of the backward pass, which is why it's larger...

    • vrighter 21 hours ago

      n iterations would be a constant factor, which is omitted from asymptotic complexity

constantcrying 16 hours ago

"If we once again assume that there are the same number of neurons in each layer, and that the number of layers equal the number of neurons in each layer we find:"

These are terrible assumptions.

Why not compute the runtime as the product of the actual sizes? Then the comparison will also make more sense.