Other articles

  1. Best things of 2015

    Here are a few things I appreciated in 2015:

    arXiv papers:

    • Batch Normalization- a simple conceptual idea beautifully executed. We've known for years that neural networks respond better to input data that follow certain distributions. Since the output of each layer is the input to the next, it's ...

    read more
  2. Overfitting, Regularization, and Hyperparameters

    (This lays the ground work for the next post, which was getting too long to be effective.)


    One of the goals of machine learning is generalizability. A model that only works on the exact data it was trained on is effectively useless. Let's say you're tasked with ...

    read more

Page 1 / 2 »