# NumPy refresher¶

Here are some quick guides to NumPy:

## Matrix conventions for machine learning¶

Rows are horizontal and columns are vertical. Every row is an example. Therefore, inputs[10,5] is a matrix of 10 examples where each example has dimension 5. If this would be the input of a neural network then the weights from the input to the first hidden layer would represent a matrix of size (5, #hid).

Consider this array:

```>>> numpy.asarray([[1., 2], [3, 4], [5, 6]])
array([[ 1.,  2.],
[ 3.,  4.],
[ 5.,  6.]])
>>> numpy.asarray([[1., 2], [3, 4], [5, 6]]).shape
(3, 2)
```

This is a 3x2 matrix, i.e. there are 3 rows and 2 columns.

To access the entry in the 3rd row (row #2) and the 1st column (column #0):

```>>> numpy.asarray([[1., 2], [3, 4], [5, 6]])[2, 0]
5.0
```

To remember this, keep in mind that we read left-to-right, top-to-bottom, so each thing that is contiguous is a row. That is, there are 3 rows and 2 columns.

```>>> a = numpy.asarray([1.0, 2.0, 3.0])
The smaller array `b` (actually a scalar here, which works like a 0-d array) in this case is broadcasted to the same size as `a` during the multiplication. This trick is often useful in simplifying how expression are written. More detail about broadcasting can be found in the numpy user guide.