# Numpy Sum Axis Intuition

I’ve always thought that axis 0 is row-wise, and axis 1 is column-wise.

` row-wise (axis 0) --->  [[ 0  1]                             [ 0  5]]                              ⭡                           column-wise (axis 1)`

However, what numpy.sum gives me is the exact opposite of what I thought it would be.

`>>> np.sum([[0, 1], [0, 5]], axis=0)array([0, 6])>>> np.sum([[0, 1], [0, 5]], axis=1)array([1, 5])`

So what’s going on here? Am I the only one who is wondering this?

The way to understand what “axis” means in numpy sum is that it collapses the specified axis. So when it collapses the axis 0 (the row), it becomes just one row (it sums column-wise).

Why did numpy choose to act this way?

It is possible that this might be confusing when discussing 2-d arrays; however, when discussing 3-d, 4-d, n-d arrays, this is a more straightforward way to define the axis.

`# Let's experiment with 3-d array.  In : x = np.array([[[1,2],[3,4]],[[1,2],[3,4]]])In : xOut: array([[[1, 2],        [3, 4]],       [[1, 2],        [3, 4]]])In : x.shapeOut: (2, 2, 2)In : x        # axis-0Out: array([[1, 2],       [3, 4]])In : x        # still axis-0Out: array([[1, 2],       [3, 4]])In : x    # axis-1Out: array([1, 2])In : x # axis-2Out: 1In : np.sum(x, axis=0)  # Notice that it eliminated the specified axis.Out: array([[2, 4],       [6, 8]])In : np.sum(x, axis=1)Out: array([[4, 6],       [4, 6]])In : np.sum(x, axis=2)Out: array([[3, 7],       [3, 7]])`

The same logic goes for Tensorflow.

`t1 = [[1, 2, 3], [4, 5, 6]]t2 = [[7, 8, 9], [10, 11, 12]]tf.concat([t1, t2], 0)  # [[1, 2, 3], [4, 5, 6], [7, 8, 9], [10, 11, 12]]tf.concat([t1, t2], 1)  # [[1, 2, 3, 7, 8, 9], [4, 5, 6, 10, 11, 12]]# tensor t3 with shape [2, 3]# tensor t4 with shape [2, 3]tf.shape(tf.concat([t3, t4], 0))  # [4, 3]tf.shape(tf.concat([t3, t4], 1))  # [2, 6]`

If you like my post, please clap and subscribe. It gives me the motivation to write more. :)