Here is how you take the average of rgb colors, something that is not a straight forward as just adding the rgb colors together and then divide them by the number of pixels.
I am writing this article because all the current Google Search results are really terrible and over complicate things with code examples when all you need to know is a bit of theory.
Solution upfront: How to average RGB colors
You need to first square the colors and then add them together and then take the square root. Here is an example with 0 and 255 which is the highest in each spectrum.
0 * 0 = 0
0 + 65025 = 65025
65025/2 = 32512,5
square root of 32512,5 = 180,31
The average of 0 and 255 is NOT 127,5, but 180!
And that is a huge difference, because if you just divided it by 2, you average would be much darker!
The reason this is the case
The reason why you can just take the average of two rgb colors by just divide it by the number of pixels is because, in the early days of computers, the space was limited, so you had to find ways to save space! The human is also much better at distinguishing dark colors than white colors! Hence they got the genius idea of taking the square root of the colors to save space, this way the could compress a big number like 65025 down to a much smaller number like 255 and save space!
Here to read more about colors and calculating with them
Here is a good explanation also by minute physics which shows it with more diagrams and pictures: