How far does an electromagnetic wave travel?

Disclaimer: I am not a physicists nor strong in mathematics just interested in how things work. Will always try to point out what I do not understand. This is based on a) my interest in physics and b) my work in product-market strategy at Kontakt.io (Bluetooth LE for IoT and beacons).

The background to this is that I want to understand how one can estimate distance between a broadcaster and a receiver of a signal. Professionally, I am interested in how that works for Bluetooth but to understand that I hope to understand it in general.

Electromagnetic waves

All the technologies we use (Bluetooth, WiFi) are electromagnetic waves. From wikipedia I learn:

“In physics, electromagnetic radiation (EM radiation or EMR) refers to the waves (or their quanta, photons) of the electromagnetic field, propagating (radiating) through space carrying electromagnetic radiant energy.”

There are many elements which I do not understand here (nearly all), the important thing is that Bluetooth, GSM, WiFi all are an electromagnetic wave.

Side note / thought experiment: if a wave travels through non-space (whatever that is) it should not loose any energy. That assumes that non-space is a system, electromagnetic waves can travel through it and the law of conservation of energy applies. As in, where should the energy of a electromagnetic waves go when there is nowhere to go because there is nothing.

Free space formula

Again, from wikipedia,it appears that free-space path loss (FSPL) is what I am looking for. Free space loss is defined by the IEEE but probably based on physics.

where:
lambda is the signal wavelength (in metres),
f is the signal frequency (in hertz),

d is the distance from the transmitter (in metres),
c is the speed of light in a vacuum, 2.99792458 × 108 metres per second.

How do we get from the first formula to the second? In the upper term we multiply by f (frequency) in the lower term we replace lambda with c (speed of light). Therefore, lambda must be equal to (wavelength * frequency = 0).

That is true for light waves according to the first google search result (hubbleSite) and because light is an electro magnetic wave.

So what?

From the formula it appears that the only relevant variable for practical purposes is the the frequency.

Thus, if the initial power of the electromagnetic wave is constant the only thing that matters is the frequency. At least according to the IEEE specification.

Sources: