James Hope, a 17 year old male from Gloucester, UK asks on June 4, 2004,My dad was saying that radio waves from TV transmitter stations can burn you severely if you are near the transmitter. However according to E=hf, a radio wave photon will have very little energy, so how can it burn you? Especially as the energy of the photon will stay the same however far it goes and they don't burn us a long way from the transmitter.
viewed 16219 times
"Radio photons" is not the way to look this question. Photons are indeed packets of energy, but rather small, and more the domain of solid state electronics engineers (who make LEDs = Light (ie photon) Emitting Diodes) and physicists.
Broadcast systems such as televsion and radio transmitters produce very large amounts of radiowave power called RF power for "Radio Frequency". The idea is to cover as much area as possible with the waves, and reach as many users as possible. To get the power levels of the radiowaves so high, very large voltages at radiowave frequencies are produced at the antenna. The antennas are normally fenced off or high up on a tower so that they cannot be accessed by the public. If a person does get too close, for example by scaling the tower or fence, then it is likely they will be electrocuted. The high voltage causes an arc to leap from the antenna to the intruder, forming a current path to the ground via the intruder. The arc carries sufficient electrical current to severely burn and even kill a person. There have been a few cases where this has happened and most are pretty gruesome. One intruder was electrocuted on an AM radio station and witnesses claimed they could hear the radio station audio (along with the normal arcing sound) while the arc held and the fellow was being electrocuted. It seems that bodies have electro-accoustic properties with radiofrequency arcs!
However, the RF waves will not burn you if you are at locations which can be accessed by the public, which usually means at least a few metres away from the antenna. There are internationally agreed standards for the amount of power in radiowaves which may reach humans. In television and radio high power levels are permissable because the broadcast radio frequency is relatively low. For cell-phone towers, the broadcast power levels must be much lower because the radiofrequency is much higher. As you get further away from the antenna, the power level drops very quickly, following the inverse square law. That is, if you are 2 meters away the power is 4 times less and if you are 3 meters away, 9 times less, 4 meters away the power is 16 times less, etc.
Note: All submissions are moderated prior to posting.
If you found this answer useful, please consider making a small donation to science.ca.