Near and Far Field Attenuation for electrically small antenna.

The above graphic is from this article about the near field affects.

Far Field RF Link Budgets all have one thing in common. They assume a signal weakens as the square of distance. Double the distance and you loose 6dB. This makes sense when you consider the intensity of radiated electro-magnetic fields drops as the square of distance. 3dB is half the power and 6dB is half again or 1/4 the power or 6dB.

Path loss is based on distance and wavelength…

Pathloss(dB) = 20 * log(4*pi*distance/wavlength)

The FCC suggests BPL signals weaken 40dB per decade in their attempt to justify BPL in the HF bands. Is it likely they are referring to the world within about 1/2 wavelength of the emitter. Additionally, this increased attenuation assumes an electrically small antenna. Certainly power lines are not electrically small antennas, but let’s assume they are for now.

The above graphic suggests that while attenuation is greater in the near field, it starts from a much stronger signal level and never dips below the 20db/decade line.

It would seem, then, we can still use the 20dB line for our calculations knowing that the promotion of 40dB/decade does not help the cause of the FCC. The 40dB/decade just does not continue past 1/2 wavelength distance if we are to believe the author of the above referenced article.

It should also be pointed out that any RF budget performed using even 1/10 Part 15 power still places massive S9 signals into any nearby receiver.

An example RF budget assumes 10mW transmitted into the power lines which, of course, radiate much of the power as they are antennas (another fact of physics the promoters of BPL hope the public does not learn). Using 14MHz and 1000 meters, the path loss is about 55dB. Given the power starts at around +10dBm and accounting for some cable and coupling losses we wind up with about -50dBm at the receiver. Shortwave and Amateur HF receivers have sensitivities far lower than -100dBm suggesting any BPL deployment using HF frequencies renders HF band useless. -50dBm is at least an S8 on your meter and probably much more.

I suspect the FCC is attempting to cloud the issue as much they can so as to justify BPL at the cost of HF spectrum users.

I have been doing RF link budgets quite successfully for years and would love to hear the technical arguments defending the 40dB per decade (or 12dB per doubling) signal loss they suggest makes BPL live within the Part 15 emission limits. I argue the near field effects actually show more signal and then converge on the 20dB/decade where problems still clearly exist.

You must be logged in to leave a reply.

A note to our visitors

This website has updated its privacy policy in compliance with changes to European Union data protection law, for all members globally. We’ve also updated our Privacy Policy to give you more information about your rights and responsibilities with respect to your privacy and personal information. Please read this to review the updates about which cookies we use and what information we collect on our site. By continuing to use this site, you are agreeing to our updated privacy policy.