The most suitable previous topic was closed having reached 100…
Re the attenuation due to using “the wrong polarisation” I was “reflecting“ * on this situation this morning and worked out a few basic calculations.
The attenuation is basically due to the wave front intersecting the antenna and only creating a fraction of the incident energy available. Well, what fraction?
A model for the situation is an antenna that is inclined at an angle, call it theta to the angle of the incident wave. The antenna will receive a fraction of the energy that can be calculated as sin(theta). This will give the current in the antenna that is induced by the incident wave.
At 45 degrees, sin(45) = 0.7, causing 0.7 of the available voltage and current from the wave to flow in the antenna elements. This is converted to db by calculating the ratio of the reduced energy to the original energy which is 20 log (0.7) = -3db, which is intuitively correct as titling the antenna by 45 degrees should result in the same loss for both polarisations.
So my next thought was to ask how much signal degredation should result from a serious mismatch, say 89 degrees (almost completely the wrong polarisation). In this case we would say the angle of incidence is 1 degree.
So going through the same calcs, 20 log (sin(01) = -35db. This is a lot of loss.
And an intermediate result for a 10 degree incidence: -15 db.
Reversing the process to calculate a 10 db loss: 18 degrees. And while that polarisation has 10 db loss, the other polarisation is only 0.4 db off the ideal, which is essentially unmeasurable by someone on a hill.
By calculating for each angle of a certain antenna tilt, you could accept a certain amount of loss on one polarisation (say for FM contacts with locals) while preserving almost all of the antenna gain for dx contacts on ssb.
This works for a normally “vertical” antenna being used to work horizontally polarised stations, with say an 18 degree tilt being enough to work those stations with only 10 db loss. At the same time the vertical antenna would still work the FM locals with only 0.4 db of loss.
Conversely, a beam that is needed to work ssb stations at 400 km can be tilted by 18 degrees, causing only 0.4 db of loss, and accepting that the locals on FM (who are always strong anyway) will be 10 db weaker than they would if you changed the antenna to vertical polarisation.
Finally, regarding the 35 db of loss from the 89 degree offset, it is quite difficult to be that accurate. At a 2 degree error, the loss drops to 30 db. At 3 degrees it is 25db.
So it is entirely feasible that due to errors in antenna setup, the sensitivity of the antenna to polarisation could be well below the theoretical maximum which is infinite.
Submitted to my learned friends for consideration.
Footnotes: * couldn’t resist** this one…
** see *