N.N. Olyunin, V.V. Sazonov, A.G. Vinogradov
Ionospheric geometrical-optics polarization effects are analyzed: the Faraday effect and the Cotton-Mouton effect. These effects result in rotation of polarization plane and deformation of polarization ellipsis of a radar signal.
The Faraday effect is traditionally considered as the main cause of depolarization of a signal. The order of magnitude of the Faraday effect in most cases is greater than the order of magnitude of the Cotton-Mouton effect. There is a possibility to obtain the original scattering matrix of a target from full-polarimetric radar data, distorted by the Faraday effect, but only if the absolute value of the depolarization caused by Cotton-Mouton effect is negligible.
Quasi-isotropic approximation may be used for the ionosphere under sufficiently high frequencies. In the framework of this approximation the anisotropy of a medium is considered as a small perturbation, and the evolution of the polarization ellipsis along a ray is described by a differential equation. This equation has been solved numerically with different values of parameters, characterizing propagation of a signal. The characteristic behavior of the magnitude of depolarization on range, ray direction and frequency is presented. Conclusions are drawn about the order of magnitude of the Faraday effect and the Cotton-Mouton effect under different values of the aforementioned parameters.