The radio waves from 'radio stars' may suffer irregular phase changes in passing through the terrestrial ionosphere, so that when they reach the earth's surface they produce a disturbance in which both amplitude and phase vary over the ground. In this paper it is assumed that the wave emerges from the ionosphere with amplitude constant but with phase varying across its wave-front, and deductions are made about the diffraction pattern produced at the ground. It is shown how, from a knowledge of the way in which phase and amplitude vary at the ground, it is possible to deduce the average magnitude of the phase deviations produced by the ionosphere and their lateral extent. It is also shown how an investigation of the diffraction patterns produced by different wave-lengths may lead to an estimate of the distance of the effective diffracting screen from the plane of observation. Experiments to determine the diffraction pattern formed at the ground by waves emitted from a radio star are described in outline, and an application of the theory to some of the observations indicates that the ionospheric irregularities have a lateral extent of the order of 5 km. and are sufficient to cause a phase deviation of 1 to 2 radians for a wave-length of $6\cdot 7$ m.