It is well known that optical signals propagating through the atmosphere are subject to random fluctuations in phase and amplitude. These fluctuations are caused by random temperature distributions in the atmosphere, which manifests themselves as a random index of refraction changes along the propagation path. We introduce a simulation method for modeling atmospheric turbulence effects, which is based on a split-step approach to numerically solve the parabolic wave equation. Atmospheric turbulence effects are modeled by a number of phase screens. These phase screens are generated on a numerical grid of finite size, which corresponds to a narrow spatial area of atmospheric turbulence.