Csoba, IstvánKunkli, RolandBenes, Bedrich and Hauser, Helwig2021-10-082021-10-0820211467-8659https://doi.org/10.1111/cgf.14267https://diglib.eg.org:443/handle/10.1111/cgf14267Visual aberrations are the imperfections in human vision, which play an important role in our everyday lives. Existing algorithms to simulate such conditions are either not suited for low‐latency workloads or limit the kinds of supported aberrations. In this paper, we present a new simulation method that supports arbitrary visual aberrations and runs at interactive, near real‐time performance on commodity hardware. Furthermore, our method only requires a single set of on‐axis phase aberration coefficients as input and handles the dynamic change of pupil size and focus distance at runtime. We first describe a custom parametric eye model and parameter estimation method to find the physical properties of the simulated eye. Next, we talk about our parameter sampling strategy which we use with the estimated eye model to establish a coarse point‐spread function (PSF) grid. We also propose a GPU‐based interpolation scheme for the kernel grid which we use at runtime to obtain the final vision simulation by extending an existing tile‐based convolution approach. We showcase the capabilities of our eye estimation and rendering processes using several different eye conditions and provide the corresponding performance metrics to demonstrate the applicability of our method for interactive environments.human vision simulationdepth of fieldocular wavefront aberrationspoint‐spread functionsEfficient Rendering of Ocular Wavefront Aberrations using Tiled Point‐Spread Function Splatting10.1111/cgf.14267182-199