Lochner, JoshuaGain, JamesPerche, SimonPeytavie, AdrienGalin, EricGuérin, EricChaine, RaphaëlleDeng, ZhigangKim, Min H.2023-10-092023-10-0920231467-8659https://doi.org/10.1111/cgf.14941https://diglib.eg.org:443/handle/10.1111/cgf14941Generating heightfield terrains is a necessary precursor to the depiction of computer-generated natural scenes in a variety of applications. Authoring such terrains is made challenging by the need for interactive feedback, effective user control, and perceptually realistic output encompassing a range of landforms.We address these challenges by developing a terrain-authoring framework underpinned by an adaptation of diffusion models for conditional image synthesis, trained on real-world elevation data. This framework supports automated cleaning of the training set; authoring control through style selection and feature sketches; the ability to import and freely edit pre-existing terrains, and resolution amplification up to the limits of the source data. Our framework improves on previous machine-learning approaches by: expanding landform variety beyond mountainous terrain to encompass cliffs, canyons, and plains; providing a better balance between terseness and specificity in user control, and improving the fidelity of global terrain structure and perceptual realism. This is demonstrated through drainage simulations and a user study testing the perceived realism for different classes of terrain. The full source code, blender add-on, and pretrained models are available.Interactive Authoring of Terrain using Diffusion Models10.1111/cgf.1494113 pages