The growing field of eye-tracking enables many researchers to investigate human (subconscious) behavior unobtrusively, naturally, and non-invasive. For that, a highly natural, immersive, and controllable environment is essential to investigators. Currently, next to mobile eye-tracking in the wild, virtual reality is becoming state-of-the-art for such experiments, combining several eye-tracking modalities’ strengths. Next to simulations, omnidirectional video footage is massively used. 360°cameras capture these videos with resolutions of up to 16k. Afterward, they can be replayed on virtual reality glasses to learn about human behavior in a realistic, highly controlled environment. However, the pipeline from stitched video to eye-tracking experiment results depends on costly proprietary or self-developed software that lacks standardization, leading to recurrent reimplementation. This paper describes an open-source stimuli organization and presentation software implementation that enables researchers to easily organize their stimuli in a standardized way and conduct eye-tracking studies in virtual reality with a few clicks without knowledge about coding or technical details. The code is available at https://bitbucket.org/benediktwhosp/zing