Many events happen so fast that we cannot observe them well with our naked eye. The temporal and spatial limitations of visual perception are well known and determine what we can actually see. Over the last years, sensors and camera systems became available that have surpassed the limitations of human perception. In this paper, we investigate how we can use augmented reality to create a system that allows altering the speed in which we perceive the world around us. We contribute an experimental exploration of how we can implement visual slow-motion to amplify human perception. We outline the research challenges and describe a conceptual architecture for manipulating the temporal perception. Using augmented reality glasses, we created a proof-of-concept implementation and conducted a study of which we report qualitative and quantitative results. We show how providing visual information from the environment at different speeds has benefits for the user. We also highlight the required new approaches to design interfaces that deal with decoupling the perception of the real would. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from permissions@acm.org. AHs ’20, March 16–17, 2020, Kaiserslautern, Germany © 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM. https://doi.org/10.1145/3384657.3384659