Visually-guided behavior and optogenetically-induced learning in head-fixed flies exploring a virtual landscape

View Researcher's Other Codes

Disclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).

Please contact us in case of a broken link from here

Authors Hannah Haberkern, Melanie A. Basnak, Biafra Ahanonu, David Schauder, Jeremy D. Cohen, Mark Bolstad, Christopher Bruns, Vivek Jayaraman
Journal/Conference Name Current Biology
Paper Category
Paper Abstract Studying the intertwined roles of sensation, experience, and directed action in navigation has been facilitated by the development of virtual reality (VR) environments for head-fixed animals, allowing for quantitative measurements of behavior in well-controlled conditions. VR has long featured in studies of Drosophila melanogaster, but these experiments have typically allowed the fly to change only its heading in a visual scene and not its position. Here we explore how flies move in two dimensions (2D) using a visual VR environment that more closely captures an animal’s experience during free behavior. We show that flies’ 2D interaction with landmarks cannot be automatically derived from their orienting behavior under simpler one-dimensional (1D) conditions. Using novel paradigms, we then demonstrate that flies in 2D VR adapt their behavior in response to optogenetically delivered appetitive and aversive stimuli. Much like free-walking flies after encounters with food, head-fixed flies exploring a 2D VR respond to optogenetic activation of sugar-sensing neurons by initiating a local search, which appears not to rely on visual landmarks. Visual landmarks can, however, help flies to avoid areas in VR where they experience an aversive, optogenetically generated heat stimulus. By coupling aversive virtual heat to the flies’ presence near visual landmarks of specific shapes, we elicit selective learned avoidance of those landmarks. Thus, we demonstrate that head-fixed flies adaptively navigate in 2D virtual environments, but their reliance on visual landmarks is context dependent. These behavioral paradigms set the stage for interrogation of the fly brain circuitry underlying flexible navigation in complex multisensory environments.
Date of publication 2019
Code Programming Language Jupyter Notebook
Comment

Copyright Researcher 2021