Abstract: Synthesizing perceivable artificial neural inputs independent of typical sensory channels remains a
fundamental challenge in the development of next-generation brain-machine interfaces. Establishing a
minimally invasive, wirelessly effective, and miniaturized platform with long-term stability is crucial for
creating a clinically meaningful interface capable of mediating artificial perceptual feedback. In this study,
we demonstrate a miniaturized fully implantable wireless transcranial optogenetic encoder designed to
generate artificial perceptions through digitized optogenetic manipulation of large cortical ensembles.
This platform enables the spatiotemporal orchestration of large-scale cortical activity for remote
perception genesis via real-time wireless communication and control, with optimized device performance
achieved by simulation-guided methods addressing light and heat propagation during operation. Cue
discrimination during operant learning demonstrates the wireless genesis of artificial percepts sensed by
mice, where spatial distance across large cortical networks and sequential order-based analyses of
discrimination performance reveal principles that adhere to general perceptual rules. These conceptual
and technical advancements expand our understanding of artificial neural syntax and its perception by
the brain, guiding the evolution of next-generation brain-machine communication.
Loading