11/09/2020

Monstre Matin Illusion: 10th Award of the 12th Illusion Contest in Japan

Eiji Watanabe (National Institute for Basic Biology) and Lana Sinapayen (Sony CSL / ELSI) got 10th place at the 12th Visual Illusion and Auditory Illusion Contest, for the Monstre Matin Illusion. The Monstre Matin Illusion is one of the color aftereffect induced by glare effect. Concentrate on the cross. Enjoy it!


Further Demo & Citation:

Watanabe, Eiji; Sinapayen, Lana (2020): Monstre Matin Illusion. https://doi.org/10.6084/m9.figshare.13125518

3/16/2020

Motion illusions arise from light source ambiguity

Lana Sinapayen and Eiji Watanabe,
Motion illusions arise from light source ambiguity.
The Journal of Brief Ideas, https://doi.org/10.5281/zenodo.3709097.
Published: 13 March (2020)

Motion illusions such as the rotating snakes illusion or the rotating cube illusion remain unexplained. Recently, Watanabe et al. showed that this category of illusions might be related to predictive coding. Unlike existing research focusing on 2D patterns, we propose that all motion illusions arise from predictions of the visual system resolving 3D light-and-shadow patterns perceived as ambiguous. All motion illusions can be understood as mimicking the shadow pattern caused by the relative motion of a light source and an object. For example, the rotating snakes illusion in Fig. 4c can be understood as a 3D black and white wavy object seen from above,with grey shadows projected by a light source moving clockwise as to stay perpendicular to each wave crest (attachment: same illusion recreated with a real object). Without the light source being explicitly visible,the pattern could be caused by (1) The light moving relatively to a static object or (2) the object moving in the opposite direction relatively to a static light source. Without cues, in a statically lit environment, the visual system resolves the ambiguity by assuming the most likely scenario: the object must be moving, the light is not.

12/02/2019

Monstre Benham Illusion: 2nd Award of the 11th Illusion Contest in Japan

Eiji Watanabe (National Institute for Basic Biology) and Lana Sinapayen (Sony CSL / ELSI) got 2nd place at the 11th Visual Illusion and Auditory Illusion Contest, for the Monstre Benham Illusion.

Lana Sinapayen and Eiji Watanabe @Kobe Univ., 2019
Benham Top is a black and white top that, when rotating, causes people to perceive illusory colors like red or blue. Illusory colors were first discovered by the German physicist Gustav Fechner, and the Benham top was built by the English journalist Charles Benham. We have implemented a version of the Benham top that instead of spinning, works like a flip book.

Twitter:
https://twitter.com/sina_lana/status/1201114897779576832?s=20

Demo:
https://www.sonycsl.co.jp/news/9164/

Citation:
DOI: https://doi.org/10.6084/m9.figshare.10046534
Eiji Watanabe and Lana Sinapayen, Monstre Benham Illusion, 2019

10/29/2019

11th Visual Illusion and Auditory Illusion Contest in Japan

Dr. Lana Sinapayen (Sony CSL / ELSI) and Pr. Eiji Watanabe (National Institute for Basic Biology) have been creating new visual illusions based on their common work. 4 of these illusions have been submitted to the 11th Visual Illusion and Auditory Illusion Contest in Japan.

Please enjoy the animations.

https://www.sonycsl.co.jp/news/9164/


6/23/2019

Predictive Coding Deep Neural Networks

Predictive Coding Deep Neural Networks in Chainer are here:

https://github.com/eijwat/Predictive_Coding_DNN_S
(without tensorboard = faster version)

or

https://github.com/eijwat/Predictive_Coding_DNN_TB
(with tensorboard)


Please refer to:
Watanabe E, Kitaoka A, Sakamoto K, Yasugi M and Tanaka K (2018)
Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction.
Front. Psychol. 9:345. doi: 10.3389/fpsyg.2018.00345
https://www.frontiersin.org/articles/10.3389/fpsyg.2018.00345

3/04/2019

New work coraborated with Yamamoto lab.

We used retro-reflection to perform three-dimensional imaging in water, and applied it to medaka behavior analysis.

Erina Abe, Masaki Yasugi, Hideaki Takeuchi, Eiji Watanabe, Yasuhiro Kamei, and Hirotsugu Yamamoto, Development of omnidirectional aerial display with aerial imaging by retro-reflection (AIRR) for behavioral biology experiments. Optical Review https://doi.org/10.1007/s10043-019-00502-w (2019)

4/13/2018

Ink blots Illusion

Ink blots Illusion
Ink blots on the blind curtain expands.

Citation;
DOI: https://doi.org/10.6084/m9.figshare.6137582
Eiji Watanabe, Ink blots Illusion, 2018

You can see many variations of Ink blots Illusion
at https://doi.org/10.6084/m9.figshare.6137582.

3/20/2018

Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction

Deep neural networks (DNNs), which have been developed with reference to the network structures and the operational algorithms of the brain, have achieved notable success in a broad range of fields, including computer vision, in which they have produced results comparable to, and in some cases superior to, human experts. In recent years, DNNs have also been expected to be useful as a tool for studies of the brain.

Recently a research team led by associate professor Eiji Watanabe of the National Institute for Basic Biology successfully reproduced illusory motion by DNNs trained for prediction.

The DNNs are based on predictive coding theory (Figure 1), which assumes that the internal models of the brain predict the visual world at all times and that errors between the prediction and the actual sensory input further refine the internal models. If the theory substantially reproduces the visual information processing of the brain, then the DNNs can be expected to represent the human visual perception of motion.

In this research, the DNNs were trained with natural scene videos of motion from the point of view of the viewer (Figure 2), and the motion prediction ability of the obtained computer model was verified using a rotating propeller in unlearned videos and the “Rotating Snake Illusion” (Figure 3). The computer model accurately predicted the magnitude and direction of motion of the rotating propeller in the unlearned videos. Surprisingly, it also represented the rotational motion for illusion images that were not moving physically, much like human visual perception (Figure 4). While the trained network accurately reproduced the direction of illusory rotation, it did not detect motion components in negative control pictures wherein people do not perceive illusory motion.

This research supports the exciting idea that the mechanism assumed by the predictive coding theory is a basis of motion illusion generation. Using sensory illusions as indicators of human perception, deep neural networks are expected to contribute significantly to the development of brain research.

These research results were published in Frontiers in Psychology on March 15. This research was conducted as a collaborative research project by the National Institute for Basic Biology, SOKENDAI (the Graduate University for Advanced Studies), Ritsumeikan University, the National Institute for Physiological Sciences, and Sakura Research Office.

Figure 1: A schematic diagram of PredNet (a modification of Figure 1 in Lotter et al. 2016, arXiv:1605.08104). Illustration of information flow within a single layer is presented. Vertical arrows represent connections with other layers. Each layer consists of “Representation” neurons, which output a layer-specific “Prediction” at each time step, which is subtracted from “Target” to produce an error, which is then propagated laterally and vertically in the network. External data or a lower-layer error signal is input to “Target”. In each layer, the input information is not processed directly, and the prediction error signal is processed.

Figure 2: Training videos. Models were trained using videos from the First-Person Social Interactions Dataset (Fathi, et al, 2012, CVPR12@Providence) which contains day-long videos of eight subjects spending their day at Disney World Resort in Orlando, Florida. The cameras were mounted on a cap worn by the subjects.

Figure 3: Akiyoshi Kitaoka’s rotating snake illusions (the left panel). People perceive clockwise or counter-clockwise motion depending on colour alignment. Negative controls (non-illusions) for which people perceive no motion are presented in the right panel. To experience stronger illusory motion perception, please refer to “Akiyoshi’s illusion pages”, http://www.ritsumei.ac.jp/~akitaoka/index-e.html.

Figure 4: The DNNs detected optical flow vectors in the illusion. Optical flow vectors detected between a pair of consecutive predictive images of the illusion. Red bars denote the direction and magnitude of vectors, yellow dots denote the start points of the vectors. The left is a single ring of the rotating snake illusion, and the right is a negative control image. 

##############################
Frontiers in Psychology (2018) Volume 9, Article 345.
“Illusory Motion Reproduced by Deep Neural Networks Trained for Prediction” Eiji Watanabe, Akiyoshi Kitaoka, Kiwako Sakamoto, Masaki Yasugi and Kenta Tanaka
DOI: 10.3389/fpsyg.2018.00345
https://www.frontiersin.org/articles/10.3389/fpsyg.2018.00345/

Frontiers Featured NEWS (2018) April 26
https://blog.frontiersin.org/2018/04/26/artificial-intelligence-tricked-by-optical-illusion-just-like-humans/
##############################


Additional notes:
This paper verifies delta model hypothesis proposed in Watanabe et al. 2010. Delta model is one of expanded theory of Predictive Coding (see below in detail).

Refer to:
Watanabe, E., Matsunaga, W., and Kitaoka, A., Motion signals deflect relative positions of moving objects, Vision Research  50, 2381-2390 (2010) [pubmed]

Related articles:
1) Motion signals deflect relative positions of moving objects
2) Delta model


3/05/2018

11/15/2017

Discovery of dynamic seasonal changes in color perception ~The small fish "medaka" shows large differences in color perception in summer and winter~

Medaka Fish in Summer and Winter (from NIBB HP)
In many areas, the environment fluctuates greatly depending on the season, and animals living in those areas must adapt to the changing environment. A research group from the National Institute for Basic Biology and Nagoya University in Japan found that color perception of Medaka, a small fish inhabiting rice fields and streams, varies greatly according to seasonal changes. Collaborative Research Projects with Yoshimura Laboratory. For details, please refer to NIBB's website.

The article “Dynamic plasticity in phototransduction regulates seasonal changes in color perception” was published in Nature Communications.
http://dx.doi.org/10.1038/s41467-017-00432-8

Authors:

Tsuyoshi Shimmura, Tomoya Nakayama, Ai Shinomiya, Shoji Fukamachi, Masaki Yasugi, Eiji Watanabe, Takayuki Shimo, Takumi Senga, Toshiya Nishimura, Minoru Tanaka, Yasuhiro Kamei, Kiyoshi Naruse, Takashi Yoshimura