Information

Dream Decoding Video

Share on Facebook
[`friendfeed` not found]
[`evernote` not found]
Bookmark this on Hatena Bookmark

We just posted a video from our research of decoding dreams from human brain activity. The video shows decoded contents during sleep, followed by a report from the subject describing what they saw. Please enjoy!

https://www.youtube.com/watch?v=inaH_i_TjV4&feature=youtu.be

Time series of visual dream contents decoded from brain activity in higher visual cortex during sleep (two dream samples) . The movie displays superimposed stimulus images of 18 different semantic categories used for decoder training. The contrasts were modulated according to the decoder outputs (continuous “scores”) for each category at each time point. The images for each category were randomly picked from the stimulus image set collected from web databases. The tag cloud illustrates the names of the categories. The sizes also vary with the output scores. The verbal reports upon awakening are shown at the end. The movie is not meant to be a reconstruction of shape or color seen in the dreams. The images indicate what kind of objects or scenes (semantic categories) were likely to be present in the dreams, or what kind of images would produce similar brain activity in higher visual cortex, the brain area known to have response selectivity to visual semantic categories.

BrainLiner v1.11

Share on Facebook
[`friendfeed` not found]
[`evernote` not found]
Bookmark this on Hatena Bookmark

Last week we released version 1.11 of BrainLiner.jp. This release includes a complete redesign of the site and new HTML5 web data previewer.

BrainLiner v1.06

Share on Facebook
[`friendfeed` not found]
[`evernote` not found]
Bookmark this on Hatena Bookmark

We are happy to announce that we released v.1.06 of BrainLiner.jp, our lab’s web portal for sharing neurophysiological data, with an emphasis on facilitating brain-machine interface (BMI) research. This release was in development for several months and includes the following key updates:

  • “Experiments” are now called “Projects”
  • References related to your experiment, such as published journal articles, can now be added
  • An area to enter contact information that will be made public has been added
  • Channels inside Neuroshare data files can now be edited to include meta-information, such as whether they are data channels or stimulus labels
  • For data files that have less than 200 channels, we compute correlation matrices and principle-component variance graphs and display these on a new “View Statistics” page, associated with each relevant data file
  • By default up to 20 channels are displayed in the Flash-based data previewer

Please give us your feedback and new ideas!

ATR Open House 2011

Share on Facebook
[`friendfeed` not found]
[`evernote` not found]
Bookmark this on Hatena Bookmark

This year, the ATR Open House will be on November 11 and 12. More details will be posted later, but here is a rough schedule:

Nov. 11 (Fri)
 1st day of the Open House 10:00-17:00

Nov. 12 (Sat)
 2nd day of the Open House 10:00-15:00
ATR 25th Anniversary Celebration 13:00-19:00