Haibao Wang, Jun Kai Ho, Fan L. Cheng, Shuntaro C. Aoki, Yusuke Muraki,
Misato Tanaka, Jong-Yun Park & Yukiyasu Kamitani
To begin, clone the repository on your local machine, using git clone and pasting the url of this project:
git clone https://github.com/KamitaniLab/InterSiteNeuralCodeConversion.gitStep1: Navigate to the base directory and create the Conda environment:
conda env create -f env.yamlStep2: Activate the environment:
conda activate NCCTo use this project, you'll need to download and organize the required data:
Alternatively, you can use the following commands to download specific data (The data will be automatically extracted and organized into the designated directory):
# In "data" directory:
# To download the training fMRI data:
python download.py fmri_training
# Or to download the test fMRI data:
python download.py fmri_test
# download the DNN features of training images:
python download.py stimulus_featureTo use this project, you'll need to download the required pre-trained decoders from Figshare with the following command:
python download.py pre-trained-decodersIf you prefer to train the decoders yourself (approximately 2 days per subject), detailed instructions and scripts are available in the feature-decoding directory.
To train the neural code converters using content loss for subject pairs, navigate to the NCC_content_loss directory and run:
python NCC_train.py --cuda- Note: Use the
--cudaflag when running on a GPU server. Omit--cudaif training on a CPU server.
Training one subject pair usually takes about 15 hours due to the large computational requirements. You can also download the pre-trained converters from Figshare with the following command:
python download.py pre-trained-convertersTo train the neural code converters using brain loss for subject pairs, navigate to the NCC_brain_loss directory and run:
python ncc_train.pyTo decode DNN features from converted brain activities (approximately 80 mins per subject pair), use the following commands in the corresponding directory:
-
For content loss-based converters:
python NCC_test.py --cuda
-
For brain loss-based converters:
python ncc_test.py
To reconstruct images from the decoded features:
- Navigate to the
reconstructiondirectory. - Follow the provided README and reconstruction demo for detailed instructions on setting up the environment and usage.
- Modify the directory of the decoded features in the script as needed to reconstruct images.
The quantitative evaluations are presented in terms of conversion accuracy, decoding accuracy, and identification accuracy.
To calculate raw correlations for conversion accuracy, navigate to the conversion_accuracy directory and run:
-
For content loss-based converters:
# pattern correlation python fmri_pattern_corr_content_loss.py # profile correlation python fmri_profile_corr_content_loss.py
-
For brain loss-based converters:
# pattern correlation python fmri_pattern_corr_brain_loss.py # profile correlation python fmri_profile_corr_brain_loss.py
To obtain the normalized correlations and plot the Figure 2E and 2F with the provided result, use the following command:
python plot_figure.pyTo calculate decoding accuracy for decoded features, first download the ground truth features of the stimulus images in the data directory using:
python download.py test_image-true_featuresThen, navigate to the decoding_accuracy directory and run:
python featdec_eval.pyTo plot the Figure 3B and 3C with the provided result, use the following command:
python plot_figure.pyTo quantitatively evaluate the reconstructed images, please request and download the ground truth stimulus images using this link due to licensing restrictions. Organize the downloaded images in the following directory structure: data/test_image/source.
Then, navigate to the identification_accuracy directory and run:
python recon_image_eval.py
python recon_image_eval_dnn.pyTo plot the Figure 3F with the provided result, use the following command.
python plot_figure.pyWang, H., Ho, J. K., Cheng, F. L., Aoki, S. C., Muraki, Y., Tanaka, M., Park, J.-Y., & Kamitani, Y. (2025). Inter-individual and inter-site neural code conversion without shared stimuli. Nature Computational Science, 5, 534–546. https://doi.org/10.1038/s43588-025-00826-5