Skip to content

FPGA based acceleration of Convolutional Neural Networks. The project is developed by Verilog for Altera DE5 Net platform.

Notifications You must be signed in to change notification settings

frankhoff/FPGA_Based_CNN

 
 

Repository files navigation

README

  • Open DE5Net_Conv_Accelerator/DE5Net_Conv_Accelerator.qsf with Quartus v16.0.0 on Computer 1

    • Generate Qsys System mem_system
    • Perform Full Compilation
    • Use computer 1 to program the FPGA device installed on HOST Computer (Computer 2) via the USB-Blaster cable. NOTE: This will cause the host computer Kernel to reset and it will automatically reboot. Please save all you work before proceeding with this
    • Reboot the host computer to train the PCIe link with FPGA board.
  • Install PCI-Express linux driver on Host PC (Computer 2)

    • Program the FPGA and reboot as mentioned above before performing the installation
    • Install instructions present in README
  • Refer to Using the User Application pdf for instructions on data transfer.

This project is a FPGA based implementation of first Convolutional Layer of AlexNet. The accelerator is developed using Verilog.

Contact

Currently Mr. Sachin Kumawat is developing this project. If you have question or need further information, please contact him. Email: [email protected]

Citing

Our research group is working on acceleration of deep CNNs. If this project helped your research, please kindly cite our latest conference paper:

Mohammad Motamedi, Philipp Gysel, Venkatesh Akella and Soheil Ghiasi, “Design Space Exploration of FPGA-Based Deep Convolutional Neural Network”, IEEE/ACM Asia-South Pacific Design Automation Conference (ASPDAC), January 2016.

About

FPGA based acceleration of Convolutional Neural Networks. The project is developed by Verilog for Altera DE5 Net platform.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Verilog 51.6%
  • C 22.2%
  • Batchfile 15.9%
  • Tcl 9.2%
  • SystemVerilog 0.5%
  • Python 0.3%
  • Other 0.3%