Skip to content

Symbol1/ChatPPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Fast Matrix Multiplication That Does Not Slow Down

Abstract

In the world of Pokemon, trainers often rely on their trusty Pokemon to battle against other trainers and their Pokemon. However, when a trainer wants to defeat a particularly powerful opponent, they may need to combine the strengths of multiple Pokemon to overcome their enemy. This is similar to how distributed matrix multiplication (DMM) works. In DMM, multiple machines work together to quickly calculate the product of two matrices. But just like a Pokemon trainer, it's important to have a backup plan in case one of the machines fails or loses connection. That's why we propose a hybrid algorithm that combines DMM with fast matrix multiplication (FMM). This combination allows us to take the best of both worlds, providing both speed and reliability. By analyzing the tensor product code structure of the hybrid algorithm, we can prove that it has a positive error exponent, meaning it can withstand errors and continue to function effectively. So just like a trainer would carefully select the right Pokemon for the job, we've chosen the perfect combination of DMM and FMM to quickly and reliably calculate matrix products.

Context

This is a talk I gave in Duke University on December 12, 2022. During this time, ChatGPT was a thing so I decided to formulate the slides as if I am asking ChatGPT to prepare a talk. Also the abstract is generated using ChatGPT.

The talk is based on my joint work with Duursma: Parity-Checked Strassen Algorithm.

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages