If this actually did lead to faster matrix multiplication, then essentially anything that can be done on a GPU would benefit. That definitely could include games, and physics models, along with a bunch of other applications (and yes, also AI stuff).
I’m sure the papers authors know all of that, but somehow along the line the article just became"faster and better AI"
If this actually did lead to faster matrix multiplication, then essentially anything that can be done on a GPU would benefit. That definitely could include games, and physics models, along with a bunch of other applications (and yes, also AI stuff).
I’m sure the papers authors know all of that, but somehow along the line the article just became"faster and better AI"