Reworkin 11 months ago
"Brain floating point" is a cool name. i guess brain's synaptic precision could go way lower, as low as 26 distinct synapse weights:

> A particularly interesting research direction puts these three trends together, with a system running on large-scale ML accelerator hardware, with a goal of being able to train a model that can perform thousands or millions of tasks in a single model. Such a model might be made up of many different compo ... (more)