Category: subnetworks

MIT finds smaller neural networks that are easier to train

[ad_1] To train most neural networks, engineers feed them massive datasets, but that can take days and expensive GPUs. The researchers from MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) found that within those trained networks are smaller, subnetworks that can make equally accurate predictions. CSAIL’s so-called ‘lottery-ticket hypothesis’ is based on the idea that […]