Abstract: This paper presents a method for adding multiple tasks to a single deep neural network while avoiding catastrophic forgetting. Inspired by network pruning techniques, we exploit redundancies ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results