Try label smoothing as described in section 7.5.1 of the Deep Learning Book :
It can be assumed that for some small constant eps mark y of the training set is valid with a probability of 1 - eps , and otherwise, any of the other possible marks may be correct.
Label smoothing organizes the softmax-based model with output values of k , replacing the hard 0 and 1 classification targets with eps / k and 1 - (k - 1) / k * eps targets, respectively.
Check out my question about introducing label smoothing in Pandas .
Otherwise, if you know for sure that some areas are negative, others are positive, and some are undefined, then you can introduce a third undefined class. I worked with datasets that contained an undefined class that matched patterns that could belong to any of the available classes.
source share