There is a lot of porn on YouTube. We want to filter these results when using the YouTube search API, but the only way to do this is to use safeSearch=strict .
One of the problems is that it filters out songs containing abusive words. We are not against the adult language, we want to filter out this porn.
The failure in this is that if I use safeSearch=moderate or safeSearch=none , the answer to the data from YouTube says absolutely nothing about whether a particular video is adult or not. In most cases, "media $ rating" in most cases does not exist.
So, how does this YouTube know to filter media when using strict mode, but does not know to tell me that it is an adult when I use moderate mode?
In order to confirm that we want to filter porn, but not obscene words. How do you do this when the YouTube data API is not reliable with an error?
source share