I am learning a neural network and I want to write a function cross_entropyin python. Where is it defined as

where Nis the number of samples, kis the number of classes, logis the natural logarithm, t_i,jis 1 if the sample iis in the class jand 0otherwise and p_i,jis the predicted probability that the sample iis in the class j. To avoid numerical problems with the logarithm, copy the predictions to the range [10^{−12}, 1 − 10^{−12}].
According to the above description, I wrote down the codes by clipping the predictions into a range [epsilon, 1 − epsilon], and then calculated cross_entropy based on the above formula.
def cross_entropy(predictions, targets, epsilon=1e-12):
"""
Computes cross entropy between targets (encoded as one-hot vectors)
and predictions.
Input: predictions (N, k) ndarray
targets (N, k) ndarray
Returns: scalar
"""
predictions = np.clip(predictions, epsilon, 1. - epsilon)
ce = - np.mean(np.log(predictions) * targets)
return ce
cross_entropy.
predictions = np.array([[0.25,0.25,0.25,0.25],
[0.01,0.01,0.01,0.96]])
targets = np.array([[0,0,0,1],
[0,0,0,1]])
ans = 0.71355817782
x = cross_entropy(predictions, targets)
print(np.isclose(x,ans))
False, , cross_entropy . cross_entropy(predictions, targets). 0.178389544455, ans = 0.71355817782. - , ?