I am trying to implement a minimized neural network on Tensorflow using their default MNIST dataset.
from __future__ import print_function import tensorflow as tf from tensorflow.examples.tutorials.mnist import input_data mnist = input_data.read_data_sets('MNIST_data', one_hot=True) def compute_accuracy(v_xs, v_ys): global prediction y_pre = sess.run(prediction, feed_dict={xs: v_xs, keep_prob: 1}) correct_prediction = tf.equal(tf.argmax(y_pre,1), tf.argmax(v_ys,1)) accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32)) result = sess.run(accuracy, feed_dict={xs: v_xs, ys: v_ys, keep_prob: 1}) return result def weight_variable(shape): initial = tf.truncated_normal(shape, stddev=0.1) return tf.Variable(initial) def bias_variable(shape): initial = tf.constant(0.1, shape=shape) return tf.Variable(initial) def conv2d(x, W):
When python is executed with this message, the: terminate call after an instance of 'std :: bad_alloc' what () is called: std :: bad_alloc
I was able to indicate that this happens when I call the compute_accuracy function or generally when I load all the mnist.test iamges and tags. Any suggestions on what can be done, given that I want to use this data. I could work with images in general, in another case.
source share