I work with Keras 2.0.0, and I would like to train a deep model with a huge number of parameters on the GPU. Since my data is big, I have to use ImageDataGenerator. Honestly, I want to abuse ImageDataGeneratorin this sense that I do not want to make any additions. I just want to put my training images in a batch (and resize them) so that I can deliver them to model.fit_generator.
I adapted the code here and made some small changes according to my data (i.e. changing the binary classification categorically, but this does not mean it is important for this problem, which should be discussed here). I have 15,000 images of the train, and the only “zoom” I want to perform is zooming in [0.1] train_datagen = ImageDataGenerator(rescale=1./255). After creating my "train_generator":
train_generator = train_datagen.flow_from_directory(
train_data_dir,
target_size=(img_width, img_height),
batch_size=batch_size,
class_mode='categorical',
shuffle = True,
seed = 1337,
save_to_dir = save_data_dir)
I customize the model using model.fit_generator().
I set the number of eras: epochs = 1
And batch_size:batch_size = 60
What I expect to see in the directory where my extended (i.e. modified) images are stored: 15,000 images with scaling in the era, i.e. with one epoch: 15 000 images with scaling. But, mysteriously, there are 15,250 images.
?
?
:
fit_generator ( stackoverflow: Keras - How are , fit_generator()?)
, ImageNet
.