I ran into a similar problem when using keras 1.2.1 using tensorflow-gpu backend.
I found out that this was because the 10th edition of Windows had problems with the encoding of the forward slash character.
Using the Lambda layer makes the to_json()
call complete, but switching to batch normalization works just fine.
model = Sequential() # model.add(Lambda(lambda x: x / 255. - .5, input_shape=INPUT_DIMENSIONS)) model.add(BatchNormalization(input_shape=INPUT_DIMENSIONS, axis=1)) . . . # POST PROCESSING, SAVE MODEL TO DISK with open('model.json', 'w') as json_file: json_file.write(model.to_json())
Not an ideal solution, but hopefully it works for those looking at it in the future.
source share