Error "There is no such file or directory" after sending the training task

I am doing:

gcloud beta ml jobs submit training ${JOB_NAME} --config config.yaml

and after about 5 minutes, an error working with this error:

Traceback (most recent call last): 
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main "__main__", fname, loader, pkg_name) 
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code exec code in run_globals 
File "/root/.local/lib/python2.7/site-packages/trainer/task.py", line 232, in <module> tf.app.run() 
File "/usr/local/lib/python2.7/dist-packages/tensorflow/python/platform/app.py", line 30, in run sys.exit(main(sys.argv[:1] + flags_passthrough)) 
File "/root/.local/lib/python2.7/site-packages/trainer/task.py", line 228, in main run_training() 
File "/root/.local/lib/python2.7/site-packages/trainer/task.py", line 129, in run_training data_sets = input_data.read_data_sets(FLAGS.train_dir, FLAGS.fake_data) 
File "/usr/local/lib/python2.7/dist-packages/tensorflow/contrib/learn/python/learn/datasets/mnist.py", line 212, in read_data_sets with open(local_file, 'rb') as f: IOError: [Errno 2] No such file or directory: 'gs://my-bucket/mnist/train/train-images.gz'

The strange thing, as far as I can tell, this file exists in this url.

+4
source share
2 answers

This error usually indicates that you are using a multi-area GCS bucket for output. To avoid this error, you should use a regional GCS bucket . Regional buckets provide more robust consistency guarantees to prevent these errors.

For more information on properly configuring GCS branches for Cloud ML, see Cloud ML Docs

+1

IO , GCS gs://. :

first_data_file = args.train_files[0]
file_stream = file_io.FileIO(first_data_file, mode='r')

# run experiment
model.run_experiment(file_stream)

gs:// , :

with file_io.FileIO(gs://presentation_mplstyle_path, mode='r') as input_f:
    with file_io.FileIO('presentation.mplstyle', mode='w+') as output_f:
        output_f.write(input_f.read())

mpl.pyplot.style.use(['./presentation.mplstyle'])

, , gs://bucket:

with file_io.FileIO(report_name, mode='r') as input_f:
    with file_io.FileIO(job_dir + '/' + report_name, mode='w+') as output_f:
        output_f.write(input_f.read())

.

+1

Source: https://habr.com/ru/post/1656254/


All Articles