As the name implies, I run jupyter in the docker container, and I get an OSError from python in the scikit learn / numpy library in the following line:
pickler.file_handle.write(chunk.tostring('C'))
I have some troubleshooting issues, and most of the problems people face, is their hard drive or RAM, actually ends up in space, which is not the case for me. AFAIK.
This is what my df looks like:
Filesystem 1K-blocks Used Available Use% Mounted on udev 16419976 0 16419976 0% /dev tmpfs 3288208 26320 3261888 1% /run /dev/sdb7 125996884 72177548 47395992 61% / tmpfs 16441036 238972 16202064 2% /dev/shm tmpfs 5120 4 5116 1% /run/lock tmpfs 16441036 0 16441036 0% /sys/fs/cgroup /dev/sdb2 98304 32651 65653 34% /boot/efi tmpfs 3288208 68 3288140 1% /run/user/1000 //192.168.1.173/ppo-server3 16864389368 5382399064 11481990304 32% /mnt/ppo-server3
Here is what my free one looks like:
total used free shared buff/cache available Mem: 32882072 7808928 14265280 219224 10807864 24357276 Swap: 976892 684392 292500
Am I looking at the right df and free outputs? Both of them are run from a bash instance inside the container.
source share