At work, we have an application for playing 2K (2048 * 1556px) OpenEXR films. It works well. In addition, when sequences that exceed 3 GB (quite often), it should unload old frames from memory, despite the fact that all machines have 8-16 GB of memory (which is addressed via Linux BIGMEM material).
Frames must be cached into memory for real-time playback. The OS is a multi-year 32-bit Fedora Distro (in the foreseeable future, it cannot be updated to 64 bits). The limit for each process is 3 GB per process.
Basically, is it possible to cache more than 3 GB of data in memory? My initial idea was to distribute data between several processes, but I have no idea if this is possible.
source share