Robert Sedgewick’s fourth edition of Book Algorithms on page 200 says: “For example, if you have 1 GB of memory on your computer (1 billion bytes), you cannot store more than 32 million int values.”
I was confused after my calculations: 1,000,000,000 bytes / 4 bytes = 250 million
How did the author get 32 million?
The book describes as shown below:

source share