Unique UUIDs from multiple processes on the same machine

To tag the data we create, I am considering using uuids. Security is not a problem, so I was going to use version 1 (date-and-mac-address). The only concern is that each user can create multiple data files from different processes with multiple threads at once. Assuming the Python uuid library is thread safe (although it doesn't look like), which still leaves a problem with multiple processes. I am considering the uuid suffix with a dash and a process number.

Since our group has little experience with uuids, are there any problems that I need to keep in mind? How is a multi-process problem handled normally?

+3
source share
1 answer

Just use uuid4 for completely random UUIDs. No need to worry about collisions.

edit in response to the comment . In my experience, redundant data sooner or later leads to inconsistencies. There is a reason why avoiding redundancy is the dogma of a relational database.

Therefore, do not use the UUID as a “backup backup” for the actual data “source computer” and “timestamp”. Either use it as a unique unique identifier that does not contain any other information, or do not use it at all.

+2
source

Source: https://habr.com/ru/post/1724317/


All Articles