How does SQL determine the length of a character in varchar?

After reading the documentation, I realized that there is a prefix of length of one or two bytes for the changing character to determine its length. I also understand that for varchar each character can have a different length in bytes depending on the character itself.

So my question is:

How does the DBMS determine each character length after it is saved?

Meaning: after saving a string, say, 4 characters long, and suppose that the first character is 1 byte long, the second 2 bytes, the 3rd 3 bytes and the 4th - 4. How does the database know how long each character has been received lines to read it correctly?

I hope the question is clear, sorry for any English mistakes I made. Thanks

+4
source share
2

UTF-8 , 1- 7 .

0, 1- (, , , 128 ASCII).

1, .

enter image description here

https://en.wikipedia.org/wiki/UTF-8

+2

UTF-8, , . , , , , . UTF-32, , .

UTF-8 , , ​​ . , Latin1, 8-, .

. , LENGTH(), , .

, , . VARCHAR(255) : , . , Postgres, > 2GB , .

+2

Source: https://habr.com/ru/post/1687828/


All Articles