If you are going to define them at all, why not define them as
#define SIZEOF_INT sizeof(int)
etc .. why are they correct even if someone is trying to use them to compile with a different architecture?
I found the answer a minute after the question. These macros are sometimes used in predicates of #if, etc., where it is impossible to determine the size of calls.
for example
#if SIZEOF_LONG_LONG_INT == 8 stuff #endif
Source: https://habr.com/ru/post/1017328/More articles:A very simple histogram with R? - rSecurity Issues Core WebAPI ASP.NET - javascriptRemoving common values from two lists in python - pythonXcode does not display warnings from CocoaPods Pods - ios"No header or cookie error" created during smoke analysis tests (Create-react-app) - cookiesThe difference between a piece and a closure - javascripthttps://translate.googleusercontent.com/translate_c?depth=1&pto=aue&rurl=translate.google.com&sl=ru&sp=nmt4&tl=en&u=https://fooobar.com/questions/1017330/how-to-circumvent-requirejs-to-load-module-with-global&usg=ALkJrhjat091s-ZsWdl7RTze2z3tGnrxvwWhen were F # reserved keywords removed from the specifications? - .netWhy are there no function headers in OCaml? - cIn which branch should I build a docker image? - gitAll Articles