In general, if you just say "Unix Time" or "Unix Timestamp", it means that you speak in seconds. However, understand that the POSIX specification does not actually use these conditions. Instead, it specifically states “Seconds from the Era,” defined in Section 4.16 and used in the specification. What we call it "Unix Time" is just colloquialism.
In section 20.3.1.1 of version 8.0 of the ECMAScript specification, the timestamp is not given in terminology other than "milliseconds" from January 01, 1970 UTC ".
Therefore, referring to the timestamps used in JavaScript (and elsewhere), you can call it "Milliseconds Starting with (Unix) Epoch" or "Unix Time in Milliseconds" or "Unix Timestamp in Milliseconds", There is no universally recognized colloquial term or standard term, which is more concise.
A few other points on this subject:
The text of the W3Schools link you gave makes a decisive mistake in that it does not indicate UTC. Because Unix Epoch is defined in terms of UTC, all timestamps derived from it are based on UTC.
Some may refer to these timestamps as “era time”. However, this is pointless. Please avoid using this terminology as this means that the meaning of time is itself an era. Only the value 0 can be considered as an era. One could say "epoch-making time", but even then the question arises of "what era?" Although the Unix era is common, there are several different eras used in the calculation .
source share