Others showed how to print printed Ascii characters. You can also print all other Ascii characters, although they are control characters with a system-dependent effect (often without effect). To create a string containing all Ascii characters into a string, you can do this:
var s = ''; for (var i = 0; i <= 127; i++) s += String.fromCharCode(i);
Unicode is much more complicated because the Unicode coding space from 0 to 0x10FFFF contains a large number of unassigned code points, as well as codes designated as uncharacteristic. There are also private code points that can be used to designate characters by "private agreement", but do not have generally accepted meaning. Moreover, many Unicode characters are non-propagating, i.e. Designed to be combined with the previous character (for example, turning βfromβ to βΓ’β), so you cannot print them in a row visually. There is no easy way in JavaScript to determine the class of the corresponding code point from an integer - you may need to read the UnicodeData.txt file, analyze it, and use the information there to classify the code points.
Finally, there is a programming problem that the concept of a JavaScript symbol corresponds to a 16-bit code block (and not a code point), and any Unicode code point exceeding 0xFFFF must be represented using two code blocks (the so-called surrogates). If you use JavaScript in the context of an HTML document and want to print characters in HTML content, the easiest way is to use character names such as 𐐀 (which denotes the Unicode character at code point 10400 hexadecimal) and assign a string to the element's innerHTML property.
If you need to write Unicode character ranges, you can take a look at the Full Unicode Input utility I wrote recently. Its source code illustrates some ways to work with Unicode characters in JavaScript.
source share