Extra  Character added .js script

Using jQuery and the formatCurrency extension, as shown below:

$(function() { $(".currency").blur(function(){ formatCurrencyInput(this) }); }); function formatCurrencyInput(input) { if($(input).val() != '') { if(isNumeric($(input).val())) { $(input).formatCurrency(null,{groupDigits:false,roundToDecimalPlace:2,symbol:'£'}); $(input).css('border-color', ''); } else { $(input).css('border-color', '#FFCCCC'); } } } 

All my text inputs with a class of "currency" are converted, for example, from 45 => 45.00 pounds.

It is strange if the formatCurrencyInput function is defined in the section in which it works. If I add the function to a .js file, it will return 45.00 pounds (note the  symbol). I assume this is a character encoding problem, but how to fix it?

+4
source share
5 answers

If you use any non-Ascii character in the JavaScript file, you must ensure that the server sends the appropriate HTTP headers for it, identifying the character encoding. Other methods, such as the charset parameter in a script element used to link to a file, or relying on a browser that incorrectly uses encoding (for example, from a referenced HTML document), are less reliable.

But if there is only one non-Ascii character in the string literal, the easiest way is to avoid it. The pound sign "£" U + 00A3 can be escaped inside the string literal as \xA3 or as \u00A3 .

+4
source

Try:

 <!DOCTYPE html> <html> <head> <meta charset="utf-8" /> </head> ... 

You can also try changing £ to &pound; .

+4
source

Unexpected  and à characters before non-ASCII characters almost always mean that:

  • You saved text encoded using UTF-8
  • You read this text as if it were encoded (what is usually called) Latin 1 encoding

This is because most non-ASCII characters you come across (at least if you are dealing with text written in languages ​​whose spelling is derived from the Latin alphabet , like Western European languages) belong to the Latin 1 unicode block (including £ and the most accented or modified versions of English letters, for example é and æ ), and UTF 8 encodes all the elements of this block with a two-byte sequence, starting from byte C2 or C3 . However, these bytes respectively indicate  and à in Latin encoding 1.

Quick demo in Python shell:

 >>> b'\xc2'.decode('latin1') 'Â' >>> '£'.encode('utf8') b'\xc2\xa3' >>> '£'.encode('utf8').decode('latin1') '£' >>> 'æ'.encode('utf8') b'\xc3\xa6' >>> 'æ'.encode('utf8').decode('latin1') 'æ' 

If these unexpected characters appear in your JavaScript, what should be wrong is that you saved your UTF-8 JavaScript file, but the browser believes that it is encoded in Latin, and therefore incorrect characters appear after decoding.

To fix this, make sure that you correctly specify the encoding of your JavaScript file using an HTTP header, for example:

 content-type: text/javascript; charset=utf-8 
+1
source

Just add charset = "UTF-8" to the script tag. Example

 <script src="myJSFile.js" charset="UTF-8"></script> 
0
source
  var poundSign = '\xA3'; var dollars = "OK, gonna exchange $ " + dollarsToExchange + " for " + poundSign + guineas ; 
-1
source

Source: https://habr.com/ru/post/1445626/


All Articles