I assume that your current problem is that characters outside the range of ISO-8859-1 are shown as mojibake . It's true? I could not think of another reason to ask this trivial question. Yes? Then read:
Firstly, if you are still using the old JSP instead of your Facelets, then you need to set the page encoding to UTF-8. Put this at the beginning of each JSP:
<%@page pageEncoding="UTF-8" %>
or configure it globally in web.xml :
<jsp-config> <jsp-property-group> <url-pattern>*.jsp</url-pattern> <page-encoding>UTF-8</page-encoding> </jsp-property-group> </jsp-config>
If you use Facelets, you do not need to do anything. It uses by default already UTF-8 as response encoding.
Secondly, if you receive messages from .properties , then you need to understand that these files are read by default using ISO-8859-1. See Also java.util.Properties javadoc :
.. the input / output stream is encoded in ISO 8859-1 character encoding. Characters that cannot be directly represented in this encoding can be written using Unicode escapes; Only one u character is allowed in an escape sequence. The native2ascii tool can be used to convert property files to and from other character encodings.
So, you need to write characters outside the range of ISO-8859-1 Unicode escape sequences . For instance. instead
some.dutch.text = Één van de wijken van Curaçao heet Saliña.
you need to write
some.dutch.text = \u00c9\u00e9n van de wijken van Cura\u00e7ao heet Sali\u00f1a.
This can be automated using the native2ascii tool.
As a completely different alternative, you can provide the JSF with a custom ResourceBundle implementation using Control , which reads files using UTF-8. This is described in more detail in this answer and in this blog . This only works if you provide the verification messages yourself, for example, requiredMessage instead of overriding the default JSF authentication messages. Other than that (i.e. you need this for <message-bundle> files), you really need to use the native2ascii tool.
See also: