Consider the following program.
import java.nio.ByteBuffer; import java.nio.CharBuffer; import java.nio.charset.Charset; public class HelloWorld { public static void main(String[] args) { System.out.println(Charset.defaultCharset()); char[] array = new char[3]; array[0] = '\u0905'; array[1] = '\u0905'; array[2] = '\u0905'; CharBuffer charBuffer = CharBuffer.wrap(array); Charset utf8 = Charset.forName("UTF-8"); ByteBuffer encoded = utf8.encode(charBuffer); System.out.println(new String(encoded.array())); } }
When I do this using a terminal,
java HelloWorld
I get correctly encoded, formatted text. The default encoding was MacRoman .
Now, when I run the same code from Eclipse, I see the wrong text printed on the console.

When I change the encoding option of an Eclipse file to UTF-8 , it prints the correct results in Eclipse.
I wonder why this is happening? Ideally, file encoding settings should not have influenced this code, because here I use UTF-8 explicitly.
Any idea why this is happening?
I am using Java 1.6 (Sun JDK), Mac OSx 10.7.
source share