I have my own program written in Python that expects its input to stdin. As a simple example,
#!python3 import sys with open('foo.txt', encoding='utf8') as f: f.write(sys.stdin.read())
I want to be able to pass a string (PowerShell) to this program as standard input. Python expects its standard input in the encoding specified in $env:PYTHONIOENCODING , which I usually install on UTF8 (so that I don't get any encoding errors).
But no matter what I do, the characters get spoiled. I searched the network and found suggestions for changing [Console]::InputEncoding / [Console]::OutputEncoding or use chcp , but nothing works.
Here is my main test:
PS >[Console]::OutputEncoding.EncodingName Unicode (UTF-8) PS >[Console]::InputEncoding.EncodingName Unicode (UTF-8) PS >$env:PYTHONIOENCODING utf-8 PS >python -c "print('\N{Euro sign}')" | python -c "import sys; print(sys.stdin.read())" Β΄ββ? PS >chcp 1252 Active code page: 1252 PS >python -c "print('\N{Euro sign}')" | python -c "import sys; print(sys.stdin.read())" ? PS >chcp 65001 Active code page: 65001 PS >python -c "print('\N{Euro sign}')" | python -c "import sys; print(sys.stdin.read())" ?
How can I fix this problem?
I canβt even explain what is happening here. Basically, I want the test ( python -c "print('\N{Euro sign}')" | python -c "import sys; print(sys.stdin.read())" ) to print the Euro sign. And to understand why, I have to do everything I need to get this to work :-) (Because then I can translate this knowledge into my real script, which should be able to write working pipelines of Python programs that don't break when they collide with Unicode characters).
source share