Your supplied SQL is part of your problem. Lines cannot carry 0x00, or at least libraries do not account for everything after the null character according to my extremely vague memory C. What I can prove is that if you attach a data viewer, you will see that between OLEDB source and actually getting the value 0x00 in the data stream, it is converted to an empty string. I lost the following script task between source and target
int charvalue = -1; char[] rep = Row.AsciiNULL.ToCharArray(); if (rep.Length > 0) { charvalue = Convert.ToInt32(rep[0]); } Row.Information = string.Format("Length {0} 0x{1:X}", Row.AsciiNULL.Length, charvalue);
0xFFFFFFFF is -1, represented as hex. Using 0 as a control value did not make sense that we really care.

How to save the value 0x00?
The string / wstring data type will not be used, so in the original request you just need to leave it as
SELECT (0x00) AS AsciiNULL
Most likely, you will need to force the metadata in your source to be updated when you remove the listing into a character type. Metadata should now be displayed as DT_BYTES with a length of 1 and using a similar script, since the above length is now 1, and the value is 0. We have binary data flowing in the data stream, the problem is solved!

Error: data conversion failed. The data transformation for the "AsciiNULL" column returned a status value of 4 and the status text "Text was truncated or one or more characters did not match on the target code page."
Perhaps the celebration was a premature tale of my life as the flat file manager does not know how to deal with this binary column. It would be nice if he just put it there, but I could not get him to take it as it is.
I thought I could map data types by setting this column as a binary in the flat file connection manager

It seems closer to the answer, but with this error it will still fail.
Script task
Swiss-army knife time. You can do anything with a script task, in which case I will have to support the output format, since CMs are useless.
using System; using System.Data; using Microsoft.SqlServer.Dts.Pipeline.Wrapper; using Microsoft.SqlServer.Dts.Runtime.Wrapper; [Microsoft.SqlServer.Dts.Pipeline.SSISScriptComponentEntryPointAttribute] public class ScriptMain : UserComponent { string fileName; System.IO.StreamWriter writer; public override void PreExecute() { base.PreExecute();
What really interests you is the part in which the code writes a null value. If you want to transfer a column of type DT_BYTES to all your conversions in order to write that ultimately to a file, you need something like writer.Write(char(0)Row.AsciiNULL[0]); but, frankly, there is no need to intervene. You will find out that every time the ProcessInputRow method is started, you need to add 0x00 to the string, so just use writer.Write((char)0);
This will have a performance boost for your data stream (at least compared to zero byte precision in the data stream). The way the engine processes binary data and LOB types (varchar / nvarchar / varbinary (max)) is that it writes this data to files and transfers the descriptor along the data stream, rather than being left in memory like the βnormalβ types data. Writing files is several orders of magnitude slower than memory, so avoid if performance is important in your packages.
Edit
There was a question about follow-up , which stated above that additional characters should be recorded. Take away, it seems, that I should have used write.Write((byte)0) YMMV