For a table with FAT rows of many potentially large rows:
create table t (s1 varchar2(4000), ..., sN varchar2(4000))
I know how to extract these columns using direct binds, i.e.
std :: vector <char> buf1 (4000, '\ 0');
OCIDefineByPos (..., 1, & buf1.front (), sb4 (buf1.size ()),
SQLT_CHR, & ind1, & rlen1, 0, OCI_DEFAULT);
foreach row {
std :: string actual1 (buf1.begin (), buf1.begin () + rlen1);
}
The problem with this approach is that it requires to know a-priori all the max-size columns (the description can also tell me that, but it works more), but also forces you to preallocate many large buffers when the data contains in each the cell in practice is much smaller.
I tried using piecewise sampling, replacing OCI_DEFAULT with OCI_DYNAMIC_FETCH and registering my callback with OCIDefineDynamic, and I called with OCI_FIRST_PIECE to dynamically provide a buffer, but here the buffer provided should be large enough and the OCI does not provide the actual column size of varchar2, which is as you might expect, I could dynamically scale the buffer as much as necessary, or just take the buffer too short and call me again with OCI_NEXT_PIECE so that I can copy a piece of the value with a piece.
Now I am systematically getting ORA-01406: fetched column value was truncated
Can someone provide an example of dynamically allocated sample buffers, please? TIA, --DD