Is it possible to dynamically buffer size while retrieving varchar2 column values?

For a table with FAT rows of many potentially large rows:

create table t (s1 varchar2(4000), ..., sN varchar2(4000))

I know how to extract these columns using direct binds, i.e.

  std :: vector <char> buf1 (4000, '\ 0');
 OCIDefineByPos (..., 1, & buf1.front (), sb4 (buf1.size ()),
                SQLT_CHR, & ind1, & rlen1, 0, OCI_DEFAULT);
 foreach row {
   std :: string actual1 (buf1.begin (), buf1.begin () + rlen1);
 }

The problem with this approach is that it requires to know a-priori all the max-size columns (the description can also tell me that, but it works more), but also forces you to preallocate many large buffers when the data contains in each the cell in practice is much smaller.

I tried using piecewise sampling, replacing OCI_DEFAULT with OCI_DYNAMIC_FETCH and registering my callback with OCIDefineDynamic, and I called with OCI_FIRST_PIECE to dynamically provide a buffer, but here the buffer provided should be large enough and the OCI does not provide the actual column size of varchar2, which is as you might expect, I could dynamically scale the buffer as much as necessary, or just take the buffer too short and call me again with OCI_NEXT_PIECE so that I can copy a piece of the value with a piece.

Now I am systematically getting ORA-01406: fetched column value was truncated

Can someone provide an example of dynamically allocated sample buffers, please? TIA, --DD

+4
source share
1 answer

I believe that you can do this by extracting SQLT_VST instead of SQLT_CHR and relying on OCIString . The memory allocation is automatically controlled inside the OCI, and you actually get a pointer to it. Then you can get the size of the actual value using OCIStringSize() and malloc() and copy it or just use it as a regular char* pointer through OCIStringPtr() .

0
source

Source: https://habr.com/ru/post/1386743/


All Articles