Simple work around. Export all data first to first. Then create a column that will have the exact URLs needed to execute the curl statements for each document (in excel there will be rows for each document). A simple curl statement to perform image loading:
curl -v -X PUT "http://username: password@serverip :portno/dbname/doc_id/image.JPG?rev=revisionid" --data-binary @image.JPG -H "Content-Type:image/jpg"
After creating the individual curl URLs, which will be pretty easy in excel, copy the entire column of embedded URLs into a text file and put the text file in which the images to be uploaded will be loaded. Then create a bash script that will read all the lines and continue to post images to the couchdb server: bash script
#!/bin/bash while read line do curl -v -X PUT $line done < test.txt
In my case, all url lines are present in the test.txt file.
This method worked perfectly for me, the image size was about 60-65 kilobytes, but with 380 documents. Not sure what will happen to large files.
source share