Using the curl or wget command line to upload files

I apologize if this question was asked before, and if it is simple.

I am trying to upload a file from an http site to my unix machine using the command line. I sign up to this site using a username and password.

Let's say I have this link (not a working link) http://www.abcd.org/portal/ABCPortal/private/DataDownload.action?downloadFile=&workspace.id=4180&datasetId=76999

Say, if I insert this link in a browser, I get a window that opens to ask if I want to save the zip file to which it refers (say, xyz.zip). These files are ~ 1 GB in size.

I want to get this zip file that this url has on my unix machine using the command line. I tried using wget and curl with the above URL (with username and password). I get an html form but not a zip file. Is there any way to get the zip file that this kind of URL link belongs to? I do not know anything about the directory structures on the machine where the files are located.

Thank you for your help,

+3
source share
2 answers

I think you did not pass the Accept-Encoding header . Browsers pass it by default, and your CLI tools you must run these parameters yourself.

wget, curl (-v verbose command, /):

curl -v "http://www.abcd.org/portal/ABCPortal/private/DataDownload.action?downloadFile=&workspace.id=4180&datasetId=76999" -H "Accept-Encoding: gzip" > /tmp/yourZippedFile.gz

, , , . , HTTP-.

+5

FYI , User-Agent:

curl -H "Accept-Encoding: gzip, deflate" -H "User-Agent: Mozilla/5.0 (Windows NT 5.1)" www.google.com > test3.gz

User-Agent gzip

curl -H "Accept-Encoding: gzip, deflate" www.google.com > test

+1

Source: https://habr.com/ru/post/1770588/


All Articles