It is definitely possible, but it's a bit of hacks. I have been doing this myself for quite some time using wget
. The trick is to make the server think that the request is being made from the browser, and for this to happen, you need a few things:
- Download link (actual file link)
- Referrer link (webpage with download button)
- Zippyshare Session ID (found in cookies)
Here is a screenshot explaining where you can get each item:
Now open your terminal and use the following command (replacing the necessary elements):
wget <download_link> \ --referer='<referrer>' \ --cookies=off --header "Cookie: JSESSIONID=<session_id>" \ --user-agent='Mozilla/5.0 (Windows NT 6.0) Gecko/20100101 Firefox/14.0.1'
Example:
wget http://www16.zippyshare.com/d/29887835/8895183/hello.txt \ --referer='http://www16.zippyshare.com/v/29887835/file.html' \ --cookies=off --header "Cookie: JSESSIONID=26458C0893BF69F88EB5743D74FE0F8C" \ --user-agent='Mozilla/5.0 (Windows NT 6.0) Gecko/20100101 Firefox/14.0.1'
Original answer: How to use wget to download from hosting sites?
Note. On the team, this is the "referent", not the "referrer"
source share