You can use wget
to download content using HTTP cookies. I use StackOverflow.com as an example. Following are the following steps:
1) Get the wget
command tool. For Mac or Linux, I think it is already available. On Windows, you can get it from the GnuWin32 project or from one of many other ports (Cygwin, MinGW / MSYS, etc.).
2) Next, we need to get an authenticated cookie by logging into the corresponding website. You can use your preferred browser for this.
In Internet Explorer, you can create it using File Menu> Import and Export> Export Cookies. In Firefox, I used the Cookie Exporter extension to export cookies to a text file. For Chrome there should be similar extensions
Obviously, you only need to take this step once before the cookies have expired!
3) After you find the cookie exported, we can use wget to get a web page and provide it to this cookie. This, of course, can be done from within MATLAB using the SYSTEM function:
%# fetch page and save it to disk url = 'http://stackoverflow.com/'; cmd = ['wget --cookies=on --load-cookies=./cookies.txt ' url]; system(cmd, '-echo'); %# process page: I am simply viewing it using embedded browser web( ['file:///' strrep(fullfile(pwd,'index.html'),'\','/')] )
Parsing a web page is another topic that I will not go into. After getting the data you are looking for, you can interact with Excel spreadsheets using the XLSREAD and XLSWRITE functions.
4) Finally, you can write this in a function and execute it at regular intervals using the TIMER function