Well, I can’t understand what’s going on, so I decided to ask you guys. In PHP, I grab the UTC timestamp using this code:
date_default_timezone_set("UTC"); time()
This, for example, will give me 1331065202
Then I have this Java code to get the UTC timestamp:
long timestamp = System.currentTimeMillis() / 1000;
This, for example, will give me 1331093502
Why two times more? Shouldn't they be in the UTC time zone, or am I doing something wrong? I am hosted on VPS and these scripts are on two different servers, so it could be something on the server side, and if so, what can I do?
source share