Java timestamp and PHP timestamp giving 2 times

Well, I can’t understand what’s going on, so I decided to ask you guys. In PHP, I grab the UTC timestamp using this code:

date_default_timezone_set("UTC"); time() 

This, for example, will give me 1331065202

Then I have this Java code to get the UTC timestamp:

 long timestamp = System.currentTimeMillis() / 1000; 

This, for example, will give me 1331093502

Why two times more? Shouldn't they be in the UTC time zone, or am I doing something wrong? I am hosted on VPS and these scripts are on two different servers, so it could be something on the server side, and if so, what can I do?

+4
source share
3 answers

Given that the two values ​​are very different (not even an integer number of hours), I would say that the clock on one of the machines is wrong. (I assume you took two timestamps around the same time.)

These timestamps are:

  • PHP: Tue Mar 06 20:20:02 GMT 2012
  • Java: Wed Mar 07 04:11:42 GMT 2012

Given that this is not March 27th in GMT, it seems that the clock on the Java machine is simply not set correctly.

If this is a real VPS that you have full control over, you should study using NTP or something similar to keep the correct server hours.

+4
source

As people said above. use ntp. If your VPS is under your control, then debian / ubuntu. The following shell script will install it.

  sudo apt-get install ntp 

It will start ntp after installation, but if you want to be sure that the daemon is working

  /etc/init.d/ntp restart 

Hope this helps.

+2
source

these scripts are on two different servers.

There is a hint: right, you two servers have different times.

If you need your Java and PHP applications for synchronization, consider having both servers use Network Time Protocol .

0
source

Source: https://habr.com/ru/post/1400095/


All Articles