I am experimenting with R to analyze some measurement data. I have a .csv file containing over 2 million dimension lines. Here is an example:
2014-10-22 21:07:03+00:00,7432442.0
2014-10-22 21:07:21+00:00,7432443.0
2014-10-22 21:07:39+00:00,7432444.0
2014-10-22 21:07:57+00:00,7432445.0
2014-10-22 21:08:15+00:00,7432446.0
2014-10-22 21:08:33+00:00,7432447.0
2014-10-22 21:08:52+00:00,7432448.0
2014-10-22 21:09:10+00:00,7432449.0
2014-10-22 21:09:28+00:00,7432450.0
After reading in the file, I want the conversion time to be correct using as.POSIXct(). For small files, this works fine, but for large files this is not.
I made an example by reading a large file, making a copy of a small part and then decoupling as.POSIXct()in the right column. I included the image of the file. As you can see, applying it to temp-variable, it corrects the number of hours, minutes and seconds. However, applying it to the entire file only the date is saved. (it also takes a lot of time (more than 2 minutes))

? , .
Edit
Windows 7 R 3.1.3, . Ubuntu 14.01, R 3.0.2, . , Windows (3.2.0), , .