Part of your question suggests "the best way to build this data."
In this spirit, you seem to have two problems: firstly, you plan to build> 35,000 points along the x axis, which, as some of the comments indicate, will lead to overlapping pixels on anything but extremely large high-resolution monitors. Secondly, and more importantly IMO, you are trying to build 69 time series (stations) on the same site. In this type of situation, a heat map may be better suited.
library(data.table) library(ggplot2) library(reshape2) # for melt(...) library(RColorBrewer) # for brewer.pal(...) url <- "http://dl.dropboxusercontent.com/s/bxioonfzqa4np6y/timeSeries.txt" dt <- fread(url) dt[,Year:=year(as.Date(date))] dt.melt <- melt(dt[,-1,with=F],id="Year",variable.name="Station") dt.agg <- dt.melt[,list(y=sum(value)),by=list(Year,Station)] dt.agg[,Station:=factor(Station,levels=rev(levels(Station)))] ggplot(dt.agg,aes(x=Year,y=Station)) + geom_tile(aes(fill=y)) + scale_fill_gradientn("Annual\nPrecip. [mm]", colours=rev(brewer.pal(9,"Spectral")))+ scale_x_continuous(expand=c(0,0))+ coord_fixed()

Pay attention to the use of data.tables . Your data set is quite large (due to all the columns, 35,000 rows are not that big). In this situation, data.tables will significantly speed up processing, especially fread(...) , which is much faster than the text import functions in the R database.
source share