Deep learning methods such as recurrent neural network and long short-term memory have attracted a great amount of attentions recently in many fields including computer vision, natural language processing and finance. Long short-term memory is a special type of recurrent neural network capable of predicting future values of sequential data by taking the past information into account. In this paper, the architectures of various long short-term memory networks are presented and the description of how they are used in sequence prediction is given. The models are evaluated based on the benchmark time series dataset. It is shown that the bidirectional architecture obtains the better results than the single and stacked architectures in both the experiments of different time series data categories and forecasting horizons. The three architectures perform well on the macro and demographic categories, and achieve average mean absolute percentage errors less than 18%. The long short-term memory models also show the better performance than most of the baseline models.