Tuesday, September 4, 2012

Time Series Analysis and Forecasting. Programming Approach - thoughts

"Certain things are impossible... 
Until an ignoramus appears, who is not aware of that".

Time Series - a sequence of data points, measured typically at successive time instants spaced at uniform time intervals. 

There are quite a lot of things that may fit this definition. For example, air temperature changes throughout the day (let's say, hourly measured), distance from the Earth to the Moon (which changes slightly throughout the lunar month). Even which political party holds the presidential chair after the elections (which depends of the "history" of the previous president, etc.) We can go on with the list of examples until the server's storage is full. As you may see, the examples above have cyclic nature, but so is everything (or at least everything) related to time series (of course, within certain deviations).

It is a nature of the mankind to want to know the future (although, sometimes it better not to know). Attempts are being made to predict, or let's use a more politically correct term - forecast, where certain series would go in the future. The best example may be shamans predicting rain or drought. These days there are complex (and not so complex) algorithms to forecast time series (e.g. noise reduction in digital signal processing). But the most scandalous and loud argument is going on about the stock market analysis and forecasting. Many of you may have heard about William Gann - some say genius, some say charlatan. I personally tend to take the first side, although, there may be facts that I am not aware of.

Mr. Gann died almost 60 years ago. Quite a long period of time. Imagine how many time series forecasting (read stock market forecasting) techniques have been born and how many have vanished. Since the chaos theory, more and more people tend to say that "stock market forecasting is impossible due to its fractal nature". Which makes sense if you look at the problem from the chaos theory's perspective. However, do not forget that chaos theory is accepted as the one that fits the situation the best, not as the one that fully explains it. In my perception, this tiny difference leaves a tiny space for hope ;-)

Well, we've had enough of science this far. Let us get to practice. Let me try to simplify things as much as possible, to demonstrate a simpler, yet effective approach from a developer's point of view.


From software perspective, there's not too much needed for successful forecasts - an expert system. Smart people use different software packages and programming languages targeted at expert systems development, but being an ignoramus (as I decided to be for this article), I decided to use what I have and what I know - C language, GCC and Geany text editor as an IDE.


There are several (graphical) ways to represent stock/forex market data. The most known one is candlesticks. A sequence of simple graphic figures, of which each one represents the variation of the price for a certain period of time (open, high, low and close values). We, however, are not going to consider any of them. Simply because we do not need that. Instead, we are going to concentrate on the raw row of numbers for a given period (let's say one year) measured hourly, which gives us a sequence of more then 8000 items (we are only paying attention to one value - either open, high, low or close).

If you try to plot this sequence (e.g. in Excel|) you will get a curvy line. Take another look at it and you will notice that there are similar segments (within certain deviations, of course). Just as a set of similar images, which would bring up one of the best approaches for image recognition - Artificial Neural Networks (especially perceptrons). Although, there is nothing new in using ANN for stock/forex market analysis. There are tones of commercial software products that provide the end user with different indicators telling him/her whether to buy, sell of hold the current position, I personally have not seen a lot of attempts to actually make long term (e.g. 24 hours for an hourly measured sequence) forecasts. There is also a lot of uncertainty as to what data should be used as ANN's input and how much data should be fed in each time. Unfortunately, no one has the exact answer for this question. It is just your trial and error. The same applies to the amount of hidden neurons in the ANN.

Another big question is how should the data be preprocessed - prepared for the ANN. Some use complex algorithms (Fourier transform, for example), other tend to use a more simplistic ones. The idea is that data should be in the range of 0.0- 1.0 and it should be as varied as possible. But remember - if you feed ANN with garbage - you get garbage in response. Meaning that you have to carefully select your algorithm for data preprocessing (normalization). I tend to use a custom normalization algorithm, which is quite simple. Sorry to disappoint you, but I am not going to give it here for now as it is still not completely defined (although, it already produces good results).

The bottom line for this paragraph - data preprocessing is not very important, it is the MOST important.


My programming solutions for this problem is quite simple - a console program that reads the input (the whole sequence of price values for the specified period), trains an artificial neural network (in my case the topology was 8x24x1 - 8 inputs, 24 hidden neurons and one output neuron), and then produces a long term forecast (at least 7 entries into the future) while each step of the forecast is done using the previously generated values.

The ANN is a simple multilayer perceptron with 8 inputs, 24 hidden neurons and 1 output neuron. Basically saying - we do not perform much calculations ourselves, if at all. ANN is a perfect implementation of a learning paradigm, able to find hidden dependencies and rules. Therefore, if you ask me - there is no better solution then utilizing ANNs for time series forecasting.


So, I implemented an ANN (in C this time, not in Assembly) and got the dataset (EUR/USD price values for every hour of the past year). The next move was to give it a try and test in run time. I decided to do that during the weekend as I was not sure about how much time would be required to train the network. Surprisingly, I got a good error after only about 30,000 epochs (several minutes). The following picture shows what I got:

EUR/USD forecast

Test set - data not included in the ANN training process. Used as a pattern for error calculation.
Test forecast - forecast on data from the past, which was not included in the training set.
Real forecast - forecast of the future values. This was done on Saturday at least 24 hours before the opening of the next trading session.
Real data - real values obtained Monday early morning after the new trading session began.

As you can see, such simple system was even able to forecast the gap between the two sessions.

P.S. Although, this article contains no source code, no description of any interesting programming technique or whatsoever, it comes to show, that each problem has a (not necessarily complicated) solution. Most of the time, the most important thing is to take a look at a problem from another angle.


  1. Hi Alexey,
    predicting the gap between trading sessions is an easy task since it's periodic, Sydney market opens first, US markets close as last. How far were you able to forecast the results and for which time slice? Consider also that most of the game is played with stop-losses and the volatility of a single candle can be huge especially on the EUR/USD. It basically means that the ANN is just predicting a value that will be at one point between the low and the high and it's easy for high volatility markets to guess a right value. I don't want to say that you're not doing a good job, only that is REALLY difficult to assess the predictions on an ANN unless you try them on the field. The prediction as shown seems good but when do you open your position and when do you close it? If you don't take volatility into account you risk losing a lot of money.

    1. Hi,

      even being quite ignorant in this field, I can myself tell that sensing a gap between sessions is not that hard - just need to take a closer look at the graph and trust your intuition. However, I was not sure about how easy it is for an ANN. Apparently, it is not that hard for it as well.

      You are completely right about the problem with volatility. This forecast only refers to the close value of each hour and who knows what happens during that hour. I have tried this system with a forex learning account (meaning no real money involved and I know that things may be different if I dealt with real money :-) ) and lost a bit of the investment due to exactly this problem. However, I've been able to recover from losses and even gain some profit once I started making forecasts for high, low and close values. The first two were giving me a good indication of the expected volatility (making it easier to estimate the entry and exit points) and the close value gave me a clue of the direction.

      No doubt - there's still work to be done. At least some optimization to the ANN (using CUDA for example) is needed.

  2. Hi again Alexey,
    remember than an ANN with three layers is a universal approximator, this means that everything that has a periodic behavior is easily recognized by the network. If you can accept an advise from me, I can tell that you might be able to get better results (but I was unable to get anything that was even close to be really good) by using a second hidden layer, that accounts for the incredible noise added by the HFT transactions. Also adding the time of the day, especially on hourly candles, will be able to help the ANN spot the activity of the big traders (when they go to lunch, when they have breaks and so on). Still you won't be able to predict those things called Noe, that is when the banks push a lot of money, or when new data is released, this strongly influences the market AND it also ruins the training of your ANN because those events are totally unpredictable. Indeed you will notice that the ANN performs better on VERY short time frames (minutes, seconds or even ticks) and you could couple them with a Markov chain. On very short time frames I was able to use successfully my model, the model worked also on very long time frames (1 candle per day) but I didn't use it with real money on the last case, for very long time frames you usually need a lot of money to be used as a buffer since deviations are veeery wide. For shorter time frames I was unable to remove the noise or to get really reliable predictions. Check my picture (http://i49.tinypic.com/vakow.png) blu being the real series, red the predicted. The prediction is currently really accurated BUT it won't yield any money if you'd trade on that model because the volatility of the market makes that prediction way too inaccurate. Seems good to the eye but really it's not... Unfortunately ;ppp

    1. Hi,

      ignorant does not mean arrogant - advise is always good. As far as I know, adding another hidden layer is (almost) the same as increasing the amount of neurons in a single hidden layer. I may be wrong, though. I do not say it is not worth trying.

      As to the time of the day - I tried that earlier without any significant benefit to the system. In my opinion, it simply adds noise to the data. I guess, I see things a bit differently - the graph (meaning the data) already contains all the needed information, thus, adding another parameter or two means adding noise for me. I cannot fully agree with you on things like Noe. When such things happen, there's always a clue in the graph, although, it may be not visible to the naked eye. One of the reasons (in addition to volatility) for either loosing or missing profit for me was the lack of trust in the forecast. It happened to me several times during all the noise about situation with Spain. For example, the forecast was showing some weird move of the price, which seemed totally insane, then (after an hour or so) new data was released and the price followed the forecast. I do not mean "look my system is so perfect" as it is still not ;-), but just to prove that the graph in deed contains all the information needed for a good forecast.

      But, back to ANNs. I've seen somewhere on the internet (do not remember neither the site, nor the query that led me there) someone mentioned usage of two ANNs - one making the forecast, another making the corrections to the forecast. Have not tried it myself yet.

    2. Hi Alexey,
      as for what was my case, I've seen benefits adding the time of the day and even the day of the year (normalized of course). The former on short time frames, the latter on very long predictions (years-wise). Adding another layer, at least for me, helped the network to be more responsive to non-linearities in high-frequency and required overall less neurons to work with, adding more layers didn't help me further. This might mean that the number of features to extract can be approximated with at most two layers, that's a nice discovery :). I agree with you that all the information needed is in the graph, though it's heavily corrupted by noise, the training set is noisy too and unfortunately it doesn't help (otherwise we'd all be rich ;p). Unlike you I was unable to predict any Noe, the network was able to forecast that "something" was happening after some hours with the market almost stationary, but it wasn't able to predict better than a coin-flip if the event would have been going up or down and for how long. Though it was able to "correctly" predict the retrace. I have done experiments too with more than one network in several different combinations and with different degrees of success/total failures ;p. As for my experience, like I was telling before, I had the best either on very long or very short time frames, signal processing is absolutely a must in order to provide the ANN with a good set (if you simply provide the network with candles, the ANN won't be able to understand that a retrace is in act, and it's not a new trend, because you would need a huge input layer in order to give it memory), recurrent networks were working better for me and one test that was good was a combination of three recurrent networks each one confirming of denying the results of the other two, a sort of majority report. Trading on very short time frames is definitely possible because you can lose or gain even just 5$ per trades, I was unable to do that on the long term because it required too much money for me. If you don't mind I have a question: how many candles did you provide the network with and how many candles were predicting your net? For how many candles the prediction was "accurate"?
      Thank you.

    3. The training and validation sets were combined using 1 candle per hour for the past year. Accuracy varies - 6 to 12 candles (again, 1 candle per hour).

  3. How did you decide to use 24 hidden neurons? That should give you about 200 weights to be fitted or 1 for every 40 data points. I would worry about the potential for overfitting but if it works out of sample then that's the main thing.

    1. I tried several topologies and this one appeared to be the best (according to the result). You are right about overfitting - there is such danger, but in general, at least 90% of the time it works.

      It may be a good idea to use more input and utilize more neurons in hidden layer, but you have to take into account the hardware limitations, due to which a larger network would be quite useless, basically because of the time it would take to train it.

  4. Someone asked my for my twitter in Spanish, so two things:

    1. here's my twitter account https://twitter.com/alexey_lyashko
    2. Unfortunately, I do not speak spanish...

  5. Hello , you know of any method so that when running my (Backdoor.exe) is injected into another process automatically without use (migrate) of metasploit exploitation frameworks?


Note: Only a member of this blog may post a comment.