InvestorsHub Logo
Followers 13
Posts 2463
Boards Moderated 1
Alias Born 02/23/2002

Re: None

Saturday, 07/27/2002 5:10:27 PM

Saturday, July 27, 2002 5:10:27 PM

Post# of 1453
Optimization on AIM,

Optimisation of parameters on a period of past share prices, makes sense. There is no disagreement on that. Suppose now that we have an pattern or trend recognition subroutine and get warning that the latest share rice falls out of the pattern by a significant degree. This signal indicated that maybe the pattern is changing.

I see two options for interpreting this:

1) The pattern(trend) is changing;
2) The pattern(trend) is not changing:

The choice for this must come from other signals if you want to choose between them. I see at least two options as a possible course of action:

A) You act on the buy/sell Advice as you always. You decide to optimise your AIM as the out of trend price may be part of a trend change. The new AIM parameters will then include the latest price.

B) You act on the buy as usual but decide that the latest price is a fluke and you do not optimise.

With the next entries of the new prices it will become evident in which direction the price trend is going and it will be possible to identify if the price under cases A) or B) are flukes or not. If the price under A) was not a fluke the optimisation was effective. If the price under A) was a fluke the optimisation was not effective and Case B) was the better option. OK, that are the breaks, either way could have occurred.

The question arises that if you are going to optimise again you might know that the previous optimisation was not effective(the price in case A) was fluke) and you want to do the optimisation now for other reasons, you are stuck with stock data in which a price is included that is a fluke. In order to prevent this from affecting the optimisation you would want to adjust the fluky price to conform to the trend that was present then.

There is of course noting wrong with such a procedure, but it requires that all the data points that are considered as fluky in relation to the trend are identified and adjusted so that the latest optimisation is done with adjusted stock prices. This would, in my opinion, much give better parameter settings as long as the trend continues.

Has anyone thought about such a procedure for optimising the optimisation procedure?

I am addressing this from a technical perspective of data interpretation. It is not uncommon to remove data points from a Test Case data set if it is believed that these points are out of character. This is of course quite legitimate. For stock data this argument is no different.

The problem with stock data it is not always possible to recognize this from individual points, and this would require identifying particular data points at the time they occur. This means flagging questionable data so that later these points can be traced and modified as required when the optimisation is done.

Obviously this requires more work but as long as this improves yield for well behaved stocks it could be worth the trouble.

As part of this procedure I would think of using only the stock history period that shows characteristics of the recent trend and reject large deviated data sets that do not support the latest trends.

Are there any problematic aspects in what I propose?


Conrad

Conrad Winkelman
What is Vortex AIMing? Look for my Vortex Discussion Forum:
http://investorshub.advfn.com/boards/board.asp?board_id=1341

Join InvestorsHub

Join the InvestorsHub Community

Register for free to join our community of investors and share your ideas. You will also get access to streaming quotes, interactive charts, trades, portfolio, live options flow and more tools.