[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

TS oddities



PureBytes Links

Trading Reference Links

Hi All

I've been doing some work developing a backtesting system in C# - so I can run more sophisticated tests/analysis than TS is capable of.

I've been comparing numbers between the two systems - to identify bugs in my code & check I've transitioned my strategies correctly.

The system is basically finished, however through this process I've discovered a bunch of oddities in TS that I thought I'd share, since I know many on this list still depend on TS on a daily basis.

Note that these notes are based on TS2k (with all service packs). While I'm sure later versions of TS have improved in some areas, I would definitely recommend being aware of these problems, particularly if you're doing more sophisticated/complex work.

And yes, I'm aware this isn't a fully comprehensive list of TS issues :)

Hope this helps some of you. This list has been incredible over the years (Gary, Bob, Jim, etc etc), so I'd like to pay a little of that back.

Kind Regards

Simon



-----

The at$ command is basically a GTC order, except that you have to re-enter it daily, like a GFD.

---
A 'Stop' signal isn't a simple stop, it's a stop-and-reverse.

eg, if we're long 6, and then have a sell 4 stop, we end up net short 4 (having closed out the 6), not just long 2, as you'd expect.

---

OHLC, it decides the internal order as follows:

the shortest O->L or O->H distance gets covered first, regardless of overall market direction.

eg, OHLC = 660.15, 662.25, 658.15, 658.50 (S&P500, 19/Sep/1984)

O->H = 2.1, O->L = 2, therefore TS trades internally in this order

660.15 (O) -> 658.15 (L) -> 662.25 (H) -> 658.50 (C).

(ie, a total market distance travelled of 2 + 4.1 + 3.75 = 9.85)

however, if you look at this bar, it's a downward bar (C < O) , which means:
it's MUCH more likely that it would have hit the high first, then reversed, come down, hit the low, then closed lower, ie:

660.15 (O) -> 662.25 (H) -> 658.15 (L) -> 658.50 (C).

(ie, a total market distance travelled of 2.1 + 4.1 + 0.35 = 6.55)

---

If you have a stop and the day's open gaps through it, TS fills it AT the open.

Eg, S&P Sell Stop @ 755.3958 (internally it's common for stops to be placed at odd positions - this would just be rounded to 755.30)
market open at 755.05, TS fills the order at 755.05.

---

TS is unpredictable when doing basic maths.

eg, consider this range of 14 Closes (3rd->22nd Aug, 1983, S&P500)

664.45
661.55
662.85
660.65
661.55
662.7
662.7
663.5
664.5
664.25
665.75
663.75
664.95
664.45

if we're doing a moving average of length 13, then the last two results (ending in 664.95, and then the next day, ending in 664.45) should be identical.

the correct 13 bar average of these numbers is: 663.319230769231
ie, 8623.15 divided by 13.

however, because this is an unusual number (lots of decimal places), TS doesn't -quite- get the same result when it does it for the 22nd aug (ie, when the first 664.45 has dropped off, and the last 664.45 has been added on).

Which means unpredictable results when trying to compare what should be two identical numbers. This also can mean problems if you're trying
to duplicate exactly what TS is doing.

Another example.

KC, Closes for 28/Sep/1990->17/Oct/1990:
167.55
166.95
168.05
168.65
167.7
167.45
168.25
168
165.8
166.1
167.05
167.35
166.4
167.55

should result in the last two Average(Close,13) being identical, which means that T3 <= T3[1] should be TRUE. When the numbers are dumped out, they appear identical, namely 167.331 (167.330769230769), however TS believes that the second occurrence is actually larger, meaning that code depending on this comparison doesn't run when it should.

This has major consequences for systems that depend on relative comparisons

---

when outputting numbers, the NumToStr command does not keep track of decimal places for any number >= 100,000.
Eg, 100000.50 will output as 100000.00.

Additionally, numbers greater than 10,000 (ie, 5 digits to the left of the decimal) are rounded to ONE decimal place (after the decimal place).
Eg 12345.25 will be output as 12345.30

Internally they are stored correctly, but attempts to output and/or verify against numbers this big will fail.
---

For some? reason, certain values are arbitrarily rounded.

This is more-or-less random. I suspect it's an artifact of numbers internally using floats/singles (4 bytes) rather than doubles (8 bytes)

The problem is that it results in a number like a sell at 108.015 being rounded down, while a sell at 103.015 (same market) is rounded up.

ie, it's essentially impossible to predict. This reverberates through all the internal maths. Orders may or may not get hit if they're near the high/low for the day. Multiplications between large numbers lose accuracy fast. etc.

A reasonable rule of thumb is that you only have 6 accurate decimal places. So, if the market is trading above 10,000 anything below the first decimal place (ie, 0.1) is rubbish. It also means that if you do maths with a large number, it will swamp the accuracy of the lower.

---

Setting max bars back to a large number doesn't ensure that the internal calculations are done before that point, it just doesn't
START them any earlier.

Eg, setting SP to MaxBarsBack 100, starting the data at 1/1/1985, and mid year, at 12/Jun/1985 (ie, 113 bars in), it still hasn't finished calculating Average(Close,13)[15] unfortunately - it DOES start using these results. So, we get entries (buys/sells) occuring before the underlying calculations are settled, regardless of MaxBarsBack.

ie, rather than using the "max bars back" period to calculate & settle all the calculations, thereby starting max bars back in with correct signals, it starts at max bars back in with calculations that won't blow up (eg trying to calc average(close,15) on the 12th bar), but also aren't correct yet, because the indicators haven't settled yet. So, if we reference average(close,15)[3] we'll get 0 until 3 bars past max bars back setting.

---

It will corrupt negative prices. Additionally, it will randomly adjust the prices to values that aren't possible.

eg, on a market that can only move in 0.25 increments, a value of -1 might become -0.97.

of course, this also throws out any calculations done with these prices.

---

When the minimum price move is different from the actual market prices (this happens often with back adjusted data).

eg, min price move is 5/100's, but prices are 107.82, 108.03 etc

then TS calculates indicators based on the actual prices (107.82, etc), but estimates entry points based on the min price move. eg, above, if a buy order was at 107.82, it would be filled at 107.80 or 107.85, even though the (back adjusted) price never went there (and couldn't).

This is an awkward case. what's the correct thing to do here? Enter at numerically unusual, but back adjusted prices, or at the correct (but invalid) granularity.

---

Rounding occurs in odd places resulting in unpredictable behaviour, particularly with back adjusted data.

Eg, IFS (MIB) has a minimum price move of 5.

On 15th & 16th Jan 1996, both days have a back-adjusted high of 15048 (which is not a factor of 5). If a long order is placed at the high of the 15th, this order is rounded to 15050, which TS then decides is outside the range of the 16th, and therefore doesn't hit. Since the intention is that the system go long at the high, if the next day has the same high, regardless of the price scaling, this should be hit.

---