- The number of Stocks contributing to new highs continues to deteriorate and we haven't seen any improvement despite the gradual upside seen in Stocks indices..
Stocks above long term average:
Swings from excessive and sentiment have been a good contrarian market indicator for the past years. Every time the indexes of stocks above or below 200-days Average reached an extreme of 90-100 percent a correction in SP500 was close.
Instances where the index started falling, diverging away from the SP500 index (indicating internal strength is weakening) were signals, as at least a hefty correction followed. The chart below Illustrates these divergences, where periods of prolonged divergence followed by a more severe downside move, while minor divergences followed less severe corrections.
- The ratio remains biased to the downside, failing to confirm rallies in the SPX ......
Subscribe to my trading portal http://thefxchannel.com/ ,
My best regards
Latest trading ideas: https://www.youtube.com/channel/UC02s7uiUQl55HkhHZDM6eUw/videos
Free Telegram Channel: https://t.me/technician_forex_trades
Trading Signals Redefined: https://thefxchannel.com
These intended to be an overall secular indicators for me that gives general signals of the underlying RELATIVE strength or weakness.(Averages not Absolute)
1) The number of stocks above its 200 days average.
Your first hint: "what if the 200 day is sharply declining but price has just popped up above it by 5 cents after a surprise earnings beat?"
This is an exception, I dont count on exceptions when looking at an indicators. Exceptions that might happen and might not is not a trend. Even if it happens it will be very limited in numbers.
You Second hint: Sideways market. "..you essentially have built a big, very slow oscillator that incorrectly shows "overbought"....etc"
Totally incorrect. As you can see the index of S5TH reaches overbought and oversold conditions only in strongly trending periods, and that negates your assumption.
Point 1: "Given that TV doesnt have parenthesis, what is really happening is HIGN/(spx*100). this means the fraction is not being normalized to a percentile as intended. Flaw #1."
Example: Current reading for HIGN IS 58 , so 58/2107x100=2.75% ( which is the reading on chart)
Point 2: There are instances where the index reached above 100% because the number of stocks making new highs were higher then the SPX value. What i am trying to say , It is all about relativity, its illustrates general trends.
I disagree with your conclusions (again, of course, respectfully :D )
The heart of my point still stands --
You are taking a static numerator and dividing by a growing denominator and citing the output as a weakening of breadth. The price of the index in and of itself is not related to breadth. The index can be high and breadth can be great, the index can be low and the breadth can be poor. This non apples to apples comparison means the data is not normalized. What happens when the index is at 800? Breadth is now great? What happens when the index is at 10,000 in fifty years? Lowest breadth ever? I would say this is not my opinion at this point but just a math reality. A capped numerator / growing denominator = smaller output. Apples/Oranges = not normalized. .
Anyway, what you want to do with that info is up to you, but like I said these models can be used very effectively and I like to see their continued development.
Its obvious that the chart above disagree with what you are suggesting. If you look at the dates near march of 2009 when we had a trough in SPX, you can clearly see that the ratio was at lowest levels as well. And thats due to the point that its all relative, where the number of stocks making new highs were declining as much as the spx was declining.
I understand your point here, and i agree to a certain extent. However, at the time being its not a concern for me as i see things in relative perspective.
However, Adding some sort of multiplier to HIGN might make this more accurate.
Technician's analysis is routinely excellent so nothing new there but I find fault with the conclusions presented in this chart. Also, I'm not really arguing the point just how he got there. Anyway if it gives something to think about then thats all that it relevant.
Starting my critiques from the bottom up since its easier:
1.) Count of SP500 stocks above the 200 day average : This isn't a bad proxy per se but it doesnt really account for context of price and the 200 day. Like what if the 200 day is sharply declining but price has just popped up above it by 5 cents after a surprise earnings beat? Surely that is a different context and reality than a strong stock that is in a multi-yr uptrend and has been above the sharply rising 200 day for the last 13 months.
Or maybe the 200 is completely flat, has zero slope.
Suppose that in a particular market environment (like the exact one were in now) the indices had been range bound for going on 6 months. What this implies is that some parts of the range will be above the 200 day and others below, but neither is indicative of a major trend. And most stocks in the SP500, in aggregate, follow the SP500. You can surmise that in general the 200 MAs of many stocks are flattening out relatively speaking. So what happens is you essentially have built a big, very slow oscillator that incorrectly shows "overbought" just because a marginal portion of rangebound stocks, still in a range mind you, are at a range high.
I'm sparing the math on all this but it works out that way.
>>>>>>>>>>> TLDR: A count of stocks over the 200 day by itself doesn't provide enough contextual info to know if a high or low reading above that average is significant.
2.) the second pane HIGN/SPX*100 doesnt make sense. I know what the goal is but bluntly the formula is incorrect. Given that TV doesnt have parenthesis, what is really happening is HIGN/(spx*100). this means the fraction is not being normalized to a percentile as intended. Flaw #1.
The NYSE composite has approximately 1800 stocks, of which, if memory serves typically *ONLY 6-8% on average* make new highs at any given moment. So lets call that 7% * 1800 = 125 long term, 3-5 yr average. This is a low percentage in general.
When it measures new highs, it means that day. If a stock makes a 52 week high yesterday, and today the high is ONE PENNY lower, then its no longer a "new high". However anyone looking at it would still say its at a 52 week high.
>>>>>>>>>> In the last 660 trading days, ONLY 20 DAYS had greater than 20% of stocks at a 52 week high. In other words, only 3% of the time. In the last 660 days only 177 days had greater than 10% at a new high (or, 27% of the time).
Despite the market being up 67% in the measurement period, only a measly qtr of days had greater than 10% of stocks making a new high. That stat right show shows why this is a poor measure for what Technician is trying to accomplish, which is the breadth of new highs over time. Even in the healthiest environments you still only get a small fractions of stocks making a literal new high on a given day.
The calc is taking a new high, lets call it 65, which I think is today's reading +/-, and dividing it by 2000? Why?? If anything it should divide by the exact number of stocks in the nyse composite to at least get a percentage reading. Or, use the symbol for new SP500 highs and then divide by 500. But given that the SPX is a market-weighted index, it (respectfully) makes no sense to divide by the last price of it.
Lastly, from a subjectivity standpoint, the red boxes drawn spans in some cases 4-6 qtrs over which time the market rose 10, 20% in some cases. Missing out on that bec of misconstruing a breadth measure...well that would be disappointing.
>>>>>>>>>>> TLDR: The formula doesn't show anything.
I know this was a long write up but I figured the goal of the community is to share thought process and rationale in a helpful way, so hopefully this allows for continued development of these types of models in the future.
How did you come to this conclusion?
If I studied the same math in school as everybody else, then I still remember that the order of operations for multiplication and division does not matter = you still get the same result.
However, although TV does not accept parenthesis, i.e. it is not possible to enter the equation in the form you wanted = (HIGN/spx)*100, one can re-arrange the components of it to 100*HIGN/SPX. It does create the same graph though - as expected, because HIGN/SPX*100 = 100*HIGN/SPX.
Check: peak at 2010-04-19 still same around 45 absolute value not percents.
For the rest I value both of your professional feedbacks.
As he is dividing by the index, which is just a price subject to its own geometric growth, it means that the divergence he is citing (ie the readings are getting smaller as the index rises) is a mathematical certainty --- the denominator is literally getting bigger while the numerator stays the same.
Example: lets just say this 20% figure = 500 new highs for simplicity at the upper end of the likely spectrum. SPX index is at 1000. his ratio: 500/(1000*100) = .005. Now, fast forward... Market is just as bullish, another top 97th percentile reading, with 500 new highs, but index has appreciated, now at 2000. 500/(2000*100) = .0025. So the *same* breadth is punished now bec the index went up? There is no relationship to glean from this figure.
Thats why the math makes no sense -- aside from .0025 not being a unit (its not percentile, it's just a random number) the construction ensures that the higher the index goes, the smaller the reading. That in and of itself indicates the reading isn't normalized, irrespective of it not making sense.
to answer you question specifically, in order to make something a percent, you need to normalize it around a base unit. since you are taking two units the ratio isn't normalized. if he did HIGN/1867 (which I think is the actual # of NYSE comp. stocks but whatever the actual number is), THEN *100 you'd have a percent. We know the lowest it can be is 0. 0/1867*100. We know the highest it can be is 100. 1867/1867*100. But taking HIGN / number of oranges in my yard *100 (which is the functional equivalent of whats happening) doesn't normalize the numerator into a percentage based unit.