TraderMathLibrary "TraderMath"
A collection of essential trading utilities and mathematical functions used for technical analysis,
including DEMA, Fisher Transform, directional movement, and ADX calculations.
dema(source, length)
Calculates the value of the Double Exponential Moving Average (DEMA).
Parameters:
source (float) : (series int/float) Series of values to process.
length (simple int) : (simple int) Length for the smoothing parameter calculation.
Returns: (float) The double exponentially weighted moving average of the `source`.
roundVal(val)
Constrains a value to the range .
Parameters:
val (float) : (float) Value to constrain.
Returns: (float) Value limited to the range .
fisherTransform(length)
Computes the Fisher Transform oscillator, enhancing turning point sensitivity.
Parameters:
length (int) : (int) Lookback length used to normalize price within the high-low range.
Returns: (float) Fisher Transform value.
dirmov(len)
Calculates the Plus and Minus Directional Movement components (DI+ and DI−).
Parameters:
len (simple int) : (int) Lookback length for directional movement.
Returns: (float ) Array containing .
adx(dilen, adxlen)
Computes the Average Directional Index (ADX) based on DI+ and DI−.
Parameters:
dilen (simple int) : (int) Lookback length for directional movement calculation.
adxlen (simple int) : (int) Smoothing length for ADX computation.
Returns: (float) Average Directional Index value (0–100).
Search in scripts for "Exponential"
Dynamic Equity Allocation Model"Cash is Trash"? Not Always. Here's Why Science Beats Guesswork.
Every retail trader knows the frustration: you draw support and resistance lines, you spot patterns, you follow market gurus on social media—and still, when the next bear market hits, your portfolio bleeds red. Meanwhile, institutional investors seem to navigate market turbulence with ease, preserving capital when markets crash and participating when they rally. What's their secret?
The answer isn't insider information or access to exotic derivatives. It's systematic, scientifically validated decision-making. While most retail traders rely on subjective chart analysis and emotional reactions, professional portfolio managers use quantitative models that remove emotion from the equation and process multiple streams of market information simultaneously.
This document presents exactly such a system—not a proprietary black box available only to hedge funds, but a fully transparent, academically grounded framework that any serious investor can understand and apply. The Dynamic Equity Allocation Model (DEAM) synthesizes decades of financial research from Nobel laureates and leading academics into a practical tool for tactical asset allocation.
Stop drawing colorful lines on your chart and start thinking like a quant. This isn't about predicting where the market goes next week—it's about systematically adjusting your risk exposure based on what the data actually tells you. When valuations scream danger, when volatility spikes, when credit markets freeze, when multiple warning signals align—that's when cash isn't trash. That's when cash saves your portfolio.
The irony of "cash is trash" rhetoric is that it ignores timing. Yes, being 100% cash for decades would be disastrous. But being 100% equities through every crisis is equally foolish. The sophisticated approach is dynamic: aggressive when conditions favor risk-taking, defensive when they don't. This model shows you how to make that decision systematically, not emotionally.
Whether you're managing your own retirement portfolio or seeking to understand how institutional allocation strategies work, this comprehensive analysis provides the theoretical foundation, mathematical implementation, and practical guidance to elevate your investment approach from amateur to professional.
The choice is yours: keep hoping your chart patterns work out, or start using the same quantitative methods that professionals rely on. The tools are here. The research is cited. The methodology is explained. All you need to do is read, understand, and apply.
The Dynamic Equity Allocation Model (DEAM) is a quantitative framework for systematic allocation between equities and cash, grounded in modern portfolio theory and empirical market research. The model integrates five scientifically validated dimensions of market analysis—market regime, risk metrics, valuation, sentiment, and macroeconomic conditions—to generate dynamic allocation recommendations ranging from 0% to 100% equity exposure. This work documents the theoretical foundations, mathematical implementation, and practical application of this multi-factor approach.
1. Introduction and Theoretical Background
1.1 The Limitations of Static Portfolio Allocation
Traditional portfolio theory, as formulated by Markowitz (1952) in his seminal work "Portfolio Selection," assumes an optimal static allocation where investors distribute their wealth across asset classes according to their risk aversion. This approach rests on the assumption that returns and risks remain constant over time. However, empirical research demonstrates that this assumption does not hold in reality. Fama and French (1989) showed that expected returns vary over time and correlate with macroeconomic variables such as the spread between long-term and short-term interest rates. Campbell and Shiller (1988) demonstrated that the price-earnings ratio possesses predictive power for future stock returns, providing a foundation for dynamic allocation strategies.
The academic literature on tactical asset allocation has evolved considerably over recent decades. Ilmanen (2011) argues in "Expected Returns" that investors can improve their risk-adjusted returns by considering valuation levels, business cycles, and market sentiment. The Dynamic Equity Allocation Model presented here builds on this research tradition and operationalizes these insights into a practically applicable allocation framework.
1.2 Multi-Factor Approaches in Asset Allocation
Modern financial research has shown that different factors capture distinct aspects of market dynamics and together provide a more robust picture of market conditions than individual indicators. Ross (1976) developed the Arbitrage Pricing Theory, a model that employs multiple factors to explain security returns. Following this multi-factor philosophy, DEAM integrates five complementary analytical dimensions, each tapping different information sources and collectively enabling comprehensive market understanding.
2. Data Foundation and Data Quality
2.1 Data Sources Used
The model draws its data exclusively from publicly available market data via the TradingView platform. This transparency and accessibility is a significant advantage over proprietary models that rely on non-public data. The data foundation encompasses several categories of market information, each capturing specific aspects of market dynamics.
First, price data for the S&P 500 Index is obtained through the SPDR S&P 500 ETF (ticker: SPY). The use of a highly liquid ETF instead of the index itself has practical reasons, as ETF data is available in real-time and reflects actual tradability. In addition to closing prices, high, low, and volume data are captured, which are required for calculating advanced volatility measures.
Fundamental corporate metrics are retrieved via TradingView's Financial Data API. These include earnings per share, price-to-earnings ratio, return on equity, debt-to-equity ratio, dividend yield, and share buyback yield. Cochrane (2011) emphasizes in "Presidential Address: Discount Rates" the central importance of valuation metrics for forecasting future returns, making these fundamental data a cornerstone of the model.
Volatility indicators are represented by the CBOE Volatility Index (VIX) and related metrics. The VIX, often referred to as the market's "fear gauge," measures the implied volatility of S&P 500 index options and serves as a proxy for market participants' risk perception. Whaley (2000) describes in "The Investor Fear Gauge" the construction and interpretation of the VIX and its use as a sentiment indicator.
Macroeconomic data includes yield curve information through US Treasury bonds of various maturities and credit risk premiums through the spread between high-yield bonds and risk-free government bonds. These variables capture the macroeconomic conditions and financing conditions relevant for equity valuation. Estrella and Hardouvelis (1991) showed that the shape of the yield curve has predictive power for future economic activity, justifying the inclusion of these data.
2.2 Handling Missing Data
A practical problem when working with financial data is dealing with missing or unavailable values. The model implements a fallback system where a plausible historical average value is stored for each fundamental metric. When current data is unavailable for a specific point in time, this fallback value is used. This approach ensures that the model remains functional even during temporary data outages and avoids systematic biases from missing data. The use of average values as fallback is conservative, as it generates neither overly optimistic nor pessimistic signals.
3. Component 1: Market Regime Detection
3.1 The Concept of Market Regimes
The idea that financial markets exist in different "regimes" or states that differ in their statistical properties has a long tradition in financial science. Hamilton (1989) developed regime-switching models that allow distinguishing between different market states with different return and volatility characteristics. The practical application of this theory consists of identifying the current market state and adjusting portfolio allocation accordingly.
DEAM classifies market regimes using a scoring system that considers three main dimensions: trend strength, volatility level, and drawdown depth. This multidimensional view is more robust than focusing on individual indicators, as it captures various facets of market dynamics. Classification occurs into six distinct regimes: Strong Bull, Bull Market, Neutral, Correction, Bear Market, and Crisis.
3.2 Trend Analysis Through Moving Averages
Moving averages are among the oldest and most widely used technical indicators and have also received attention in academic literature. Brock, Lakonishok, and LeBaron (1992) examined in "Simple Technical Trading Rules and the Stochastic Properties of Stock Returns" the profitability of trading rules based on moving averages and found evidence for their predictive power, although later studies questioned the robustness of these results when considering transaction costs.
The model calculates three moving averages with different time windows: a 20-day average (approximately one trading month), a 50-day average (approximately one quarter), and a 200-day average (approximately one trading year). The relationship of the current price to these averages and the relationship of the averages to each other provide information about trend strength and direction. When the price trades above all three averages and the short-term average is above the long-term, this indicates an established uptrend. The model assigns points based on these constellations, with longer-term trends weighted more heavily as they are considered more persistent.
3.3 Volatility Regimes
Volatility, understood as the standard deviation of returns, is a central concept of financial theory and serves as the primary risk measure. However, research has shown that volatility is not constant but changes over time and occurs in clusters—a phenomenon first documented by Mandelbrot (1963) and later formalized through ARCH and GARCH models (Engle, 1982; Bollerslev, 1986).
DEAM calculates volatility not only through the classic method of return standard deviation but also uses more advanced estimators such as the Parkinson estimator and the Garman-Klass estimator. These methods utilize intraday information (high and low prices) and are more efficient than simple close-to-close volatility estimators. The Parkinson estimator (Parkinson, 1980) uses the range between high and low of a trading day and is based on the recognition that this information reveals more about true volatility than just the closing price difference. The Garman-Klass estimator (Garman and Klass, 1980) extends this approach by additionally considering opening and closing prices.
The calculated volatility is annualized by multiplying it by the square root of 252 (the average number of trading days per year), enabling standardized comparability. The model compares current volatility with the VIX, the implied volatility from option prices. A low VIX (below 15) signals market comfort and increases the regime score, while a high VIX (above 35) indicates market stress and reduces the score. This interpretation follows the empirical observation that elevated volatility is typically associated with falling markets (Schwert, 1989).
3.4 Drawdown Analysis
A drawdown refers to the percentage decline from the highest point (peak) to the lowest point (trough) during a specific period. This metric is psychologically significant for investors as it represents the maximum loss experienced. Calmar (1991) developed the Calmar Ratio, which relates return to maximum drawdown, underscoring the practical relevance of this metric.
The model calculates current drawdown as the percentage distance from the highest price of the last 252 trading days (one year). A drawdown below 3% is considered negligible and maximally increases the regime score. As drawdown increases, the score decreases progressively, with drawdowns above 20% classified as severe and indicating a crisis or bear market regime. These thresholds are empirically motivated by historical market cycles, in which corrections typically encompassed 5-10% drawdowns, bear markets 20-30%, and crises over 30%.
3.5 Regime Classification
Final regime classification occurs through aggregation of scores from trend (40% weight), volatility (30%), and drawdown (30%). The higher weighting of trend reflects the empirical observation that trend-following strategies have historically delivered robust results (Moskowitz, Ooi, and Pedersen, 2012). A total score above 80 signals a strong bull market with established uptrend, low volatility, and minimal losses. At a score below 10, a crisis situation exists requiring defensive positioning. The six regime categories enable a differentiated allocation strategy that not only distinguishes binarily between bullish and bearish but allows gradual gradations.
4. Component 2: Risk-Based Allocation
4.1 Volatility Targeting as Risk Management Approach
The concept of volatility targeting is based on the idea that investors should maximize not returns but risk-adjusted returns. Sharpe (1966, 1994) defined with the Sharpe Ratio the fundamental concept of return per unit of risk, measured as volatility. Volatility targeting goes a step further and adjusts portfolio allocation to achieve constant target volatility. This means that in times of low market volatility, equity allocation is increased, and in times of high volatility, it is reduced.
Moreira and Muir (2017) showed in "Volatility-Managed Portfolios" that strategies that adjust their exposure based on volatility forecasts achieve higher Sharpe Ratios than passive buy-and-hold strategies. DEAM implements this principle by defining a target portfolio volatility (default 12% annualized) and adjusting equity allocation to achieve it. The mathematical foundation is simple: if market volatility is 20% and target volatility is 12%, equity allocation should be 60% (12/20 = 0.6), with the remaining 40% held in cash with zero volatility.
4.2 Market Volatility Calculation
Estimating current market volatility is central to the risk-based allocation approach. The model uses several volatility estimators in parallel and selects the higher value between traditional close-to-close volatility and the Parkinson estimator. This conservative choice ensures the model does not underestimate true volatility, which could lead to excessive risk exposure.
Traditional volatility calculation uses logarithmic returns, as these have mathematically advantageous properties (additive linkage over multiple periods). The logarithmic return is calculated as ln(P_t / P_{t-1}), where P_t is the price at time t. The standard deviation of these returns over a rolling 20-trading-day window is then multiplied by √252 to obtain annualized volatility. This annualization is based on the assumption of independently identically distributed returns, which is an idealization but widely accepted in practice.
The Parkinson estimator uses additional information from the trading range (High minus Low) of each day. The formula is: σ_P = (1/√(4ln2)) × √(1/n × Σln²(H_i/L_i)) × √252, where H_i and L_i are high and low prices. Under ideal conditions, this estimator is approximately five times more efficient than the close-to-close estimator (Parkinson, 1980), as it uses more information per observation.
4.3 Drawdown-Based Position Size Adjustment
In addition to volatility targeting, the model implements drawdown-based risk control. The logic is that deep market declines often signal further losses and therefore justify exposure reduction. This behavior corresponds with the concept of path-dependent risk tolerance: investors who have already suffered losses are typically less willing to take additional risk (Kahneman and Tversky, 1979).
The model defines a maximum portfolio drawdown as a target parameter (default 15%). Since portfolio volatility and portfolio drawdown are proportional to equity allocation (assuming cash has neither volatility nor drawdown), allocation-based control is possible. For example, if the market exhibits a 25% drawdown and target portfolio drawdown is 15%, equity allocation should be at most 60% (15/25).
4.4 Dynamic Risk Adjustment
An advanced feature of DEAM is dynamic adjustment of risk-based allocation through a feedback mechanism. The model continuously estimates what actual portfolio volatility and portfolio drawdown would result at the current allocation. If risk utilization (ratio of actual to target risk) exceeds 1.0, allocation is reduced by an adjustment factor that grows exponentially with overutilization. This implements a form of dynamic feedback that avoids overexposure.
Mathematically, a risk adjustment factor r_adjust is calculated: if risk utilization u > 1, then r_adjust = exp(-0.5 × (u - 1)). This exponential function ensures that moderate overutilization is gently corrected, while strong overutilization triggers drastic reductions. The factor 0.5 in the exponent was empirically calibrated to achieve a balanced ratio between sensitivity and stability.
5. Component 3: Valuation Analysis
5.1 Theoretical Foundations of Fundamental Valuation
DEAM's valuation component is based on the fundamental premise that the intrinsic value of a security is determined by its future cash flows and that deviations between market price and intrinsic value are eventually corrected. Graham and Dodd (1934) established in "Security Analysis" the basic principles of fundamental analysis that remain relevant today. Translated into modern portfolio context, this means that markets with high valuation metrics (high price-earnings ratios) should have lower expected returns than cheaply valued markets.
Campbell and Shiller (1988) developed the Cyclically Adjusted P/E Ratio (CAPE), which smooths earnings over a full business cycle. Their empirical analysis showed that this ratio has significant predictive power for 10-year returns. Asness, Moskowitz, and Pedersen (2013) demonstrated in "Value and Momentum Everywhere" that value effects exist not only in individual stocks but also in asset classes and markets.
5.2 Equity Risk Premium as Central Valuation Metric
The Equity Risk Premium (ERP) is defined as the expected excess return of stocks over risk-free government bonds. It is the theoretical heart of valuation analysis, as it represents the compensation investors demand for bearing equity risk. Damodaran (2012) discusses in "Equity Risk Premiums: Determinants, Estimation and Implications" various methods for ERP estimation.
DEAM calculates ERP not through a single method but combines four complementary approaches with different weights. This multi-method strategy increases estimation robustness and avoids dependence on single, potentially erroneous inputs.
The first method (35% weight) uses earnings yield, calculated as 1/P/E or directly from operating earnings data, and subtracts the 10-year Treasury yield. This method follows Fed Model logic (Yardeni, 2003), although this model has theoretical weaknesses as it does not consistently treat inflation (Asness, 2003).
The second method (30% weight) extends earnings yield by share buyback yield. Share buybacks are a form of capital return to shareholders and increase value per share. Boudoukh et al. (2007) showed in "The Total Shareholder Yield" that the sum of dividend yield and buyback yield is a better predictor of future returns than dividend yield alone.
The third method (20% weight) implements the Gordon Growth Model (Gordon, 1962), which models stock value as the sum of discounted future dividends. Under constant growth g assumption: Expected Return = Dividend Yield + g. The model estimates sustainable growth as g = ROE × (1 - Payout Ratio), where ROE is return on equity and payout ratio is the ratio of dividends to earnings. This formula follows from equity theory: unretained earnings are reinvested at ROE and generate additional earnings growth.
The fourth method (15% weight) combines total shareholder yield (Dividend + Buybacks) with implied growth derived from revenue growth. This method considers that companies with strong revenue growth should generate higher future earnings, even if current valuations do not yet fully reflect this.
The final ERP is the weighted average of these four methods. A high ERP (above 4%) signals attractive valuations and increases the valuation score to 95 out of 100 possible points. A negative ERP, where stocks have lower expected returns than bonds, results in a minimal score of 10.
5.3 Quality Adjustments to Valuation
Valuation metrics alone can be misleading if not interpreted in the context of company quality. A company with a low P/E may be cheap or fundamentally problematic. The model therefore implements quality adjustments based on growth, profitability, and capital structure.
Revenue growth above 10% annually adds 10 points to the valuation score, moderate growth above 5% adds 5 points. This adjustment reflects that growth has independent value (Modigliani and Miller, 1961, extended by later growth theory). Net margin above 15% signals pricing power and operational efficiency and increases the score by 5 points, while low margins below 8% indicate competitive pressure and subtract 5 points.
Return on equity (ROE) above 20% characterizes outstanding capital efficiency and increases the score by 5 points. Piotroski (2000) showed in "Value Investing: The Use of Historical Financial Statement Information" that fundamental quality signals such as high ROE can improve the performance of value strategies.
Capital structure is evaluated through the debt-to-equity ratio. A conservative ratio below 1.0 multiplies the valuation score by 1.2, while high leverage above 2.0 applies a multiplier of 0.8. This adjustment reflects that high debt constrains financial flexibility and can become problematic in crisis times (Korteweg, 2010).
6. Component 4: Sentiment Analysis
6.1 The Role of Sentiment in Financial Markets
Investor sentiment, defined as the collective psychological attitude of market participants, influences asset prices independently of fundamental data. Baker and Wurgler (2006, 2007) developed a sentiment index and showed that periods of high sentiment are followed by overvaluations that later correct. This insight justifies integrating a sentiment component into allocation decisions.
Sentiment is difficult to measure directly but can be proxied through market indicators. The VIX is the most widely used sentiment indicator, as it aggregates implied volatility from option prices. High VIX values reflect elevated uncertainty and risk aversion, while low values signal market comfort. Whaley (2009) refers to the VIX as the "Investor Fear Gauge" and documents its role as a contrarian indicator: extremely high values typically occur at market bottoms, while low values occur at tops.
6.2 VIX-Based Sentiment Assessment
DEAM uses statistical normalization of the VIX by calculating the Z-score: z = (VIX_current - VIX_average) / VIX_standard_deviation. The Z-score indicates how many standard deviations the current VIX is from the historical average. This approach is more robust than absolute thresholds, as it adapts to the average volatility level, which can vary over longer periods.
A Z-score below -1.5 (VIX is 1.5 standard deviations below average) signals exceptionally low risk perception and adds 40 points to the sentiment score. This may seem counterintuitive—shouldn't low fear be bullish? However, the logic follows the contrarian principle: when no one is afraid, everyone is already invested, and there is limited further upside potential (Zweig, 1973). Conversely, a Z-score above 1.5 (extreme fear) adds -40 points, reflecting market panic but simultaneously suggesting potential buying opportunities.
6.3 VIX Term Structure as Sentiment Signal
The VIX term structure provides additional sentiment information. Normally, the VIX trades in contango, meaning longer-term VIX futures have higher prices than short-term. This reflects that short-term volatility is currently known, while long-term volatility is more uncertain and carries a risk premium. The model compares the VIX with VIX9D (9-day volatility) and identifies backwardation (VIX > 1.05 × VIX9D) and steep backwardation (VIX > 1.15 × VIX9D).
Backwardation occurs when short-term implied volatility is higher than longer-term, which typically happens during market stress. Investors anticipate immediate turbulence but expect calming. Psychologically, this reflects acute fear. The model subtracts 15 points for backwardation and 30 for steep backwardation, as these constellations signal elevated risk. Simon and Wiggins (2001) analyzed the VIX futures curve and showed that backwardation is associated with market declines.
6.4 Safe-Haven Flows
During crisis times, investors flee from risky assets into safe havens: gold, US dollar, and Japanese yen. This "flight to quality" is a sentiment signal. The model calculates the performance of these assets relative to stocks over the last 20 trading days. When gold or the dollar strongly rise while stocks fall, this indicates elevated risk aversion.
The safe-haven component is calculated as the difference between safe-haven performance and stock performance. Positive values (safe havens outperform) subtract up to 20 points from the sentiment score, negative values (stocks outperform) add up to 10 points. The asymmetric treatment (larger deduction for risk-off than bonus for risk-on) reflects that risk-off movements are typically sharper and more informative than risk-on phases.
Baur and Lucey (2010) examined safe-haven properties of gold and showed that gold indeed exhibits negative correlation with stocks during extreme market movements, confirming its role as crisis protection.
7. Component 5: Macroeconomic Analysis
7.1 The Yield Curve as Economic Indicator
The yield curve, represented as yields of government bonds of various maturities, contains aggregated expectations about future interest rates, inflation, and economic growth. The slope of the yield curve has remarkable predictive power for recessions. Estrella and Mishkin (1998) showed that an inverted yield curve (short-term rates higher than long-term) predicts recessions with high reliability. This is because inverted curves reflect restrictive monetary policy: the central bank raises short-term rates to combat inflation, dampening economic activity.
DEAM calculates two spread measures: the 2-year-minus-10-year spread and the 3-month-minus-10-year spread. A steep, positive curve (spreads above 1.5% and 2% respectively) signals healthy growth expectations and generates the maximum yield curve score of 40 points. A flat curve (spreads near zero) reduces the score to 20 points. An inverted curve (negative spreads) is particularly alarming and results in only 10 points.
The choice of two different spreads increases analysis robustness. The 2-10 spread is most established in academic literature, while the 3M-10Y spread is often considered more sensitive, as the 3-month rate directly reflects current monetary policy (Ang, Piazzesi, and Wei, 2006).
7.2 Credit Conditions and Spreads
Credit spreads—the yield difference between risky corporate bonds and safe government bonds—reflect risk perception in the credit market. Gilchrist and Zakrajšek (2012) constructed an "Excess Bond Premium" that measures the component of credit spreads not explained by fundamentals and showed this is a predictor of future economic activity and stock returns.
The model approximates credit spread by comparing the yield of high-yield bond ETFs (HYG) with investment-grade bond ETFs (LQD). A narrow spread below 200 basis points signals healthy credit conditions and risk appetite, contributing 30 points to the macro score. Very wide spreads above 1000 basis points (as during the 2008 financial crisis) signal credit crunch and generate zero points.
Additionally, the model evaluates whether "flight to quality" is occurring, identified through strong performance of Treasury bonds (TLT) with simultaneous weakness in high-yield bonds. This constellation indicates elevated risk aversion and reduces the credit conditions score.
7.3 Financial Stability at Corporate Level
While the yield curve and credit spreads reflect macroeconomic conditions, financial stability evaluates the health of companies themselves. The model uses the aggregated debt-to-equity ratio and return on equity of the S&P 500 as proxies for corporate health.
A low leverage level below 0.5 combined with high ROE above 15% signals robust corporate balance sheets and generates 20 points. This combination is particularly valuable as it represents both defensive strength (low debt means crisis resistance) and offensive strength (high ROE means earnings power). High leverage above 1.5 generates only 5 points, as it implies vulnerability to interest rate increases and recessions.
Korteweg (2010) showed in "The Net Benefits to Leverage" that optimal debt maximizes firm value, but excessive debt increases distress costs. At the aggregated market level, high debt indicates fragilities that can become problematic during stress phases.
8. Component 6: Crisis Detection
8.1 The Need for Systematic Crisis Detection
Financial crises are rare but extremely impactful events that suspend normal statistical relationships. During normal market volatility, diversified portfolios and traditional risk management approaches function, but during systemic crises, seemingly independent assets suddenly correlate strongly, and losses exceed historical expectations (Longin and Solnik, 2001). This justifies a separate crisis detection mechanism that operates independently of regular allocation components.
Reinhart and Rogoff (2009) documented in "This Time Is Different: Eight Centuries of Financial Folly" recurring patterns in financial crises: extreme volatility, massive drawdowns, credit market dysfunction, and asset price collapse. DEAM operationalizes these patterns into quantifiable crisis indicators.
8.2 Multi-Signal Crisis Identification
The model uses a counter-based approach where various stress signals are identified and aggregated. This methodology is more robust than relying on a single indicator, as true crises typically occur simultaneously across multiple dimensions. A single signal may be a false alarm, but the simultaneous presence of multiple signals increases confidence.
The first indicator is a VIX above the crisis threshold (default 40), adding one point. A VIX above 60 (as in 2008 and March 2020) adds two additional points, as such extreme values are historically very rare. This tiered approach captures the intensity of volatility.
The second indicator is market drawdown. A drawdown above 15% adds one point, as corrections of this magnitude can be potential harbingers of larger crises. A drawdown above 25% adds another point, as historical bear markets typically encompass 25-40% drawdowns.
The third indicator is credit market spreads above 500 basis points, adding one point. Such wide spreads occur only during significant credit market disruptions, as in 2008 during the Lehman crisis.
The fourth indicator identifies simultaneous losses in stocks and bonds. Normally, Treasury bonds act as a hedge against equity risk (negative correlation), but when both fall simultaneously, this indicates systemic liquidity problems or inflation/stagflation fears. The model checks whether both SPY and TLT have fallen more than 10% and 5% respectively over 5 trading days, adding two points.
The fifth indicator is a volume spike combined with negative returns. Extreme trading volumes (above twice the 20-day average) with falling prices signal panic selling. This adds one point.
A crisis situation is diagnosed when at least 3 indicators trigger, a severe crisis at 5 or more indicators. These thresholds were calibrated through historical backtesting to identify true crises (2008, 2020) without generating excessive false alarms.
8.3 Crisis-Based Allocation Override
When a crisis is detected, the system overrides the normal allocation recommendation and caps equity allocation at maximum 25%. In a severe crisis, the cap is set at 10%. This drastic defensive posture follows the empirical observation that crises typically require time to develop and that early reduction can avoid substantial losses (Faber, 2007).
This override logic implements a "safety first" principle: in situations of existential danger to the portfolio, capital preservation becomes the top priority. Roy (1952) formalized this approach in "Safety First and the Holding of Assets," arguing that investors should primarily minimize ruin probability.
9. Integration and Final Allocation Calculation
9.1 Component Weighting
The final allocation recommendation emerges through weighted aggregation of the five components. The standard weighting is: Market Regime 35%, Risk Management 25%, Valuation 20%, Sentiment 15%, Macro 5%. These weights reflect both theoretical considerations and empirical backtesting results.
The highest weighting of market regime is based on evidence that trend-following and momentum strategies have delivered robust results across various asset classes and time periods (Moskowitz, Ooi, and Pedersen, 2012). Current market momentum is highly informative for the near future, although it provides no information about long-term expectations.
The substantial weighting of risk management (25%) follows from the central importance of risk control. Wealth preservation is the foundation of long-term wealth creation, and systematic risk management is demonstrably value-creating (Moreira and Muir, 2017).
The valuation component receives 20% weight, based on the long-term mean reversion of valuation metrics. While valuation has limited short-term predictive power (bull and bear markets can begin at any valuation), the long-term relationship between valuation and returns is robustly documented (Campbell and Shiller, 1988).
Sentiment (15%) and Macro (5%) receive lower weights, as these factors are subtler and harder to measure. Sentiment is valuable as a contrarian indicator at extremes but less informative in normal ranges. Macro variables such as the yield curve have strong predictive power for recessions, but the transmission from recessions to stock market performance is complex and temporally variable.
9.2 Model Type Adjustments
DEAM allows users to choose between four model types: Conservative, Balanced, Aggressive, and Adaptive. This choice modifies the final allocation through additive adjustments.
Conservative mode subtracts 10 percentage points from allocation, resulting in consistently more cautious positioning. This is suitable for risk-averse investors or those with limited investment horizons. Aggressive mode adds 10 percentage points, suitable for risk-tolerant investors with long horizons.
Adaptive mode implements procyclical adjustment based on short-term momentum: if the market has risen more than 5% in the last 20 days, 5 percentage points are added; if it has declined more than 5%, 5 points are subtracted. This logic follows the observation that short-term momentum persists (Jegadeesh and Titman, 1993), but the moderate size of adjustment avoids excessive timing bets.
Balanced mode makes no adjustment and uses raw model output. This neutral setting is suitable for investors who wish to trust model recommendations unchanged.
9.3 Smoothing and Stability
The allocation resulting from aggregation undergoes final smoothing through a simple moving average over 3 periods. This smoothing is crucial for model practicality, as it reduces frequent trading and thus transaction costs. Without smoothing, the model could fluctuate between adjacent allocations with every small input change.
The choice of 3 periods as smoothing window is a compromise between responsiveness and stability. Longer smoothing would excessively delay signals and impede response to true regime changes. Shorter or no smoothing would allow too much noise. Empirical tests showed that 3-period smoothing offers an optimal ratio between these goals.
10. Visualization and Interpretation
10.1 Main Output: Equity Allocation
DEAM's primary output is a time series from 0 to 100 representing the recommended percentage allocation to equities. This representation is intuitive: 100% means full investment in stocks (specifically: an S&P 500 ETF), 0% means complete cash position, and intermediate values correspond to mixed portfolios. A value of 60% means, for example: invest 60% of wealth in SPY, hold 40% in money market instruments or cash.
The time series is color-coded to enable quick visual interpretation. Green shades represent high allocations (above 80%, bullish), red shades low allocations (below 20%, bearish), and neutral colors middle allocations. The chart background is dynamically colored based on the signal, enhancing readability in different market phases.
10.2 Dashboard Metrics
A tabular dashboard presents key metrics compactly. This includes current allocation, cash allocation (complement), an aggregated signal (BULLISH/NEUTRAL/BEARISH), current market regime, VIX level, market drawdown, and crisis status.
Additionally, fundamental metrics are displayed: P/E Ratio, Equity Risk Premium, Return on Equity, Debt-to-Equity Ratio, and Total Shareholder Yield. This transparency allows users to understand model decisions and form their own assessments.
Component scores (Regime, Risk, Valuation, Sentiment, Macro) are also displayed, each normalized on a 0-100 scale. This shows which factors primarily drive the current recommendation. If, for example, the Risk score is very low (20) while other scores are moderate (50-60), this indicates that risk management considerations are pulling allocation down.
10.3 Component Breakdown (Optional)
Advanced users can display individual components as separate lines in the chart. This enables analysis of component dynamics: do all components move synchronously, or are there divergences? Divergences can be particularly informative. If, for example, the market regime is bullish (high score) but the valuation component is very negative, this signals an overbought market not fundamentally supported—a classic "bubble warning."
This feature is disabled by default to keep the chart clean but can be activated for deeper analysis.
10.4 Confidence Bands
The model optionally displays uncertainty bands around the main allocation line. These are calculated as ±1 standard deviation of allocation over a rolling 20-period window. Wide bands indicate high volatility of model recommendations, suggesting uncertain market conditions. Narrow bands indicate stable recommendations.
This visualization implements a concept of epistemic uncertainty—uncertainty about the model estimate itself, not just market volatility. In phases where various indicators send conflicting signals, the allocation recommendation becomes more volatile, manifesting in wider bands. Users can understand this as a warning to act more cautiously or consult alternative information sources.
11. Alert System
11.1 Allocation Alerts
DEAM implements an alert system that notifies users of significant events. Allocation alerts trigger when smoothed allocation crosses certain thresholds. An alert is generated when allocation reaches 80% (from below), signaling strong bullish conditions. Another alert triggers when allocation falls to 20%, indicating defensive positioning.
These thresholds are not arbitrary but correspond with boundaries between model regimes. An allocation of 80% roughly corresponds to a clear bull market regime, while 20% corresponds to a bear market regime. Alerts at these points are therefore informative about fundamental regime shifts.
11.2 Crisis Alerts
Separate alerts trigger upon detection of crisis and severe crisis. These alerts have highest priority as they signal large risks. A crisis alert should prompt investors to review their portfolio and potentially take defensive measures beyond the automatic model recommendation (e.g., hedging through put options, rebalancing to more defensive sectors).
11.3 Regime Change Alerts
An alert triggers upon change of market regime (e.g., from Neutral to Correction, or from Bull Market to Strong Bull). Regime changes are highly informative events that typically entail substantial allocation changes. These alerts enable investors to proactively respond to changes in market dynamics.
11.4 Risk Breach Alerts
A specialized alert triggers when actual portfolio risk utilization exceeds target parameters by 20%. This is a warning signal that the risk management system is reaching its limits, possibly because market volatility is rising faster than allocation can be reduced. In such situations, investors should consider manual interventions.
12. Practical Application and Limitations
12.1 Portfolio Implementation
DEAM generates a recommendation for allocation between equities (S&P 500) and cash. Implementation by an investor can take various forms. The most direct method is using an S&P 500 ETF (e.g., SPY, VOO) for equity allocation and a money market fund or savings account for cash allocation.
A rebalancing strategy is required to synchronize actual allocation with model recommendation. Two approaches are possible: (1) rule-based rebalancing at every 10% deviation between actual and target, or (2) time-based monthly rebalancing. Both have trade-offs between responsiveness and transaction costs. Empirical evidence (Jaconetti, Kinniry, and Zilbering, 2010) suggests rebalancing frequency has moderate impact on performance, and investors should optimize based on their transaction costs.
12.2 Adaptation to Individual Preferences
The model offers numerous adjustment parameters. Component weights can be modified if investors place more or less belief in certain factors. A fundamentally-oriented investor might increase valuation weight, while a technical trader might increase regime weight.
Risk target parameters (target volatility, max drawdown) should be adapted to individual risk tolerance. Younger investors with long investment horizons can choose higher target volatility (15-18%), while retirees may prefer lower volatility (8-10%). This adjustment systematically shifts average equity allocation.
Crisis thresholds can be adjusted based on preference for sensitivity versus specificity of crisis detection. Lower thresholds (e.g., VIX > 35 instead of 40) increase sensitivity (more crises are detected) but reduce specificity (more false alarms). Higher thresholds have the reverse effect.
12.3 Limitations and Disclaimers
DEAM is based on historical relationships between indicators and market performance. There is no guarantee these relationships will persist in the future. Structural changes in markets (e.g., through regulation, technology, or central bank policy) can break established patterns. This is the fundamental problem of induction in financial science (Taleb, 2007).
The model is optimized for US equities (S&P 500). Application to other markets (international stocks, bonds, commodities) would require recalibration. The indicators and thresholds are specific to the statistical properties of the US equity market.
The model cannot eliminate losses. Even with perfect crisis prediction, an investor following the model would lose money in bear markets—just less than a buy-and-hold investor. The goal is risk-adjusted performance improvement, not risk elimination.
Transaction costs are not modeled. In practice, spreads, commissions, and taxes reduce net returns. Frequent trading can cause substantial costs. Model smoothing helps minimize this, but users should consider their specific cost situation.
The model reacts to information; it does not anticipate it. During sudden shocks (e.g., 9/11, COVID-19 lockdowns), the model can only react after price movements, not before. This limitation is inherent to all reactive systems.
12.4 Relationship to Other Strategies
DEAM is a tactical asset allocation approach and should be viewed as a complement, not replacement, for strategic asset allocation. Brinson, Hood, and Beebower (1986) showed in their influential study "Determinants of Portfolio Performance" that strategic asset allocation (long-term policy allocation) explains the majority of portfolio performance, but this leaves room for tactical adjustments based on market timing.
The model can be combined with value and momentum strategies at the individual stock level. While DEAM controls overall market exposure, within-equity decisions can be optimized through stock-picking models. This separation between strategic (market exposure) and tactical (stock selection) levels follows classical portfolio theory.
The model does not replace diversification across asset classes. A complete portfolio should also include bonds, international stocks, real estate, and alternative investments. DEAM addresses only the US equity allocation decision within a broader portfolio.
13. Scientific Foundation and Evaluation
13.1 Theoretical Consistency
DEAM's components are based on established financial theory and empirical evidence. The market regime component follows from regime-switching models (Hamilton, 1989) and trend-following literature. The risk management component implements volatility targeting (Moreira and Muir, 2017) and modern portfolio theory (Markowitz, 1952). The valuation component is based on discounted cash flow theory and empirical value research (Campbell and Shiller, 1988; Fama and French, 1992). The sentiment component integrates behavioral finance (Baker and Wurgler, 2006). The macro component uses established business cycle indicators (Estrella and Mishkin, 1998).
This theoretical grounding distinguishes DEAM from purely data-mining-based approaches that identify patterns without causal theory. Theory-guided models have greater probability of functioning out-of-sample, as they are based on fundamental mechanisms, not random correlations (Lo and MacKinlay, 1990).
13.2 Empirical Validation
While this document does not present detailed backtest analysis, it should be noted that rigorous validation of a tactical asset allocation model should include several elements:
In-sample testing establishes whether the model functions at all in the data on which it was calibrated. Out-of-sample testing is crucial: the model should be tested in time periods not used for development. Walk-forward analysis, where the model is successively trained on rolling windows and tested in the next window, approximates real implementation.
Performance metrics should be risk-adjusted. Pure return consideration is misleading, as higher returns often only compensate for higher risk. Sharpe Ratio, Sortino Ratio, Calmar Ratio, and Maximum Drawdown are relevant metrics. Comparison with benchmarks (Buy-and-Hold S&P 500, 60/40 Stock/Bond portfolio) contextualizes performance.
Robustness checks test sensitivity to parameter variation. If the model only functions at specific parameter settings, this indicates overfitting. Robust models show consistent performance over a range of plausible parameters.
13.3 Comparison with Existing Literature
DEAM fits into the broader literature on tactical asset allocation. Faber (2007) presented a simple momentum-based timing system that goes long when the market is above its 10-month average, otherwise cash. This simple system avoided large drawdowns in bear markets. DEAM can be understood as a sophistication of this approach that integrates multiple information sources.
Ilmanen (2011) discusses various timing factors in "Expected Returns" and argues for multi-factor approaches. DEAM operationalizes this philosophy. Asness, Moskowitz, and Pedersen (2013) showed that value and momentum effects work across asset classes, justifying cross-asset application of regime and valuation signals.
Ang (2014) emphasizes in "Asset Management: A Systematic Approach to Factor Investing" the importance of systematic, rule-based approaches over discretionary decisions. DEAM is fully systematic and eliminates emotional biases that plague individual investors (overconfidence, hindsight bias, loss aversion).
References
Ang, A. (2014) *Asset Management: A Systematic Approach to Factor Investing*. Oxford: Oxford University Press.
Ang, A., Piazzesi, M. and Wei, M. (2006) 'What does the yield curve tell us about GDP growth?', *Journal of Econometrics*, 131(1-2), pp. 359-403.
Asness, C.S. (2003) 'Fight the Fed Model', *The Journal of Portfolio Management*, 30(1), pp. 11-24.
Asness, C.S., Moskowitz, T.J. and Pedersen, L.H. (2013) 'Value and Momentum Everywhere', *The Journal of Finance*, 68(3), pp. 929-985.
Baker, M. and Wurgler, J. (2006) 'Investor Sentiment and the Cross-Section of Stock Returns', *The Journal of Finance*, 61(4), pp. 1645-1680.
Baker, M. and Wurgler, J. (2007) 'Investor Sentiment in the Stock Market', *Journal of Economic Perspectives*, 21(2), pp. 129-152.
Baur, D.G. and Lucey, B.M. (2010) 'Is Gold a Hedge or a Safe Haven? An Analysis of Stocks, Bonds and Gold', *Financial Review*, 45(2), pp. 217-229.
Bollerslev, T. (1986) 'Generalized Autoregressive Conditional Heteroskedasticity', *Journal of Econometrics*, 31(3), pp. 307-327.
Boudoukh, J., Michaely, R., Richardson, M. and Roberts, M.R. (2007) 'On the Importance of Measuring Payout Yield: Implications for Empirical Asset Pricing', *The Journal of Finance*, 62(2), pp. 877-915.
Brinson, G.P., Hood, L.R. and Beebower, G.L. (1986) 'Determinants of Portfolio Performance', *Financial Analysts Journal*, 42(4), pp. 39-44.
Brock, W., Lakonishok, J. and LeBaron, B. (1992) 'Simple Technical Trading Rules and the Stochastic Properties of Stock Returns', *The Journal of Finance*, 47(5), pp. 1731-1764.
Calmar, T.W. (1991) 'The Calmar Ratio', *Futures*, October issue.
Campbell, J.Y. and Shiller, R.J. (1988) 'The Dividend-Price Ratio and Expectations of Future Dividends and Discount Factors', *Review of Financial Studies*, 1(3), pp. 195-228.
Cochrane, J.H. (2011) 'Presidential Address: Discount Rates', *The Journal of Finance*, 66(4), pp. 1047-1108.
Damodaran, A. (2012) *Equity Risk Premiums: Determinants, Estimation and Implications*. Working Paper, Stern School of Business.
Engle, R.F. (1982) 'Autoregressive Conditional Heteroskedasticity with Estimates of the Variance of United Kingdom Inflation', *Econometrica*, 50(4), pp. 987-1007.
Estrella, A. and Hardouvelis, G.A. (1991) 'The Term Structure as a Predictor of Real Economic Activity', *The Journal of Finance*, 46(2), pp. 555-576.
Estrella, A. and Mishkin, F.S. (1998) 'Predicting U.S. Recessions: Financial Variables as Leading Indicators', *Review of Economics and Statistics*, 80(1), pp. 45-61.
Faber, M.T. (2007) 'A Quantitative Approach to Tactical Asset Allocation', *The Journal of Wealth Management*, 9(4), pp. 69-79.
Fama, E.F. and French, K.R. (1989) 'Business Conditions and Expected Returns on Stocks and Bonds', *Journal of Financial Economics*, 25(1), pp. 23-49.
Fama, E.F. and French, K.R. (1992) 'The Cross-Section of Expected Stock Returns', *The Journal of Finance*, 47(2), pp. 427-465.
Garman, M.B. and Klass, M.J. (1980) 'On the Estimation of Security Price Volatilities from Historical Data', *Journal of Business*, 53(1), pp. 67-78.
Gilchrist, S. and Zakrajšek, E. (2012) 'Credit Spreads and Business Cycle Fluctuations', *American Economic Review*, 102(4), pp. 1692-1720.
Gordon, M.J. (1962) *The Investment, Financing, and Valuation of the Corporation*. Homewood: Irwin.
Graham, B. and Dodd, D.L. (1934) *Security Analysis*. New York: McGraw-Hill.
Hamilton, J.D. (1989) 'A New Approach to the Economic Analysis of Nonstationary Time Series and the Business Cycle', *Econometrica*, 57(2), pp. 357-384.
Ilmanen, A. (2011) *Expected Returns: An Investor's Guide to Harvesting Market Rewards*. Chichester: Wiley.
Jaconetti, C.M., Kinniry, F.M. and Zilbering, Y. (2010) 'Best Practices for Portfolio Rebalancing', *Vanguard Research Paper*.
Jegadeesh, N. and Titman, S. (1993) 'Returns to Buying Winners and Selling Losers: Implications for Stock Market Efficiency', *The Journal of Finance*, 48(1), pp. 65-91.
Kahneman, D. and Tversky, A. (1979) 'Prospect Theory: An Analysis of Decision under Risk', *Econometrica*, 47(2), pp. 263-292.
Korteweg, A. (2010) 'The Net Benefits to Leverage', *The Journal of Finance*, 65(6), pp. 2137-2170.
Lo, A.W. and MacKinlay, A.C. (1990) 'Data-Snooping Biases in Tests of Financial Asset Pricing Models', *Review of Financial Studies*, 3(3), pp. 431-467.
Longin, F. and Solnik, B. (2001) 'Extreme Correlation of International Equity Markets', *The Journal of Finance*, 56(2), pp. 649-676.
Mandelbrot, B. (1963) 'The Variation of Certain Speculative Prices', *The Journal of Business*, 36(4), pp. 394-419.
Markowitz, H. (1952) 'Portfolio Selection', *The Journal of Finance*, 7(1), pp. 77-91.
Modigliani, F. and Miller, M.H. (1961) 'Dividend Policy, Growth, and the Valuation of Shares', *The Journal of Business*, 34(4), pp. 411-433.
Moreira, A. and Muir, T. (2017) 'Volatility-Managed Portfolios', *The Journal of Finance*, 72(4), pp. 1611-1644.
Moskowitz, T.J., Ooi, Y.H. and Pedersen, L.H. (2012) 'Time Series Momentum', *Journal of Financial Economics*, 104(2), pp. 228-250.
Parkinson, M. (1980) 'The Extreme Value Method for Estimating the Variance of the Rate of Return', *Journal of Business*, 53(1), pp. 61-65.
Piotroski, J.D. (2000) 'Value Investing: The Use of Historical Financial Statement Information to Separate Winners from Losers', *Journal of Accounting Research*, 38, pp. 1-41.
Reinhart, C.M. and Rogoff, K.S. (2009) *This Time Is Different: Eight Centuries of Financial Folly*. Princeton: Princeton University Press.
Ross, S.A. (1976) 'The Arbitrage Theory of Capital Asset Pricing', *Journal of Economic Theory*, 13(3), pp. 341-360.
Roy, A.D. (1952) 'Safety First and the Holding of Assets', *Econometrica*, 20(3), pp. 431-449.
Schwert, G.W. (1989) 'Why Does Stock Market Volatility Change Over Time?', *The Journal of Finance*, 44(5), pp. 1115-1153.
Sharpe, W.F. (1966) 'Mutual Fund Performance', *The Journal of Business*, 39(1), pp. 119-138.
Sharpe, W.F. (1994) 'The Sharpe Ratio', *The Journal of Portfolio Management*, 21(1), pp. 49-58.
Simon, D.P. and Wiggins, R.A. (2001) 'S&P Futures Returns and Contrary Sentiment Indicators', *Journal of Futures Markets*, 21(5), pp. 447-462.
Taleb, N.N. (2007) *The Black Swan: The Impact of the Highly Improbable*. New York: Random House.
Whaley, R.E. (2000) 'The Investor Fear Gauge', *The Journal of Portfolio Management*, 26(3), pp. 12-17.
Whaley, R.E. (2009) 'Understanding the VIX', *The Journal of Portfolio Management*, 35(3), pp. 98-105.
Yardeni, E. (2003) 'Stock Valuation Models', *Topical Study*, 51, Yardeni Research.
Zweig, M.E. (1973) 'An Investor Expectations Stock Price Predictive Model Using Closed-End Fund Premiums', *The Journal of Finance*, 28(1), pp. 67-78.
BTC Power Law Valuation BandsBTC Power Law Rainbow
A long-term valuation framework for Bitcoin based on Power Law growth — designed to help identify macro accumulation and distribution zones, aligned with long-term investor behavior.
🔍 What Is a Power Law?
A Power Law is a mathematical relationship where one quantity varies as a power of another. In this model:
Price ≈ a × (Time)^b
It captures the non-linear, exponentially slowing growth of Bitcoin over time. Rather than using linear or cyclical models, this approach aligns with how complex systems, such as networks or monetary adoption curves, often grow — rapidly at first, and then more slowly, but persistently.
🧠 Why Power Law for BTC?
Bitcoin:
Has finite supply and increasing adoption.
Operates as a monetary network , where Metcalfe’s Law and power laws naturally emerge.
Exhibits exponential growth over logarithmic time when viewed on a log-log chart .
This makes it uniquely well-suited for power law modeling.
🌈 How to Use the Valuation Bands
The central white line represents the modeled fair value according to the power law.
Colored bands represent deviations from the model in logarithmic space, acting as macro zones:
🔵 Lower Bands: Deep value / Accumulation zones.
🟡 Mid Bands: Fair value.
🔴 Upper Bands: Euphoria / Risk of macro tops.
📐 Smart Money Concepts (SMC) Alignment
Accumulation: Occurs when price consolidates near lower bands — often aligning with institutional positioning.
Markup: As price re-enters or ascends the bands, we often see breakout behavior and trend expansion.
Distribution: When price extends above upper bands, potential for exit liquidity creation and distribution events.
Reversion: Historically, price mean-reverts toward the model — rarely staying outside the bands for long.
This makes the model useful for:
Cycle timing
Long-term DCA strategy zones
Identifying value dislocations
Filtering short-term noise
⚠️ Disclaimer
This tool is for educational and informational purposes only . It is not financial advice. The power law model is a non-predictive, mathematical framework and does not guarantee future price movements .
Always use additional tools, risk management, and your own judgment before making trading or investment decisions.
EWMA Volatility EstimatorThis script calculates EWMA Volatility (Exponentially Weighted Moving Average Volatility).
Commonly used model in financial risk management.
It estimates recent price volatility by applying more weight to the most recent returns, capturing volatility clustering while remaining responsive to fast market shifts.
The method uses a decay factor (λ) of 0.94, the standard value used in models like RiskMetrics, and converts the variance estimate into annualized volatility in percentage terms.
This is not a forecasting tool. It’s an estimator that reflects the magnitude of recent price moves in a statistically robust way.
It can be helpful for:
Understanding regime shifts in market behavior
Designing position sizing rules based on recent volatility
Filtering entries during high or low volatility phases
How It Works
Computes log returns of the closing price.
Squares the returns to get a proxy for variance.
Applies an exponential moving average to the squared returns using an equivalent EMA period based on λ = 0.94.
Converts the result to volatility by taking the square root and scaling to a percentage.
Key Characteristics
Backward-looking estimator
Reacts faster than standard rolling-window volatility
Smooths noise while still being sensitive to recent spikes
This script is educational and informational. It is not financial advice or a guarantee of performance. Always test any tool as part of a broader strategy before using it in live markets.
EMA-Based Squeeze Dynamics (Gap Momentum & EWMA Projection)EMA-Based Squeeze Dynamics (Gap Momentum & EWMA Projection)
🚨 Main Utility: Early Squeeze Warning
The primary function of this indicator is to warn traders early when the market is approaching a "squeeze"—a tightening condition that often precedes significant moves or regime shifts. By visually highlighting areas of increasing tension, it helps traders anticipate potential volatility and prepare accordingly. This is intended to be a statistically and psychologically grounded replacement of so-called "fib-time-zones," which are overly-deterministic and subjective.
📌 Overview
The EMA-Based Squeeze Dynamics indicator projects future regime shifts (such as golden and death crosses) using exponential moving averages (EMAs). It employs historical interval data and current market conditions to dynamically forecast when the critical EMAs (50-period and 200-period) will reconverge, marking likely trend-change points.
This indicator leverages two core ideas:
Behavioral finance theory: Traders often collectively anticipate popular EMA crossovers, creating a self-fulfilling prophecy (normative social influence), similar to findings from Solomon Asch’s conformity experiments.
Bayesian-like updates: It utilizes historical crossover intervals as a prior, dynamically updating expectations based on evolving market data, ensuring its signals remain objectively grounded in actual market behavior.
⚙️ Technical & Mathematical Explanation
1. EMA Calculations and Regime Definitions
The indicator uses three EMAs:
Fast (9-period): Represents short-term price movement.
Medial (50-period): Indicates medium-term trend direction.
Slow (200-period): Defines long-term market sentiment.
Regime States:
Bullish: 50 EMA is above the 200 EMA.
Bearish: 50 EMA is below the 200 EMA.
A shift between these states triggers visual markers (arrows and labels) directly on the chart.
2. Gap Dynamics and Historical Intervals
At each crossover:
The indicator records the gap (distance) between the 50 and 200 EMAs.
It tracks the historical intervals between past crossovers.
An Exponentially Weighted Moving Average (EWMA) of these intervals is calculated, weighting recent intervals more heavily, dynamically updating expectations.
Important note:
After every regime shift, the projected crossover line resets its calculation. This reset is visually evident as the projection line appears to move further away after each regime change, temporarily "repelled" until the EMAs begin converging again. This ensures projections remain realistic, grounded in actual EMA convergence, and prevents overly optimistic forecasts immediately after a regime shift.
3. Gap Momentum & Adaptive Scaling
The indicator measures how quickly or slowly the gap between EMAs is changing ("gap momentum") and adjusts its forecast accordingly:
If the gap narrows rapidly, a crossover becomes more imminent.
If the gap widens, the next crossover is pushed further into the future.
The "gap factor" dynamically scales the projection based on recent gap momentum, bounded between reasonable limits (0.7–1.3).
4. Squeeze Ratio & Background Color (Visual Cues)
A "squeeze ratio" is computed when market conditions indicate tightening:
In a bullish regime, if the fast EMA is below the medial EMA (price pulling back towards long-term support), the squeeze ratio increases.
In a bearish regime, if the fast EMA rises above the medial EMA (price rallying into long-term resistance), the squeeze ratio increases.
What the Background Colors Mean:
Red Background: Indicates a bullish squeeze—price is compressing downward, hinting a bullish reversal or continuation breakout may occur soon.
Green Background: Indicates a bearish squeeze—price is compressing upward, suggesting a bearish reversal or continuation breakout could soon follow.
Opacity Explanation:
The transparency (opacity) of the background indicates the intensity of the squeeze:
High Opacity (solid color): Strong squeeze, high likelihood of imminent volatility or regime shift.
Low Opacity (faint color): Mild squeeze, signaling early stages of tightening.
Thus, more vivid colors serve as urgent visual warnings that a squeeze is rapidly intensifying.
5. Projected Next Crossover and Pseudo Crossover Mechanism
The indicator calculates an estimated future bar when a crossover (and thus, regime shift) is expected to occur. This calculation incorporates:
Historical EWMA interval.
Current squeeze intensity.
Gap momentum.
A dynamic penalty based on divergence from baseline conditions.
The "Pseudo Crossover" Explained:
A key adaptive feature is the pseudo crossover mechanism. If price action significantly deviates from the projected crossover (for example, if price stays beyond the projected line longer than expected), the indicator acknowledges the projection was incorrect and triggers a "pseudo crossover" event. Essentially, this acts as a reset, updating historical intervals with a weighted adjustment to recalibrate future predictions. In other words, if the indicator’s initial forecast proves inaccurate, it recognizes this quickly, resets itself, and tries again—ensuring it remains responsive and adaptive to actual market conditions.
🧠 Behavioral Theory: Normative Social Influence
This indicator is rooted in behavioral finance theory, specifically leveraging normative social influence (conformity). Traders commonly watch EMA signals (especially the 50 and 200 EMA crossovers). When traders collectively anticipate these signals, they begin trading ahead of actual crossovers, effectively creating self-fulfilling prophecies—similar to Solomon Asch’s famous conformity experiments, where individuals adopted group behaviors even against direct evidence.
This behavior means genuine regime shifts (actual EMA crossovers) rarely occur until EMAs visibly reconverge due to widespread anticipatory trading activity. The indicator quantifies these dynamics by objectively measuring EMA convergence and updating projections accordingly.
📊 How to Use This Indicator
Monitor the background color and opacity as primary visual cues.
A strongly colored background (solid red/green) is an early alert that a squeeze is intensifying—prepare for potential volatility or a regime shift.
Projected crossover lines give a dynamic target bar to watch for trend reversals or confirmations.
After each regime shift, expect a reset of the projection line. The line may seem initially repelled from price action, but it will recalibrate as EMAs converge again.
Trust the pseudo crossover mechanism to automatically recalibrate the indicator if its original projection misses.
🎯 Why Choose This Indicator?
Early Warning: Visual squeeze intensity helps anticipate market breakouts.
Behaviorally Grounded: Leverages real trader psychology (conformity and anticipation).
Objective & Adaptive: Uses real-time, data-driven updates rather than static levels or subjective analysis.
Easy to Interpret: Clear visual signals (arrows, labels, colors) simplify trading decisions.
Self-correcting (Pseudo Crossovers): Quickly adjusts when initial predictions miss, maintaining accuracy over time.
Summary:
The EMA-Based Squeeze Dynamics Indicator combines behavioral insights, dynamic Bayesian-like updates, intuitive visual cues, and a self-correcting pseudo crossover feature to offer traders a reliable early warning system for market squeezes and impending regime shifts. It transparently recalibrates after each regime shift and automatically resets whenever projections prove inaccurate—ensuring you always have an adaptive, realistic forecast.
Whether you're a discretionary trader or algorithmic strategist, this indicator provides a powerful tool to navigate market volatility effectively.
Happy Trading! 📈✨
EWMA Volatility Bands
The EWMA Volatility Bands indicator combines an Exponential Moving Average (EMA) and Exponentially Weighted Moving Average (EWMA) of volatility to create dynamic upper and lower price bands. It helps traders identify trends, measure market volatility, and spot extreme conditions. Key features include:
Centerline (EMA): Tracks the trend based on a user-defined period.
Volatility Bands: Adjusted by the square root of volatility, representing potential price ranges.
Percentile Rank: Highlights extreme volatility (e.g., >99% or <1%) with shaded areas between the bands.
This tool is useful for trend-following, risk assessment, and identifying overbought/oversold conditions.
Nasan Hull-smoothed envelope The Nasan Hull-Smoothed Envelope indicator is a sophisticated overlay designed to track price movement within an adaptive "envelope." It dynamically adjusts to market volatility and trend strength, using a series of smoothing and volatility-correction techniques. Here's a detailed breakdown of its components, from the input settings to the calculated visual elements:
Inputs
look_back_length (500):
Defines the lookback period for calculating intraday volatility (IDV), smoothing it over time. A higher value means the indicator considers a longer historical range for volatility calculations.
sl (50):
Sets the smoothing length for the Hull Moving Average (HMA). The HMA smooths various lines, creating a balance between sensitivity and stability in trend signals.
mp (1.5):
Multiplier for IDV, scaling the volatility impact on the envelope. A higher multiplier widens the envelope to accommodate higher volatility, while a lower one tightens it.
p (0.625):
Weight factor that determines the balance between extremes (highest high and lowest low) and averages (sma of high and sma of low) in the high/low calculations. A higher p gives more weight to extremes, making the envelope more responsive to abrupt market changes.
Volatility Calculation (IDV)
The Intraday Volatility (IDV) metric represents the average volatility per bar as an exponentially smoothed ratio of the high-low range to the close price. This is calculated over the look_back_length period, providing a base volatility value which is then scaled by mp. The IDV enables the envelope to dynamically widen or narrow with market volatility, making it sensitive to current market conditions.
Composite High and Low Bands
The high and low bands define the upper and lower bounds of the envelope.
High Calculation
a_high:
Uses a multi-period approach to capture the highest highs over several intervals (5, 8, 13, 21, and 34 bars). Averaging these highs provides a more stable reference for the high end of the envelope, capturing both immediate and recent peak values.
b_high:
Computes the average of shorter simple moving averages (5, 8, and 13 bars) of the high prices, smoothing out fluctuations in the recent highs. This generates a balanced view of high price trends.
high_c:
Combines a_high and b_high using the weight p. This blend creates a composite high that balances between recent peaks and smoothed averages, making the upper envelope boundary adaptive to short-term price shifts.
Low Calculation
a_low and b_low:
Similar to the high calculation, these capture extreme lows and smooth low values over the same intervals. This approach creates a stable and adaptive lower bound for the envelope.
low_c:
Combines a_low and b_low using the weight p, resulting in a composite low that adjusts to price fluctuations while maintaining a stable trend line.
Volatility-Adjusted Bands
The final composite high (c_high) and composite low (c_low) bands are adjusted using IDV, which accounts for intraday volatility. When volatility is high, the bands expand; when it’s low, they contract, providing a visual representation of volatility-adjusted price bounds.
Basis Line
The basis line is a Hull Moving Average (HMA) of the average of c_high and c_low. The HMA is known for its smoothness and responsiveness, making the basis line a central trend indicator. The color of the basis line changes:
Green when the basis line is increasing.
Red when the basis line is decreasing.
This color-coded basis line serves as a quick visual reference for trend direction.
Short-Term Trend Strength Block
This component analyzes recent price action to assess short-term bullish and bearish momentum.
Conditions (green, red, green1, red1):
These are binary conditions that categorize price movements as bullish or bearish based on the close compared to the open and the close’s relationship with the exponential moving average (EMA). This separation helps capture different types of strength (above/below EMA) and different bullish or bearish patterns.
Composite Trend Strength Values:
Each of the bullish and bearish counts (above and below the EMA) is normalized, resulting in the following values:
green_EMAup_a and red_EMAup_a for bullish and bearish strength above the EMA.
green_EMAdown_a and red_EMAdown_a for bullish and bearish strength below the EMA.
Trend Strength (t_s):
This calculated metric combines the normalized trend strengths with extra weight to conditions above the EMA, giving more relevance to trends that have momentum behind them.
Enhanced Trend Strength
avg_movement:
Calculates the average absolute price movement over the short_term_length, providing a measurement of recent price activity that scales with volatility.
enhanced_t_s:
Multiplies t_s by avg_movement, creating an enhanced trend strength value that reflects both directional strength and the magnitude of recent price movement.
min and max:
Minimum and maximum percentile thresholds, respectively, based on enhanced_t_s for controlling the color gradient in the fill area.
Fill Area
The fill area between plot_c_high and plot_c_low is color-coded based on the enhanced trend strength (enhanced_t_s):
Gradient color transitions from blue to green based on the strength level, with blue representing weaker trends and green indicating stronger trends.
This visual fill provides an at-a-glance assessment of trend strength across the envelope, with color shifts highlighting momentum shifts.
Summary
The indicator’s purpose is to offer an adaptive price envelope that reflects real-time market volatility and trend strength. Here’s what each component contributes:
Basis Line: A trend-following line in the center that adjusts color based on trend direction.
Envelope (c_high, c_low): Adapts to volatility by expanding and contracting based on IDV, giving traders a responsive view of expected price bounds.
Fill Area: A color-gradient region representing trend strength within the envelope, helping traders easily identify momentum changes.
Overall, this tool helps to identify trend direction, market volatility, and strength of price movements, allowing for more informed decisions based on visual cues around price boundaries and trend momentum.
Value at Risk [OmegaTools]The "Value at Risk" (VaR) indicator is a powerful financial risk management tool that helps traders estimate the potential losses in a portfolio over a specified period of time, given a certain level of confidence. VaR is widely used by financial institutions, traders, and risk managers to assess the probability of portfolio losses in both normal and volatile market conditions. This TradingView script implements a comprehensive VaR calculation using several models, allowing users to visualize different risk scenarios and adjust their trading strategies accordingly.
Concept of Value at Risk
Value at Risk (VaR) is a statistical technique used to measure the likelihood of losses in a portfolio or financial asset due to market risks. In essence, it answers the question: "What is the maximum potential loss that could occur in a given portfolio over a specific time horizon, with a certain confidence level?" For instance, if a portfolio has a one-day 95% VaR of $10,000, it means that there is a 95% chance the portfolio will not lose more than $10,000 in a single day. Conversely, there is a 5% chance of losing more than $10,000. VaR is a key risk management tool for portfolio managers and traders because it quantifies potential losses in monetary terms, allowing for better-informed decision-making.
There are several ways to calculate VaR, and this indicator script incorporates three of the most commonly used models:
Historical VaR: This approach uses historical returns to estimate potential losses. It is based purely on past price data, assuming that the past distribution of returns is indicative of future risks.
Variance-Covariance VaR: This model assumes that asset returns follow a normal distribution and that the risk can be summarized using the mean and standard deviation of past returns. It is a parametric method that is widely used in financial risk management.
Exponentially Weighted Moving Average (EWMA) VaR: In this model, recent data points are given more weight than older data. This dynamic approach allows the VaR estimation to react more quickly to changes in market volatility, which is particularly useful during periods of market stress. This model uses the Exponential Weighted Moving Average Volatility Model.
How the Script Works
The script starts by offering users a set of customizable input settings. The first input allows the user to choose between two main calculation modes: "All" or "OCT" (Only Current Timeframe). In the "All" mode, the script calculates VaR using all available methodologies—Historical, Variance-Covariance, and EWMA—providing a comprehensive risk overview. The "OCT" mode narrows the calculation to the current timeframe, which can be particularly useful for intraday traders who need a more focused view of risk.
The next input is the lookback window, which defines the number of historical periods used to calculate VaR. Commonly used lookback periods include 21 days (approximately one month), 63 days (about three months), and 252 days (roughly one year), with the script supporting up to 504 days for more extended historical analysis. A longer lookback period provides a more comprehensive picture of risk but may be less responsive to recent market conditions.
The confidence level is another important setting in the script. This represents the probability that the loss will not exceed the VaR estimate. Standard confidence levels are 90%, 95%, and 99%. A higher confidence level results in a more conservative risk estimate, meaning that the calculated VaR will reflect a more extreme loss scenario.
In addition to these core settings, the script allows users to customize the visual appearance of the indicator. For example, traders can choose different colors for "Bullish" (Risk On), "Bearish" (Risk Off), and "Neutral" phases, as well as colors for highlighting "Breaks" in the data, where returns exceed the calculated VaR. These visual cues make it easy to identify periods of heightened risk at a glance.
The actual VaR calculation is broken down into several models, starting with the Historical VaR calculation. This is done by computing the logarithmic returns of the asset's closing prices and then using linear interpolation to determine the percentile corresponding to the desired confidence level. This percentile represents the potential loss in the asset over the lookback period.
Next, the script calculates Variance-Covariance VaR using the mean and standard deviation of the historical returns. The standard deviation is multiplied by a z-score corresponding to the chosen confidence level (e.g., 1.645 for 95% confidence), and the resulting value is subtracted from the mean return to arrive at the VaR estimate.
The EWMA VaR model uses the EWMA for the sigma parameter, the standard deviation, obtaining a specific dynamic in the volatility. It is particularly useful in volatile markets where recent price behavior is more indicative of future risk than older data.
For traders interested in intraday risk management, the script provides several methods to adjust VaR calculations for lower timeframes. By using intraday returns and scaling them according to the chosen timeframe, the script provides a dynamic view of risk throughout the trading day. This is especially important for short-term traders who need to manage their exposure during high-volatility periods within the same day. The script also incorporates an EWMA model for intraday data, which gives greater weight to the most recent intraday price movements.
In addition to calculating VaR, the script also attempts to detect periods where the asset's returns exceed the estimated VaR threshold, referred to as "Breaks." When the returns breach the VaR limit, the script highlights these instances on the chart, allowing traders to quickly identify periods of extreme risk. The script also calculates the average of these breaks and displays it for comparison, helping traders understand how frequently these high-risk periods occur.
The script further visualizes the risk scenario using a risk phase classification system. Depending on the level of risk, the script categorizes the market as either "Risk On," "Risk Off," or "Risk Neutral." In "Risk On" mode, the market is considered bullish, and the indicator displays a green background. In "Risk Off" mode, the market is bearish, and the background turns red. If the market is neither strongly bullish nor bearish, the background turns neutral, signaling a balanced risk environment.
Traders can customize whether they want to see this risk phase background, along with toggling the display of the various VaR models, the intraday methods, and the break signals. This flexibility allows traders to tailor the indicator to their specific needs, whether they are day traders looking for quick intraday insights or longer-term investors focused on historical risk analysis.
The "Risk On" and "Risk Off" phases calculated by this Value at Risk (VaR) script introduce a novel approach to market risk assessment, offering traders an advanced toolset to gauge market sentiment and potential risk levels dynamically. These risk phases are built on a combination of traditional VaR methodologies and proprietary logic to create a more responsive and intuitive way to manage exposure in both normal and volatile market conditions. This method of classifying market conditions into "Risk On," "Risk Off," or "Risk Neutral" is not something that has been traditionally associated with VaR, making it a groundbreaking addition to this indicator.
How the "Risk On" and "Risk Off" Phases Are Calculated
In typical VaR implementations, the focus is on calculating the potential losses at a given confidence level without providing an overall market outlook. This script, however, introduces a unique risk classification system that takes the output of various VaR models and translates it into actionable signals for traders, marking whether the market is in a Risk On, Risk Off, or Risk Neutral phase.
The Risk On and Risk Off phases are primarily determined by comparing the current returns of the asset to the average VaR calculated across several different methods, including Historical VaR, Variance-Covariance VaR, and EWMA VaR. Here's how the process works:
1. Threshold Setting and Effect Calculation: The script first computes the average VaR using the selected models. It then checks whether the current returns (expressed as a negative value to signify loss) exceed the average VaR value. If the current returns surpass the calculated VaR threshold, this indicates that the actual market risk is higher than expected, signaling a potential shift in market conditions.
2. Break Analysis: In addition to monitoring whether returns exceed the average VaR, the script counts the number of instances within the lookback period where this breach occurs. This is referred to as the "break effect." For each period in the lookback window, the script checks whether the returns surpass the calculated VaR threshold and increments a counter. The percentage of periods where this breach occurs is then calculated as the "effect" or break percentage.
3. Dual Effect Check (if "Double" Risk Scenario is selected): When the user chooses the "Double" risk scenario mode, the script performs two layers of analysis. First, it calculates the effect of returns exceeding the VaR threshold for the current timeframe. Then, it calculates the effect for the lower intraday timeframe as well. Both effects are compared to the user-defined confidence level (e.g., 95%). If both effects exceed the confidence level, the market is deemed to be in a high-risk situation, thus triggering a Risk Off phase. If both effects fall below the confidence level, the market is classified as Risk On.
4. Risk Phases Determination: The final risk phase is determined by analyzing these effects in relation to the confidence level:
- Risk On: If the calculated effect of breaks is lower than the confidence level (e.g., fewer than 5% of periods show returns exceeding the VaR threshold for a 95% confidence level), the market is considered to be in a relatively safe state, and the script signals a "Risk On" phase. This is indicative of bullish conditions where the potential for extreme loss is minimal.
- Risk Off: If the break effect exceeds the confidence level (e.g., more than 5% of periods show returns breaching the VaR threshold), the market is deemed to be in a high-risk state, and the script signals a "Risk Off" phase. This indicates bearish market conditions where the likelihood of significant losses is higher.
- Risk Neutral: If the break effect hovers near the confidence level or if there is no clear trend indicating a shift toward either extreme, the market is classified as "Risk Neutral." In this phase, neither bulls nor bears are dominant, and traders should remain cautious.
The phase color that the script uses helps visualize these risk phases. The background will turn green in Risk On conditions, red in Risk Off conditions, and gray in Risk Neutral phases, providing immediate visual feedback on market risk. In addition to this, when the "Double" risk scenario is selected, the background will only turn green or red if both the current and intraday timeframes confirm the respective risk phase. This double-checking process ensures that traders are only given a strong signal when both longer-term and short-term risks align, reducing the likelihood of false signals.
A New Way of Using Value at Risk
This innovative Risk On/Risk Off classification, based on the interaction between VaR thresholds and market returns, represents a significant departure from the traditional use of Value at Risk as a pure risk measurement tool. Typically, VaR is employed as a backward-looking measure of risk, providing a static estimate of potential losses over a given timeframe with no immediate actionable feedback on current market conditions. This script, however, dynamically interprets VaR results to create a forward-looking, real-time signal that informs traders whether they are operating in a favorable (Risk On) or unfavorable (Risk Off) environment.
By incorporating the "break effect" analysis and allowing users to view the VaR breaches as a percentage of past occurrences, the script adds a predictive element that can be used to time market entries and exits more effectively. This **dual-layer risk analysis**, particularly when using the "Double" scenario mode, adds further granularity by considering both current timeframe and intraday risks. Traders can therefore make more informed decisions not just based on historical risk data, but on how the market is behaving in real-time relative to those risk benchmarks.
This approach transforms the VaR indicator from a risk monitoring tool into a decision-making system that helps identify favorable trading opportunities while alerting users to potential market downturns. It provides a more holistic view of market conditions by combining both statistical risk measurement and intuitive phase-based market analysis. This level of integration between VaR methodologies and real-time signal generation has not been widely seen in the world of trading indicators, marking this script as a cutting-edge tool for risk management and market sentiment analysis.
I would like to express my sincere gratitude to @skewedzeta for his invaluable contribution to the final script. From generating fresh ideas to applying his expertise in reviewing the formula, his support has been instrumental in refining the outcome.
lib_no_delayLibrary "lib_no_delay"
This library contains modifications to standard functions that return na before reaching the bar of their 'length' parameter.
That is because they do not compromise speed at current time for correct results in the past. This is good for live trading in short timeframes but killing applications on Monthly / Weekly timeframes if instruments, like in crypto, do not have extensive history (why would you even trade the monthly on a meme coin ... not my decision).
Also, some functions rely on source (value at previous bar), which is not available on bar 1 and therefore cascading to a na value up to the last bar ... which in turn leads to a non displaying indicator and waste of time debugging this)
Anyway ... there you go, let me know if I should add more functions.
sma(source, length)
Parameters:
source (float) : Series of values to process.
length (simple int) : Number of bars (length).
Returns: Simple moving average of source for length bars back.
ema(source, length)
Parameters:
source (float) : Series of values to process.
length (simple int) : Number of bars (length).
Returns: (float) The exponentially weighted moving average of the source.
rma(source, length)
Parameters:
source (float) : Series of values to process.
length (simple int) : Number of bars (length).
Returns: Exponential moving average of source with alpha = 1 / length.
atr(length)
Function atr (average true range) returns the RMA of true range. True range is max(high - low, abs(high - close ), abs(low - close )). This adapted version extends ta.atr to start without delay at first bar and deliver usable data instead of na by averaging ta.tr(true) via manual SMA.
Parameters:
length (simple int) : Number of bars back (length).
Returns: Average true range.
rsi(source, length)
Relative strength index. It is calculated using the ta.rma() of upward and downward changes of source over the last length bars. This adapted version extends ta.rsi to start without delay at first bar and deliver usable data instead of na.
Parameters:
source (float) : Series of values to process.
length (simple int) : Number of bars back (length).
Returns: Relative Strength Index.
Adaptive Price Channel (log scale)The field of technical analysis is consistently expanding, with numerous indicators used for market forecasting. Amongst them, a novel indicator dubbed the Adaptive Price Channel (log scale), inspired by the renowned Nadaraya-Watson Envelope (LuxAlgo) from LuxAlgo, is gaining traction for its distinctive features and versatility. Unlike its predecessor, the Adaptive Price Channel (log scale) is applicable on a logarithmic scale, thereby allowing it to be utilized on both smaller and larger timeframes.
1. Key Features
The Adaptive Price Channel (log scale) is founded on the trading view Pinescript language, version 5, with its primary aim to maximize the versatility and scalability of trading indicators. It allows traders to adapt it according to their preferred timeframe, thereby making it applicable for a wide range of trading strategies.
Its bandwidth can be adjusted through the input parameters, offering traders the flexibility to manipulate the indicator according to their strategic requirements. Furthermore, it provides an option for repainting smoothing. This option enables users to control the repainting effect in which the historical output of the indicator may change over time. When disabled, the indicator provides the endpoints of the calculations, ensuring consistency in historical values.
Moreover, the Adaptive Price Channel (log scale) allows for color customization, thereby improving visibility and user-friendliness. The colors of the indicator's upward and downward directions can be changed according to the user's preference.
2. Working Mechanism
The Adaptive Price Channel (log scale) uses the logarithm of the source, which is typically the closing price of a trading instrument. It leverages a Gaussian function that exponentially decreases the further the price moves away from the mean, accounting for both positive and negative values. The bandwidth of the Gaussian function can be adjusted to adapt to different market conditions.
Additionally, the Adaptive Price Channel (log scale) features an array of 500 lines for each bar, which helps in defining the boundaries or envelope for price movements. The calculations are executed using the Nadaraya-Watson estimator, which uses kernel regression for non-parametric analysis.
The calculated values for the upper and lower bounds of the envelope are then converted back from the logarithmic scale using the exponential function. This calculation process continues for each bar until the last bar in the data set.
To ensure optimal performance, the Adaptive Price Channel (log scale) uses dynamic repainting. If the repainting mode is enabled, it adjusts the smoothing of the indicator for the entire historical data, making the results more accurate.
3. Visualization and Alerts
The Adaptive Price Channel (log scale) offers an array of visual aids, including labels and plots. The upper and lower bounds of the envelope are plotted, and the indicator triggers labels at points where the closing price crosses these boundaries. These labels serve as alerts for potential trading opportunities.
4. Conclusion
The Adaptive Price Channel (log scale) is an innovative and adaptable trading indicator, drawing inspiration from its predecessor but introducing unique features to increase its versatility. By providing a repainting option, it ensures consistent historical values, thereby enhancing the reliability of the indicator. Furthermore, the capability to operate on a logarithmic scale broadens its usability for different timeframes. The Adaptive Price Channel (log scale) is a powerful tool for any trader, facilitating a better understanding of market dynamics, and enabling more informed decision-making.
Parabolic SAR + EMA 200 + MACD SignalsParabolic SAR + EMA 200 + MACD Signals Indicator, a powerful tool designed to help traders identify optimal entry points in the market.
This indicator combines three popular technical indicators: Parabolic SAR (Stop and Reverse), EMA200 (Exponential Moving Average 200) and MACD (Moving Average Convergence Divergence) - to provide clear and concise buy and sell signals based on market trends.
The MACD component of this indicator calculates the difference between two exponentially smoothed moving averages, providing insight into the trend strength of the market. The Parabolic SAR component helps identify potential price reversals, while the EMA200 acts as a key level of support and resistance, providing additional confirmation of the overall trend direction.
Whether you're a seasoned trader or just starting out, the MACD-Parabolic SAR-EMA200 Indicator is a must-have tool for anyone looking to improve their trading strategy and maximize profits in today's dynamic markets.
Buy conditions
The price should be above the EMA 200
Parabolic SAR should show an upward trend
MACD Delta should be positive
ُSell conditions
The price should be below the EMA 200
Parabolic SAR should show an downward trend
MACD Delta should be negative
Simple_RSI+PA+DCA StrategyThis strategy is a result of a study to understand better the workings of functions, for loops and the use of lines to visualize price levels. The strategy is a complete rewrite of the older RSI+PA+DCA Strategy with the goal to make it dynamic and to simplify the strategy settings to the bare minimum.
In case you are not familiar with the older RSI+PA+DCA Strategy, here is a short explanation of the idea behind the strategy:
The idea behind the strategy based on an RSI strategy of buying low. A position is entered when the RSI and moving average conditions are met. The position is closed when it reaches a specified take profit percentage. As soon as the first the position is opened multiple PA (price average) layers are setup based on a specified percentage of price drop. When the price hits the layer another position with the same position size is is opened. This causes the average cost price (the white line) to decrease. If the price drops more, another position is opened with another price average decrease as result. When the price starts rising again the different positions are separately closed when each reaches the specified take profit. The positions can be re-opened when the price drops again. And so on. When the price rises more and crosses over the average price and reached the specified Stop level (the red line) on top of it, it closes all the positions at once and cancels all orders. From that moment on it waits for another price dip before it opens a new position.
This is the old RSI+PA+DCA Strategy:
The reason to completely rewrite the code for this strategy is to create a more automated, adaptable and dynamic system. The old version is static and because of the linear use of code the amount of DCA levels were fixed to max 6 layers. If you want to add more DCA layers you manually need to change the script and add extra code. The big difference in the new version is that you can specify the amount of DCA layers in the strategy settings. The use of 'for loops' in the code gives the possibility to make this very dynamic and adaptable.
The RSI code is adapted, just like the old version, from the RSI Strategy - Buy The Dips by Coinrule and is used for study purpose. Any other low/dip finding indicator can be used as well
The distance between the DCA layers are calculated exponentially in a function. In the settings you can define the exponential scale to create the distance between the layers. The bigger the scale the bigger the distance. This calculation is not working perfectly yet and needs way more experimentation. Feel free to leave a comment if you have a better idea about this.
The idea behind generating DCA layers with a 'for loop' is inspired by the Backtesting 3commas DCA Bot v2 by rouxam .
The ideas for creating a dynamic position count and for opening and closing different positions separately based on a specified take profit are taken from the Simple_Pyramiding strategy I wrote previously.
This code is a result of a study and not intended for use as a full functioning strategy. To make the code understandable for users that are not so much introduced into pine script (like myself), every step in the code is commented to explain what it does. Hopefully it helps.
Enjoy!
Band Pass Normalized Suite (BPNS)Outlier-Free Normalization and Band Pass Filtering
We present a technique for normalizing and filtering a given time series, source, in order to improve its stationarity and enhance its features. The technique includes two stages: outlier-free normalization and band pass filtering.
Outlier-Free Normalization:
In order to normalize source and reduce the impact of outliers, we first smooth the time series using an exponential moving average with a smoothing factor of alpha. The smoothed time series is then normalized by subtracting the minimum value within a given lookback period, dev_lookback, and dividing the result by the range (maximum - minimum) within the same lookback period. Outliers are detected and excluded from the normalization process by identifying values that are more than outlier_level standard deviations away from the exponentially smoothed average.
Band Pass Filtering:
After normalization, the time series is passed through a band pass filter to remove low and high frequency components. The specifics of the band pass filter implementation are not provided.
Code snippet:
bes(float source = close, float alpha = 0.7) =>
var float smoothed = na
smoothed := na(smoothed) ? source : alpha * source + (1 - alpha) * nz(smoothed )
max(source, outlier_level, dev_lookback)=>
var float max = na
src = array.new()
stdev = math.abs((source - bes(source, 0.1))/ta.stdev(source, dev_lookback))
array.push(src, stdev < outlier_level ? source : -1.7976931348623157e+308)
max := math.max(nz(max ), array.get(src, 0))
min(source, outlier_level, dev_lookback) =>
var float min = na
src = array.new()
stdev = math.abs((source - bes(source, 0.1))/ta.stdev(source, dev_lookback))
array.push(src, stdev < outlier_level ? source : 1.7976931348623157e+308)
min := math.min(nz(min ), array.get(src, 0))
min_max(src, outlier_level, dev_lookback) =>
(src - min(src, outlier_level, dev_lookback))/(max(src, outlier_level, dev_lookback) - min(src, outlier_level, dev_lookback)) * 100
To apply the outlier-free normalization and band pass filter to a given time series, source, the min_max() function can be called with the desired values for outlier_level and dev_lookback as arguments. For example:
normalized_source = min_max(source, 2, 50)
This will apply the outlier-free normalization and band pass filter to source, using an outlier_level of 2 standard deviations and a lookback period of 50 data points for both the normalization and outlier detection steps. The resulting normalized and filtered time series will be stored in normalized_source.
It is important to note that the choice of values for outlier_level and dev_lookback will have a significant impact on the resulting normalized and filtered time series. These values should be chosen carefully based on the characteristics of the input time series and the desired properties of the normalized and filtered output.
In conclusion, the outlier-free normalization and band pass filtering technique presented here provides a useful tool for preprocessing time series data and improving its stationarity and feature content. The flexibility of the method, through the choice of outlier_level and dev_lookback values, allows it to be tailored to the specific characteristics of the input time series.
AMASling - All Moving Average Sling ShotThis indicator modifies the SlingShot System by Chris Moody to allow it to be based on 'any' Fast and Slow moving average pair. Open Long / Close Long / Open Short / Close Short alerts can be generated for automated bot trading based on the SlingShot strategy:
• Conservative Entry = Fast MA above Slow MA, and previous bar close below Fast MA, and current price above Fast MA
• Conservative Entry = Fast MA below Slow MA, and previous bar close above Fast MA, and current price below Fast MA
• Aggressive Entry = Fast MA above Slow MA, and price below Fast MA
• Aggressive Exit = Fast MA below Slow MA, and price above Fast MA
Entries and exits can also be made based on moving average crossovers, I initially put this in to make it easy to compare to a more standard strategy, but upon backtesting combining crossovers with the SlingShot appeared to produce better results on some charts.
Alerts can also be filtered to allow long deals only when the fast moving average is above the slow moving average (uptrend) and short deals only when the fast moving average is below the slow moving averages (downtrend).
If you have a strategy that can buy based on External Indicators you can use the 'Backtest Signal' which plots the values set in the 'Long / Short Signals' section.
The Fast, Slow and Signal Moving Averages can be set to:
• Simple Moving Average (SMA)
• Exponential Moving Average (EMA)
• Weighted Moving Average (WMA)
• Volume-Weighted Moving Average (VWMA)
• Hull Moving Average (HMA)
• Exponentially Weighted Moving Average (RMA) (SMMA)
• Linear regression curve Moving Average (LSMA)
• Double EMA (DEMA)
• Double SMA (DSMA)
• Double WMA (DWMA)
• Double RMA (DRMA)
• Triple EMA (TEMA)
• Triple SMA (TSMA)
• Triple WMA (TWMA)
• Triple RMA (TRMA)
• Symmetrically Weighted Moving Average (SWMA) ** length does not apply **
• Arnaud Legoux Moving Average (ALMA)
• Variable Index Dynamic Average (VIDYA)
• Fractal Adaptive Moving Average (FRAMA)
'Backtest Signal' and 'Deal State' are plotted to display.none, so change the Style Settings for the chart if you need to see them for testing.
Yes I did choose the name because 'It's Amasling!'
Any RibbonThis indicator displays a ribbon of two individually configured Fast and Slow and Moving Averages for a fixed time frame. It also displays the last close price of the configured time frame, colored green when above the band, red below and blue when interacting. A label shows the percentage distance of the current price from the band, (again red below, green above, blue interacting), when the price is within the band it will show the percentage distance from median of the band.
The Fast and Slow Moving Averages can be set to:
Simple Moving Average (SMA)
Exponential Moving Average (EMA)
Weighted Moving Average (WMA)
Volume-Weighted Moving Average (VWMA)
Hull Moving Average (HMA)
Exponentially Weighted Moving Average (RMA) (SMMA)
Linear regression curve Moving Average (LSMA)
Double EMA (DEMA)
Double SMA (DSMA)
Double WMA (DWMA)
Double RMA (DRMA)
Triple EMA (TEMA)
Triple SMA (TSMA)
Triple WMA (TWMA)
Triple RMA (TRMA)
Symmetrically Weighted Moving Average (SWMA) ** length does not apply **
Arnaud Legoux Moving Average (ALMA)
Variable Index Dynamic Average (VIDYA)
Fractal Adaptive Moving Average (FRAMA)
I wrote this script after identifying some interesting moving average bands with my AMACD indicator and wanting to see them on the price chart. As an example look at the interactions between ETHBUSD 4hr and the band of VIDYA 32 Open and VIDYA 39 Open. Or start from the good old BTC Bull market support band, Weekly EMA 21 and SMA 20 and see if you can get a better fit. I find the Double RMA 22 a better fast option than the standard EMA 21.
AMACD - All Moving Average Convergence DivergenceThis indicator displays the Moving Average Convergane and Divergence ( MACD ) of individually configured Fast, Slow and Signal Moving Averages. Buy and sell alerts can be set based on moving average crossovers, consecutive convergence/divergence of the moving averages, and directional changes in the histogram moving averages.
The Fast, Slow and Signal Moving Averages can be set to:
Exponential Moving Average ( EMA )
Volume-Weighted Moving Average ( VWMA )
Simple Moving Average ( SMA )
Weighted Moving Average ( WMA )
Hull Moving Average ( HMA )
Exponentially Weighted Moving Average (RMA) ( SMMA )
Symmetrically Weighted Moving Average ( SWMA )
Arnaud Legoux Moving Average ( ALMA )
Double EMA ( DEMA )
Double SMA (DSMA)
Double WMA (DWMA)
Double RMA ( DRMA )
Triple EMA ( TEMA )
Triple SMA (TSMA)
Triple WMA (TWMA)
Triple RMA (TRMA)
Linear regression curve Moving Average ( LSMA )
Variable Index Dynamic Average ( VIDYA )
Fractal Adaptive Moving Average ( FRAMA )
If you have a strategy that can buy based on External Indicators use 'Backtest Signal' which returns a 1 for a Buy and a 2 for a sell.
'Backtest Signal' is plotted to display.none, so change the Style Settings for the chart if you need to see it for testing.
Nasdaq VXN Volatility Warning IndicatorToday I am sharing with the community a volatility indicator that uses the Nasdaq VXN Volatility Index to help you or your algorithms avoid black swan events. This is a similar the indicator I published last week that uses the SP500 VIX, but this indicator uses the Nasdaq VXN and can help inform strategies on the Nasdaq index or Nasdaq derivative instruments.
Variance is most commonly used in statistics to derive standard deviation (with its square root). It does have another practical application, and that is to identify outliers in a sample of data. Variance is defined as the squared difference between a value and its mean. Calculating that squared difference means that the farther away the value is from the mean, the more the variance will grow (exponentially). This exponential difference makes outliers in the variance data more apparent.
Why does this matter?
There are assets or indices that exist in the stock market that might make us adjust our trading strategy if they are behaving in an unusual way. In some instances, we can use variance to identify that behavior and inform our strategy.
Is that really possible?
Let’s look at the relationship between VXN and the Nasdaq100 as an example. If you trade a Nasdaq index with a mean reversion strategy or algorithm, you know that they typically do best in times of volatility . These strategies essentially attempt to “call bottom” on a pullback. Their downside is that sometimes a pullback turns into a regime change, or a black swan event. The other downside is that there is no logical tight stop that actually increases their performance, so when they lose they tend to lose big.
So that begs the question, how might one quantitatively identify if this dip could turn into a regime change or black swan event?
The Nasdaq Volatility Index ( VXN ) uses options data to identify, on a large scale, what investors overall expect the market to do in the near future. The Volatility Index spikes in times of uncertainty and when investors expect the market to go down. However, during a black swan event, historically the VXN has spiked a lot harder. We can use variance here to identify if a spike in the VXN exceeds our threshold for a normal market pullback, and potentially avoid entering trades for a period of time (I.e. maybe we don’t buy that dip).
Does this actually work?
In backtesting, this cut the drawdown of my index reversion strategies in half. It also cuts out some good trades (because high investor fear isn’t always indicative of a regime change or black swan event). But, I’ll happily lose out on some good trades in exchange for half the drawdown. Lets look at some examples of periods of time that trades could have been avoided using this strategy/indicator:
Example 1 – With the Volatility Warning Indicator, the mean reversion strategy could have avoided repeatedly buying this pullback that led to this asset losing over 75% of its value:
Example 2 - June 2018 to June 2019 - With the Volatility Warning Indicator, the drawdown during this period reduces from 22% to 11%, and the overall returns increase from -8% to +3%
How do you use this indicator?
This indicator determines the variance of VXN against a long term mean. If the variance of the VXN spikes over an input threshold, the indicator goes up. The indicator will remain up for a defined period of bars/time after the variance returns below the threshold. I have included default values I’ve found to be significant for a short-term mean-reversion strategy, but your inputs might depend on your risk tolerance and strategy time-horizon. The default values are for 1hr VXN data/charts. It will pull in variance data for the VXN regardless of which chart the indicator is applied to.
Disclaimer: Open-source scripts I publish in the community are largely meant to spark ideas or be used as building blocks for part of a more robust trade management strategy. If you would like to implement a version of any script, I would recommend making significant additions/modifications to the strategy & risk management functions. If you don’t know how to program in Pine, then hire a Pine-coder. We can help!
S&P500 VIX Volatility Warning IndicatorToday I am sharing with the community a volatility indicator that can help you or your algorithms avoid black swan events. Variance is most commonly used in statistics to derive standard deviation (with its square root). It does have another practical application, and that is to identify outliers in a sample of data. Variance in statistics is defined as the squared difference between a value and its mean. Calculating that squared difference means that the farther away the value is from the mean, the more the variance will grow (exponentially). This exponential difference makes outliers in the variance data more apparent.
Why does this matter?
There are assets or indices that exist in the stock market that might make us adjust our trading strategy if they are behaving in an unusual way. In some instances, we can use variance to identify that behavior and inform our strategy.
Is that really possible?
Let’s look at the relationship between VIX and the S&P500 as an example. If you trade an S&P500 index with a mean reversion strategy or algorithm, you know that they typically do best in times of volatility. These strategies essentially attempt to “call bottom” on a pullback. Their downside is that sometimes a pullback turns into a regime change, or a black swan event. The other downside is that there is no logical tight stop that actually increases their performance, so when they lose they tend to lose big.
So that begs the question, how might one quantitatively identify if this dip could turn into a regime change or black swan event?
The CBOE Volatility Index (VIX) uses options data to identify, on a large scale, what investors overall expect the market to do in the near future. The Volatility Index spikes in times of uncertainty and when investors expect the market to go down. However, during a black swan event, the VIX spikes a lot harder. We can use variance here to identify if a spike in the VIX exceeds our threshold for a normal market pullback, and potentially avoid entering trades for a period of time (I.e. maybe we don’t buy that dip).
Does this actually work?
In backtesting, this cut the drawdown of my index reversion strategies in half. It also cuts out some good trades (because high investor fear isn’t always indicative of a regime change or black swan event). But, I’ll happily lose out on some good trades in exchange for half the drawdown. Lets look at some examples of periods of time that trades could have been avoided using this strategy/indicator:
Example 1 – With the Volatility Warning Indicator, the mean reversion strategy could have avoided repeatedly buying this pullback that led to SPXL losing over 75% of its value:
Example 2 - June 2018 to June 2019 - With the Volatility Warning Indicator, the drawdown during this period reduces from 22% to 11%, and the overall returns increase from -8% to +3%
How do you use this indicator?
This indicator determines the variance of the VIX against a long term mean. If the variance of the VIX spikes over an input threshold, the indicator goes up. The indicator will remain up for a defined period of bars/time after the variance returns below the threshold. I have included default values I’ve found to be significant for a short-term mean-reversion strategy, but your inputs might depend on your risk tolerance and strategy time-horizon. The default values are for 1hr VIX data. It will pull in variance data for the VIX regardless of which chart the indicator is applied to.
Disclaimer : Open-source scripts I publish in the community are largely meant to spark ideas or be used as building blocks for part of a more robust trade management strategy. If you would like to implement a version of any script, I would recommend making significant additions/modifications to the strategy & risk management functions. If you don’t know how to program in Pine, then hire a Pine-coder. We can help!
MACD Alert [All MA in one] [Smart Crypto Trade (SCT)]This code is a gift from "Smart Crypto Trade (SCT)" group
MACD indicator contains 3 EMA, I think one of the best usage of MACD is trend detection and divergences.
In our indicator, you can select the type of Moving averages that used in macd.
You can using "MACD" based on several types of moving averages including:
Exponential Moving Average ( EMA )
Volume-Weighted Moving Average ( VWMA )
Simple Moving Average ( SMA )
Weighted Moving Average ( WMA )
Exponentially Weighted Moving Average (RMA) that used in RSI
Smoothed Moving Average ( SMMA )
Arnaud Legoux Moving Average ( ALMA )
Double EMA ( DEMA )
Double SMA (DSMA)
Double WMA (DWMA)
Double RMA (DRMA)
Triple EMA ( TEMA )
Triple SMA (TSMA)
Triple WMA (TWMA)
Triple RMA (TRMA)
Linear regression curve Moving Average ( LSMA )
Variable Index Dynamic Average ( VIDYA )
Fractal Adaptive Moving Average ( FRAMA )
In other words we tried to collect all the most popular MAs in our MACD indicator.
In addition, you can use four types of alert or alarm conditions for detection LONG or SHORT positions and trends. For this, you must set an alert in alert tab and set the condition based on four defaults conditions.
Enjoy
EvMA BandsIt is an index that looks like the final evolution by weighting the Bollinger band with exponential smoothing and volume.
The base Line is my EvMA as volume weighted EMA, so it is quite responsive.
The standard deviation is also exponentially smoothed, and the reaction is too good to handle, so it is further smoothed by EMA.
Charts without volume are not weighted with volume as 1.
It seems that the usage in trading is the same as the Bollinger band
ボリンジャーバンドを指数平滑出来高加重し、最終進化したような指標です
中央線は拙作のEvMAで出来高加重EMAなのでかなり反応が良いです
標準偏差も指数平滑出来高加重して反応が良すぎて扱いにくいのでさらにEMAで平滑化しています
出来高の無いチャートは出来高を1として加重しないようにしています
トレードでの使い方はボリンジャーバンドと同じで良いと思われます
Moving Averages Linear CombinatorLinearly combining moving averages can provide relatively interesting results such as a low-lagging moving averages or moving averages able to produce more pertinent crosses with the price.
As a remainder, a linear combination is a mathematical expression that is based on the multiplication of two variables (or terms) with two coefficients (also called scalars when working with vectors) and adding the results, that is:
ax + by
This expression is a linear combination , with x/y as variables and a/b as coefficients. Lot of indicators are made from linear combinations of moving averages, some examples include the double/triple exponential moving average, least squares moving average and the hull moving average.
Today proposed indicator allow the user to combine many types of moving averages together in order to get different results, we will introduce each settings of the indicator as well as how they affect the final output.
Explaining The Effects Of Linear Combinations
There are various ways to explain why linear combination can produce low-lagging moving averages, lets take for example the linear combination of a fast SMA of period p/2 and slow simple moving average of period p , the linear combination of these two moving averages is described as follows:
MA = 2SMA(p/2) + -1SMA(p)
Which is equivalent to:
MA = 2SMA(p/2) - SMA(p) = SMA(p/2) + SMA(p/2) - SMA(p)
We can see the above linear combinations consist in adding a bandpass filter to the fast moving average, which of course allow to reduce the lag. It is important to note that lag is reduced when the first moving average term is more reactive than the second moving average term. In case we instead use:
MA = -2SMA(p/2) + 1SMA(p)
we would have a combination between a low-pass and band-reject filter.
The Indicator
The indicator is based on the following linear combination:
Coeff × LeadingMA(length) - (Coeff-1) × LaggingMA(length)
The length setting control both moving averages period, leading control the type of moving average used as leading MA, while lagging control the type of MA used as lagging moving average, in order to get low lag results the leading MA should be more reactive than the lagging MA. Coeff control the coefficients of the linear combination, with higher values of coeff amplifying the effects of the linear combination, negative values of coeff would make a low-lag moving average become a lagging moving average, coeff = 1 return the leading MA, coeff = -1 return the lagging MA. The leading period divisor allow to divide the period of the leading MA by the selected number.
The types of moving average available are: simple, exponentially weighted, triangular, least squares, hull and volume weighted. The lagging MA allow you to select another MA on the chart as input.
length = 100, leading period divisor = 2, coeff = 2, with both MA type = SMA. Using coeff = -2 instead would give:
You can select "Plot leading and lagging" in order to show the leading and lagging MA.
Conclusion
The proposed tool allow the user to create a custom moving averages by making use of linear combination. The script is not that useful when you think about it, and might maybe be one of my worst, as it is relatively impractical, not proud of it, but it still took time to make so i decided to post it anyway.
Reflex & Trendflex█ OVERVIEW
Reflex and Trendflex are zero-lag oscillators that decompose price into independent cycle and trend components using SuperSmoother filtering. These indicators isolate each component separately, providing clearer identification of cyclical reversals (Reflex) versus trending movements (Trendflex).
Based on Dr. John F. Ehlers' "Reflex: A New Zero-Lag Indicator" article (February 2020, TASC), both oscillators use normalized slope deviation analysis to minimize lag while maintaining signal clarity. The SuperSmoother filter removes high-frequency noise, then deviations from linear regression (Reflex) or current value (Trendflex) are measured and normalized by RMS for consistent amplitude across instruments and timeframes.
█ CONCEPTS
SuperSmoother Filter
Both oscillators begin with a two-pole Butterworth low-pass filter that smooths price data without the excessive lag of simple moving averages. The filter uses exponential decay coefficients and cosine modulation based on the cutoff period, providing aggressive smoothing while preserving signal timing.
Reflex: Cycle Component
Reflex isolates cyclical price behavior by measuring deviation from a linear regression line fitted through the SuperSmoother output. For each bar, the filter calculates a linear slope over the lookback period, then sums how much the smoothed price deviates from this trendline. These deviations represent pure cyclical movement - price oscillations around the dominant trend. The result is normalized by RMS (root mean square) to produce consistent amplitude regardless of volatility or timeframe.
Trendflex: Trend Component
Trendflex extracts trending behavior by measuring cumulative deviation from the current SuperSmoother value. Instead of comparing to a regression line, it simply sums the differences between the current smoothed value and all past values in the period. This captures sustained directional movement rather than oscillations. Like Reflex, normalization by RMS ensures comparable readings across different instruments.
RMS Normalization
Both oscillators normalize their raw deviation measurements using an exponentially weighted RMS calculation: `rms = 0.04 * deviation² + 0.96 * rms `. This adaptive normalization ensures the oscillator amplitude remains stable as volatility changes, making threshold levels meaningful across different market conditions.
█ INTERPRETATION
Reflex (Cycle Component)
Oscillates around zero representing cyclical price behavior isolated from trend:
• Above zero : Price is in upward phase of cycle
• Below zero : Price is in downward phase of cycle
• Zero crossings : Potential cycle reversal points
• Extremes : Indicate stretched cyclical condition, often precede mean reversion
Best used for identifying cyclical turning points in ranging or oscillating markets. More sensitive to reversals than Trendflex.
Trendflex (Trend Component)
Oscillates around zero representing trending behavior isolated from cycles:
• Above zero : Sustained upward trend
• Below zero : Sustained downward trend
• Zero crossings : Trend direction changes
• Magnitude : Strength of trend (larger absolute values = stronger trend)
Best used for confirming trend direction and identifying trend exhaustion. Less noisy than Reflex due to focus on directional movement rather than oscillations.
Combined Analysis
Using both oscillators together provides powerful signal confirmation:
• Both positive: Strong uptrend with positive cycle phase (high probability long setup)
• Both negative: Strong downtrend with negative cycle phase (high probability short setup)
• Divergent signals: Conflicting cycle and trend (choppy conditions, reduce position size)
• Reflex reversal with Trendflex agreement: Cyclical turn within established trend (entry/exit timing)
Dynamic Thresholds
Threshold bands identify statistically significant oscillator readings that warrant attention:
• Breach above +threshold : Strong bullish cycle (Reflex) or trend (Trendflex) behavior - potential overbought condition
• Breach below -threshold : Strong bearish cycle or trend behavior - potential oversold condition
• Return inside thresholds : Signal strength normalizing, potential reversal or consolidation ahead
• Threshold compression : During low volatility, thresholds narrow (especially with StdDev mode), making breaches more frequent
• Threshold expansion : During high volatility, thresholds widen, filtering out minor oscillations
Combine threshold breaches with zero-line position for stronger signals:
• Threshold breach + zero-line cross = high-conviction signal
• Threshold breach without zero-line support = monitor for confirmation
Alert Conditions
Six built-in alerts trigger on bar close (no repainting):
• Above +Threshold : Oscillator crossed above positive threshold (strong bullish behavior)
• Below -Threshold : Oscillator crossed below negative threshold (strong bearish behavior)
• Reflex Above Zero : Reflex crossed above zero (bullish cycle phase)
• Reflex Below Zero : Reflex crossed below zero (bearish cycle phase)
• Trendflex Above Zero : Trendflex crossed above zero (bullish trend shift)
• Trendflex Below Zero : Trendflex crossed below zero (bearish trend shift)
█ SETTINGS & PARAMETER TUNING
Oscillator Settings
• Source : Price series to decompose
• Reflex Period (5-50): SuperSmoother period for cycle component. Lower values increase responsiveness to cyclical turns but add noise. Default 20.
• Trendflex Period (5-50): SuperSmoother period for trend component. Lower values respond faster to trend changes. Default 20.
Display Settings
• Reflex/Trendflex Display : Toggle visibility and customize colors for each oscillator independently
• Zero Line : Reference line showing neutral oscillator position
Dynamic Thresholds
Optional significance bands that identify when oscillator readings indicate strong cyclical or trending behavior:
• Threshold Mode : Choose calculation method based on market characteristics
- MAD (Median Absolute Deviation) : Outlier-resistant, best for markets with occasional spikes (default)
- Standard Deviation : Volatility-sensitive, adapts quickly to regime changes
- Percentile Rank : Fixed probability bands (e.g., 90% = only 10% of values exceed threshold)
• Apply To : Select which oscillator (Reflex or Trendflex) to calculate thresholds for
• Period (2-200): Lookback window for threshold calculation. Default 50.
• Multiplier (k) : Scaling factor for MAD/StdDev modes. Higher values = fewer threshold breaches (default 1.5)
• Percentile (%) : For Percentile mode only. Higher percentile = more selective threshold (default 90%)
Parameter Interactions
• Shorter periods make both oscillators more sensitive but noisier
• Reflex typically more volatile than Trendflex at same period settings
• For ranging markets: shorter Reflex period (10-15) captures swings better
• For trending markets: shorter Trendflex period (10-15) follows trend shifts faster
█ LIMITATIONS
Inherent Characteristics
• Near-zero lag, not zero-lag : Despite the name, some lag remains from SuperSmoother filtering
• Normalization artifacts : RMS normalization can produce unusual readings during volatility regime changes
• Period dependency : Oscillator characteristics change significantly with different period settings - no "correct" universal parameter
Market Conditions to Avoid
• Very low volatility : Normalization amplifies noise in quiet markets, producing false signals
• Sudden gaps : SuperSmoother assumes continuous data; large gaps disrupt filter continuity requiring bars to stabilize
• Micro timeframes : Sub-minute charts contain microstructure noise that overwhelms signal quality
Parameter Selection Pitfalls
• Matching periods to dominant cycle : If period doesn't align with actual market cycle period, signals degrade
• Threshold over-tuning : Optimizing threshold parameters for past data often fails forward - use conservative defaults
• Ignoring component differences : Reflex and Trendflex measure different aspects - don't expect identical behavior
█ NOTES
Credits
These indicators are based on Dr. John F. Ehlers' "Reflex: A New Zero-Lag Indicator" published in the February 2020 issue of Technical Analysis of Stocks & Commodities (TASC) magazine. The article introduces a novel approach to isolating cycle and trend components using SuperSmoother filtering combined with normalized deviation analysis.
For those interested in the underlying mathematics and DSP concepts:
• Ehlers, J.F. (February 2020). "Reflex: A New Zero-Lag Indicator" - Technical Analysis of Stocks & Commodities magazine
• Ehlers, J.F. (2001). Rocket Science for Traders: Digital Signal Processing Applications . John Wiley & Sons
• Various TASC articles by John Ehlers on SuperSmoother filters and oscillator design
by ♚@e2e4
[BTX] TRIX + MA combined indicator (open version)This indicator combines TRIX and MA of TRIX in one. You can choose which type of moving average line to be used (EMA or SMA).
Default values are 12 periods for TRIX and 10 periods for MA/TRIX, which helps better response to price movement.
This indicator can use in all markets, all timeframes. This is an update to my indicator, which is a protected script. You can find it at the link: .
What is the TRIX (Triple Exponential Average) indicator?
TRIX is a momentum oscillator that displays the percent rate of change of a triple exponentially smoothed moving average. It was developed in the early 1980s by Jack Hutson, an editor for 'Technical Analysis of Stocks and Commodities' magazine. With its triple smoothing, TRIX is designed to filter out insignificant price movements. Chartists can use TRIX to generate signals similar to MACD. A signal line can be applied to look for signal line crossovers. A directional bias can be determined with the absolute level. Bullish and bearish divergences can be used to anticipate reversals.






















