The focus and nature of investment analysis has gradually shifted over decades, as the amount and complexity of information available to the analyst has evolved.

There have long been two schools of investment analysis.

One school, concentrating on price information, is called “technical analysis”.

The other school, focusing on facts about the issuers of securities and the terms and conditions of the issue, is called “fundamental analysis”.

Easy facts: the advantage of technical analysis

Price information has always be easier to come by than facts about issuers and the securities themselves.

Technical analysis: massaging price and volume data

Technical analysis, in all its varied forms, dominates much of what passes for investment analysis today.

From Japanese candlestick charts, dating to the 18th century, to Modern Portfolio Theory, with its emphasis on alphas and betas, to the Black-Scholes   equations that brought down Long Term Capital Management and the pride of the Nobel Gods — when we strip away the fancy graphs and esoteric equations, we find that the only facts involved seem to be price, volume, and simple data on corporate actions (like dividends, stock splits, or rights issues).

Some have compared technical analysis to astrology — more of a magical belief system than rational science. I would agree with them if technical analysis is the only basis for selecting investment.

However, when combined with fundamental analysis, examining price and volume trends can provide useful insights.

Hard facts: the bane of fundamental analysis

The availability of facts about issuers and the terms and conditions of the securities issued has improved gradually over the years.

Prior the regulatory reforms of the Great Depression, facts about securities (other than price and volume), were hard to come by.

When Benjamin Graham and David Dodd published “Security Analysis” in the 1930s, the availability of corporate information had improved considerably. They wrote in their book (which came to be the Bible of fundamental analysis):

… descriptive analysis consists of marshaling the important facts relating to an issue and presenting them in a coherent, readily intelligible manner. This function is adequately performed for the entire range of marketable corporate securities by the various manuals, the Standard Statistics and Fitch services and others. …

In an earlier article, I describe the quality of information available at the time “Security Analysis” was written.

The Standard Corporation Service volume of 1915   may be examined on Google books.

More information than an analyst alone can handle

Whereas the complexity of price-volume facts that support technical analysis has not changed much over the years, this is not true for the facts needed to support fundamental analysis.

  1. Complexity: Corporations, corporate operations, and terms and conditions of securities have become exponentially more complicated and difficult to understand over the years since Graham & Dodd set forth the principles of fundamental analysis.
  2. Information Overload: Due to the Internet, the volume of fundamental facts now freely available to the analyst often surpasses the capacity of any individual to gather and organize.
See: The heroic, solitary security analyst is long gone.

Whereas, in the days of Benjamin Graham, the analyst could count on Standard Statistics to provide most essential facts, three-quarters of a century later, this is no longer true for its successor, Standard & Poor’s.

Most of the book “Security Analysis” deals with the interpretation of the relatively simple data available in the market at the time. Little space is devoted to the task of obtaining this data.

The ratio between the work involved in fact-gathering to fact-interpretation was perhaps 1 to 20 in the 1930s.

Today, it would be more like 100 to 1!

Moving beyond Graham & Dodd

What this means for fundamental analysis is that in the 21st century much more attention must be given to the process of gathering, collating, and organizing basic facts about issuers, issues, and their operations.

Graham & Dodd gives no clue on how to do this for the simple reason that they knew nothing about the Internet, computer search techniques, capital market taxonomy, esoteric derivatives, or the myriad complexities that helped to crash the market in 2008.

Even the book, A Modern Approach to Graham and Dodd Investing   does not deal with the formidable task of mining the sea of information on the Internet, focusing instead on modern financial statement analysis. The problem is, as was seen in the billion dollar collapse of the market for Auction Rate Securities, the devil was in the terms and conditions of the issue rather than in the financial statements of the issuer — facts which the statistical publishers failed to make clear to subscribers.

Where we are today

The Crash of 2008 did a pretty good job of trashing the Efficient Market Hypothesis, which had become a pillar supporting the notion that technical analysis in all its manifestations could reasonably serve as a stand-alone approach to investing.

The movers and shakers of Wall Street were revealed as ignorant of the basic facts needed to value securities.

For too long, price and volume, and all their derivative statistics and indices, had dominated investment behavior — leaving fundamental facts about issuers, issues, and the arcane details of operations in the closet.

The rating agencies are now in the dog house.

It is time to move on.

For more on the crisis of information, see:

See: Crowdsourcing investment research: opportunities in OSINT and Free information and the Efficient Market Hypothesis and Crowdsourcing investment research: Capital Market Taxonomy and Innovation in investment research; dealing with free information and Modern technology for institutional investment research
 
divider

copyright | privacy | home

Powered by WordPress | Entries (RSS) | Comments (RSS)