Rabu, 29 Juni 2011

Why we GIVE each big bank $12 billion per year

I've been reading quite a lot about the various political battles over efforts to regulate the banking industry, derivatives in particular. This afternoon, quite by coincidence, I noticed on a colleagues' disordered desk a copy of a speech given last year by Andrew Haldane, Executive Director for Financial Stability of the Bank of England. It's message should be a matter of urgent public debate in the US, and elsewhere. Yet I haven't heard Haldane's arguments mentioned anywhere in the US media.

Banks, he argues, are polluters, and they need to be controlled as such. Let me run through several points in a little more detail. First think of the car industry, where producers and users together pollute. Haldane:
"Exhaust fumes are a noxious by-product. Motoring benefits those producing and consuming car travel services – the private benefits of motoring. But it also endangers innocent bystanders within the wider community – the social costs of exhaust pollution."
He then goes on to make the point that we now face much the same situation with bankers, in part due to the proliferation of instruments which have made it possible for banks to spread risks far and wide and make profits events as they amplify system wide risks. The banking industry is also a polluter:
"Systemic risk is a noxious by-product. Banking benefits those producing and consuming financial services – the private benefits for bank employees, depositors, borrowers and investors. But it also risks endangering innocent bystanders within the wider economy – the social costs to the general public from banking crises."
What is to be done about polluters? Back in the 1970s, in the face of the rising air pollution, policy makers, guided by economists, worked out ways to solve the problem through regulations and, where necessary, outright prohibitions of some practices. That's where we are now with the banks. The regulations under consideration recognize the social costs of systemic risk and try sensibly to find a redress. The bankers response has of course been to whine and scream.

The first practical step in addressing a pollution problem is to estimate how much pollution there is. How much do we pay to guarantee bank solvency during these not-too-infrequent systemic crises? Haldane points out that if you count only the monetary amount transferred to banks by governments, the cost of the recent crisis is fairly large -- about $100 billion in the US. But this is probably an extreme lower bound to the true collective cost. As Haldane remarks,
"...these direct fiscal costs are almost certainly an underestimate of the damage to the wider economy which has resulted from the crisis – the true social costs of crisis. World output in 2009 is expected to have been around 6.5% lower than its counterfactual path in the absence of crisis. In the UK, the equivalent output loss is around 10%. In money terms, that translates into output losses of $4 trillion [for the US] and £140 billion [for the UK] respectively."
Now, I'm not one to take GDP loss arguments too seriously. It's not a very good measure of human well being, or even of the narrow economic well being of a nation (in particular, it tells us nothing at all about how much of the store of natural resources may have been eaten up in achieving that GDP). However, the scale alone in this case suggests that the cost of the crisis -- and other crises, which seem to strike more and more frequently with modern financial engineering and its global reach -- are immensely high.

But here is where Haldane's view becomes REALLY interesting -- and goes beyond anything you're likely to see in the popular media (especially in the US).

Lots of people recklessly borrowed in the run up to the crisis; it wasn't only the fault of the recklessly loaning banks. So, how much of the systemic costs associated with financial crises really is due to the banks? One place to look, Haldane suggests, is the valuations given to banks by the ratings agencies, many of which take explicit note of the fact that governments give a subsidy to banks in the form of implicit or explicit safegaurds to their stability. Some give banks two different ratings, one "with government support" and one "without." Haldane goes through a simple calculation based on these ratings, and estimates the effective subsidy to many big banks is on the scale of billions. The data suggests it has been increasing for the past 50 years. The "too big to fail" problem shows up in the data as well -- the average ratings difference for large banks is much bigger than it is for small banks. As Haldane sums up:
For the sample of global banks, the average annual subsidy for the top five banks was just less than $60 billion per year...  the large banks account for over 90% of the total implied subsidy. On these metrics, the too-big-to-fail problem results in a real and on-going cost to the taxpayer and a real and on-going windfall for the banks."
Haldane's speech -- as far as I'm aware -- is unique among high-level banking figures in taking seriously the public costs entailed by banking practices. Imagine where the US might be today if we'd taken that $60 billion per year and, over the past decade, invested it in infastructure, education and scientific research.

In the rest of his speech, Haldane explores what might be done to end or control this pollution, through taxes or by making some banking practices illegal. This too is illuminating, although what emerges is the view that finding a solution itself isn't so hard. What is hard is gathering the political will to take steps in the face of banking opposition. Publicizing how much they pollute, and how much we pay to let them do it, is perhaps a first step.

Jumat, 24 Juni 2011

PIMCO questions US financial focus

It's quite something when the managing director of PIMCO comes our and announces that the US has gone too far in seeking "wealth creation via financial assets," rather than boring things like science and manufacturing. Sean Paul Kelley at The Agonist points to this, which is worth a read.

Kamis, 23 Juni 2011

UNCITRAL, IP and security interests

The most recent issue of the Uniform Commercial Code Law Journal, Vol. 43, No. 2 April 2011, published in coordination with the Penn State Dickinson School of Law, contains at p.601 a most interesting article by UNCITRAL Senior Legal Officer Spiros Bazinas, "Intellectual Property Financing Under The UNICTRAL Guide". To give readers some idea of its contents, the IP Finance blog quotes from the article's conclusions:
"The Guide and the Supplement are designed to facilitate intellectual property financing, without interfering with intellectual property law. This result is achieved by commentary and recommendations that deal with the creation, third-party effectiveness, priority, enforcement (even within insolvency) of a security interest in an intellectual property right, and the law applicable to such matters. 
The commentary of the Supplement explains how the recommendations of the Guide and the Supplement would apply in the context of an intellectual property financing transaction. They do so in a way that ensures better coordination between secured transactions and intellectual
property law. With the same goal in mind, the recommendations of the Supplement modify the general recommendations of the Guide as they apply to security interests in intellectual property
rights. 
The Supplement is intended to provide guidance to States as to issues relating to security interests in intellectual property rights. It is not intended to deal with purely intellectual property law issues. However, it includes mild suggestions as to how States that may wish to enact the recommendations of the Guide and the Supplement could coordinate their intellectual property laws with their enactment of the recommendations of the Guide and the Supplement. 
The integrity of intellectual property law is preserved through a general rule that gives precedence to intellectual property law where it deals in an asset-specific and different way with a matter addressed in the Guide and the Supplement. 
The creation of a security interest in an intellectual property right is simplified by requiring only a written security agreement. At the same time, the rights of third parties are protected by third-party effectiveness requirements that refer to the registration of a notice of a security interest in an intellectual property right in the general security interests registry (or, if there is a specialized intellectual property registry, in that registry). 
Similarly, the interests of competing claimants are protected by a set of rules according to which priority among competing claimants is to be determined on the basis of the time of registration of a notice of the security interest in the general security interest registry, with appropriate exceptions (for example, giving priority to a security interest registered in an Intellectual property registry over a security interest registered in the general security interest registry). 
A comprehensive set of enforcement provisions in the Guide and the Supplement is designed to ensure certainty as to the remedies of the secured creditor in the case of default with due protection of the rights of the grantor and other parties with interests in the encumbered intellectual property, and with due recognition being given to the basic principles of intellectual property law. 
Discussion of applicable law issues completes the treatment of security interests in intellectual property rights in the Guide and the Supplement in a practical way that is consistent with intellectual property law. The recommendation adopted breaks new ground and should enhance
certainty of the law applicable to security interests in intellectual property rights and thus facilitate intellectual property financing. 
Finally, the discussion of insolvency-related issues is intended to supplement the regime of security interests in intellectual property with an analysis of the impact of the licensor's or licensee's insolvency on a security right in that party's rights under a licence agreement. It can be reasonably expected that the Guide will become a common reference tool in all secured transactions reform efforts. The reference to the Guide in the Australian and South Korean secured transactions law reform initiatives, as well as the use of the Guide in the Draft Common Frame of Reference and in World Bank secured transaction law reform documents justify such an expectation. 
The expectation of the Supplement can be no less, in particular, as it is the first text of its kind that deals with the issues at the intersection of secured transactions, insolvency, conflict-of-laws and intellectual property law, and cuts in an Alexandrian way many Gordian knots in that respect or even breaks new ground with rules that attracted a great deal of consensus among experts who initially were not of one mind".
Meanwhile, preparations for next week's Commission session (Vienna, 27 June to 8 July 2011) are going well. You can find the provisional agenda (A/CN.9/711) is here.  This blog is indebted to Spiros for letting us know that
"On security interests, we have just a brief report on the progress of WG VI and on the coordination with: (i) the security interests texts of Unidroit and the Hague Conference; (ii) the WB in preparing a joint UNCITRAL/WB set of standards on ST; and (iii) the EU Commission on the law applicable to the proprietary effects of assignments (the BIICL is preparing a study). Security interests will probably be discussed between 4 and 6 July".
Long-standing readers of this weblog will recall that it was UNCITRAL's first efforts regarding the IP-security interests zone of interest which triggered its being founded in January 2008.  The blog and UNCITRAL have both grown through the experience which, initially filled with friction, suspicion and misunderstanding, has brought a good deal of benefit to both the IP and the finance communities. Thank you, Spiros and UNCITRAL, for taking our comments, and our commitment, so seriously.

(Corrupt) Lawmakers challenge derivatives rules

I'm not the least bit surprised by this story detailing the push back by well funded (by whom, do you think?) US senators and representatives against proposed rules to reign in derivatives. Here's some choice material:
The lawmakers, Republicans and Democrats alike, argue that some proposed rules could force Wall Street’s derivatives business overseas. They also say that regulators are ignoring a crucial exemption to the rules spelled out in the Dodd-Frank financial regulatory law.

The law excused airlines, oil companies and other nonfinancial firms known as end-users from new restrictions, including a rule that derivatives must be cleared and traded on regulated exchanges. The firms use derivatives to hedge against unforeseen market changes, say a rise in fuel costs or interest rates, rather than to speculate.

“We are concerned that recent rule proposals may undermine these exemptions, substantially increasing the cost of hedging for end-users, and needlessly tying up capital that would otherwise be used to create jobs and grow the economy,” Senator Debbie Stabenow, Democrat of Michigan and chairwoman of the Senate Agriculture Committee, and Representative Frank D. Lucas, her Republican counterpart in the House, said in a letter this week to regulators.
I particularly like the phrase "airlines, oil companies and other nonfinancial firms known as end-users" identifying those who have restrictions. Does anyone doubt that lawyers and accountants at Goldman Sachs, JP Morgan and virtually every other big bank are working overtime right now deciding how they can turn the bank, or some subsidiary in the Cayman Islands, into an airline, oil company or other nonfinancial firm? I would bet they're just looking for a new pathway through which to route all their high risk stuff - and they've counseled the lawmakers on how they can best carve out some useful routes.

The other thing that is simply precious is this (the likes of which we've heard many times already, of course):
The lawmakers, Republicans and Democrats alike, argue that some proposed rules could force Wall Street’s derivatives business overseas.
The proper response would be not worry, but GOOD, PLEASE HURRY UP! Let them take their financial engineering business overseas and blow up someone else's economy.

P.S. Here's a random string of letters and digits: W8ENHMJ8MBKD. Ponder it if you like, but I don't see that it holds any particular meaning or interest. I have to tuck it into one of my blog posts somewhere to get listed on Technorati.

The ticking CDS time-bomb

The looming mess in Europe, linked to financial distress in Greece, looks like a perfect if rather frightening illustration of the malign consequences of over-dense banking interdependence on global financial stability. In this case -- as with the crisis of 2007-2008 -- the root cause of the trouble is CDSs and other derivatives. No one in Europe is quite sure how many reckless gambles banks have made over Greek debt and potential default, and the European Central Bank appears to be deadly afraid that such gambles have the potential to bring down the entire financial house of cards.

What's happened to the CDS market over the past decade? It's exploded. Amazingly, the value of outstanding CDS linked to debt in Greece, Italy, Spain and Portugal has doubled in the past three years -- since 2008!! The New York Times today discusses what can only be described as a ridiculous situation -- Europe pushed to the brink of a financial disaster by the actions of a small number of people gambling with other peoples' money in the dark, and doing so in the direct aftermath of the greatest financial crisis since the Great Depression:
The uncertainty, financial analysts say, has led European officials to push for a “voluntary” Greek bond financing solution that may sidestep a default, rather than the forced deals of other eras. “There’s not any clarity here because people don’t know,” said Christopher Whalen, editor of The Institutional Risk Analyst. “This is why the Europeans came up with this ridiculous deal, because they don’t know what’s out there. They are afraid of a default. The industry is still refusing to provide the disclosure needed to understand this. They’re holding us hostage. The Street doesn’t want you to see what they’ve written.”
 Wonderful. We've known about this danger for at least several years and have done nothing about it. But in fact, we've actually known about such danger for far longer, and have only taken steps to make our problems worse. A couple of years ago CBS aired an examination of the CDS market, and it is still worth watching. One interesting comment from the program:
It would have been illegal [selling CDSs of any kind] during most of the 20th century under the gaming laws, but in 2000, Congress gave Wall Street an exemption and it has turned out to be a very bad idea.

Rabu, 22 Juni 2011

Dirty little derivatives secrets...

The first dirty little secret of the derivatives industry -- probably not so secret to those in the financial industry, but unknown to most others who still think financial markets in some approximation are fair and efficient -- is that some of the big banks control the market and expressly inhibit competition to protect their profits. I just stumbled across this still highly relevant exposition by the New York Times of efforts to place derivatives trading within properly defined clearinghouses, and the banks' countervailing efforts to gain control over those clearing houses so as to block competition.

The banks (invoking some questionable claims of economic theory) like to argue that derivatives make markets more efficient because they make them more "complete." As Eugen Fama puts it: "Theoretically, derivatives increase the range of bets people can make, and this should help to wipe out potential inefficiencies." Available information, the idea goes, should flow more readily into the market. But the truth seems to be that derivatives make banks more profitable at everyone's collective expense, and not only because they make markets more unstable (see more on this below). From the New York Times article:
Two years ago, Kenneth C. Griffin, owner of the giant hedge fund Citadel Group, which is based in Chicago, proposed open pricing for commonly traded derivatives, by quoting their prices electronically. Citadel oversees $11 billion in assets, so saving even a few percentage points in costs on each trade could add up to tens or even hundreds of millions of dollars a year.

But Mr. Griffin’s proposal for an electronic exchange quickly ran into opposition, and what happened is a window into how banks have fiercely fought competition and open pricing.

To get a transparent exchange going, Citadel offered the use of its technological prowess for a joint venture with the Chicago Mercantile Exchange, which is best-known as a trading outpost for contracts on commodities like coffee and cotton. The goal was to set up a clearinghouse as well as an electronic trading system that would display prices for credit default swaps.

Big banks that handle most derivatives trades, including Citadel’s, didn’t like Citadel’s idea. Electronic trading might connect customers directly with each other, cutting out the banks as middlemen.

The article goes on to describe a host of maneuvers that Goldman Sachs, JP Morgan and other big banks used to block this idea, or at least to make sure they'd be locked into the gears of such an electronic exchange. Eventually the whole idea fell apart to the banks' relief. Guess who's paying the price?
Mr. Griffin said last week that customers have so far paid the price for not yet having electronic trading. He puts the toll, by a rough estimate, in the tens of billions of dollars, saying that electronic trading would remove much of this “economic rent the dealers enjoy from a market that is so opaque.”

"It’s a stunning amount of money,” Mr. Griffin said. “The key players today in the derivatives market are very apprehensive about whether or not they will be winners or losers as we move towards more transparent, fairer markets, and since they’re not sure if they’ll be winners or losers, their basic instinct is to resist change.”
But there's another dirty little secret about the derivatives industry, and this goes back to the question of whether these instruments really do have benefits, by making markets more efficient, perhaps, or if instead they might make them more unstable and prone to collapse. Warren Buffet was certainly clear in his opinion, expressed in his newsletter (excerpts here) to Berkshire Hathaway shareholders back in 2002: "I view derivatives as time bombs, both for the parties that deal in them and the economic system." But the disconcerting truth about derivatives emerges in more certain terms from new, fundamental analyses of how precisely they can stir up natural market instabilities.

I'm thinking primarily of two bits of research -- one very recent and the other a few years old -- both of which should be known by anyone interested in the impact that derivatives have on markets. Derivatives can obviously let people hedge risks -- locking in affordable fuel for the winter months in advance, for example. But they're used for risk taking as much as hedging, and can easily create collective market instability. These two studies show -- from within the framework of economic theory itself -- that adding derivatives to markets in pursuit of the nirvana of market completeness should indeed make those market less stable, not more.

I'm currently working on a post (it's taking a little time) that will explore these works in more detail. I hope to get this up very shortly. Meanwhile, these two examples of science on the topic might be something to keep in mind as the banks try hard to confuse the issue and obscure what ought to be the real aim of financial reform -- to return he markets to their proper role as semi-stable systems providing funds for creative and valuable enterprise. Markets should be a public good, not a rigged casino, benefiting the few, and guaranteed by the public.

Minggu, 19 Juni 2011

The Grand Inquisitors of Rational Expectations

In this short snippet (two minutes and 23 seconds) of an interview, John Kay sums up quite succinctly the situation facing Rational Expectations theorists in the light of what has happened in the past several years. Reality just isn't respecting their (allegedly) beautiful mathematical theories.

In Bertold Brecht's play The Life of Galileo, Kay notes, there's a moment when the Grand Inquisitors of the Church refuse to look through Galileo's telescope. Why? Because the Catholic church had essentially deduced the motion of the planets from a set of axioms. They refused to look, as Kay puts it,
......on the grounds that the Church has decreed that we he sees cannot be there. This makes me think of the way some of the economists who believe in Rational Expectations have reacted to events of the past few years. [They're like the inquisitors with Galileo]. ...they refuse to look through the telescope because they know on a priori grounds that what he saw wasn't actually there.

Sabtu, 18 Juni 2011

Millisecond mayhem

The terrifying Flash Crash of 6 May 2010 has long dropped out of the news. The news cycle more of less ended with the release of the SEC's final report on the event in October of last year which concluded that...well... the event got kicked off by a big trade in E-Mini Futures by Waddell and Reed and played out in two subsequent liquidity crises exacerbated -- and crammed into a very short time-sale -- by high-frequency traders. In essence, the report concluded that A happened, then B happened, which caused C to happen, etc., and we had this Flash Crash. What it didn't explore is WHY this kind of this was possible, WHY the markets as currently configured should be prone to such instabilities, or WHY we should have any confidence similar things won't happen again.

I'm not sure what triggered my interest, but I had a quick look today to see if any similar events have taken place more recently. Back in November of last year the New York Times reported on about a dozen episodes it called "mini flash crashes" in which individual stocks plunged in value over a few seconds, before then recovering. In one episode, for example, stock for Progress Energy -- a company with 11,000 employees -- dropped 90% in a few seconds.  These mini whirlwinds are continuing to strike fear into the market today.

For example, this page at Nanex (a company that runs and tracks a whole-market datafeed) lists a number of particularly volatile events over previous months, events in which single stocks lost 5%, 17%, 95% over a second or five seconds before then recovering. According to Nanex, events of this kind are now simply endemic to the market -- the 6 May 2010 events simply seems larger than similar events taking place all the time:
The most recent data available are for the first month and three days of 2011. In that period, stocks showed perplexing moves in 139 cases, rising or falling about 1% or more in less than a second, only to recover, says Nanex. There were 1,818 such occurrences in 2010 and 2,715 in 2009, Nanex says.
 A few specific examples as reported on in this USA Today article:
•Jazz Pharmaceuticals' stock opened at $33.59 on April 27, fell to $23.50 for an instant, then recovered to close at $32.93. "There was no circuit break," says Joe Saluzzi, trader at Themis Trading, because Jazz did not qualify for rules the exchanges put in place after the flash crash for select stocks following extreme moves.

•RLJ Lodging Trust was an initial public offering on May 11. It opened at $17.25 its first day, then a number of trades at $0.0001 took place in less than a second before the stock recovered. The trades were later canceled, but it's an example of exactly what is not supposed to happen anymore, Hunsader says.

•Enstar, an insurer, fell from roughly $100 a share to $0 a share, then back to $100 in just a few seconds on May 13.

•Ten exchange traded funds offered by FocusShares short-circuited on March 31. One, the Focus Morningstar Health Care Index, opened at $25.32, fell to 6 cents, then recovered, says Richard Keary of Global ETF Advisors. The trades were canceled. "No one knows how frequently this is happening," he says.

•Health care firms Pfizer and Abbott Labs experienced the opposite of a flash crash on May 2 in after-hours trading. Abbott shares jumped from $50 to more than $250, and Pfizer shot from $27.60 to $88.71, both in less than a second, Nanex says. The trades were canceled.
 Apparently, according to the Financial Times, something similar happened just over a week ago, on 9 June, in natural gas futures.

I haven't seen anyone who has explained these events in some clear and natural way. I still see a lot of hand waving and vague talk about computer errors and fat fingers. But it seems unlikely these tiny explosions in the market are all driven by accidents. Much more likely it seems to me is that these events are somehow akin to those dust devils you see if driving through a desert -- completely natural if rather violent little storms whipped up by ordinary processes. The question is what are those processes? Also -- how dangerous are they?

The best hint at an explanation I've seen comes from this analysis by Michael Kearns of the University of Pennsylvania and colleagues. Their idea was to study the dynamics of the limit order mechanism which lies at the mechanical center of today's equity markets, and to see if it is perhaps prone to natural instabilities -- positive feed backs that would make it likely for whirlwind like movements in prices to take place quite frequently. In other words, are markets prone to the Butterfly Effect? Their abstract gives a pretty clear description of their study and results:
We study the stability properties of the dynamics of the standard continuous limit-order mechanism that is used in modern equity markets. We ask whether such mechanisms are susceptible to "Butterfly Effects" -- the infliction of large changes on common measures of market activity by only small perturbations of the order sequence. We show that the answer depends strongly on whether the market consists of "absolute" traders (who determine their prices independent of the current order book state) or "relative" traders (who determine their prices relative to the current bid and ask). We prove that while the absolute trader model enjoys provably strong stability properties, the relative trader model is vulnerable to great instability. Our theoretical results are supported by large-scale experiments using limit order data from INET, a large electronic exchange for NASDAQ stocks.
The "absolute" traders in this setting act more like fundamentalists who look to external information to make their trades, rather than the current state of the market. The "relative" traders are more akin, at least in spirit, to momentum traders -- they're responding to what just happened in the market a split second ago and changing their strategies on the fly. Without any question, there are indeed many high-frequency traders who are "relative" traders -- probably most. So mini-flash crashes -- perhaps -- are merely a sign of natural instability and chaos in the micro dynamics of the market.

Jumat, 17 Juni 2011

Real steps on banking reform?

Don't want to get too wildly optimistic. When it comes to banking regulations, disappointment always lies just around the corner and everything important happens behind the scenes. But a couple things today give me some cautious hope that regulators interpreting the new Basel III banking rules may actually take some real steps to curb systemic risks -- they may even take the structure of banking network interactions into account.

First, Simon Johnson gives an excellent summary of recent developments in the US where banking lobbyists seem to have been caught flat-footed by recent steps taken by Federal Reserve governor Dan Tarullo. I hope this isn't just wishful thinking.

The banks are apparently pushing four key arguments to explain why it's a really horrific idea to make the banking system more stable, and why, especially, the world will probably end quite soon in a spectacular fiery cataclysm of the biggest and most well-connected banks are required to keep an additional few percentage points of capital. Johnson dissects these arguments quite effectively and suggests, encouragingly, that regulators at the Fed, charged with interpreting Basel III, aren't convinced either.

Elsewhere, this Bloomberg article suggests -- and this really surprises me -- that the measures under consideration would...
.. subject banks to a sliding scale depending on their size and links to other lenders.
Now this is an interesting development. Someone somewhere seems to be paying at least a little attention to what we're learning about banking networks, and how some risks are tied more directly to network linkages, rather than to the health of banks considered individually. The density of network linkages itself matters.

Research I've written about here suggests that there's essentially no way to safeguard a banking system unless we monitor the actual network of links connecting banks. It's certainly an encouraging step that someone is thinking about this and trying to find ways to bring density of linkages into the regulatory equation. I hope they're pondering the figure below from this study which shows how (in a model) the overall probability of a banking failure (the red line) at first falls with increasing diversification and linking between different banks, but then abruptly begins rising when the density gets too high.


The implication is that there's likely to be a sweet spot in network density (labeled in the figure as diversification, this being the number of links a bank has to other banks) from which we should not stray too far, whether the big banks like it or not.

IP – the value and risks to a fast growing business

- was the title of a seminar recently held at the London office of insurance company MunichRe.

In addition to launching a new IP insurance product, the seminar also presented the findings of a survey of attitudes to IP insurance recently carried out by Morwenna Rees-Mogg, proprietor of AngelNews, a media business focused on business angels, VCs and early stage funded companies.

One interesting conclusion of the survey was that “investors should and will pay for the right insurance”, with over half of the investors surveyed agreeing that a company’s valuation would be enhanced by having IP insurance in place. Around 75% of those surveyed also felt that they would be more willing to do business with an SME having IP insurance.

Kamis, 16 Juni 2011

Complexity economics

Complexity is one of the hottest concepts currently nipping at the fringes of economic research. But what is it? Richard Holt, David Colander and Barkley Rosser last year write a nice essay on the concept which makes some excellent points, and they're certainly optimistic about the prospects for a sea change in the way economic theory is done:
The neoclassical era in economics has ended and has been replaced by an unnamed era. We believe what best characterizes the new era is its acceptance that the economy is complex, and thus that it might be called “the complexity era.”
Indeed, something like this seems to be emerging. I've been writing about complexity science and its applications in physics, biology and economics for a decade, and the ideas are certainly far more fashionable now than they were before.

But again -- what is complexity? One key point that Holt and colleagues make is that complexity science recognizes and accepts that many systems in nature cannot be captured in simple and timeless equations with elegant analytical solutions. That may have been true with quantum electrodynamics and general relativity, but it's decidedly not true for most messy real world systems -- and this goes for physics just as much as it does for economics. Indeed, I think it is fair to say that much of the original impetus behind complexity science came out of physics in the 1970s and 80s as the field turned toward the study of collective organization in disordered systems -- spin glasses and glassy materials, granular matter, the dynamics of fracture and so on. There are lots of equations and models used in the physics of such systems, but no one has the intention or hope of discovering the final theory that would wrap up everything in a tidy formula or set of axioms.

Most of the world isn't like that. Rather, understanding means creating and using a proliferation of models, every one of which is partial and approximate and incomplete, which together help to illuminate key relationships -- especially relationships between the properties of things at a lower level (atoms, molecules, automobiles or people) and those at a higher level (crystal structures, traffic jams or aggregate economic outcomes).

Holt and colleagues spend quite some time discussing definitions of complexity, a task that I'm not sure is really worth the effort. But they do arrive at a useful distinction between three broad views -- a general view, a dynamic view, and a computational view. The second of these seems most interesting and directly related to emerging research focusing on instabilities and rich dynamics in economic systems. As stated by Rosser, a system is "dynamically complex" if...
it endogenously (i.e. on its own, and not because of external interference) does not tend asymptotically to a fixed point, a limit cycle, or an explosion.
In other words, a dynamically complex system never settles down into one equilibrium state, but has rich internal dynamics which persist. I'm not sure this definition covers all the bases, but it comes close and certainly strikes in the right direction.

Holt, Colander and Rosser go on to outline a number of areas where the complexity viewpoint is currently altering the landscape of economic research. I wouldn't quibble with the list: evolutionary game theory is bringing institutions more deeply into economic analysis, ecological economics is actually bringing the consideration of biology into economics (imagine that!), behavioural economics is taking account of how real people behave (again, what a thought!), agent-based models are providing a powerful alternative to analytical models, and so on.

This is all insightful, but I think perhaps one point could be more strongly emphasized -- the absolute need to recognize that what happens at higher macro-levels in a system often depends in a highly non-intuitive way on what happens at the micro-level. One of the principle barriers to progress in economics and finance, in my opinion, has been the the systematic effort by theorists over decades to avoid facing up to this micro-to-macro problem, typically through various analytical tricks. The most powerful trick -- a pair of tricks, really -- is to assume 1) that individuals are rational (hence making the study of human behaviour a problem not of psychology but of pure mathematics) and 2) assuming (in what is called the representative agent method) that the the behaviour of collective groups, indeed entire markets and economies, can be calculated as the simple sum of the rational actions of the people making it up.

The effect of this latter trick is actually quite amazing -- it eliminates from the problem, by definition, all of the interesting interactions and feed backs between people which make economies and markets rich and their dynamics surprising. Having done this, economics becomes a mathematical task of exploring the nature of rational behaviour -- it essentially places the root of complex collective economic outcomes inside the logical mind of the individual.

To put it most simply, this way of thinking tends to attribute outcomes in collective systems directly to properties of parts at the individual level, which is a terrific mistake. This might sometimes be the case. But we know from lots of examples in physics, biology and computer science that very simple things, in interaction, can give rise to astonishing complexity in a collective group. We should expect the same in social science: many surprising and non-intuitive phenomena at the collective level may reflect nothing tricky at all in the behaviour of individuals, but only the tendency for rich structures and dynamics to emerge in collective systems, created by myriad pathways of interaction among the system's parts.

Taking this point seriously is what I think most of complexity science -- as applied to social systems -- is about. It's certainly central to everything I write about under the phrase the "physics of finance." I take physics in the broad sense as the study of how organization and order and form well up in collective systems. It so happened that physics started out on this project in the context of physical stuff, electrons, atoms, molecules and so on, but that's merely a historical accident. The insights and methods physics has developed aren't bound by the nature of the particular things being discussed, and this project of understanding the emergence of collective order and organisation goes well beyond the traditional subject matter of physics.

Holt, Colander and Rosser make one other interesting point, about the resistance of macro-economists in general to the new ways of thinking:
Interestingly, these cutting edge changes in micro theory toward inductive analysis and a complexity approach have not occurred in macroeconomics. In fact, the evolution of macroeconomic thinking in the United States has gone the other way. By that, we mean that there has been a movement away from a rough and ready macro theory that characterized the macroeconomics of the 1960s toward a theoretically analytic macro theory based on abstract, representative agent models that rely heavily on the assumptions of equilibrium. This macro work goes under the name new Classical, Real Business cycle, and the dynamic stochastic general equilibrium (DSGE) theory, and has become the mainstream in the U.S.
This is quite depressing, of course, but if most of what Holt and colleagues write in their essay is true, it cannot possibly stay this way for long. Economics as a whole is changing very rapidly and macro can't remain as it is indefinitely. Indeed, there are already a number of researchers aiming to build up macro-economic models "from the bottom up" -- see this short essay by Paul De Grauwe, for example. All this spells certain near-term doom for the Rational Expectations crowd, and that doom can't come a moment too soon.

Rabu, 15 Juni 2011

UK R&D – more consulation

In addition to the patent box paper, the Treasury also published the next round of R&D consultation – this is rather more a summary of responses to the last consultation than new information.


In summary:


  • not ruling out ‘above the line’ R&D relief/credit but the Government still needs to be convinced
  • large company subcontractor costs to qualify for the subcontractor only where the subcontractor is aware that it is qualifying R&D and has evidence to this effect
  • no plans to extend qualifying expenditure to cover (eg) rent of premises used for R&D
  • draft legislation in the Autumn to allow a wider range of externally provided workers to qualify
  • no plans to restrict internally created software from being qualifying costs
  • improved guidance on whether prototypes will be qualifying R&D – whether the ‘uncertainty’ principle applies
  • plans for a pilot scheme will be brought in during the Autumn so that small companies and start-ups can get advance assurances that can be relied on in making R&D claims for several years

The Government is looking for specific responses (by 2 September 2011) on:

  • qualifying indirect activities: should the relief be retained? (QIA are hard to define and harder to get relief for)
  • should there be some form of certification or election process to provide certainty for subcontractors as to whether the work they are doing is R&D?
  • does the removal of the PAYE/NICs limit on the repayable credit require any safeguards?
  • does the ‘going concern’ definition need to be reformed, to make it closer to that for other tax reliefs?

The UK Patent Box - more details, not necessarily much more clarification

The Treasury has published (a few days later than originally advertised) the
next round of consultation on the patent box (pdf).


In summary:


  • extended to plant variety rights, data exclusivity and supplementary protection certificates
  • will apply to UK and EPO patents only
  • will apply to all UK and EPO patents, no matter when commercialised or granted
  • to be phased in over 5 years, from 2013/14 – full benefit not until 2017/18
  • a seriously complex round of computations will be involved, with analysis of income and expenses across the company and possibly by division required

Responses to the consultation are requested by 2 September 2012.


The consultation paper builds on the previous round of consultation, with more detail on the proposals following the responses to that round. There is still no draft legislation to consider, as the paper focussed on the high level principles involved.


Qualifying IP


The patent box proposals are to be extended to income from: data exclusivity (regulatory data protection), plant variety rights, and supplementary protection certificates (for pharma and agrochemical patents) as these are considered to be appropriately subject to external scrutiny before being granted.


Qualifying patents will be those granted by the UK Patent Office and the European Patent Office only – the UK is usually quite accommodating about foreign IP but the wider range of patents granted in (say) the US is clearly a step too far for the UK Revenue. The consultation indicates that the Government is open to the idea of including patents granted by other national
authorities where the local examination process, and scope of patentable ideas, is similar to that in the UK. However, it will not include non-UK/EPO patents that could have been protected by patent in the UK but have not been so protected – the consultation paper notes that HMRC is not in a position to judge whether the UK Patent Office would have granted a patent. This may lead to an increase in patent registrations in the UK; I doubt whether anyone has considered the additional resource issues that might arise for the Patent Office as a result.


Outright ownership of the patent will not be required: the patent box will be available to UK companies with an exclusive licence (as to field or territory) to a licence, where there is effective market exclusivity. It appears that the licensor (if a UK taxpayer) will have the advantage of the patent box for the royalty income, and the licensee could (presumably) have the advantage of the patent box on the 'embedded income' in profits from manufacturing.


This is fairly theoretical because it is also proposed that – to be able to claim the benefit of the patent box – the taxpayer must be actively involved in the ongoing decision making in respect of exploitation of the patent. In addition, the taxpayer must have performed "significant activity" to develop the patented invention or its application. It's not impossible, as a result, that
neither the licensor could claim the patent box (where it is not actively involved in the ongoing decision making once a licence has been granted) nor the licensee (where it has not performed significant development activity). Legal protection and management of a financial investment won't count as activity or involvement.


Qualifying income


Qualifying income will be that earned worldwide by a UK company on such patents – this is perhaps likely to be more important for embedded income relief under the patent box (that is, income from utilisation of the IP rather than income from licensing).


When considering embedded income (the profit on sale of product that relates to the patent underlying the product), the patent must genuinely contribute to the product producing the income (it's not enough to simply say it's protected by patent).


Compensation and damages for infringement of a patent will qualify as income from patents.


Income from products made using a patented processes may qualify where an arm's length royalty can be imputed for the use of the patent.


Income from the sale of patents will qualify for the patent box rate (although if the patent is held in a separate company, and the company's shares are sold rather than the patent, the substantial shareholdings exemption could mean a 0% tax rate on the gain instead ...)


The patent box rate won't apply to income between application and grant of the patent at the time it arises but, once that patent has been granted, the company can claim the benefit of the rate for that income (up to four years). Rather
than having to re-open the previous returns, this will be done by way of reduction of tax in the year the patent is actually granted.


Calculating the profits


The patent profits (for tax purposes) are to be calculated on a three-step basis which I'm not even going to attempt to explain, beyond saying that it will involve analysing the income and expenses of the company, possibly by division, and separating out the qualifying income, reducing it for a fixed percentage of 'routine' profits and rinse and repeat as necessary.


Commencement date


The November consultation suggested that all patents first commercialised after 29 November 2010 would qualify; this consultation paper now considered that this sort of cut-off date might not work all that well – the date of initial commercialisation can be hard to define (no kidding!), and a single date would require transitional rules for up to 20 years until all older patents have expired.


So the suggestion is now that the patent box will apply to all UK/EPO patents, but phasing in the benefits over the first five years of the patent box – effectively, 60% of the benefit would apply in 2013/14, then 70% in 2014/15 etc until 100% is available in 2017/18. It will still only apply to profits arising after 1 April 2013.

Slippery bankers slither through the constraints...

A couple months ago I wrote this short article in New Scientist magazine (it seems freely available without subscription). It pointed out a troubling fact about recent measures proposed to stop bankers (and CEOs and hedge fund managers, etc.) from taking excessive risks which bring them big personal profits in the short run while saddling their firms (and taxpayers) with losses in the long run. The problem, as economists Peyton Young and Dean Foster have pointed out, is that none of these schemes will actually work. If executives keep tight control on details about how they are running their firms and how they are investing, they can always game the system by making it look outwardly to others that they're not taking excessive risks. There's no way to control it without much greater transparency so shareholders or investors can see clearly what strategies executives are using.

Here's the gist of the article:
You might think that smarter rules could get around the problem. Delay the bonuses for five years, perhaps, or put in clauses allowing the investor to claw back pay if it proves undeserved. But it won't work. Economists Dean Foster of the University of Pennsylvania in Philadelphia and Peyton Young of the University of Oxford have now extended Lo's argument, showing that the problem is simply unsolvable as long as investors cannot see the strategies used by managers (Quarterly Journal of Economics, vol 125, p 1435).

As they point out, delaying bonuses for a few years may deter some people as the risky strategies could explode before the period is up. But a manager willing to live with the uncertainty can still profit enormously if they don't. On average, managers playing the game still profit from excess risk while sticking the firm with the losses.

So-called claw back provisions, in the news as politicians ponder how to fix the system, will also fail. While sufficiently strong provisions would make it unprofitable for managers to game the system, Young and Foster show that such provisions would also deter honest managers who actually do possess superior management skills. Claw back provisions simply end up driving out the better managers that incentives were meant to attract in the first place.
This work by Young and Foster is among the more profound things I've read on the matter of pay for performance and the problems it has created. You would think it would be front and center in the political process of coming up with new rules, but in fact it seems to get almost no attention whatsoever. From this article in today's Financial Times, it seems that bankers have quite predictably made some quick adjustments in the way they get paid, conforming to the letter of new rules, without actually changing anything:
Bank chiefs' average pay in the US and Europe leapt 36 per cent last year to $9.7m, according to data compiled for the Financial Times, despite variable performance across the sector....

Regulators have declined to impose caps on bank pay, instead introducing changes they believe will limit incentives to take excessive risks. That has led many banks to increase fixed salaries, reduce employees’ reliance on annual bonuses and defer cash and stock awards over several years.

“The real story around pay is the progress on ensuring bonuses are deferred, paid in shares and subject to clawback and performance targets, rather than the headline figure,” said Angela Knight, British Bankers’ Association chief executive.
 It's just too bad that we already know it won't work.

The Fear Index rises...

So far this month the so-called Fear Index -- the Chicago Board Options Exchange's volatility index, the VIX -- has risen by 24%. The VIX is a measure, roughly speaking, of how volatile market participants expect the markets to be over the next month, and these people, apparently, have growing uncertainty about a whole lot of things. What's going to happen with Greece and is the Eurozone itself safe? In the US, what happens now that the Federal Reserve has stopped its program of "quantitative easing" -- i.e. increasing the supply of money?

In the Financial Times, Gavyn Davies suggests the world's economic leaders are "out to sea" and simply unable to coordinate in their actions. The situation, he suggests, has been...
...triggered by the realisation that policy-makers around the world are no longer in any condition to rescue the global economy from a further slowdown, should that occur. Economies, and therefore markets, are currently flying without an automatic safety net from either fiscal or monetary policy.  And that brings new risks, compared to the situation which has been in place for most of the period since the 2009.
This is the world as reflected in the financial press -- one in which emotions and fears and hunches and uncertainty play a huge, indeed central role. Hardly surprising.

What is surprising and rather odd, is how far this is from the view of academic economists who assume (most of them, at least) that irrational fears and crazy expectations have little to do with market outcomes, which actually reflect in some way individuals' rational assessments of future prospects. This is the Rational Expectations view of economics, currently still considered, amazingly enough, as the gold standard in economic theorizing. It's been dominant for around 40 years, since first proposed by economists John Muth and Robert Lucas. Fortunately, it is finally giving way to a more realistic and less rigid view (more on this below) which takes note of something that is quite important to human beings -- our ability to learn and adapt.

I've long found it hard to understand how the Rational Expectations idea has come to be taken seriously at all by economists. It's primary assertion is that the predictions of individuals and firms about the economic future, while they may not be completely identical, are not on average incorrect. People make random but unbiased errors, the idea goes, and so the average expectation is rational -- it captures the true model of how the economy works. This is indeed patently absurd in the face of historical experience: the average view of the future in 2005, for example, was not that a global financial and economic crisis lay just around the corner. Perhaps a determined Rational Expectations enthusiast would disagree. What evidence is there after all about what peoples' expectations really were at the time?

The standard defense of the Rational Expectations framework follows the same three-tiered logic used in defense of the Efficient Markets Hypothesis (not surprisingly, given the close link between the two ideas). The first defense is to assert that people are indeed rational. Given evidence that they are not, the second defense is to admit that, OK, people make all kinds of errors, but to assert that they make them in different ways; hence, they are not irrational on average. Of course, the behavioral economics crowd has now thoroughly trashed this defense, showing that people are systematically irrational (over confident, etc) in similar ways, so their errors won't cancel out. Hence, the third line of defense -- arbitrage. Starting with Milton Friedman and continuing for decades, defenders of the rational core of economic theory have argued that competition in the marketplace will drive the irrational out of the market as the more rational participants take their money. The strong prey upon the weak and drive them out of existence.

For example, in 1986, Robert Lucas wrote the following, expressing his view that anything other than rational expectations would be weeded out of the market and ultimately play no role:  
Recent theoretical work is making it increasingly clear that the multiplicity of equilibria ... can arise in a wide variety of situations involving sequential trading, in competitive as well as finite-agent games. All but a few of these equilibria are, I believe, behaviorally uninteresting: They do not describe behavior that collections of adaptively behaving people would ever hit on.
Alas, this evolutionary or adaptive defense turns out not to work either. As Andrei Shliefer and others have shown, an arbitrager would need access to an infinite amount of capital to be able to drive the irrational (so-called "noise traders") from the market, essentially because market "inefficiencies" -- mispricings due to the stupid actions of noise traders -- may persist for very long times. The market, as the famous saying goes, can stay irrational longer than you can stay solvent.

None of this has stopped most economists from plodding right on along behind the Rational Expectations idea, and they may well continue this way for another 50 years, stifling macroeconomics, with the rest of us paying the price (well beyond our tax dollars that go to fund such research). But there is fresh air in this field, let in by researchers willing to throw open some windows and let high theory be influenced by empirical reality.

It is obvious that expectations matter in economics. What happens in the future in markets or in the larger economy often depend on what people think is likely to happen. The problem with RE (Rational Expectations) isn't with the E but with the R. A far more natural supposition would be that people form their expectations not through any purely rational faculty but through a process of learning and adjustment, adaptively, as we do most other things. Indeed, a number of people have been exploring this idea in a serious way, through both models and experiments with real people. The results are eye-opening and demonstrate how much has been lost in a 40 year trance-like distraction at the (supposed) mathematical beauty of the rational expectations theory.

Cars Hommes has written a beautiful review of the area that is well worth close study. As he notes, there have been about 1000 papers published in the past 20 years on "bounded rationality" -- the finite mental capacity of real people -- and learning. These include agent-based models of markets in which the agents all use different, imperfect strategies and learn as they go, and other studies looking more closely at the process by which people form their expectations. Here are a few highlights:

* Studies from about 10 years ago explored the actual strategies followed by traders in a hog and beef market, and found empirical evidence for heterogeneity of expectations. One study estimated that "about 47% of the beef producers behave naively (using only the last price in their forecast), 18% of the beef producers behaves rationally, whereas 35% behaves quasi-rationally (i.e. use a univariate autoregressive time series model to forecast prices)."

Think of that -- in an ordinary hog and beef market we have roughly half the market participants behaving on the basis of an extremely simple heuristic.

* More recently, a number of researchers have used agent models to characterize data from markets for stock, commodities, foreign exchange and oil. As the review notes, "Most of these studies find significant time-variation in the fractions of agents using a mean-reverting fundamental versus a trend-following strategy."

That is, not only do the participants have distinct expectations, but those expectations and the basic strategies they follow shift over time. Not surprising really, but clearly not consistent with the rational expectations view.

* The review also points to recent seminal work by a team of physicists led by Fabrizio Lillo. This study looked at investors in a Spanish stock market and found that they fell into different groups based on their strategies, including trend followers and contrarians who bought against trends.

* And who says simple surveys can't be instructive? Various studies over 15 years using this traditional technique have found that financial experts "use different forecasting strategies to predict exchange rates. They tend to use trend extrapolating rules at short horizons (up to 3 months) and mean-reverting fundamentalists rules at longer horizons (6 months to 1 year) and, moreover, the weight given to different forecasting techniques changes over time." Similarly, for peoples' views on inflation, a study found that "heterogeneity in inflation expectations is pervasive..."

So then, why does the use of rational and homogeneous expectations continue in economics and finance? The review doesn't answer these questions, but in view of the massive quantities of data pointing to a much richer and complex process by which people form their expectations, the persistence of the rational expectations view in mainstream macroeconomic models -- this is my understanding, at least -- seems fairly scandalous.

So what's going on with the VIX? No one really knows, and I'm absolutely sure that the answer cannot emerge from any general equilibrium model with rational expectations. Times of uncertainty lead, in general, to two effects -- 1) a growing diversity in views, given the lack of information that would push most people toward one, and 2) a growing urgency to find some clue about the future, some guide to future behaviour. Urgency in the face of uncertainty leads many people -- as this study in Science from a couple of years ago showed -- to see what they take to be meaningful patterns even in purely random noise. Hence, we need a theory of adaptive expectations, and it needs to include irrational expectations. The good news is that this idea is finally getting a lot of attention, and the old faith that evolutionary competition in some form would enforce the tidy picture of rational expectations can be consigned to the dustbin of history.

         *** UPDATE ***

I just want to clarify that what I've written in this post doesn't begin to do justice to the rich material in Cars Hommes review article. I mentioned some of the earlier empirical work he discusses, but much of the paper reviews then results of extensive laboratory experiments, by his group and other groups, exploring how people actually do form expectations in situations where the experimenter has close control on the process in question. For example, a price fluctuates from period to period and volunteers are asked to predict it over 50 periods. They are told very little, except some facts about where the prices come from -- from a speculative market such as a stock market, or instead from a very different market with perishable goods. Skeletal information of this kind can make them expect positive feed backs, hence the potential for self-reinforcing trends, or quite the opposite.

There's nothing in these experiments that explains why they weren't done 30, 40, 50 years ago. Why didn't someone in the 1970s start such a program to really test the Rational Expectations idea? (Did they and I'm just not aware of it?). It's a shame because these experiments yield a wealth of insight, as well as directions for future work. I'll just quote from Hommes conclusions (my comments in brackets):
Learning to forecast experiments are tailor-made to test the expectations hypothesis, with all other model assumptions computerized and under control of the experimenter. Different types of aggregate behavior have been observed in different market settings. To our best knowledge, no homogeneous expectations model [rational or irrational] fits the experimental data across different market settings. Quick convergence to the RE-benchmark only occurs in stable (i.e. stable under naive expectations) cobweb markets with negative expectations feedback, as in Muth's (1961) seminal rational expectations paper. In all other market settings persistent deviations from the RE fundamental benchmark seem to be the rule rather than the exception.

Selasa, 14 Juni 2011

The latest IAM is the earliest

Issue 48 of Intellectual Asset Management (IAM) magazine is now available online and in digital format. The cover date is July/August, but don't let that worry you.

Editor Joff Wild has a few things to say about this issue:
"For six weeks during January and February this year, IAM and the IP Solutions business of Thomson Reuters offered readers of the magazine and the IAM blog the opportunity to take part in our second annual benchmarking survey. The 700-plus in-house and private practice IP professionals who agreed to do so answered detailed questions that focused on issues such as patent quality, portfolio creation and management, litigation, licensing and the alignment of IP with overall business strategies. The results of the survey – along with comments on its findings from a number of well-known individuals in the IP world – form the cover story of this issue.

IAM 48 is also special for another reason. It marks the first time that we have invited a highly regarded chief IP officer to guest edit the magazine. The person in question is Damon Matteo, from the Palo Alto Research Centre (PARC). ...  Our thanks go to him and to the authors he selected to write for us: Vincent Pluvinage, formerly of Intellectual Ventures; Peter Holden of Coller Capital; Nader Mousavi of Sullivan & Cromwell; and Dan Figueroa of Sony Computer Entertainment America. Look out, too, for the profile of Damon and PARC written by new IAM reporter Helen Sloan.

... In a very full issue, we also have our regular IAFS contribution and a fascinating insight into the developing IP transactions marketplace in China. In addition, we carry a management report on IP issues in the life sciences industries.. ..".
You can check out the contents of this issue in full here.

The British Patent Box: consultation advances

The Patent Box comes
out of the closet ...
In November 2010 the United Kingdom announced its intention to establish a 'Patent Box'. As the Treasury explained at the time,
"The Government is consulting on a preferential regime for profits arising from patents, known as a Patent Box. The intention is to introduce rules in Finance Bill 2012.

The Patent Box will encourage companies to locate the high-value jobs and activity associated with the development, manufacture and exploitation of patents in the UK. It will also enhance the competitiveness of the UK tax system for high-tech companies that obtain profits from patents ...".
Last week the Government released the next stage of the consultation on its Patent Box proposals. You can read the consultation document here.  The Government now seeks views on the points raised in the consultation document.  These must be received by 2 September 2011.  Between now and that date, the Government will be continuing to consult businesses on its proposals.

This blogger's colleagues at Olswang LLP have taken quite an interest in these proposals and plan to respond to the consultation after seeking views on the proposals from business and interested parties. A summary of the firm's response to the first stage of the consultation, following discussions with clients and contacts in the high-tech, pharma and life sciences sectors, runs like this.
"Olswang's consultees were generally supportive of the Patent Box proposal.  However, a number of suggestions were made regarding the design of the regime, some of which were considered would be critical to its success. 
Appropriate conditions for a patent to qualify 

Eligible Patents
  • It is understood the Government intends to limit the regime by reference to patents registered in particular jurisdictions.  The Patent Box regime should recognise patents from EU member jurisdictions, the United States of America, Japan, Australia, Korea and the BRIC countries (although income subject to corporation tax from products sold in/ licences to any jurisdiction should fall within the regime if a relevant patent is registered in only one qualifying jurisdiction).
  • The Patent Box should apply to acquired patents, not just those based on technology developed by the patent holder. 
Ownership Criteria
  • Beneficial ownership should take precedence over legal ownership.  Exclusive licensees should also qualify for the regime.
Commercialisation Condition (the Government has proposed that all patents first commercialised after 29 November 2010 will qualify for inclusion in the Patent Box)
  • Initial commercialisation could be the date that a product protected by a patent is first offered for sale to end-consumers.  Similarly, for patent licence income from easily identifiable patents, the date of initial commercialisation could be the date that the patent was first licensed. 
  • A number of representations were made that, in order for the UK Patent Box regime to be both practicable and competitive, it should apply to all embedded patent profits and licensing related profits received from eligible patents from 1 April 2013.
Determining patent income
  • Receipts from disposals of patents and other receipts of a capital nature should be included within the regime.  This is to reflect (and avoid distortion of) commercial actions.
  • A simple formulaic approach to determine the proportion of income that falls within the regime is to be welcomed.  However, companies should be given the option of using transfer pricing methods to determine the correct amount of income.
Commercial alignment
  • The Patent Box should not result in a narrowing of the Research & Development (R&D) tax credit regime, nor should it claw back the benefit of R&D tax credits or enhanced deductions.
  • For the regime to reflect commercial reality it should apply to income generated prior to the grant of a patent; the date of publication may be appropriate.  However, we understand the Government may take the approach that the regime should not apply unless and until a patent is granted and that, once a patent is granted, credit for pre-grant profits be given by way of an enhanced corporation tax deduction in the tax year in which the patent is granted.  We have suggested that any such approach should be carefully considered as it may undermine the competitiveness of the UK's Patent Box regime.
  • If a patent is granted and then revoked there should be no claw-back of the benefit of the regime on the pre-revocation profits.
Preventing artificial tax avoidance
  • We would expect that the high costs of obtaining and maintaining a patent would, in itself, deter artificial behaviours in relation to the Patent Box regime. 
  • If the Government insists on including an activities based restriction, it should be the case that historic R&D activity should also be taken into account (rather than only ongoing R&D or manufacturing which the Government stated it was considering).
Further general views
  • It is understood that the Government will not at this time introduce a regime that applies to income deriving from IP generally; it is suggested that the Government review this position in due course.
  • It is suggested that all IP income deriving from R&D activity should be included in the scope of the regime.  At the very least "know-how" that is inextricably linked to a qualifying patent should be included within the regime.
  • Income obtained whilst a product is protected by a supplementary protection certificate should be included within the regime. 
  • Income connected with services related to qualifying patents should be included within the regime.
  • The regime as currently proposed would appear to be easier to apply to products that are protected primarily by one or only a few patents and for companies that licence easily identifiable rights.  In particular, it would appear to be much more challenging to apply to complex technology products and licences of patents relating to complex technology.  The regime should be structured such that it provides a sufficient tax incentive for innovative companies across all high-tech sectors in order to achieve its stated aims.
  • The regime must be easily accessible and simple to apply.  Procedures for obtaining advance clearances should be introduced. 
  • The effectiveness of the regime should be reviewed on a regular basis and its terms adjusted to deal with perceived and practical difficulties".
Anyone wishing to comment on the consultation document, or on the views expressed above, should email Natasha Kaye or my SPC Blog colleague Robert Stephen.  IP Finance is also pleased to hear from other bodies, firms and individuals who are seeking comments or intending to make submissions: just email me here with the subject line 'Patent Box'.

Minggu, 12 Juni 2011

Patent Licensing Fees Modest in Total Cost of Ownership for Cellular

In this, the third in a series of features written for Keith Mallinson (WiseHarbor) for IP Finance, Keith addresses the claim that the aggregate of patent licence fees paid by anyone buying into patented mobile handset technology is prohibitive and stifles competition (Nb if you can't read Exhibits 1 and 3 clearly on-screen, try clicking them).
"Patent Licensing Fees Modest in Total Cost of Ownership for Cellular 
Patented technology is the lifeblood of today’s advanced mobile handsets, network equipment and operator services. As mobile services become increasingly sophisticated, manufacturing of handsets and network equipment represents a declining share of value compared to investments in innovative mobile technologies and software. There is no inherent maximum value share for the IP created with such investments. Aggregate IP fees are a small proportion of handset costs and are very modest compared to operator service charges. Handset costs as a percentage of total ownership expenditures including operator services are 17% in the US and Canada and 13% in Western Europe. 
My previous IP Finance posting showed markets for mobile phones and operator services have flourished with outstanding growth, technological innovation, significant competition and tumbling prices on the basis of (Fair) Reasonable and Non-discriminatory licensing for technologies required to implement mobile communications standards. Despite all these positives, some still complain IP fees are excessive in comparison to other costs. In this article, I evaluate fees paid upstream in technology licensing in comparison to downstream expenditures in supply of handsets and provision of operator services. 
Caps to fix IP charges 
There are concerted attempts to limit licensing fees in standards-essential IP.  For example, downstream equipment manufacturers seek to minimize out-payments for licensing standards-essential IP by promoting aggregate royalty caps.  In 2008, Alcatel-Lucent, Ericsson, NEC, NextWave Wireless, Nokia, Nokia Siemens Networks and Sony Ericsson announced their agreement that aggregate royalties for handsets implementing the 3G/4G LTE standard should be capped below 10% of handset prices. Similarly, mobile operators, who in many cases subsidize handset prices to consumers, also seek to limit these licensing fees.  A common proposal from several mobile operators is to limit aggregate essential-IP charges by establishing an LTE patent pool. Patent pooling will be the topic of my next IP Finance posting. However, one immediate and obvious observation is that if a patent pool is designed to limit aggregate license fees for the benefit of downstream licensees, then it will be unattractive to upstream licensors that depend on licensing revenue to fund continued investments in R&D and earn a return on prior investments.  Also, the major vertically-integrated companies have mostly preferred to enter into bilateral agreements with other vertically-integrated companies in order to be able to negotiate cross-licenses with trade-offs between their business interests and patent portfolios. 
Unproven suppositions of licensing excesses by some technology licensors and resulting harm abound by predominant voices downstream and their cheerleaders. For example, an August 2009 contribution to the European Competition Journal by Philippe Chappatte of Slaughter and May argues that: 
·         There is likely to be an upward spiral of royalty claims for many standards including telecoms standards resulting in higher costs for handsets and other standardised products; and

·         Operators will be reluctant to invest in new technologies or upgrade their networks to endorse faster and higher quality networks and the quality and range of services that will be available to consumers may be prejudiced. 
Contrary evidence is that handset prices and royalty costs have actually fallenwith handset prices, upon which royalty fees are based, declining 77% on average since 1993despite the addition of many new technologies and increasing demand for advanced features and functionality. 
Estimates for “cumulative royalties” vary widely. In 1998, International Telecommunications Standards User Group (representing some operators and manufacturers) complained to the European Commission that “when GSM handsets first appeared on the marketplace cumulative royalties amounted to as much as 35 percent to 40 percent of the ex-works selling price”. Much lower estimates for the cumulative GSM royalty rate paid, by companies that do not have any patents to trade, include 10-13 percent (IP Law and Business reporting PA Consulting Group estimate, July, 2005). In September 2005, CSFB’s “3G Economics” report estimated cumulative royalties had fallen to single digits and predicted 17.3% cumulative royalties in WCDMA “for those vendors without an IPR position to trade off”. Whereas ABI Research described average WCDMA cumulative royalties of 9.4% in 2007 “a most challenging barrier... ...to the development of more affordable devices”, the market-leading handset manufacturer with 37% share was paying much less: Nokia stated that “until 2007 it has paid less than 3 percent aggregate license fees on WCDMA handset sales under all its patent license agreements”. 
In addition, there have been various attempts to determine aggregate fees sought by licensors for new technologies. In 2007, the Next Generation Mobile Network (NGMN) Alliance, an industry group led by mobile operators and including major 4G equipment vendors, established a confidential process for the ex ante disclosure and aggregation of expected licensing fees for a number of upcoming 4G standards including LTE. The process concluded in 2009 and the results are confidential. However, commentators have suggested the individual disclosures of expected licensing feeswhich were in several cases accompanied by public disclosures on company websitesproduced misleading and unrealistic figures.
Aggregate figures derived are not actual prices paid including cross-licensing and do not reflect other realities in negotiations such as identification of patents that are weak or inapplicable. Patent strengths and “essentiality” were not validated. In 2003, the 3G Patent Platform Partnership (including 19 telecommunications operators and equipment makers) estimated “that several hundred different patents, among several thousand publicly claimed as essential, will actually be determined to be ‘essential patents’ in implementing 3G standards”. Some candidate licensees would rather risk being sued than pay “rack rates” in these circumstances. Licensors prefer to negotiate settlements than litigate and subject their patents to invalidity and non-infringement claims. Vertically-integrated licensors are particularly concerned about their product revenues with the risk of being counter-sued for infringement. 
Mobile operators are as eager as ever to invest in new technologies to improve performance and lower total costs. New technology cost savings outweigh licensing fees. For example, while mobile operators spend billions of dollars on spectrum, technological advancements have mitigated this cost with 20-fold spectral efficiency increases and much improved voice encoding since 1G analogue cellular. Operators worldwide are investing extensively in advanced technologies HSPA+ and LTE that have increased network capacity and maximum end-user data speeds 1,000-fold since the introduction of 2G technologies around 1993. In the US, for example, all the major operators (and smaller ones too) claim to have introduced “4G services” over the last couple of years. Operators are also making major investments in associated devices by significantly subsidising end-user prices. With demand for HSPA+ and LTE so strong, IP cost issues can be no more significant than they were with previously and currently successful 2G and 3G technologies. 
Increasing value share in software and patents 
There is no reason why any arbitrary percentage limit should be imposed on IP costs. It is widely accepted that when one pays, for example, $25 for a hardback or $10 for a paperback book, production costs in printing account for but a small proportion of these figures. Royalties to authors, illustrators and agents as well as costs in distribution, marketing and the publisher’s profit margin account for the vast majority of these prices. Similarly, other IP-intensive products, as illustrated in Exhibit 1, have a significant proportion of costs in the intangibles. 
Exhibit 1: Manufactured content value varies substantially by product category
                    Source: WiseHarbor
I have predicted a marked trend of increasing value with the intangibles in mobile devicesincluding embedded and aftermarket software predominating over hardwaresince Apple’s 2008 3G iPhone launch. The success of the iPhone including its Apps store proves my point. The iPhone leads the smartphone market and has a manufacturing cost around just one third of its $600 average wholesale pricing (before operator subsidies to consumers). Gross profit margins approaching 60% provide a significant return on investments in software, brand and distribution, while Apple largely relies on the essential IP developed and contributed to mobile standards by others. 
Handset, network and services-essential IP 
Mobile phones are inextricable from the networks and operator services with which they are used: licensing fees should be considered in this broader context. In contrast to technologies that can be used offline, such as in audio and video players, standards-essential IP is implemented end-to-end in handsets and network equipment with the provision of cellular voice and data services. In addition to increased speeds and network capacity, end-to-end innovations include voice encoding, encryption, automatic roaming and location tracking. A handset in isolation from a network cannot make calls or receive data, let alone exploit any of these capabilities. By convention, licensing fees are charged on wholesale mobile phone prices. Whereas this royalty base is simple and convenient to administer in licensing, it overlooks where most ecosystem value is generated—in operator service revenues. In fact, phone prices are commonly subsidised—to substantial extent in many cases—by operators in anticipation of these revenues. 
The average service life of a phone from purchase until retirement is around 20 months in the US where postpaid contracts predominate and 34 months in Western Europe where most users have prepaid or SIM-only service with unsubsidised phones. Exhibit 2 shows that during a handset’s service life, consumers spend on average around five or six times more on service fees than they or their operators spend on the handset. Handset costs in the US/Canada and Western Europe represent 17% and 13% respectively of total ownership expenditures including handset costs and operator service charges. 
Exhibit 2: Handsets, a small proportion of total ownership expenses

US and Canada
Western Europe
Average service revenue per user (per month)
$50
$32
Service life (in months)
20
34
Total operator services expenditures
$1,001
$1,087
Average unsubsidised wholesale phone price
$207
$167
Total lifecycle expenditures
$1,208
$1,254
Handset cost/total expenditures
17%
13%
       Source: WiseHarbor, based on 2009 and 2010 market figures
Royalty rates expressed as a percentage of total ownership lifecycle expenses are therefore much lower than rates based on handset prices. Exhibit 3 shows that converting aggregate handset cost-based royalty rates to rates based on total ownership expenditures reduces the rate to 13% and 17% of the rate based on handset costs for Western Europe and US/Canada respectively.  More frequent handset upgrades in the US account for most of the differences between the two regions. 
 
Source: WiseHarbor Research   * For companies with no IP to trade

Competitive advantage with IP 
It is not the average level of IP charges that affects competition; it is the different rates paid among competitors. Aggregate royalty rates are significantly less than European Union VAT rates that have mostly ranged from 15% to 25% in recent years.  Applied uniformly among competitors, taxing phones and services at these VAT rates has not significantly impeded their sales versus nations where consumption taxes on phone sales are much lower. 
The asymmetry in licensing costs between manufacturers with IP who can cross-license to minimise their licensing expenditures and manufacturers without essential-IP patents who must pay more is a significant competitive factor. Manufacturers are faced with a business choice: bear the up-front costs and risks of investing in technologies with the aim to cross-license for much of the essential IP required, or pay to license others’ IP. Investing up to several billions of dollars per year in R&D in the hope that some of it will prove effective enough to be accepted in leading mobile standards merits competitive benefits and commercial returns. Nevertheless, latter-day cellular market entrants including Research in Motion, HTC, Apple and others succeeded with little or nothing in the way of essential IP at the outset".
Keith Mallinson‘s recent clients include several mobile phone technology IP owners. His work includes various other commercial issues as well as IP. He provides advisory services including market analysis and forecasts for operator services, network equipment and handset. He also has significant testifying expert witness experience in the cellular sector, but has not yet testified on matters relating to standards-essential IP.
 

http://financetook.blogspot.com/ Copyright © 2012 -- Powered by Blogger