Kamis, 29 Desember 2011

‘Different Sectoral Contexts’ approach is vital for IAM


… noted Jeremy in the last post to IP Finance. SMEs in the automotive sector may be interested in a recently completed study which maps the IP business models behind ten cases of US patent, trade secret and trade mark litigation brought by the most prolific US patent applicants in the field of braking technology. The study’s author runs the Engineering Intellectual Property Research Unit at Cranfield University’s School of Engineering.

The study shows that the predominant IP monetisation mechanism in the field of braking is that of “monopoly provision”, i.e. using IP to exclude competing suppliers. There is only one instance of a patent licensing relationship, that being with a company already related to the patentee, suggesting that SME developers of braking technology may struggle to license into prolific US patent applicant companies.

As regards the IP mix that such SME developers might employ, the study confirms that trade secret protection is not used above raw material / component-level manufacture. It also illustrates the risk of third party patent infringement associated with trade secret protection.

The greatest threat of infringement litigation appears to come from competitors at a similar level on the value chain, the study also showing that neither the small size of an SME nor the large size of a company’s patent portfolio can guarantee immunity from suit.

Senin, 26 Desember 2011

Intellectual assets and innovation: a sector-by-sector study on SMEs

The IP Finance weblog is grateful to Chris Torrero for drawing its attention to a recent study, Intellectual Assets and Innovation: the SME Dimension, which has been published by OECD Publishing in its OECD Studies on SMEs and Entrepreneurship series.
"Intellectual Property Rights can be instrumental for SMEs to protect and build on their innovations; position themselves competitively vis-à-vis larger enterprises in global markets; gain access to revenues; signal current and prospective value to investors, competitors and partners; access knowledge markets and networks; open up new commercial pathways; or segment existing markets ['can be' -- but the same rights 'can be' a means by which SMEs bankrupt themselves though over-expenditure in acquiring them and over-extending themselves when seeking to enforce them. That's why it's a shame that ...] ....  while there is increasing recognition of their significance, as well as the need for appropriate intellectual asset management for SMEs across OECD countries, there are few regulatory frameworks or specific instruments directed to SMEs. This is in part due the pace of technological innovation, which often exceeds the time it takes for policy makers to create appropriate responses to the changing landscape of intellectual property. 
This study explores the relations between SME intellectual asset management, innovation and competitiveness in different national and sectoral contexts [the 'different sectoral contexts' approach is vital, since IPRs behave so differently in their respective contexts and we have suffered too long from 'one size fits all' prescriptive analyses of the IP needs of SMEs]. It provides insights on the ability of SMEs to access and utilise the protection systems available to them and identifies key challenges for SMEs in appropriating full value from IPRs. It also investigates effectiveness of regulatory frameworks and policy measures to support SME access to IPRs, identifying best practices and proposing policy recommendations".

The study is available in print and pdf formats. Full details are available from the OECD's bookshop website here.

Minggu, 25 Desember 2011

The New York Times and Diminishing Derivative Works


Most of the readers of this blog are surely familiar with the struggle, some say existential, of print newspapers to find a viable business model in today's increasingly online world. The issue, as framed, is simple: having given away substantial content for free online access without a revenue stream parallel to the want-ads cum advertisments of print newspapers, can such newspapers now find a long-term model based on charging for online contents? This, while at the same time the newspapers continue to try to salvage something from its print product.

Against this print v online morality play, a little-mentioned event occurred late last week that highlights another aspect of the search for commercially viable platforms to deliver copyright content. The event in question was the announcement by the New York Times that it is discontinuing its broadcast of podcasts. First a confession: I am podcast freak. For over two hours each day, starting with my one-hour constitutional in the morning and extending to the 45-minute bus commmute to and from work, I combine either walking or bus travel with listening to a series of informative podcasts by the newspaper on business, politics, science and music.

Without a doubt the New York Times is well ahead of the class in the contents of the podcasts in these (and probably in all areas subject to a podcast broadcast by the company). It is not an exageration to say that these podcasts have helped frame the terms of my thinking in recent years. In effect, these New York Times podcasts are a unique form of derivative work, whereby journalists are called upon to adapt their print contents to the quite different medium of the spoken world. Unlike the mere reproduction of audio content broadcast on radio and then reworked for further podcast broadcast in MP3 format (such as Bloomberg, the BBC or National Public Radio), the New York Times had succceeded in creating a distinct content form elegantly adapted for an aural, non-visual platform. And yet this same newspaper, owner of one of the most august names in the journalistic world, has decided to shut down one (albeit small) aspect of its journalistic excellence.

While no specific reason was given in the announcement of the discontinuance of the podcasts, a strong hint was found near the end. There, the presenter urged listeners to either take up an offer to subscribe to the online version at an introductory low price or continue with the print edition. This suggests that either the podcasts were an expense that the newspaper was no longer interested in bearing, and/or that the podcasts had not succeeded in driving listeners to subscribe to either the online or print edition, and/or the podcast broadcasts were merely cannabilizing potential revenue-generating customers. In shutting down its podcast service, the newspaper is apparently prepared to risk the loss of the goodwill accruing to its name by virtue of the podcasts or, even more, engendering a certain feeling of betrayal -- contents available via the iPod over a number of years have been suddenly yanked from the listening public.

Whatever the reason, the result of the decision is to impoverish the content available for a medium (MP3 iPod players) that is uniquely placed to facilitate the optimalization of the multi-tasking experience, something neither the online or print experience can offer. After all, one can equally engage in various forms of physical activity while also listening to his favourite podcast content. Nothing of the like can be achieved by online or print content, which demands the complete attention of the reader.

It has been noted more than once that the burgeoning of derivative works and the success of their commercial exploitation has been one of the main drivers in the expansion of copyright over the last several decades. Both commercially and artistically, therefore, the creation and exploitation of derivative works is one of the success stories of modern copyright. When it comes to MP3 content, however, that is no longer the case, at least for the New York Times. In so doing, at least one najor copyright owner has apparently decided to cut back on the production of quality derivative works. That is a result that should be lamented.

Kamis, 22 Desember 2011

The Higgs Boson -- and the value of money

What does the (tentatively discovered) Higgs Boson have to do with finance? Nothing. At least nothing obvious. Still, it's a fascinating topic with grand sweeping themes. I've written an essay on it for Bloomberg Views, which will appear later tonight or tomorrow. I'll add the link when it does.

The interesting thing to me about the Higgs Boson, and the associated Higgs field (the boson is an elementary excitation of this field), is the intellectual or conceptual history of the idea. It seems crazy to think that as the universe cooled (very shortly after the big bang) a new field, the Higgs field, suddenly appeared, filling all space, and giving particles mass in proportion to how strongly they interact with that field. It would be a crazy idea if it were just a proposal pulled out of thin air. But the history is that Higgs' work (and the work of many others at the same time, the early 1960s) had very strong stimulation from the BCS theory of superconductivity of ordinary metals, which appeared in 1957.

That theory explained how superconductivity originates through the emergence below a critical temperature of a condensate of paired electrons (hence, bosons) which acts as an extremely sensitive electromagnetic medium. Try to impose a magnetic field inside a superconductor (by bringing a magnet close, for example) and this condensate or field will respond by stirring up currents which act precisely to cancel the field inside the superconductor. This is the essence of superconductivity -- its appearance changes physics inside the superconductor in such a way that electromagnetic fields cannot propagate. In quantum terms (from quantum electrodynamics), this is equivalent to saying that the photon -- the carrier of the electromagnetic fields -- comes to have a mass. It does so because it interacts very strongly with the condensate.

This idea from superconductivity is pretty much identical to the Higgs mechanism for giving the W and Z particles (the carriers of the weak force) mass. This is what I think is fascinating. The Higgs prediction arose not so much from complex mathematics, but from the use of analogy and metaphor -- I wonder if the universe is in some ways like a superconductor? If we're living in a superconductor (not for ordinary electrical charge, but for a different kind of charge of the electroweak field), then it's easy to understand why the W and Z particles have big masses (more than 100 times the mass of the proton). They're just like photons traveling inside an ordinary superconductor -- inside an ordinary metal, lead or tin or aluminum, cooled down to low temperatures.

I think it's fitting that physics theory so celebrated for bewildering mathematics and abstraction beyond ordinary imagination actually has its roots in the understanding of grubby things like magnets and metals. That's where the essential ideas were born and found their initial value.

Having said that none of this has anything to do with finance, however, I should mention a fascinating proposal from 2000 by Per Bak, Simon Nørrelykke and Martin Shubik, which draws a very close analogy between the process which determines the value of money and any Higgs-like mechanism. They made the observation that the absolute value of money is essentially undetermined:
The value of money represents a “continuous symmetry”. If, at some point, the value of money was globally redefined by a certain factor, this would have no consequences whatsoever. Thus, in order to arrive at a specific value of money, the continuous symmetry must be broken.
In other words, a loaf of bread could be worth $1, $10, or $100 -- it doesn't matter. But here and now in the real world it does have one specific value. The symmetry is broken.

This idea of continuous symmetry is something that arises frequently in physics. And it is indeed the breaking of a continuous symmetry that underlies the onset of superconductivity. The mathematics of field theory shows that, anytime a continuous symmetry is broken (so that some variables comes to take on one specific value), there appears in the theory a new dynamical mode -- a so-called Goldstone Mode -- corresponding to fluctuations along the direction of the continuous symmetry. This isn't quite the appearance of mass -- that takes another step in the mathematics, but this Goldstone business is a part of the Higgs mechanism.

I'll try to return to this paper again. It offers a seemingly plausible dynamical model for how a value of money can emerge in an economy, and also why it should be subject to strong inherent fluctuations (because of the Goldstone mode). None of this comes out of equilibrium theory, nor should one expect it to as money is an inherently dynamical thing -- we use it as a tool to manage activities through time, selling our services today to buy food next week, for example.

Selasa, 20 Desember 2011

Economics and IP: the Katonomics posts

After posting this item on the IPKat weblog, it occurred to me that there are probably a good many readers of the IP Finance weblog who are interested in the point at which economics intersects with IP but who do not read the IPKat. Accordingly I've listed the titles of a series of six posts on economics and IP, written by economist Dr Nicola Searle and published together under the term "Katonomics". A further series by the same author will follow in the New Year.
  • No.1: The social contract theory of IP
  • No.2: The economics of trade marks
  • No.3: Evidence-based policy: the challenge of data
  • No.4: Where to look for an IP-oriented economist
  • No.5: The paradox of fashion
  • No.6: The economics of IP in pharmaceuticals.

Kamis, 15 Desember 2011

Can Policy Right the Science Ship? The Case of Argentina


With another Nobel Prize season behind us after the winners picked up their prizes last weekend, it is worthwhile to consider the state of research and development in developing countries (or the "low end" of developed countries). Once again, and for the fourth time in less than a decade, an Israeli was awarded the Nobel prize for Chemistry, this time Professor Dan Schechtman of the Technion--Israel Institute of Technology, here. However, except for the oversized success of Israelis, especially in Chemistry, the track record of similarly placed countries regarding Nobel Prize awards has become more and more sparse out the years.

Against that backdrop, The Economist published an interesting article in its November 5th issue. Entitled "Cristina the Alchemist: Science in Argentina", the article discusses the multi-faceted attempts of the Argentine government under successive presidents, first the late Nestor Krichner and more recently his widow, Cristina Fernandez, to upgrade the state of science and technological research in the country.

Argentina is hardly without a respectable past in this regard. Using Nobel Prize awards as a rough proxy, the country has seen three winners in science. However, the last of these winners received the award in 1984 and the country has been witness to a precipitous decline since then. In response, the government has adopted a series of measures designed to right the sinking ship of Argentine science. Based on the article, the leading aspects of this push can be summarized as follows:

1. R&D expenditures has risen from 0.41% to 0.64% of GDP (although this is still far less than that expended for R&D in Brazil in 2009--1.18% of GDP).

2. Under President Kirchner, researchers' salaries were increased, an organized scheme has been put into place to repatriate Argentine scientists abroad, and tax breaks were given for the software industry. Under President Fernandez, a new science ministry was created and grants of money for new product development has been increased.

3. The state is financially supporting the cost of registering patents in jurisdictions outside of Argentina and lawyers' fees in connection with defending these patents. It is also supporting the placement of PhDs with employers in the IT area, including partial support of their salaries in this area.

4. Perhaps the most interesting result of the foregoing is that 854 scientists have returned to Argentina, lured by new labs and increased compensation. In turn, researchers have increased their presence in the leading scientific journals to 179 published articles during the past decade, compared to only 30 articles published in the 1990s. As well, there seem to be particularly noteworthy developments in agriculture and horticulture.

5. That said, reservations have been raised about the extent to which these scientists are engaged in industry-oriented enterprises. Moreover, the article is stone-silent on the extent of patent activity as a result of these efforts. At the macro level, it remains to be seen whether the Argentine government will stay the political course and maintain these policies over an extended period of time, much less whether these efforts will bear fruit at the level of Nobel prizes and similar achievements 15-30 years down the line.

The example of Israel should provide Argentina with an example of how a mix of public and private activities can enable a marginally developed country to reach world-class accomplishments in science. The Israeli situation should also serve as a caution that continued vigilance is essential. While the country basked last week in the Nobel Prize granted to Professor Schechtman, the professor himself sent a pointed and clear message to the country's leaders: Unless you adopt a wide-reaching set of changes in the approach and support of science, starting from primary education, there will not be another generation of Nobel prize recipients. The implications for Argentina are clear.

Selasa, 13 Desember 2011

a little more on power laws

I wanted to respond to several insightful comments on my recent post on power laws in finance. And, after that, pose a question on the economics/finance history of financial time series that I hope someone out there might be able to help me with.

First, comments:

ivansml said...
Why exactly is power-law distribution for asset returns inconsistent with EMH? It is trivial to write "standard" economic model where returns have fat tails, e.g. if we assume that stochastic process for dividends / firm profits has fat tails. That of course may not be very satisfactory explanation, but it still shows that EMH != normal distribution. In fact, Fama wrote about non-gaussian returns back in 1960's (and Mandelbrot before him), so the idea is not exactly new. The work you describe here is certainly useful and interesting, but pure patterns in data (or "stylized facts", as economists would call them) by themselves are not enough - we need some theory to make sense of them, and it would be interesting to hear more about contributions from econophysics in that area.
James Picerno said...
It's also worth pointing out that EMH, as I understand it, doesn't assume or dismiss that returns follow some specific distribution. Rather, EMH simply posits that prices reflect known information. For many years, analysts presumed that EMH implies a random distribution, but the empirical record says otherwise. But the random walk isn't a condition of EMH. Andrew Lo of MIT has discussed this point at length. The market may or may not be efficient, but it's not conditional on random price fluctuations. Separately, ivansmi makes a good point about models. You need a model to reject EMH. But that only brings you so far. Let's say we have a model of asset pricing that rejects EMH. Then the question is whether EMH or the model is wrong? That requires another model. In short, it's ultimately impossible to reject or accept EMH, unless of course you completely trust a given model. But that brings us back to square one. Welcome to economics.
I actually agree with these statements. Let me try to clarify. In my post I said, referring to the fat tails in returns and 1/t decay of volatility correlations, that  "None of these patterns can be explained by anything in the standard economic theories of markets (the EMH etc)." The key word is of course "explained."

The EMH has so much flexibility and is so loosely linked to real data that it is indeed consistent with these observations, as Ivansml (Mark) and James rightly point out. I think it is probably consistent with any conceivable time series of prices. But "being consistent with" isn't a very strong claim, especially if the consistency comes from making further subsidiary assumptions about how these fat tails might come from fluctuations in fundamental values. This seems like a "just so" story (even if the idea that fluctuations in fundamental values could have fat tails is not at all preposterous).

The point I wanted to make is that nothing (that I know of) in traditional economics/finance (i.e. coming out of the EMH paradigm) gives a natural and convincing explanation of these statistical regularities. Such an explanation would start from simple well accepted facts about the behaviour of individuals, firms, etc., market structures and so on, and then demonstrate how -- because of certain logical consequences following from these facts and their interactions -- we should actually expect to find just these kinds of power laws, with the same exponents, etc., and in many different markets. Reading such an explanation, you would say "Oh, now I see where it comes from and how it works!"

To illustrate some possibilities, one class of proposed explanations sees large market movements as having inherently collective origins, i.e. as reflecting large avalanches of trading behaviour coming out of the interactions of market participants. Early models in this class include the famous Santa Fe Institute Stock Market model developed in the mid 1990s. This nice historical summary by Blake LeBaron explores the motivations of this early agent-based model, the first of which was to include a focus on the interactions among market participants, and so go beyond the usual simplifying assumptions of standard theories which assume interactions can be ignored. As LeBaron notes, this work began in part...
... from a desire to understand the impact of agent interactions and group learning dynamics in a financial setting. While agent-based markets have many goals, I see their first scientific use as a tool for understanding the dynamics in relatively traditional economic models. It is these models for which economists often invoke the heroic assumption of convergence to rational expectations equilibrium where agents’ beliefs and behavior have converged to a self-consistent world view. Obviously, this would be a nice place to get to, but the dynamics of this journey are rarely spelled out. Given that financial markets appear to thrive on diverse opinions and behavior, a first level test of rational expectations from a heterogeneous learning perspective was always needed.   
I'm going to write posts on this kind of work soon looking in much more detail. This early model has been greatly extended and had many diverse offspring; a more recent review by LeBaron gives an updated view. In many such models one finds the natural emergence of power law distributions for returns, and also long-term correlations in volatility. These appear to be linked to various kinds of interactions between participants. Essentially, the market is an ecology of interacting trading strategies, and it has naturally rich dynamics as new strategies invade and old strategies, which had been successful, fall into disuse. The market never settles into an equilibrium, but has continuous ongoing fluctuations.

Now, these various models haven't yet explained anything, but they do pose potentially explanatory mechanisms, which need to be tested in detail. Just because these mechanisms CAN produce the right numbers doesn't mean this is really how it works in markets. Indeed, some physicists and economists working together have proposed a very different kind of explanation for the power law with exponent 3 for the (cumulative) distribution of returns which links it to the known power law distribution of the wealth of investors (and hence the size of the trades they can make). This model sees large movements as arising in the large actions of very wealthy market participants. However, this is more than merely attributing the effect to unknown fat tails in fundamentals, as would be the case with EMH based explanations. It starts with empirical observations of tail behaviour in several market quantities and argues that these together imply what we see for market returns.

There are more models and proposed explanations, and I hope to get into all this in some detail soon. But I hope this explains a little why I don't find the EMH based ideas very interesting. Being consistent with these statistical regularities is not as interesting as suggesting clear paths by which they arise.

Of course, I might make one other point too, and maybe this is, deep down, what I find most empty about the EMH paradigm. It essentially assumes away any dynamics in the market. Fundamentals get changed by external forces and the theory supposes that this great complex mass of heterogenous humanity which is the market responds instantaneously to find the new equilibrium which incorporates all information correctly. So, it treats the non-market part of the world -- the weather, politics, business, technology and so on -- as a rich thing with potentially complicated dynamics. Then it treats the market as a really simply dynamical thing which just gets driven in slave fashion by the outside. This to me seems perversely unnatural and impossible to take seriously. But it is indeed very difficult to rule out with hard data. The idea can always be contorted to remain consistent with observations.

Finally, another valuable comment:
David K. Waltz said...
In one of Taleeb's books, didn't he make mention that something cannot be proven true, only disproven? I think it was the whole swan thing - if you have an appropriate sample and count 100% white swans does not prove there are ONLY white swans, while a sample that has a black one proves that there are not ONLY white swans.
Again, I agree completely. This is a basic point about science. We don't ever prove a theory, only disprove it. And the best science works by trying to find data to disprove a hypothesis, not by trying to prove it.

I assume David is referring to my discussion of the empirical cubic power law for market returns. This is indeed a tentative stylized fact which seems to hold with appreciable accuracy in many markets, but there may well be markets in which it doesn't hold (or periods in which the exponent changes). Finding such deviations  would be very interesting as it might offer further clues as to the mechanism behind this phenomenon.

NOW, for the question I wanted to pose. I've been doing some research on the history of finance, and there's something I can't quite understand. Here's the problem:

1. Mandelbrot in the early 1960s showed that market returns had fat tails; he conjectured that they fit the so-called Stable Paretian (now called Stable Levy) distributions which have power law tails. These have the nice property (like the Gaussian) that the composition of the returns for longer intervals, built up from component Stable Paretian distributions, also has the same form. The market looks the same at different time scales.
2. However, Mandelbrot noted in that same paper a shortcoming of his proposal. You can't think of returns as being independent and identically distributed (i.i.d.) over different time intervals because the volatility clusters -- high volatility predicts more to follow, and vice versa. We don't just have an i.i.d. process.
3. Lots of people documented volatility clustering over the next few decades, and in the 1980s Robert Engle and others introduced ARCH/GARCH and all that -- simple time series models able to reproduce the realistic properties of financial times, including volatility clustering.
4. But today I found several papers from the 1990s (and later) still discussing the Stable Paretian distribution as a plausible model for financial time series.

My question is simply -- why was anyone even 20 years ago still writing about the Stable Paretian distribution when the reality of volatility clustering was so well known? My understanding is that this distribution was proposed as a way to save the i.i.d. property (by showing that such a process can still create market fluctuations having similar character on all time scales). But volatility clustering is enough on its own to rule out any i.i.d. process.

Of course, the Stable Paretian business has by now been completely ruled out by empirical work establishing the value of the exponent for returns, which is too large to be consistent with such distributions. I just can't see why it wasn't relegated to the history books long before.

The only possibility, it just dawns on me, is that people may have thought that some minor variation of the original Mandelbrot view might work best. That is, let the distribution over any interval be Stable Paretian, but let the parameters vary a little from one moment to the next. You give up the i.i.d. but might still get some kind of nice stability properties as short intervals get put together into longer ones. You could put Mandelbrot's distribution into ARCH/GARCH rather than the Gaussian. But this is only a guess. Does anyone know?

Jumat, 09 Desember 2011

Prosecuting Wall St.

By way of Simolean Sense:
The following is a script of "Prosecuting Wall Street" (CBS) which aired on Dec. 4, 2011. Steve Kroft is correspondent, James Jacoby, producer.

It's been three years since the financial crisis crippled the American economy, and much to the consternation of the general public and the demonstrators on Wall Street, there has not been a single prosecution of a high-ranking Wall Street executive or major financial firm even though fraud and financial misrepresentations played a significant role in the meltdown. We wanted to know why, so nine months ago we began looking for cases that might have prosecutorial merit. Tonight you'll hear about two of them. We begin with a woman named Eileen Foster, a senior executive at Countrywide Financial, one of the epicenters of the crisis.

Steve Kroft: Do you believe that there are people at Countrywide who belong behind bars?

Eileen Foster: Yes.

Kroft: Do you want to give me their names?

Foster: No.

Kroft: Would you give their names to a grand jury if you were asked?

Foster: Yes.

But Eileen Foster has never been asked - and never spoken to the Justice Department - even though she was Countrywide's executive vice president in charge of fraud investigations...
See the video and transcript here.

Kamis, 08 Desember 2011

Patents and standards again: a valuable study

This weblog has focused a good deal in recent weeks on standards and patents. In this context, the Study on the Interplay between Standards and Intellectual Property Rights (IPRs), April 2011, is highly relevant. Commissioned and financed by the Directorate General for Enterprise and Industry of the European Commission, this study was produced by the Fraunhofer Institute for Communication System and Dialogic in collaboration with the School of Innovation Sciences at Eindhoven University of Technology, and enjoyed the support of two legal consultants.

Ruben Schellingerhout, who kindly drew the attention of the IP Finance weblog to this study, explains a bit about it:
"The study shows that distribution of patents in standards is very skewed, both in terms of standards and in terms of owners. A few standards cover a large number of patents while most standards include only a few patents, or no patents at all [I had no idea that this was the case]. And a relatively small group of companies own a large number of essential patents in standards, while most companies own only a few or none of these patents. 

In the telecommunications and the consumer electronics market, implementers ensure access to essential IPRs most often via cross-licensing and - to a lesser extent - via general licensing-in and patent pools.

Legal uncertainty can still arise on the obligation to disclose, the irrevocability and the geographic scope of the licensing commitment and in cases of transfer of IPRs if they are still subject to a FRAND licensing commitment. Companies expect standard setting organisations to improve transparency on essential IPRs".
Thanks, Ruben, for your kind assistance.  Readers can access the report in full here.

Selasa, 06 Desember 2011

Power laws in finance

My latest column in Bloomberg looks very briefly at some of the basic mathematical patterns we know about in finance. Science has a long tradition of putting data and observation first. Look very carefully at what needs to be explained -- mathematical patterns that show up consistently in the data -- and then try to build simple models able to reproduce those patterns in a natural way.

This path has great promise in economic finance, although it hasn't been pursued very far until recently. My Bloomberg column gives a sketch of what is going on, but I'd like to give a few more details here and some links.

The patterns we find in finance are statistical regularities -- broad statistical patterns which show up in all markets studied, with an impressive similarity across markets in different countries and for markets in different instruments. The first regularity is the distribution of returns over various time intervals, which has been found generically to have broad power law tails -- "fat tails" -- implying that large fluctuations up or down are much more likely than they would be if markets fluctuated in keeping with normal Gaussian statistics. Anyone who read The Black Swan knows this.

This pattern has been established in a number of studies over the past 15 years or so, mostly by physicist Eugene Stanley of Boston University and colleagues. This paper from 1999 is perhaps the most notable, as it used enormous volumes of historical data to establish the fat tailed pattern for returns over times ranging from one minute up to about 4 days. One of the most powerful things about this approach is that it doesn't begin with any far reaching assumptions about human behaviour, the structure of financial markets or anything else, but only asks -- are there patterns in the data? As the authors note:
The most challenging difficulty in the study of a financial market is that the nature of the interactions between the different elements comprising the system is unknown, as is the way in which external factors affect it. Therefore, as a starting point, one may resort to empirical studies to help uncover the regularities or “empirical laws” that may govern financial markets.    
This strategy seems promising to physicists because it has worked in building theories of complex physical systems -- liquids, gases, magnets, superconductors -- for which it is also often impossible to know anything in great detail about the interactions between the molecules and atoms within. This hasn't prevented the development of powerful theories because, as it turns out, many of the precise details at the microscopic level DO NOT influence the large scale collective properties of the system. This has inspired physicists to think that the same may be true in financial markets -- at least some of the collective behaviour we see in markets, their macroscopic behaviour, may be quite insensitive to details about human decision making, market structure and so on.

The authors of this 1999 study summarized their findings as follows:


Several points of clarification. First, the result for the power law with exponent close to 3 is a result for the cumulative distribution. That is, the probability that a return will be greater than a certain value (not just equal to that value). Second, the fact that this value lies outside of the range [0,2] means that the process generating these fluctuations isn't a simple stationary random process with an identical and independent distribution for each time period. This was the idea initially proposed by Benoit Mandelbrot on the basis of the so-called Levy Stable distributions. This study and others have established that this idea can't work -- something more complicated is going on.

That complication is also referred to in the second paragraph above. If you take the data on returns at the one minute level, and randomize the order in which it appears, then you still get the same power law tails in the distribution of returns over one minute. That's the same data. But this new time series has different returns over longer times, generated by combining sequences of the one minute returns. The distribution over longer and longer times turns out to converge slowly to a Gaussian for the randomized data, meaning that the true fat tailed distribution over longer times has its origin in some rich and complex correlations in market movements at different times (which gets wiped out by the randomization). Again, we're not just dealing with a fixed probability distribution and independent changes over different intervals.

To read more about this, see this nice review by Xavier Gabaix of MIT. It covers this and many other power laws in finance and economics.

Now, the story gets even more interesting if you look past the mere distribution of returns and study the correlations between market movements at different times. Market movements are, of course, extremely hard to predict. But it is very interesting where the unpredictability comes in.

The so-called autocorrelation of the time series of market returns decays to zero after a few minutes. This is essentially a measure of how much the return now can be used to predict a return in the future. After a few minutes, there's nothing. This is the sense in which the markets are unpredictable. However, there are levels of predictability. It was discovered in the early 1990s, and has been confirmed many times since in different markets, that the time series of volatility -- the absolute value of the market return -- has long-term correlations, a kind of long-term memory. Technically, the autocorrelation of this time series only decays to zero very slowly.

This is shown below in the following figure (from a representative paper, again from the Boston University group) which shows the autocorrelation of the return time series g(t) and also of the volatility, which is the absolute value of g(t):



Clearly, whereas the first signal shows no correlations after about 10 minutes, the second shows correlations and predictability persisting out to times as long as 10,000 minutes, which is on the order of 10 days or so.

So, its the directionality of price movements which has very little predictability, whereas the magnitude of changes follows a process with much more interesting structure. It is in the record of this volatility that one sees potentially deep links to other physical processes, including earthquakes. A particularly interesting paper is this one, again by the Boston group, quantifying several ways in which market volatility obeys several quantitative laws known from earthquake science, especially the Omori Law describing how the probability of aftershocks decays following a main earthquake. This probability decays quite simply in proportion to 1/time since the main quake, meaning that aftershocks are most likely immediately afterward, and become progressively less likely with time. Episodes of high volatility appear to follow similar behaviour quite closely.

Perhaps even better is another study, which looks at the link to earthquakes with a somewhat tighter focus. The abstract captures the content quite well:
We analyze the memory in volatility by studying volatility return intervals, defined as the time between two consecutive fluctuations larger than a given threshold, in time periods following stock market crashes. Such an aftercrash period is characterized by the Omori law, which describes the decay in the rate of aftershocks of a given size with time t by a power law with exponent close to 1. A shock followed by such a power law decay in the rate is here called Omori process. We find self-similar features in the volatility. Specifically, within the aftercrash period there are smaller shocks that themselves constitute Omori processes on smaller scales, similar to the Omori process after the large crash. We call these smaller shocks subcrashes, which are followed by their own aftershocks. We also show that the Omori law holds not only after significant market crashes as shown by Lillo and Mantegna [Phys. Rev. E 68, 016119 2003], but also after “intermediate shocks.” ...
These are only a few of the power law type regularities now known to hold for most markets, with only very minor differences between markets. An important effort is to find ways to explain these regularities in simple and plausible market models. None of these patterns can be explained by anything in the standard economic theories of markets (the EMH etc). They can of course be reproduced by suitably generating time series using various methods, but that hardly counts as explanation -- that's just using time series generators to reproduce certain kinds of data.

The promise of finding these kinds of patterns is that they may strongly constrain the types of theories to be considered for markets, by ruling out all those which do not naturally give rise to this kind of statistical behaviour. This is where data matters most in science -- by proving that certain ideas, no matter how plausible they seem, don't work. This data has already stimulated the development of a number of different avenues for building market theories which can explain the basic statistics of markets, and in so doing go well beyond the achievements of traditional economics.

I'll have more to say on that in the near future.

Jumat, 02 Desember 2011

Consumable IP

Some OEMs derive a significant proportion of their profits from the sale of consumables (a recent article in The Times recalled the 2002 assertion by the Consumers Association that the ink in Hewlett-Packard’s printers “was more expensive, per millilitre, than Dom Perignon champagne”). Others make no attempt to prevent aftermarket suppliers, instead making their profit on the sale of original equipment.

Now there would appear to be a third way: in the field of aircraft brakes, Nasco has announced an “innovative alternative” approach to providing [corporate] customers with new brake designs involving a fixed-price design, development and production contract that includes re-procurement data rights. Such data are believed to include manufacturing drawings and material specifications.

According to Nasco’s website, “customers pay the development costs up front but reap the long-term benefits of lower cost spare parts through competitive sourcing.” Contrast this with the use of trade secrets in manufacturing drawings and material specifications to exclude competitors as previously reported here.

Interview with Dave Cliff

Dave Cliff of the University of Bristol is someone whose work I've been meaning to look at much more closely for a long time. Essentially he's an artificial intelligence expert, but has has devoted some of his work to developing trading algorithms. He suggests that many of these algorithms, even one working on extremely simple rules, consistently outperform human beings, which rather undermines the common economic view that people are highly sophisticated rational agents.

I just noticed tht Moneyscience is beginning a several part interview with Cliff, the first part having just appeared. I'm looking forward to the rest. Some highlights from Part I, beginning with Cliff's early work, mid 1990s, on writing algorithms for trading:
I wrote this piece of software called ZIP, Zero Intelligence Plus. The intention was for it to be as minimal as possible, so it is a ridiculously simple algorithm, almost embarrassingly so. It’s essentially some nested if-then rules, the kind of thing that you might type into an Excel spreadsheet macro. And this set of decisions determines whether the trader should increase or decrease a margin. For each unit it trades, has some notion of the price below which it shouldn’t sell or above which it shouldn’t buy and that is its limit price. However, the price that it actually quotes into the market as a bid or an offer is different from the limit price because obviously, if you’ve been told you can buy something and spend no more than ten quid, you want to start low and you might be bidding just one or two pounds. Then gradually, you’ll approach towards the ten quid point in order to get the deal, so with each quote you’re reducing the margin on the trade.  The key innovation I introduced in my ZIP algorithm was that it learned from its experience. So if it made a mistake, it would recognize that mistake and be better the next time it was in the same situation.

HFTR: When was this exactly?

DC: I did the research in 1996 and HP published the results, and the ZIP program code, in 1997. I then went on to do some other things, like DJ-ing and producing algorithmic dance music (but that’s another story!)

Fast-forward to 2001, when I started to get a bunch of calls because a team at IBM’s Research Labs in the US had just completed the first ever systematic experimental tests of human traders competing against automated, adaptive trading systems. Although IBM had developed their own algorithm called MGD, (Modified Gjerstad Dickhaut), it did the same kind of thing as my ZIP algorithm, using different methods. They had tested out both their MGD and my ZIP against human traders under rigorous experimental conditions and found that both algorithms consistently beat humans, regardless of whether the humans or robots were buyers or sellers. The robots always out-performed the humans.

IBM published their findings at the 2001 IJCAI conference (the International Joint Conference on AI) and although IBM are a pretty conservative company, in the opening paragraphs of this paper they said that this was a result that could have financial implications measured in billions of dollars. I think that implicitly what they were saying was there will always be financial markets and there will always be the institutions (i.e. hedge funds, pension management funds, banks, etc). But the traders that do the business on behalf of those institutions would cease to be human at some point in the future and start to be machines. 
Personally, I think there are two important things here. One is that, yes, trading will probably soon become almost all algorithmic. This may tend to make you think the markets will become more mechanical, their collective behaviour emerging out of the very simple actions of so many crude programs.

But the second thing is what this tells us about people -- that traders and investors and people in general aren't so clever or rational, and most of them have probably been following fairly simple rules all along, rules that machines can easily beat. So there's really no reason to think the markets should become more mechanical as they become more algorithmic. They've probably been quite mechanical all along, and algorithmic too -- it's just that non-rational zero intelligence automatons running the algorithms were called people. 

Kamis, 01 Desember 2011

The Missing IP Narrative

It remains my most vexing professional challenge. The "it" is how to integrate IP/IC into management education. The vexation comes from the seeming paradox tha, while intellectual property and intellectual capital are routinely described as cornerstones of innovation, if not modern business itself, their systematic presence in MBA curricula remains sporadic at best. I was reminded of this in connection with two quite different experiences that I had during the week.

In the first, I had occasion to spend some time with the dean of a local business school. Recently appointed, he was taking bold action to modify the schools's MBA program to make it more appropriate for today's student body. In that connection, he wanted to hear more about my class on IP and Management that I teach elsewhere. His question, half  "devil's advocate", half an expression of curricular skepticism, was simply this: "I have space for 20 or so courses in the program. Why should a course such as yours be part of the curriculum?"

The case in favour of inclusion is not simple. In the face of multiple courses in strategy, finance, marketing, and operations, the role of a course focusing on IP is dfficult to explain. The uneven diffusion of IP subject matter throughout an organization, the origin of IP as a branch of legal practice and its intangible character all give IP a bit of orphan status within the school's curriculum.

The Dean pushed me for examples of how the course works in practice. A pregnant pause ensued, finally punctuated by several examples of IP and management that seemed to pique his interest. All the while I stressed that one can look at MBA education as a platform for imparting relevant narratives to the students. Taken from this perspective, the ultimate justification for the course is that it highlights the IP narrative in a manner that is front and centre: "Can you imagine a manager who does not have the ability to apply the IP narrative to his daily businsess?", I asked. I am not sure that I convinced him that the answer is "yes". If I failed, cohort after cohort of young managers will be trained at his school without receiving any systematic tranining in this field. The managerial narrative for these students will simply lack a meaningful consideration of IP.

This absence of a narrative for IP was reinforced in listening to a podcast that featured a well-known venture capitalist describing the foundations of the VC world. The speaker did not disappoint. He described the flow of foundation money from university and similar endowments as the turning point for VCs to attract substantial investment capital. He emphasized the importance of the human dimension in any investment, and observed that any prospective company that puts special emphasis on an exit strategy for the company lacks the necessary patience. He distinguished between great innovative ideas and market potential. There are a lot more of the former than the latter.

These multiple narratives about the VC enterprise were interesting and instructive. Except for one thing: the speaker mentioned IP only in passing. Based on his words, IP was not a central part of the VC narrative. In follow-up correspondence with the speaker, he replied briefly that the company "of course" takes an interest in the company's IP, ie., "FTO and patentability." In his view, IP is largely limited to patents, and the work required is the purview of patent technocrats, far removed from most of the company's managers.


This podcast and email correspondence reinforced the sense of frustation that I had felt in my meeting with the dean, namely that IP is not part of the mainstream MBA narrative for most students. The upshot is that most MBA students will continue to go through their programs with scant or simply no attention being paid to IP. Is there a price to be paid for this? Perhaps. It is frequently observed that innovation has materially declined over the last few years. There are no doubt a number of reasons for this troubling state of affairs. Against that backdrop, one wonders whether the absence of a meaningful narrative regarding IP within the context of most MBA programs is another source of the innovative malaise. This is at least narrative food for thought.

Rabu, 30 November 2011

Alan Greenspan's nirvana

I wrote a post a while back exploring some of the silliest things economists were saying before the crisis about how financial engineering was making our economy more robust, stable, efficient, wonderful, beautiful, intelligent, self-regulating, and so on. The markets were, R. Glenn Hubbard and William Dudley were convinced, even leading to better governance by punishing bad governmental decisions. [How that could be the case when markets have a relentless focus on the very short term is hard to fathom, but they indeed did assert this]..

Paul Krugman has recently undertaken a similar exercise in silliness mining -- in this case going through the hallucinations of Alan Greenspan. The Chairman of the Fed was evidently drinking the very same Kool-Aid:
Deregulation and the newer information technologies have joined, in the United States and elsewhere, to advance flexibility in the financial sector. Financial stability may turn out to have been the most important contributor to the evident significant gains in economic stability over the past two decades.

Historically, banks have been at the forefront of financial intermediation, in part because their ability to leverage offers an efficient source of funding. But in periods of severe financial stress, such leverage too often brought down banking institutions and, in some cases, precipitated financial crises that led to recession or worse. But recent regulatory reform, coupled with innovative technologies, has stimulated the development of financial products, such as asset-backed securities, collateral loan obligations, and credit default swaps, that facilitate the dispersion of risk.

Conceptual advances in pricing options and other complex financial products, along with improvements in computer and telecommunications technologies, have significantly lowered the costs of, and expanded the opportunities for, hedging risks that were not readily deflected in earlier decades. The new instruments of risk dispersal have enabled the largest and most sophisticated banks, in their credit-granting role, to divest themselves of much credit risk by passing it to institutions with far less leverage. Insurance companies, especially those in reinsurance, pension funds, and hedge funds continue to be willing, at a price, to supply credit protection.

These increasingly complex financial instruments have contributed to the development of a far more flexible, efficient, and hence resilient financial system than the one that existed just a quarter-century ago.

As Krugman notes, this can all be translated into ordinary language: "Thanks to securitization, CDOs, and AIG, nothing bad can happen!"

Bail out everyone

I had wondered about this idea a couple years ago -- but that's all I did, wondered about it. The idea is that when banks need bailing out -- and sadly, we seem stuck with that problem for the moment -- we shouldn't bail them out directly, but indirectly. For example, just give every single person in the US $1,000. Or maybe a voucher for $1,000 that they have to spend somewhere, or put in a bank. This quickly amounts to $300 billion infused into the economy, a large portion of which would end up in banks. So cash would be pumped into the banks too, but only through people first.

You can imagine all kinds of ways to play around with such a scheme. Paying off some of peoples' mortgages. The amount injected could be much larger. Perhaps similar funds would be injected directly into banks and other businesses as well. Mark Thoma has thought through some of the details. But I'm quite surprised this is the first I've heard about any idea even remotely like this. It seems like a much better idea than just giving money to the bankers who created the problem in the first place. Why don't we hear more about such possibilities?

Modern European Tragedy

The endgame playing out in Europe is a tragedy in the usual sense, but also in the sense of Greek tragedy -- downfall brought about ironically through the very efforts, perhaps even well intentioned, of those ultimately afflicted. It's terrible to see Europe looming toward disaster, but also utterly fascinating that everyone involved -- Greeks, Germans, French, the European Central Bank -- has acted in what they thought was their own interest, yet those very actions have led the collective to a likely outcome much worse for all. A tragedy of the commons.

Philosopher Simon Critchley has written a brilliant essay exploring this theme more generally. Among the most poetic analyses of the situation I have seen:
The euro was the very project that was meant to unify Europe and turn a rough amalgam of states in a free market arrangement into a genuine social, cultural and economic unity. But it has ended up disunifying the region and creating perverse effects, such as the spectacular rise of the populist right in countries like the Netherlands, for just about every member state, even dear old Finland.

What makes this a tragedy is that we knew some of this all along — economic seers of various stripes had so prophesied — and still we conspired with it out of arrogance, dogma and complacency.  European leaders — technocrats whom Paul Krugman dubbed this week “boring cruel romantics” — ignored warnings that the euro was a politically motivated project that would simply not work given the diversity of economies that the system was meant to cover. The seers, indeed, said it would fail; politicians across Europe ignored the warnings because it didn’t fit their version of the fantasy of Europe as a counterweight to United States’ hegemony. Bad deals were made, some lies were told, the peoples of the various member countries were bludgeoned into compliance often without being consulted, and now the proverbial chickens are coming home to roost.

But we heard nothing and saw nothing, for shame. The tragic truth that we see unspooling in the desperate attempts to shore up the European Union while accepting no responsibility for the unfolding disaster is something that we both willed and that threatens to now destroy the union in its present form.

The euro is a vast boomerang that is busy knocking over millions of people. European leaders, in their blindness, continue to act as if that were not the case.

Senin, 28 November 2011

The end of the Euro?

Three interesting articles on what now seems to be considered an increasingly likely event -- the end of the Euro (in its current form, although some version might arise from the ashes).

First, Gavyn Davies speculates on several possible scenarios for the collapse of the Euro. It might persist as the new currency of a smaller union including Germany and The Netherlands (in which case the value of the Euro would rise significantly), or it might persist as the new currency of the periphery countries after Germany bolts (in which case the value of the Euro would fall significantly). Or the Europeans might finally find a way through the ongoing nightmare. Not betting on that one.

Second, Satyajit Das goes into a little more detail, and I think rightly sees some cultural issues as ultimately being most important. The three logical possibilities are easy to list:
The latest plan has bought time, though far less than generally assumed. The European debt endgame remains the same: fiscal union (greater integration of finances where Germany and the stronger economies subsidise the weaker economies); debt monetisation (the ECB prints money); or sovereign defaults. 
 Germany may be largely in favour of solution number 1. But the smaller periphery countries, and perhaps France as well, will favour solution number 2. Hence, we may by default find Europe hurtling inexorably into "solution" number 3 -- sovereign defaults:
The accepted view is that, in the final analysis, Germany will embrace fiscal integration or allow printing money. This assumes that a cost-benefit analysis indicate that this would be less costly than a disorderly break-up of the Euro-zone and an integrated European monetary system. This ignores a deep-seated German mistrust of modern finance as well as a strong belief in a hard currency and stable money. Based on their own history, Germans believe that this is essential to economic and social stability. It would be unsurprising to see Germany refuse the type of monetary accommodation and open-ended commitment necessary to resolve the crisis by either fiscal union or debt monetisation.

Unless restructuring of the Euro, fiscal union or debt monetisation can be considered, sovereign defaults may be the only option available.
Perhaps it betrays a little bit of anarchy in my own soul, but I'm rooting quite hard for sovereign defaults. I wish the Greeks had gone ahead with their referendum. For all the complaining about the slack morals of the Greek taxpayer, every debt-creating transaction has two sides -- and the creditors (French and German banks) bear as much responsibility as the debtors.

Then again, the end is likely to bring some severe social misery, not to mention riots (the UK is already advising its European embassies on the likelihood). A third article by Simon Johnson and Peter Boone points ominously in this direction, essentially echoing Davies' analysis in bleaker language:
The path of the euro zone is becoming clear. As conditions in Europe worsen, there will be fewer euro-denominated assets that investors can safely buy. Bank runs and large-scale capital flight out of Europe are likely.

Devaluation can help growth but the associated inflation hurts many people and the debt restructurings, if not handled properly, could be immensely disruptive. Some nations will need to leave the euro zone. There is no painless solution.

Ultimately, an integrated currency area may remain in Europe, albeit with fewer countries and more fiscal centralization. The Germans will force the weaker countries out of the euro area or, more likely, Germany and some others will leave the euro to form their own currency. The euro zone could be expanded again later, but only after much deeper political, economic and fiscal integration.

Tragedy awaits. European politicians are likely to stall until markets force a chaotic end upon them. Let’s hope they are planning quietly to keep disorder from turning into chaos.

Kamis, 24 November 2011

FRAND terms from a competition authority's approach: a new essay

The issue of FRAND licensing terms is critical for all IT companies involved in a standardization process but it is also a headache for competition authorities. As Mario Mariniello, Chief Competition Economist team member of EC's Directorate General for Competition, recently highlighted in an article published in OUP's Journal of Competition Law and Economics (JCLE) entitled "Fair, Reasonable and Non-discriminatory (FRAND) Terms: A Challenge for Competition Authorities", the adoption of a technology standard can raise competition concerns when the owner of the chosen technology abuses of the additional market power gained through standardization. FRAND terms can therefore be seen as a corrective device seeking a balance of interests between the licensor, who is entitled to the incremental rent "that arises from standardization with respect to the next best alternative", and the licensees, who can be considered as "locked-in" (that is forced to adopt the chosen standard).

In this article, Mario Mariniello highlights the fact that "FRAND commitments involve an incomplete contract between licensors and licensees", their implementation will therefore be necessarily controversial. From an antitrust perspective FRAND commitments are very ambiguous because there is no commonly accepted method to assess their violation. The author therefore proposes a four-pronged screening-test to assess if such a violation has occured:

If the four following conditions criteria are met:

(1) ex-ante, a credible alternative to the adopted technology exists;
(2) ex-ante, prospective licensees cannot reasonably anticipate the licensor’s ex-post requests;
(3) ex-post, the licensor requests worse licensing conditions than ex-ante; and
(4) ex-post, the licensee is locked into the technology,

then a FRAND violation could have occurred and a competition authority needs to investigate and decide whether the terms and conditions of the defendant are fair, reasonable and non-discriminatory, which involves "an objective valuation of the royalty rate that patent holder would have been to charge if the standard did not increase its market power, subject to the broader context of the license contract."

An access to this very interesting analysis can be found here.

While on the subject of INTIPSA ...

Finding a new CIPO
is the nearest thing
many people get to
big game hunting ...
Yesterday's IP Finance post mentioned INTIPSA, the International IP Strategists Association. By sheer coincidence, the blog has received news that the organisation has a webinar coming up next week on the topic "Where Will Your Next CIPO Come From?"  According to the synopsis:
"Peter Spours from Tom Tom and Andrew Sant from Crown Holdings will be reflecting on their experiences and career paths in IP and discussing the future of the Chief Intellectual Property Officer role".
This webinar takes place next Wednesday, 30 November, at 15:00 GMT. Further details and registration are available via the INTIPSA website at www.intipsa.com

Rabu, 23 November 2011

INTIPSA: one year on

IP Finance is seeing an increasing degree of interest in INTIPSA -- the International IP Strategists Association. Indeed, in the past few weeks I have received a number of emails asking whether this weblog will be mentioning the organisation's existence.

To put the record straight, IP Finance has written about INTIPSA. Almost exactly one year ago we published this post which alerted to readers to the intended formation of INTIPSA and to the LinkedIn group which preceded it.

INTIPSA now has a busy website, a handsome logo and great prospects for the future. This weblog wishes it the best of luck and looks forward to its contributions to the well-being of IP business strategy.

Facts, figures and fun with FRAND: the seminar

When GSM stood for
"grandma's sewing machine" ...
Yesterday's seminar, “Facts and figures on FRAND licensing for standards-essential IP”, turned out to be a most enjoyable and interesting experience.  Keith Mallinson (WiseHarbor) gave a presentation which covered a wide range of issues concerning the way we understand and view FRAND licences, patent pools and the measurement of their impact on profit, competition and the uptake of new technologies in the mobile telecoms sector. Other topics which made brief appearances included sewing machine cartels, regression analysis and the sinister-sounding Herfindahl-Hirschman Index.

Panellists Enrico Bonadio (City Law School), Dan Hermele (Qualcomm) and Richard Vary (Nokia) threw in a number of further ingredients and we had a chance to debate the question whether the Dutch courts' approach to the resolution of infringement/refusal to license issues, as illustrated by the recent spat between Samsung and Apple, was the best way of encouraging the litigants to negotiate their own settlement.

IP Finance's expectations were dashed when the number of chairs laid out for those attending proved insufficient since -- quite remarkably -- virtually every one of the statistically-likely "no-shows" actually turned up, even though there were other exciting events in town on the same day.

IP Finance thanks Olswang LLP for once again providing a venue and refreshments. IP Finance also thanks Keith for all his hard work in preparing and delivering a most entertaining and informative paper: you can access his slides as a pdf file here.

Senin, 21 November 2011

IP Strategist testifies in Leveson Inquiry today

Yesterday famous actor Hugh Grant gave evidence in the Leveson Inquiry. Today it is the turn of IP Strategist Mary-Ellen Field. If you ever wondered how IP licensing attracted the attention of News Of World then her evidence should be enlightening. But it is more than that - it is the story of how the reputation and health of professional business person and consultant is ruined by over zealous press seeking a story about her famous client. It could happen to you.

For a previous post and some background please click here.

Jumat, 18 November 2011

Are economists good scientists?

I've had no time to post recently for several reasons, mostly the urgent need to work on a book closely related to this blog. The deadline is getting closer. I hope to resume something like my previous posting frequency soon.

But I would like to point everyone to a fascinating recent analysis of economists' opinions about the scientific method (that seems the best term for it, at least). Ole Rogeberg, a reader of this blog, alerted me to some work by himself and Hans Melberg in which they surveyed economists to see how much they looked to actual empirical tests of a theory's predictions in judging the value of a theory. The answer, it turns out, is -- not much. Internal consistency seems to be more important than empirical test.

This even for a theory -- the theory of "rational addiction", which seeks to explain heroin addiction and other life destroying addictions as the consequence of fully rational choices on the part of individuals as they maximize their expected utility over their lifetimes -- which on the face of it seems highly unlikely, making the burden of empirical evidence (one would think) even higher. Some history. Gary Becker (Nobel Prize) of the University of Chicago is famous for his efforts to push the neo-classical framework into every last corner of human life. He (and many followers) have applied the trusted old recipe of utility maximization to understand (they claim) everything from crime to patterns of having children to addiction. You may see a slobbering shivering drunk or junkie in an alleyway in winter and think -- like most people -- there goes someone trapped in some very destructive behavioural feedback controlled by the interaction of addictive physical substances, emotions and so on. Not Becker. It's all quite rational, he argues.

Now, Rogeberg and Melberg. Here's their abstract:
This paper reports on results from a survey of views on the theory of rational addiction among academics who have contributed to this research. The topic is important because if the literature is viewed by its participants as an intellectual game, then policy makers should be aware of this so as not to derive actual policy from misleading models. A majority of the respondents believe the literature is a success story that demonstrates the power of economic reasoning. At the same time, they also believe the empirical evidence to be weak, and they disagree both on the type of evidence that would validate the theory and the policy implications. These results shed light on how many economists think about model building, evidence requirements and the policy relevance of their work.
Now, in any area of science there are disgreements over what evidence really counts as important. I've certainly learned this from following 20 years of research on high temperature superconductivity, where every new paper with "knock down" evidence for some claim tends to be immediately countered by someone else claiming this evidence actually shows something quite different. The materials are complex as is the physics, and so far it just doesn't seem possible to bring clarity to the subject.

But in high-Tc research, theorists are under no illusion that they understand. They readily admit that they have no good theory. The same attitude doesn't seem to have been common in economics. Rogeberg and Melberg have also described their survey work in this clearly written paper in a less technical style.

A few more choice excerpts from their (full) paper below:
The core of the causal insight claims from rational addiction research is that people behave in a certain way (i.e. exhibit addictive behavior) because they face and solve a specific type of choice problem. Yet rational addiction researchers show no interest in empirically examining the actual choice problem – the preferences, beliefs, and choice processes – of the people whose behavior they claim to be explaining. Becker has even suggested that the rational choice process occurs at some subconscious level that the acting subject is unaware of, making human introspection irrelevant and leaving us no known way to gather relevant data...

The claim of causal insight, then, involves the claim that a choice problem people neither face nor would be able to solve prescribes an optimal consumption plan no one is aware of having. The gradual implementation of this unknown plan is then claimed to be the actual explanation for why people over time smoke more than they should according to the plans they actually thought they had. To quote Bertrand Russell out of context, this ‘is one of those views which are so absurd that only very learned men could possibly adopt them’ (Russell 1995, p. 110).
On the nature of reasoning in rational addiction models (this is Nobel Prize winning stuff, by the way):
[The addict]... looks strange because he sits down at (the first) period, surveys future income, production technologies, investment/addiction functions and consumption preferences over his lifetime to period T, maximizes the discounted value of his expected utility and decides to be an alcoholic. That’s the way he will get the greatest satisfaction out of life. (Winston 1980, p. 302)



 

Kamis, 17 November 2011

Latest tax news from Malta

The Fenech Farrugia Fiott Legal newsletter, Malta 2012 Budget -- Tax Highlights", contains some good news for IP owners:
"Copyright & IP royalty exemption

The tax exemption on royalties from qualifying patents introduced in 2010 has been extended to cover royalty income from works protected by copyright and other IP including books, film scripts, music and art".
Anne Fairpo covered Malta's tax exemption for patent royalties on IP Finance in April 2010, here.

LES offers cash awards in IP business plan competition for grad students

IP Finance doesn't know whether anything will come of the entries, but the 2012 International Graduate Student Business Plan Competition looks like a worthwhile cause. According to the information available:
"... the Licensing Executives Society Foundation, in cooperation with the Licensing Executives Society (U.S.A. and Canada) and the Licensing Executives Society International, officially kicked off registration for its 2012 International Graduate Student Business Plan Competition http://les2012.istart.org.

Again this year, LES registration is being kicked off during Global Entrepreneurship Week (GEW), an annual initiative of the Ewing Marion Kauffman Foundation designed to help people explore their potential as self-starters and innovators. ...
In response to the world’s growing reliance on innovation, the LES Foundation is working to ready the next generation of IP and licensing professionals through mentorship and educational programs, like the Competition, that build intellectual property (IP) and licensing know-how. ...

Starting today, graduate students, including MS/MBA/MD/JD/PhD and postdoctoral scholars, from across the globe are invited to register to participate in the 2012 LES Foundation Graduate Student Business Plan Competition, which uniquely focuses on business plans that include an overview of IP assets and describe how those assets will be managed and commercialized to achieve business goals.

This year, student teams will compete to win expenses-paid trips to the Final Round of Competition at the LES (USA & Canada) Spring Meeting in Boston, MA, May 15-17, where they will attend educational sessions, mingle with global IP leaders and compete for the $10,000 Grand Prize and valuable in-kind prizes or the $5,000 Global Award. Runner-up teams receive $1,000. Students receive comprehensive feedback throughout the process from IP business leaders who share valuable expertise earned in the trenches of businesses ranging from start-ups to Fortune 500 companies.

For more information on the 2012 Competition and the LES Foundation, click here".
IP Finance hopes that there will be a good response from Europe to this call for innovative creativity. If you know anyone who might be able to take advantage of this initiative, please forward this post to them as soon as possible.

Investec to fund civil litigation: what does this mean for IP?

A media release today informs IP Finance that Investec Specialist Private Bank has become the first UK bank to offer litigation funding to clients requiring specialist finance to pursue a civil claim in court. This is said to be "in response to increasing demand for innovative funding solutions from law firms and their clients". According to the media release:
"... Investec has no pre-defined lending criteria [well that's good, since most IP players have no pre-defined litigation criteria -- unless perhaps they are trolls], which means that it can provide fast decisions and competitively priced funding for commercial litigation. Each case is evaluated on its own merits and structured accordingly. The minimum funding is £250,000 [This doesn't necessarily bar loans to fund litigation in courts where costs awards are capped, though such a high minimum may tempt a potential claimant in England and Wales to opt for the more expensive Patents Court than the Patents County Court on order to justify the high minimum].

Jonathan Harvey, Specialised Lending, Investec Specialist Private Bank said, “The cost of litigation in the UK can be prohibitive [For many businesses the cost of borrowing is also prohibitive ...]. Many clients have strong cases but in such uncertain times are not prepared to take on the cash flow risk associated with pursuing their case. This can represent a significant opportunity cost in terms of lost revenue for the law firm and damages for the client.

“Over recent months we’ve been approached by a growing number of law firms [not IP owners or prospective defendants?] looking for alternative ways to fund litigation in the commercial sector. This is partly driven by changes in the way law firms fund their own working capital and partly by claimants’ growing need for flexible finance. Based on the success of our pilot transactions mid-year, we anticipate significant demand.”

Investec’s specialist finance team works with law firms and their clients to find innovative and flexible ways to finance their cases. The availability of litigation funding can itself be a powerful asset in bringing about a negotiated settlement, rather than going to court [but can't the same be said about the lack of availability of litigation funding?].

... The Investec professional services team was set up in response to increased demand from managing partners at law firms who are considering financial support to make structural changes to their businesses as a result of the impending introduction of the Legal Services Act 2011".
It would be good to receive readers' comments.
 

http://financetook.blogspot.com/ Copyright © 2012 -- Powered by Blogger