Subscribe to DSC Newsletter

Government and open source: Public interest - Quantitative Valuation of Toxic Assets and "Open Data"

Bob Sutor or Dr. Robert S. Sutor who is Vice President, Open Source and Linux, IBM Corporation, Somers, New York; has a great blog which I have been following for nearly a year now, he goes to the mark on many issues relevant to quantitative analytics. The blog front page is here;-

This week Bob asked a question which I have been thinking about for some time and indeed have presented about here in Europe frequently in recent months, this is Bob's question;-

This week I’m going to pose a series of questions in the hope of driving some discussion around the use of open source in government, as well as government involvement in open source.

(General warm-up question) In what ways is it in the public’s interest for a government to make the intellectual property it develops available to its citizens?
In the case of software, does exclusive patent licensing by a government patent holder have any advantages or disadvantages to the public interest compared with making it available for open source implementations?
Does a government allowing open source implementations of its software intellectual property foster potential security problems?
Does a government allowing open source implementations of its software intellectual property dilute any potential innovation and economic advantages to its citizens?

The Q&A is available here;-

Bob was good enough to publish my answer which is as follows;-

There is an obvious mini-case-study here in the challenge of the valuation of impaired assets for the TARP/TALF programmes in the US & the variants in the EU member states.

Whatever happens it will be the governments who define the valuation algorithms which will be the final pricing of these assets. The governments are now the ‘counterparty of last resort’ so they have the final say now on ‘fair Value’; in effect. The algorithims which compute fair value, used by the government will themsleves probably be variants of those developed by academic institutions (or central banks)and published in the ‘academic’ public domain. So to your questions;-

1. It is in the public interest that these valuation algorithims and even possibly the software which implements them to be in the public domain, arguably its one of the most important aspects of public stewardship of the 21st Century.

2. Exclusive patent right in this context by the governments (both US and EU) would be of extreme detriment in my view to the citizens since if the government develops the correct alogorithms for valuation why should they not be available to the banks so that they do it properly going forward. That is so important to the future welfare of all citizens in my view.

3. I don’t see that issue in this case. I suppose the argument could be made in a sort of game theory framework that the government would allways want to keep the ’super-clever’ algorithim but then a 2nd-best solution developed by a government and placed in the public domain for the banks to use would be progress from where we are today.

4. Quite the reverse in this case if the US Fed or Treasury or EU or ECB invents an algorithim which properly analyses risk and quantifies risk capital and risk exposure, sharing that IP into the banking system would foster innovation in that macro-prudential process which was sadly lacking prior to this credit crisis.

My standard Value Proposition collateral which reflects this logic is available here;-

But, rather like when one is responding to an RFI, its the questions which the customer does not ask which in the end-up become the most important and here on Analytic Bridge I can add the bit I have been thinking about which Bob did not ask, its in specific relation to the quantitative pricing of "toxic assets" an issue of such importance right now;-

In regard to data sources for the fair-valuation of structured products, there is no ‘silver bullet’ in that domain & that in the run-up to the credit crunch it was that type of belief which allowed the banks to believe that they were impregnable to value depletion. In a sense there was a view in banking that they had a “secret formula”. A key lesson of the Credit Crisis is Open-ness and Transparency; that fair valuation must be conducted on the basis of data available in the public domain and not on the basis of proprietary data (or methods) which may be relied upon as a panacea. Events in the ratings agencies have demonstrated the folly of that perspective. Thus I would recommend that data platforms for fair valuation must be of data in the public domain (not necessarily market data) and that be a pre-requisite of that process (as I would recommend of analytical technologies also).

Views: 38

Tags: asymptotix


You need to be a member of AnalyticBridge to add comments!

Join AnalyticBridge

On Data Science Central

© 2021   TechTarget, Inc.   Powered by

Badges  |  Report an Issue  |  Privacy Policy  |  Terms of Service