Tuesday, April 28, 2009

Avox partners with CUSIP/S&P

Like a said a few weeks ago, I anticipate a flurry of activity in the reference data space. OK, I had some inside information upon which to make this assertion but the reality is - it's happening.

We are really excited about this arrangement with CUSIP. Some skeptics out there are suggesting that this is not a good partnership because of some negative press CUSIP is getting, particularly in Europe. I look at it the other way around. By partnering with Avox, CUSIP is embracing a more open approach to entity reference data. The content managed by Avox within this partnership will be consistent with that of all Avox clients which currently include the likes of Barclays, Citigroup, Nomura and Standard Bank of South Africa. It will also be consistent with entity data held by our partners including firms such as SWIFT, Interactive Data and Markit. Applying the CABRE will be quick and efficient for these companies as well as for existing CUSIP customers and partners. In essence, this partnership proves that collaboration is growing and succeeding in the reference data world.

Will the CABRE become the industry standard entity identifier? Well, frankly, that's up to the market. What do you think? If you have an opinion, we'd like to hear it and get some public discussion going on.

Ken

Future of Journalism(Alan Rusbridger) = Future of Data?

Alan gives an interesting commentary on this video about the future of journalism and how The Guardian is addressing the new world of open information. The parallels between this and data management keep ringing home for me. What do you think?

Ken

Sunday, April 26, 2009

Out with the old, in with the new.

I've been tracking Jeff Jarvis, author of "What Would Google Do?" and of the www.buzzmachine.com blog. He posted an article recently about the demise of newspapers in the form of a theoretical testimonial to Senator John Kerry's hearings. What's that got to do with data?

I find interesting parallels between the news business and our industry. Both have historically relied on IP and tight control over it. The Internet destroys control. But on the plus side, it proliferates knowledge. And on the down side, it proliferates much useless noise.

The question is, can large, blue chip businesses that rely heavily on conventional license revenue streams adapt quickly enough to this new regime of transparency to survive and prosper? A number of news organizations have found this a major challenge.

It's an exciting time to test these ideas as firms in the financial services space are forced to seriously question everything they do and how they do it. Might there be a more efficient way to improve data quality. I suppose you know my opinion on that already...

Ken

Thursday, April 23, 2009

Wiki-data beta is live!

OK, back in business. Looking for feedback from those of you who have an interest.

Ken

--------------------------------------
Comment on note below: As expected, we've already received some great feedback which means we've got to do some revamping to make this content more useful for you. We'll get back up and running asap.

Best Regards,

Ken

-------------------------------------
310,000 verified, maintained business entity data records. Not a bad starting point. Some partners will soon be coming along with authoritative identifiers, documentation and linkages to securities.

It's happening folks.

Ken

Tuesday, April 21, 2009

Avox/Markit

We are tremendously excited about this new partnership with Markit. The amount of efficiency that can be gained by mutual clients is tremendous without even considering the amount of risk that can be driven out by ensuring consistent data and documentation.

The folks at Markit are very customer focused as are we so please let us know if you have any suggestions for improvement or enhancement of our joint service.

Ken

Saturday, April 11, 2009

Is the industry finally getting "collaboration"?!

In the past 3 months, I've seen more concerted effort amongst the vendor community to work together than I have in my 15 years in this industry. Here's a prediction. Before the end of Q2, 2009, we are going to see a series of announcements and real progress toward global vendor collaboration in the business entity space.

We need to move faster in this industry if we want to remain relevant. That applies to all members of the food chain - data vendors, technology providers, consulting firms, regulators, utilities AND users. Business conditions are ripe for change that may initially be percieved as risky but which is already proving to deliver significantly higher value than that which has been achievable in the past.

I'm looking forward to some discussion on this topic.

WikiData

WikiData

We've been speaking with a lot of firms in the industry over the past year about the trade off between data quality, coverage, timeliness and cost. A challenge faced by data managers, particularly in the business entity space, is that a central data repository can feed many different groups with distinctly different requirements. Credit risk, for example, will have an extremely low tolerance for latency or errors but may not need massive volumes of entities - just those to which their firm has exposure to. Marketing on the other hand may place a higher priority on a huge database of potential corporate customers to which they can target for campaigns. In this case, quality, while important, does not have the same value as it does for risk.

So the question is how does one balance these requirements while leveraging a central data repository? Six sigma level data quality across an entire CRM database would cost far too much. Incomplete data population for marketing initiatives significantly compromises the effectiveness of a campaign.

What about all of us sharing, for free, very basic data about entities? Say, legal name, country & region(where necessary) of incorporation and perhaps a few other bits as agreed by the community. We can agree a mechanism that ensures contributor identities are not disclosed unless they choose otherwise. All contributors have the capability to check, update and comment on data, very much like a Wikipedia model.

This data then serves as a platform for a financial institution's internal team and/or a third party to perform additional verification/certification. Over time, the shared and free data asset becomes more comprehensive and reliable. Firms like mine will continue to provide verification services however we will need to continue to enhance our service/content offerings in order to increase our value proposition to our market over time. And most importantly, a free and open foundation of basic data will enable anyone on the internet to identify errors. Is there a chance that some contributors may corrupt the data by accident or intentionally? Absolutely. However similar models like Wikipedia are now proving that such abuse is rapidly identified and corrected by the well intentioned community members.

Financial institutions have consistently asked for a free data utility to help address business entity data quality and identification issues. Which of you is ready to proactively participate in such an initiative? It won't happen without your involvement.

I look forward to your comments, criticisms and ideas for improvement.

Ken