Role of standardisation in the DRM field

Leonardo Chiariglione, Telecom Italia Lab, Torino

Little or great, there are always turning points in history. Among the great ones we have Hannibal’s decision by not to besiege Rome. Another was kamikaze, the godly wind that blew away Gengis Khan’s fleet about to invade Japan. Some say that the failed Pickett cavalry’s charge at Gettysburg was also a turning point in history. Years, decades, maybe even centuries, after the turning point, people keep on debating how history was shaped by events that turned one way and how history could have been if events had turned another way.

We live in one of those epochal moments that should spawn a turning point. There is a war but, to be clear, it is not being fought in the Middle East. It is a war that is going on since several years – fortunately without bloodshed – in cyberspace. The problem is that there is no sign that there will ever be a turning point. This generation, our sons and grandsons will look back and will shake heads complaining about the missed opportunities that our inability to make things move forward has cost mankind.

We have great technologies, great products, and great ideas of how consumers will be served by those technologies and products. But ideas keep on being ideas. The result is that we live in an age of stagnation where opposing forces battle on many fronts and all forces are strong enough to neutralise the others.

The enlightened ones say: Digital Rights Management (DRM) is the solution. I could not agree more with them, but I beg to I differ on one point. I do not understand what is the problem that we want to solve using DRM. Let me make an example. The DRM Group of CEN/ISSS has done a very good job bringing together many market players giving then an opportunity to make known their "views of the world". I am told, however, that the group attempted to agree on a definition of DRM, and failed.

I am not going to propose a definition myself. I will simply refer to the one used by the US National Institute of Science and Technology (NIST) to make a point. NIST’s definition is: "Digital Rights Management is a system of information technology (IT) components and services along with corresponding law, policies and business models which strive to distribute and control intellectual property (IP) and its rights".

Whatever the level of agreement that this definition can achieve, one point is clear, i.e. that it is law, policies and business models that drive DRM. The fact that DRM uses IT components and services is a mere technicality. The problem with DRM is not the technology – there are excellent examples of solutions – but the forces that drive it.

The reason that things are stagnating is shown by the many cases of DRM "standards" developed as the codification of technologies suited to a particular business model. Yes, I am aware that this is the way that industries have traditionally dealt with their technology needs when a new phase of their business has emerged. Unfortunately DRM, whatever its definition, is different.

DRM is not a piece of technology loaded onto an end-user device or a service offered through a server. It is a pervasive technology that has to extend across the entire value network if it is to perform its function. It cannot work just on a portion of it. Every industry would like to make its own alloy of law, policies, business models and technology to make the invincible sword that will beat the barbarians at the gates. This, however, is not possible in this case because there is no single industry that can control the entire value network. There will always be a need of other companies or industries, with conflicting interests in other areas. Under these conditions, DRM will remain an acronym.

It is time for policy to play a role. When I say this, some of my listeners translate my words to mean: we need regulation. Well, no, we don’t need regulation. On the other hand, yes, we need regulation. But not now, because first we must understand the societal goals that we want to achieve and lay down the neutral rules that market players will have to abide by. Incidentally this will provide a simple answer to the question: "What is the definition of DRM?" The answer is easy: "DRM is the technology that solves the problem".

The next step is working out the technical means – standards – that can be used to achieve those goals. Only at the end we may have to design the necessary regulatory measures. This, of course, assuming that we find out that unless certain areas are regulated, technology, left in the hands of market players and citizens, cannot lead us to attain the goals that we want to achieve.

Speaking of standards I would like to issue a word of caution. As I have been working in this space for a good part of my professional life and, quite intensely, in the last 15 years, you might think that my purpose in life is producing standards that regulators would then convert into legislation. That is not my intention. I happen to have a very precise definition of standardisation that guides me: "Standardisation is the process by which individuals recognise the advantage of all doing certain things in an agreed way and codify that agreement". I personally do not think that law should impose standards. On the other hand I can see that they could be useful tools in the hands of regulators.

Let me make an example. DVD is a fully interoperable system and its success has beaten all previous successes in the history of consumer electronics. Conversely, pay TV by satellite is a non-interoperable system, and its problems are known. While I strongly believe public authorities should not impose the use of interoperable technologies on market players, I even more strongly believe that the existence of standardised interoperable technologies can be a powerful weapon in the hands of public authorities when market players, after years of unwise choices, come to them with a statement of problems.

Being a technologist, I don’t know if the lofty societal goals I was talking about before have been agreed and already spelled out. I personally thought that there was no reason to wait because there are technologies that are going to be needed anyway and development of technology takes time.

For two years, starting in 1999, I was involved in an initiative that was a great personal – although difficult – experience that taught me a lot. That experience made me realise the extent of complexities I mentioned before and the need to include all actors in the value network in the effort if we are to get anywhere.

The result has been the ISO standardisation project called MPEG-21. It is not my intention to describe the project in detail but only show that technologies developed in a neutral fashion can provide the technology foundation to build interoperable DRM.

The goal of the MPEG-21 project is to enable electronic commerce of Digital Items (DI). Before you stop me asking questions about what is a DI, I will tell that DI is the unit of transactions between Users. An example of a DI is a music compilation, full with MP3 files, metadata, all sort of related links etc. By "User", I mean all entities that act on the value network, i.e. creators, market players, regulators and consumers.

To realise the goal we need technologies, but which? I have just said that DRM is a set of technologies that span the entire value network, with manifold impacts on business and policy. Are we going to develop DRM technologies serving the needs of a specific industry segment? No way. The boundary conditions demand that technologies be generic. So, let’s have a short look at them.

The first component needed is a standard way of "declaring" DIs. Called "Digital Item Declaration" (DID), this standard has the purpose of defining multimedia content in terms of its components and structure (i.e. resources and metadata). DIDs are expressed in XML, the IT "lingua franca".

For any transaction we need a means to identify the object of the transaction. That is why we need a standard to uniquely identify DIs. Called "Digital Item identification" (DII) the standard plays very much the same role as ISBN does for books and ISSN for periodicals. At the last MPEG meeting we recommended, form a number of excellent candidates, that CISAC, the International Confederation of Authors and Composers Societies, be appointed as the Registration Authority for organisations that intend to act as assigners of DII.

Getting a number identifying a DI is important, but how are we going to put a "sticker" on it? This is where Persistent Association Technologies come in. The Secure Digital Music Initiative (SDMI) struggled with the selection of very advanced "Phase I" and "Phase II" screening technologies and its task was made harder by the fact that no established methods exist to assess the performance of these technologies. That is why we are developing another part of the standard called "Evaluation Methods for Persistent Association Technologies". This is not meant to be a "prescriptive" (normative) standard but more like "best practice" for those who need to assess the performance of watermarking and similar technologies.

The next step is a reference architecture of Intellectual Property Management and Protection (IPMP) to manage and protect DIs. This part is still under development and active participation from all players is needed so as to make the architecture truly generic, without any bias towards a particular way of trading DIs.

Already in the physical world we seldom have absolute rights to an object. More so we will have in the virtual world, where the disembodiment of content from carriage augments the flexibility with which business can be conducted. That is why we have a "Rights Expression Language" (REL) so that rights about a digital item can be expressed in a way that can be interpreted by a computer.

A right exists to perform actions on something. Today we use such verbs as: "display", "print", "copy" or "store" and we humans think we know what we mean. But computers must to be taught the meaning. This is why we need a "Rights Data Dictionary" (RDD) that gives the precise semantics of all the verbs that are used in the REL.

Information and Communication Technologies (ICT) let people do more than just new ways of doing old business. Content and service providers used to know their customers very well. They used to know – even control – the means through which their content was going to be delivered. Consumers used to know the meaning of well-classified services such as television, movie and music. Today we are having less and less such certainties: end users are more unpredictable than ever, the same piece of content can reach them through a variety of delivery systems and can be enjoyed by a plethora of widely differing consuming devices. How can we cope with this unpredictability of end user features, delivery systems and consumption devices? This is where "Digital Item Adaptation" (DIA) comes to help, providing the means to describe how a DI should be adapted (i.e. transformed) so that it best matches the specific features of the User, the Network and the Device.

I will complete the current list of basic technologies mentioning "Event Reporting" (ER), whose purpose is to provide metrics and interfaces for performance of all reportable events, "File Format" (FF), that provides a standard way to store and transmit DIs and "Digital Item Processing" (DIP), whose purpose is to provide the means to "play" a DI.

What I have explained above is not a declaration of intentions; it is the result of a coordinated work involving hundreds of individuals that has lasted 3.5 years. At the moment we have two standards (DID and DII) that have achieved International Standard (IS) status. Two more (REL and RDD) will reach that status in July 2003. Another (DIA) will do so in December 2003. More will follow in 2004.

Normally my talk should end here. In this case, however, I would like to take this opportunity to raise an issue that is highly related to the subject of this talk.

As a rule communication standards relate to interfaces and protocols and not to the device internals. But DRM is different. To be trusted, a device must have internal security elements. I am aware of attempts that are being made to impose by legislation such security elements in any computing device and by this we mean all future communication devices. The protection of valuable content can be achieved through successful DRM standards that have user-perceived interoperability as the main goal, not through piecemeal legislative interventions that run the risk of bringing to a standstill the continuing evolution of one the technologies that hold the best promises for mankind in the third millennium.

Let me conclude with an observation and a call. The Zen philosophy used the metaphor of a finger pointing to the moon: although the finger points to the moon, the finger and the moon belong to two different worlds. By concentrating on the finger, one loses clarity of the vision of the moon and vice versa. MPEG-21 can provide the generic tools that will let society reach the moon, but is not the moon. If we want to reach the goal we need an attentive scrutiny of the technologies that are being provided.

I call for Users, in the MPEG-21 sense, to check that was is being provided serves the needs of mankind in this epochal transition.