ICT Standards and Business Models

Leonardo Chiariglione – CEDEO.net

 

 

Probably the first invention leading to a standard fitting the current perception of an “Information and Communication Technology (ICT) standard” is the Morse code. This standard, later modified to become one of the first standards of the International Telecommunication Union (ITU), was instrumental to the inventor’s plan to achieve an effective means to transmit characters across space [1]. Morse was awarded – belatedly – 400,000 Francs by several European governments for use of his patent in their telegraphy systems.

Many ICT standards followed this first by Morse to support the fast evolving and metamorphosing new communication world.

Telecommunications were fostered by public authorities who considered it their prerogative the provision of means to communicate to their citizens. Issuing standards – designed to provide interoperability to their users – was their mission. Sometimes it was difficult to find the border between a standard and a law.

The roots of what is called today Consumer Electronics can be found in T. A. Edison’s phonograph (with today’s eyes, in the original photographic camera half a century before). This industry developed without government intervention and was driven by the idea that standards were “good for business”. The fact that patents were possibly required to implement the standard was no deterrent, but considered as a reward for those who had made the innovation.

Television broadcasting receivers were manufactured by the Consumer Electronics (CE) industry, but broadcasting standards were firmly in the hands of public authorities and any relevant patents in the hands of some national champions.

The Information Technology (IT) industry developed in a quite different direction than those of the Telecommunication and CE industries. Originally very few IT standards existed as every industry player competed with everybody else on every single aspect of the business. This did not mean that “standards” did not exist: IBM used to have a department for “internal” and one for “external” standards. Patents, it should be clear, were aplenty.

We have already mentioned the ITU as the body taking care of standards for the Telecommunication industry. As a result of a process started in the most industrially advanced countries, the International Electrotechnical Commission (IEC) was established at the beginning of the 20th century to cater to what could be described today “CE industry needs”. Immediately after World War II the other industries revived the failed pre-war attempt at creating the third international standards body called International Organisation for Standardisation (ISO).

The three organisations – and a host of other Standard Development Organisations (SDO) at all levels – agree that standards are legitimate even with patents provided that a user of a standard issued by one of those organisation can access the relevant Intellectual Property (IP) at Fair Reasonable and Non-Discriminatory (FRAND) conditions.

The need of the USA Department of Defence to have a powerful “computer communication” standard gave rise to a huge project that eventual gave rise to a “transport layer” standard not just for computers but for general telecommunication networks. Over several decades the seemingly bottomless availability of funds attracted the best minds in the project and they not only developed a technical excellent solution but framed it in a specific philosophical set up, essentially that the foundational layer of digital communication should be accessible to all mankind. In other words no patented technology should be accepted in a Request for Comment (RFC), as the project called its standards. The World Wide Web Consortium (W3C), the standards organisation for things “web”, tries to continue the tradition by making their Recommendations (as they, like the ITU, call their standards) royalty free.

It is important to note, however, that, in order to apply the Internet protocol (IP) to a physical network, you are probably going to need some patented technology…

The Moving Picture Experts Group (MPEG) is the body catering to convergence of media technologies in the digital world. In spite of its vision to create the foundational layer of digital media communication, it had to accept patents in standards because the Telecommunications and CE industries had patented digital media technologies for ages and to make sure that private money – not Uncle Sam – would feed innovation.

A parallel body to MPEG – the Joint Photographic Experts Group (JPEG) – has succeeded in defining two layers in two of its standard (JPEG and JPEG2000): a baseline that is royalty free and a more performing layer that is not. The Motion Picture Industry has selected the royalty-free JPEG2000 standard for digital movie distribution instead of other more performing standards. Some have remarked that an industry that values its IP should probably value other industries’ IP.

The IT world has generated a peculiar form of standardisation. Some programs whose role is indeed a foundation in a layered architecture have become standard either because users have decreed so with their wallets or because a group of idealists – or, sometimes, different types of entrepreneur – has developed excellent code doing the right job as Open Source Software (OSS).

As has been noted elsewhere [2], it is remarkable that, quite independently from the OSS environment, since 15 years MPEG develops its “reference software” with a process that has many points in common with the OSS process. A form of the MPEG reference software licence acknowledges the possible existence of patents required to exercise the code. Although something is changing in the OSS world in this regard we are not ready for a convergence of different approaches.

Before closing let’s try a definition of “standard”. The Webster’s defines standard as “a conspicuous object (as a banner) formerly carried at the top of a pole and used to mark a rallying point especially in battle or to serve as an emblem”, or as `”something set up and established by authority as a rule for the measure of quantity, weight, extent, value, or quality”. Although these definitions are good enough, over the years I have found that the following definition does a better job: “Codified agreement between parties who recognise the advantage of all doing certain things in a given way”.

Some features that some claim should be required of a standard depend on the business models of the beholders and, if there is some that we should not standardise, it’s business models.

 

 

[1] Samuel F.B. Morse, Improvement in the mode of communicating information by signals by application of electro-magnetism, filed 1840/06/20

[2] L. Chiariglione, Open source in MPEG, Linux Journal, 2001/03

[3] http://itscj.ipsj.or.jp/sc29/29w7scld.htm