Robustness principle

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by 82.113.121.55 (talk) at 18:00, 24 March 2011 (→‎External links: the "robustness principle" is on page 22 of IEN 111, not on page 21). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In computing, the robustness principle is a general design guideline for software:

Be conservative in what you send; be liberal in what you accept.

The principle is also known as Postel's Law, after Internet pioneer Jon Postel, who wrote in an early specification of the Transmission Control Protocol that:[1]

TCP implementations should follow a general principle of robustness: be conservative in what you do, be liberal in what you accept from others.

In other words, code that sends commands or data to other machines (or to other programs on the same machine) should conform completely to the specifications, but code that receives input should accept non-conformant input as long as the meaning is clear.

Interpretation

RFC 1122 (1989) expanded on Postel's principle by recommending that programmers[2] "assume that the network is filled with malevolent entities that will send in packets designed to have the worst possible effect". Protocols should allow for the addition of new codes for existing fields in future versions of protocols by accepting messages with unknown codes (possibly logging them). Programmers should avoid sending messages with "legal but obscure protocol features" that might expose deficiencies in receivers, and design their code "not just to survive other misbehaving hosts, but also to cooperate to limit the amount of disruption such hosts can cause to the shared communication facility".

In RFC 3117, Marshall Rose characterized several deployment problems when applying Postel's principle in the design of a new application protocol.[3] For example, a defective implementation that sends non-conforming messages might be used only with implementations that tolerate those deviations from the specification until, possibly several years later, it is connected with a less tolerant application that rejects its messages. In such a situation, identifying the problem is often difficult, and deploying a solution can be costly. Rose therefore recommended "explicit consistency checks in a protocol ... even if they impose implementation overhead".

References

  1. ^ RFC 761: Transmission Control Protocol. Jon Postel (ed), January 1980.
  2. ^ RFC 1122: Requirements for Internet Hosts — Communication Layers. Robert Braden (ed), October 1989.
  3. ^ RFC 3117: On the Design of Application Protocols. Marshall Rose, November 2001.

External links