Subject: Re: Security concerns with Common Lisp and EVAL
From: rpw3@rpw3.org (Rob Warnock)
Date: Sun, 28 Sep 2008 22:50:41 -0500
Newsgroups: comp.lang.lisp
Message-ID: <spadndcUY_8MzX3VnZ2dnUVZ_ovinZ2d@speakeasy.net>
verec <verec@mac.com> wrote:
+---------------
| "John Thingstad" <jpthing@online.no> said:
| > Well the golden rule for Internet communication is to be
| > generous in what you accept and strict in what you produce.
| 
| I'm not sure how "golden" is that rule...
+---------------

As John says, it is a *very* well known Internet design principle,
called the "Robustness Principle" (or less often, "Postel's Law"):

    http://www.postel.org/postel.html
    http://en.wikipedia.org/wiki/Robustness_Principle

It was elucidated by Internet RFC Editor Jon Postel *at least*
as far back as January 1980 in RFC 760 "Internet Protocol" (IP):

    http://www.ietf.org/rfc/rfc760.txt
    ...
    3.2. Discussion

      The implementation of a protocol must be robust.  Each implementation
      must expect to interoperate with others created by different
      individuals.  While the goal of this specification is to be
      explicit about the protocol there is the possibility of differing
      interpretations.  In general, an implementation should be conservative
      in its sending behavior, and liberal in its receiving behavior.
      That is, it should be careful to send well-formed datagrams, but
      should accept any datagram that it can interpret (e.g., not object
      to technical errors where the meaning is still clear).

By 1981, in RFC 793 "Transmission Control Protocol" (TCP), it had become
more stylized and succinct:

    http://www.ietf.org/rfc/rfc793.txt
    ...
    2.10. Robustness Principle

      TCP implementations will follow a general principle of robustness:
      be conservative in what you do, be liberal in what you accept from
      others.

This is actually *more* generous than the Golden Rule [think it through],
and more robust than the more commonly quoted version which refers only
to protocols, as seen, say, in 1989's RFC 1122 "Requirements for Internet
Hosts -- Communication Layers" (Bob Braden, ed.):

    http://www.ietf.org/rfc/rfc1122.txt
    ...
    1.2.2  Robustness Principle

      At every layer of the protocols, there is a general rule whose
      application can lead to enormous benefits in robustness and
      interoperability [IP:1]:

            "Be liberal in what you accept,
            and conservative in what you send."

This latter (or with the clauses swapped) is the version most people
would recognize.

+---------------
| ...but it surely sounds like a call for general non-compliance,
| "anything goes", "the market will sort it out" amd other utter
| non sense we've been all spoon fed all those years.
| 
| How do you think eveloution works? Rejection of the non-fitting.
| "Do not accept what doesn't follow the rules" would make a tad
| lot more sense, IMHO :-)
+---------------

Well, the Internet Robustness Principle served the industry *very* well
for quite a number of years, but you do have a point, one which is
shared by others such as Marshall Rose [well known for his work in
getting several ISO applications protocols to run on top of IP/TCP].
As he noted in 2001 in RFC 3117 "On the Design of Application Protocols":

    http://www.ietf.org/rfc/rfc3117.txt
    ...
    4.5 Robustness
    ...
      Counter-intuitively, Postel's robustness principle ("be conservative
      in what you send, liberal in what you accept") often leads to
      deployment problems.  Why? When a new implementation is initially
      fielded, it is likely that it will encounter only a subset of
      existing implementations.  If those implementations follow the
      robustness principle, then errors in the new implementation will
      likely go undetected.  The new implementation then sees some, but
      not widespread deployment.  This process repeats for several new
      implementations.  Eventually, the not-quite-correct implementations
      run into other implementations that are less liberal than the initial
      set of implementations.  The reader should be able to figure out what
      happens next.

      Accordingly, explicit consistency checks in a protocol are very
      useful, even if they impose implementation overhead.

An even stronger argument has been made that the Robustness Principle
*encouraged* Microsoft's "embrace/extend/destroy" strategy when it came
to Web protocols, especially HTML & "the browser wars", leaving others
bending over backwards to be "liberal" enough to "accept" the non-standard
(and in many cases flat-out *broken*!) HTML produced by Windows-based
web page generation tools. Had other browsers simply refused to accept
HTML with malformed nesting of tags (say), at least some of the resulting
male bovine excrement could have been avoided.

But despite all that, the Robustness Principle still *is* thought
by many (and probably most) network engineers to be a good idea,
when not stretched beyond reasonable limits. Read the above-referenced
Wikipedia page for a more extensive discussion of the tradeoffs.


-Rob

-----
Rob Warnock			<rpw3@rpw3.org>
627 26th Avenue			<URL:http://rpw3.org/>
San Mateo, CA 94403		(650)572-2607