Subject: Re: Common LISP: The Next Generation
From: Erik Naggum <erik@naggum.no>
Date: 1996/10/01
Newsgroups: comp.lang.lisp,comp.lang.dylan
Message-ID: <3053120041249285@naggum.no>


[James Toebes]

|   Let me explain what I mean by 'getting the job done'.

Thanks.

|   I am not referring to UNICODE character set.

Since Microsoft is a pusher of Unicode (ISO 10646) in Windows NT and also a
member of the Unicode Consortium, I find it amazing beyond compare if they
have called something else "Unicode", as well.  Somebody must have confused
the name, at least.

|   All computer operations reguardless of the language can be broken down
|   to some basic functions line add two integers, store a variable,
|   compare two values, ....

There's an ISO standard at some stage called "Language-independent
procedure calling".  It sounds vaguely like the effort you mention.  That
standard is highly optimized for languages with trivial representation of
objects, as found in languages with values and storage locations that are
typed at compile-time, only.  It fails utterly to address the needs of
dynamic languages, of which several have arrived on the scene "recently".

I believe in creating "interfaces" or "trampolines" or whatever between
different languages' optimal data representations and function calling
mechanisms.  Languages just differ too much in how they do things beyond
the most basic to be able to agree on a common format without seriously
hurting performance.  Assuming, that is, that "performance" still means
something when this gets used.  (In some existing systems, this overhead
can be ignored because it will always be much smaller than the next most
costly overhead, typically interprocess communication mechanisms.)  I have
found that function calling mechanisms have been getting more and more
complex in recent years both on the CPU level and in the coding conventions
employed, to the point where a function should be quite long before it pays
to call it instead of inlining its body.  Inlining code between languages
will be tricky at best, so the standard practice of creating large amounts
of teeny-weeny functions as accessors into larger objects will have a
tremendous cost overhead if those objects are not interchanged at the
highest possible level.  This means that type definitions (e.g., struct and
class declarations) need to be shared across the language interface.  This
is a non-trivial problem, not at all solved by reducing function calls to
interchange of some notion of "basic" building blocks.  That whole concept
is seriously dated; I'd say it's _pre_-object-orientation.

#\Erik
-- 
I could tell you, but then I would have to reboot you.