Lockless Inc

Why Lisp Failed

This article investigates why the lisp computer language is no longer widely used. Long ago, this language was at the forefront of computer science research, and in particular artificial intelligence research. Now, it is used rarely, if at all. The reason for this failure is not age. Other languages of similar heritage are still widely used.

Some "old" languages are FORTRAN, COBOL, LISP, BASIC, and the ALGOL family. The primary difference between these languages is who they were designed for. FORTRAN was designed for scientists and engineers, for whom solving equations on computers was the primary task of programming. COBOL was designed for businesses, being especially expressive so that business people could take advantage of the computer age. LISP was designed by computer scientists, and was expressive enough for research into the fundamentals of computation. BASIC was designed for beginners to learn programming. Finally, the ALGOL language was modified by computer programmers, and evolved into a huge family of popular languages such as C, Pascal and Java.

Some of the above languages are no longer quite as popular as they once were. This will be the definition of "failure" we will use here. The question is why did they fail? The first stand-out is COBOL. Unfortunately, its design to be humanly-readable by business people was its downfall. Businesses found that it was possible to hire programmers to look after their computers. Programmers would then gravitate to languages designed for them, rather than their managers. Thus over time, more and more business functions would be programmed in languages such as VB, C, C++ and Java. Now, only relic software tends to be still be written in the language.

BASIC suffered a different fate. It was the language of beginners. Those just learning to program on microcomputers would use the in-built BASIC language to start off with. As time progressed, microcomputers were replaced by personal computers running Microsoft operating systems, or macintoshes running Apple's. The language evolved with time, becoming Visual Basic once the desktop paradigm arrived. Since it could be used by those with little programming skill, it replaced COBOL for a while. Why pay for an expensive compiler, if a cheap interpreter that comes with your machine is all you need? Recently, Microsoft has moved to the .Net system, leaving VB behind. Its replacement, C#, is an ALGOL family member closely related to Java.

FORTRAN usage has waxed and waned throughout the years. At one stage, nearly all science codes were written in it. Its advantage was that there were no pointers in the language, and recursion was disallowed. This meant that all data reference locations were able to be compile-time constants. FORTRAN compilers could use this extra information to make extremely fast programs. Unfortunately, as time progressed fixed sized arrays as data structures became obsolete. Now, science works with arbitrary shaped grids, and even more complex representations of the real world. This required the addition of pointers to the language. Around the time that happened, FORTRAN went into decline. Now it is relegated to high performance computing workloads where new parallel matrix and vector operations recently added to the language still give it the performance edge.

The ALGOL language family succeeded. The reason for this is that these were the languages written by programmers for programmers. As time progressed, these evolved into the system and application languages most commonly used today. Their advantage was that the more programmers that used them, the more the languages improved, and the more programs that were written in them. This provided a virtuous cycle, where more programmers were in turn hired to work on the programs that were written. This is an example of the network effect. The "worth" of a system is proportional to the square of the number of users of it, due to the number of interactions between users scaling at this rate.

So why did the Lisp programming language family end up on the failure side? Some say it is due to the syntax. Lisp is notorious for its parentheses. I do not believe that is the reason. Many users of Lisp say that formatting quickly allows them to match bracket-pairs. Also, soon after the invention of the language super-brackets were created to quickly close off an arbitrary number of open brackets. This language feature is rarely used today. Finally, syntax-understanding editors have made most of the layout problems of Lisp nonexistent in this age.

Another common complaint against Lisp is that it is a functional language. Could this be the reason for failure? Amongst all the early important languages, it alone is functional in nature. Unfortunately, I don't think reality is this simple. Lisp contains imperative features, and the ALGOL family can be used in a purely functional manner. If one wishes to code to a certain paradigm, certain languages may make that choice easier to make. However, modern languages are flexible enough to support many programming paradigms, and there is no reason a mostly imperative Lisp might not exist.

Perhaps the problem with Lisp was that it used a garbage collector? Again, amongst the early important languages, it alone had one. Garbage collection requires more memory and computational resources than manual memory management. Could the lack of memory and low performance of early computers have held Lisp back enough to slow its adoption? Again, I do not think this was the case. The complex programs Lisp was used to create would require something with the complexity of a garbage collector to be written any way if they were implemented in another language. The proverbial statement that any complex enough program eventually contains a poorly written implementation of Lisp does hold some weight after all.

The reason Lisp failed was that it was too successful at what it was designed for. Lisp, alone amongst the early languages was flexible enough that the language itself could be remade into whatever the user required. Programming with the other early languages involved breaking a task into small sub-tasks that could then be implemented. The larger tasks could then be implemented in terms of the smaller ones. Lisp was different, due to its power, a programmer would be able to design a domain-specific language that would perfectly solve the task at hand. Due to the orthogonality of the language, the extensions written would work seamlessly with the core language.

So what is the problem with creating domain-specific languages as a problem solving technique? The results are very efficient. However, the process causes Balkanization. It results in many sub-languages all slightly different. This is the true reason why Lisp code is unreadable to others. In most other languages it is relatively simple to work out what a given line of code does. Lisp, with its extreme expressibility, causes problems as a given symbol could be a variable, function or operator, and a large amount of code may need to be read to find out which.

The reason Lisp failed was because it fragmented, and it fragmented because that was the nature of the language and its domain-specific solution style. The network effect worked in reverse. Less and less programmers ended up talking the same dialect, and thus the total "worth" ended up decreasing relative to the ALGOL family.

If one were designing a language now, how could this problem be prevented? If expressibility is the goal of the language, then it must somehow be moderated. The language must have deliberate limitations that allow for readability of code written in it. Python is a successful language where this has been done, where some of these limitations are hard-coded, and others exist via the use of convention.

Unfortunately, so much time has passed, and so many Lisp variants have been created, that yet another new language based upon it is probably not the answer. There simply will not be enough users to make a difference. Perhaps the solution is to slowly add Lisp-like features to languages within the ALGOL family. Fortunately, this seems to be what is happening. The newer languages (C#, D, Python etc.) tend to have garbage collectors. They also tend to be even more orthogonal than the older languages. The future may eventually contain a popular language that behaves much like Lisp.


reader said...
There is just too many bs in this article to write a counter argument. GOD! Here's a penny go get a better programming book.
Felix said...
I really enjoyed reading this article. This might be one of the sanest summaries of Lisp's downfall on the Internet.

Anyway, there is one thing you didn't include: While the popularity of Common Lisp has wayned over the years it is now fluctuating. The essays and books of Paul Graham and others have caused some adoption. Nowadays, a small but active Common Lisp community prospers and provides libraries, documentation and support for newcomers.

The Scheme language is still in use at many universities as a means of teaching "fundamentals of computation" to freshmen.

Very recently, the Clojure language has gained some traction among ALGOL-family programmers. Clojure departs from Common Lisp heritage quite a bit, but offers a rethought and modern Lisp-like language running on the Java virtual machine.
Sebastian said...
"This is the true reason why Lisp code is unreadable to others. In most other languages it is relatively simple to work out what a given line of code does. Lisp, with its extreme expressibility, causes problems as a given symbol could be a variable, function or operator, and a large amount of code may need to be read to find out which."

I could not agree more.

Lisp's strength seems to also be its weakness at the same time.

Sad, somehow...
anus boy said...
Morrie said...
Going to put this atricle to good use now.
Antonio Bonifati said...
You actually *can* tell variable from functions and operators in standard Lisp, why not? First, functions and operators are the same thing, as in mathematics. You do not need to make any syntactic distinction, so it's simpler and more flexible than in other languages. And variables are introduced by some pre-established kinds of forms eg. defvar, defparameter, let, defun, defmacro, etc. Common Lisp has separated namespaces for functions and variables to make the distinction even clearer in those few tricky cases where you want to call a function with a variable name.

You can argue a lot about pros and cons of domain specific languages, but the fact is that you decide whether to use a DSL or not. Lisp just makes it easy to develop DSLs, but it does not forces you to do so all the time, for all aspects of a problem, etc.

I believe that Lisp failed for only one reason: ignorance! I personally did not know anything about Lisp, I have always been taught and forced to use other ALGOL-like languages for more than 12 years. I discovered it by change and now I do not want to use the other inferior languages any more. Why? I would just feel stupid and there is no advantage: rather, it takes more time and money to develop. This is a common situation amongst people who are rediscovering Lisp.

My 1-cent piece of advice: Learn Lisp well, use it for a project, compare the implementation of the same program with others, made in other languages you know and then you will agree with me and thank me forever :-)
cs124 said...
nice article! :D
David A. Wheeler said...
No, it's the syntax. Lisp syntax is horribly unreadable, and as a result, most developers choose to NOT use Lisp.

"Super-brackets" don't help, as you noted, they're rarely used. But saying that "editors have made most of the layout problems of Lisp nonexistent" is nonsense. The problem isn't *layout* (which a pretty-printer can do), the problem is that a properly-indented programs is still painful to read. Its lack of infix notation - which even tiny BASICs supported - is absurd today. You need to have a *clear* notation when working with other people - Lisp is not that language.

I agree it can't be that it's functional, since it need not be used that way. And it can't be the garbage collector; nearly all modern languages have one. Supporting domain-specific languages isn't a problem; that's a strength, and should have made Lisps *more* popular.

The claim "Lisp failed was because it fragmented" may describe Scheme, but it doesn't describe Common Lisp at all. Common Lisp is quite standardized, and all the different implementations are highly compatible.

You later note that "The language must have deliberate limitations that allow for readability of code written in it." Well, no. You don't need limitations, you need a readable notation. Python is fantastically readable - indentation is enforced, infix is built-in, function call notation is just like math class.

An alternative is to add abbreviations to Lisp readers so that you can optionally use an easier-to-read notation for common conventions. Such abbreviations need to be general and homoiconic; past efforts failed because they didn't meet those criteria.

Please take a look at the "readable" project at:
You may find that it's possible to have an alternative and better Lisp notation.

Thanks for reading.
Scott said...
Good article, but it should be fewer and fewer, not less and less programmers.
NP said...
Sorry, FORTRAN is still very much in use today. Fortran 2008 with its coarrays, dynamic memory allocation, whole array operations etc is very well suited for scientific computing. I work in one of the top world's aerospace companies, we run jobs that take 1000+ cores and run for a month. All out big codes, with a few exceptions are written in FOrtran (the few others are in C), and this is tru for all HPC industry (NASA, Boeing,...).
Tom said...
Syntax can be learned.
What is harder to grasp is semantics.
In my opinion many programming languages have too many features and most programmers fail to understand how even a small fraction of those features is used properly.
Just my opinion...
puiu-bratu-nelu said...
Upgrade LISP to VirtualC+Lisp
Rahul said...
Clojure can be the solution.
puiu-bratu-nelu2 said. said...
Good solution is "VirtualC+LISP" a hybride
 programming language between C and LISP
You write C/C++ code and LISP code , and the result code after compilation process wiil be 100%LISP(all C+ lines will be LISP lines source)
HomePage http://lisp2arx.3xforum.ro

Lispy lisper of lisp said...
Lisp is truely a god like language that remains awaiting the saviour to reclaim the territory that is the Earth. Total domination and subjugation of earthly programmers is the final result of man returning to a god like super programming language like lisp. When the devil of this earth is gone then the people will kick out their silly tiresome so called bosses and implement the one true pure language and not have to put up with stupid marketing junk for other stupid languages made by marketing people not technical programmers.So yea lisp will eventually be the one and only language that the whole world uses. so there.
Sylwester said...
I'm not entirely sure Algol syntax is easier to read than Lisp if you have never seen either before. I'll test it out when my daughter gets a little older :)

Both Algol and Basic are derived from the Fortran family. I think the Algol family and imperative way was easy for me to learn since I've written Basic in the beginning on my C64 and in that family C was easy since I knew assembly for both x86 and 68000.

When you think in Algol, programming a Lisp dialect would be very difficult. Though I bet Lisp is easier to learn for a Algol programmer than Haskell is.
From my experience with Lisp, it's unreadability is neither caused by too many parentheses, nor by the lack of distinction between operators, function, and variables. In my eyes, what makes it unreadable is its lack of structural variety.

In the C-family, for instance, we have blocks of statements in braces '{}' delimited by ';' and a convention to write one statement per line. We have a function call syntax that uses parentheses '()' and separates arguments with ',', and a convention to not split a function call across lines. We have a special syntax for accessing array elements '[]', even though 'a[b]' is equivalent to '*(a + b)'. We have a special syntax for loops with statements within parentheses 'for(;;)'. We have a special syntax for case labels 'case a:', ... I could go on like this for quite a while, but the point is, that these diverse syntax elements convey a lot of meaning at one glance. A particular strength of C over languages like Pascal and Fortran is that this syntactic diversity is nevertheless very concise, allowing a programmer to grasp a lot a meaning from a small amount of text.

Lisp, on the other hand, has only one syntactic structure that's used over and over again: the list '(a b)'. The only syntactic variation is when a pair is explicitely used '(a . b)'. That means, whenever a programmer sees something listed within parentheses, he knows nothing about what the code is supposed to be doing. He even has to read the context to know whether this list is interpreted as data or as code. Perhaps it's first interpreted as data, modified further, and then executed as code. Even when he knows that it's actually executed as code, he still has to look at the first element of the list to see whether the code is supposed to implement a loop, a condition, call a function, define a function, declare some variables, ...

Of course, this deficiency can somewhat mitigated by good formatting, but I for one believe that formatting becomes more effective with more diverse and expressive syntax, and that Lisp failed because it cannot easily convey the structure of the code to the reader.
PUMA said...
A nice article, and good ideas. I don't agree that limitations are the answer. I tend to agree with David Wheeler. I love everything about LISP except the syntax (lack of). And for that the immediate counter-reaction from any LISPy message board is outright vitriol. The lady doth protest too much, methinks.

The power of the LISP is great, but its syntax (lack of) renders it absolutely unreadable. Python is great and even C is *much* *more* readable. No, I shouldn't have to learn to "not see the parenthesis". They just shouldn't be there.

Add two things to the reason LISP failed:
- the explosion of open source, UNIX, and the web, all almost 100% using C. Before that other languages championed by a small or medium sized group might have had a chance.
- the smug superiority complex and unfriendly attitudes of a large number of anonymous lower echelon would be LISP advocates. It's framed as defense - after all "LISP is the greatest, blah, blah, why are you hating on ()". Well, now that history has soundly *proved* otherwise, instead of getting defensive, let's just honestly ask ourselves "why?".

For that, asking why, this was refreshing to read! Thanks!
chelahmy said...
We read back our codes because we want to do alteration or to correct bugs. How about if we never have to read back codes? How about if we just write some codes and forget about it? LISP failed because we want to use it like C, Java, or bla... bla.. bla.. Everybody knows that LISP is different and special. We suppose to code in LISP once we already know what exactly the program should do and write it fast without us to structure it other than to use (). If LISP failed then why LISP is still a fundamental language in major programming courses?
erik said...
>Of course, this deficiency can somewhat mitigated by good formatting, but I for one believe that formatting becomes more effective with more diverse and expressive syntax, and that Lisp failed because it cannot easily convey the structure of the code to the reader.

But in Lisp, you are actually reading the AST that represents the code.
Yu said...
> From my experience with Lisp, it's unreadability
> is neither caused by too many parentheses, nor
> by the lack of distinction between operators,
> function, and variables. In my eyes, what makes
> it unreadable is its lack of structural variety.

I'd think this is mostly a training effect. My programming beyond lectures (both computer science lectures and "programming for scientists") so far was mostly restricted to data analysis, but overall I've used some amount of python and Java. Yet, later I got invested into emacs and starting from simply customizations of modes I started writing my own libraries. After some time I actually found the structural variety of other languages awfully distracting even in Python.

Genrally I found reading the emacs lisp libraries more feasible than reading another persons python code. Surprisingly the absence of a true module system generally helps readability, as it causes each function call or variable access to be clearly marked with the module prefix. Kind of as if python required fully classifying the module name for each function call (except that if it did module hierachies would have evolved to be much more flat and module names as short as possible while remaining unique).
BAKL:U said...
All these comments were written by one person
another anon said...
I completely agree with "said..." (i.e. the man w/o a name :)). I started looking for a better language because C++ looked ugly, yet I needed something that could be fast enough and communicate with C/C++ code. Lisp seemed a good choice, especially because of extending possibilities, small syntax, etc. Later, when I started to read books about Lisp, I thought that syntax was hard because it was very new to me, so I didn't pay much attention to that. Yet, now, after reading some articles comparing Lisp and other languages, maybe even C++, C++ doesn't seem as alien to me as Lisp does: to read a small Lisp function I need a few minutes, while to read the same function in C++ I'll need no more than a minute (at least I'll have general understanding of it). Also, no matter how I formatted the code I couldn't read it as good as Algol-based languages.

Now, because of Lisp, I see Python as an ugly language (intermixed paradigms that demand synchronization when you need to change something small), but also I understand that that is an architectural compromise. In this world it's hard to be all nice and shiny.

Yet, I must admit I *really* want to use Lisp-based language, even functional-only, e.g. Scheme, but it is very hard to do so and I'm afraid I won't be able to reach that "light in the end of a tunnel" in my lifespan.
another anon said...
Damn, it's always like this: I post an answer or question, and then almost immediately I come with something better. Guess what, there *is* a way to solve syntax problem, i.e. to make it readable. Know why? Because it's Lisp! Check these out: (1) https://www.youtube.com/watch?v=MHDmVRU4fqw, (2) http://readable.sourceforge.net/
Andy said...
First C and C++ didn't spring from ALGOL. They are closer to PL1. ALGOL was designed for mathematical algorithms. None of the curent ilk of programming languages implement call by name. As far as I know the only widely used computer that provided that feature in hardware were the Barrows ALGOL machines. B5500 for example. The B5500 had 48 bit word memory and 3 flag bits. The code in the flag bits provided for indirect references and call by name on passed arguments. You could than use sum like

      Y = sum(x,1,4,sum(y,3,8,x+y));

In the above the inner sum(y,3,8,x+y) would generate an unnamed function to pass to the outer sum call. The variables x and y are not passed by value but by address. In the case of variables they are akin to pass by address address in C. It gets a bit confusing when recursion is involved.

Borrows B5500 had 48 bit word memory with 3 flag bits. The flag bits implemented the cal by name of ALGOL. it was a stack machine so when function was loaded onto the stack the call by name fag would cause it to be called. The compiler would generate unnamed functions when expressions were used as arguments. A variable would be a simple indirect reference. An error would occur writing to a function.

In the above the inner sum(y,3,8,x+y) would generate an unnamed function to pass to the outer sum call. The variables x and y are not passed by value but by name. In the case of variables it is the came as a address & reference in C.

ALGOL died with the machines designed to run it. It didn't have a big user vase to start with.

The only thing modern block structure languages have in common with ALGOL is their block structure. By the time C was being developed there were several block structured languages.

LISP was developed for AI research projects. It was tried for other functions and found lacking right from the start. APL is another failure and like LISP was unreadable. Those languages are are easy to program and near impossible to read.

There was a block structured LISP but nothing came of it. There certainly application were a list processing language is useful. AUTOLISP worked well in AUTOCAD.

One can in C++ implement a lisp list class. There are some specilized languages like TREEMETA to make use of lists.

The real reason that all the old programming languages are dying out is the personal computer. Very few would run on a small computer. And by the time personal computer evolved to where you could run mainfram languages on them no one was interested. I think PASCAL lasted the longest. PASCAL was meant to be used for teaching programming and lacked many features need for real world projects. Extended PASCAL compilers fixed the problems in different ways and eventually were droped in favor of C. It was free and available on most platforms. And along came WINDOWS and eventually object programming in C++.
Lemony Snicket said...
A lot of this information is incorrect. It's actually the computational errors that caused the downfall of C# and crushed tiny people.
Enter your comments here
your mom. said...
ITT: mostly idiots.
polos said...
Why did Lisp fail? (ehm... did it, really??)

Only now do we see a real renaissance of Lisp: why not earlier? Because it's not that easy (for common coders) to recognize genius!

But, in the long term, genius will always rule out all its competitors. Currently we are living the most popular Lisp age ever, so, Lisp seemed to fail, and when everybody thought it dead, it resurrected, to live forever!
Asgeir said...

this retrospective of old-school programming languages is interesting, but it goes against almost everything I've read about CS history and Lisp. Could you, please, cite your sources?
Samantha Atkins said...
Common Lisp is an Ansi standard. It is not fragmented. It is more powerful and far more flexible than any other language out there. That is its challenge. Few programmers are good enough frankly to master it. Or at least few programmers get to work in it full time which is pretty essential to mastery. There is actually far less special case cruft to keep in mind with CL than any other language. And it is handicapped by less libraries available out of the box (all batteries included) than say Python or Ruby. The latter is mostly a matter of fewer users adding adapters to various things.

Java is acknowledged as being designed so mediocre programmers can do something useful without blowing a foot off. It limits the hell out of better developers.

I expect Common Lisp to have a resurgence and things like clojure to be gateway drugs to Common Lisp. :)

Lisp can be the most readable of all languages as you can build up on the language to a language that deeply matches the type of problem at hand.
Chris Kohlhepp said...
I would propose that Lisp's condundrum re wide spread adoption or lack thereof isn't unique to Lisp. There are other very powerful languages which see very limited uptake. OCaml and Haskell are two examples. Hindler-Milner Type Inference (OCaml), Monads (Haskell), Lambda Calculus (Lisp)? These are terms that make the average developer's eyes glaze over. These languages are severely expressive, but are all based on a mathematical approach to formulation of problems that do not resonate with the mainstream software developer. Software development used to be the mainstay of computer science types. It is that no longer. Meanwhile features are cherry picked from these languages by mainstream languages: Python has a REPL. C++ now has type inference and lambda expressions. Java 8 has lambda expressions. So rather than specialist languages attracting the mainstream, the mainstream cherry-picks their features. Imperative languages are less expressive, but resonate with an ever less computer science and mathematics minded mainstream developer base. That's Java, Python, etc. Mainstream adoption then creates an advantage in terms of available libraries etc. and the process becomes self perpetuating.

Yet consider this: My SLR camera is far more powerful than the simple point-and-shoot camera in my IPhone. Yet there are more IPhone cameras in circulation than SLRs. What does this say about single lens reflex cameras? Not much, except that a better photographer will prefer an SLR over an IPhone. So popularity is not a measure of fitness for purpose. It might even be the inverse. What makes the IPhone camera more accessible to non-photographers, makes it less useful to professional photographers.

List processing is useful in generic programming. Generic C++ algorithms of modern C++11 almost invariably graviate towards representing everything as a list - only call these iterators & ranges. C++ also took onboard meta programming but has made this intractable and obtuse. Output and process are invisible. Lisp gets this right. Only problem is, only a very small number of C++ ever progress to the point where this matters: very large and complex systems. Incidentally this means that Lisp has a use case where meta programming is concerned. Clojure on the JVM with Java and more recently Clasp on the LLVM and C++. Watch this space...




Chris Kohlhepp
Aaron Krister Johnson said...
Lisp and Scheme are beautiful languages. I'm mainly a Python programmer, and know some C, and have done exercises implementing small lisp-like interpreters for fun. Lisp has a fantastic and rich history.

I think the real reason for its lack of greater popularity has more to do with the combination of lack of one leading implementation, and real lack of standardized libraries, partly because of the lack of a standard reference implementation. Yes, you have an ANSI standard, but who implements it to the letter?

In Python, you have a killer and beautiful language with amazing and powerful libraries which allow one to get stuff done right away.

Jay said...
Funny the thing I like most about lisp is the thing everyone else hates, the syntax. The syntax is what makes it a powerful language, the syntax's simplicity allows you to create macros and domain specific languages, which the author claims are what caused its failure even though that is not true since lisp is not alone in providing macros. C also supports macros however with one major difference that in C a macro is a direct text substitution whereas in Lisp a macro is a function which generates proper code, i.e. you cannot use a macro to write garbage like )(#sshdjos* gdgdh]% whereas that is possible in C. Lets not even mention how C allows you to create alias of existing types with typedef which can just as much confuse the crap out of anybody. C++ makes things worse with operator overloading and all the issues which come with it such as ambiguity and automatic type promotions.

Anyway macros are not the reason why I like Lisp's syntax but mainly because everything in lisp is a form which returns a value(s) with the only exception being declare expressions which are compiler directives rather than forms. Even constructs such as if, cond, case, loop, do (if, if-else-if, switch, while) return a value this means that you can write something like:

(setq var (if (condition) t-value f-value))

a literal translation to C would be (pretend x is an integer) :
int x = if (condition) { t-value; } else { f-value; };

except this is not possible in C since if is a special flow control which not only does not return a value but cannot appear in the middle of a statement in-fact if is not a statement so to translate that code into c correctly you would have to either declare int first, and then use if to control which setter code is executed:

int x;
if (condition) { x = t-value; }
else { x = f-value }

So much repetition though I can of course use the ternary operator:

int x = condition ? t-value : f-value;

though not only did it take me a very long time to figure the ternary operator out, it is no good if you only have a t-value (true) or only an f-value or multiple-conditions if, else-if. Suddenly C seems so ugly (just look at all the braces and semicolons), verbose and limiting compared to lisp. This syntactic feature of lisp alone made love the syntax since I can recall so many moments in C where I wished could just write somevar = if (some-conditon) { some-value; }; but couldn't. Note I recently learned lisp after programming in C, C++ and Java so this isn't some bad habit I picked up.

Another useful syntactic feature is the prefix notation which everybody hates. I love prefix for two reasons, number being that arithmetic operations are not some super syntactic constructs but regular functions, which function as regular functions. I can pass these regular functions as higher-order function arguments to mapping and filter functions e.g. I want to add two lists no problem (mapcar #'+ list1 list2) here I pass the + operator as a regular function, how would you do it in C? you cannot get the function pointer of an operator. The second feature I love about prefix is the ability to perform comparison operations on more than two arguments at a time (= a b c d) which check whether a b c and d are equal. In c you may think it is possible to write a == b == c == d but this is wrong since c == d returns a boolean which will then be compared to the value of b thus you have to write a == b && b == x && c == d again so ugly and so much repetition. Checking whether y is in between x & z no problem (< x y z) in C you would have to write x < y && y < z, thinks get worse if you have a list of values.

I used to think Lisp's syntax was bad but when I actually tried it out I realized it has numerous advantages and is the way it is for a reason, getting rid of the parenthesis would get rid of extensibility through macros and would destroy the everything returns a value rule. Using infix may seem more natural than prefix but in itself creates many problems which prefix does not have.
CPL (1963), Algol W (1966), Simula 67 (1967), and Algol 68 (1968) all had garbage collectors.
CPL is an ancestor of the simpler systems language BCPL, which is an ancestor of C.
Enter your comments here
Lek said...
Metrics of use and opinions here are those of the author, over many domains that human use, and are to be respected as such. LISP, both the so-called "pure/bare" and the functional, remains "best" for computers to use/understand and apply to themselves. That the various "LISP applications winters" have occurred is only anthropology, and $$$ market histories. This is not of/for interest to self referent, independent computer(s). Though small and not widely known, it exists and proceeds. For these, LISP is "best" (an undefined, and possibly undefinable, metric outside of LISP).
Lisp failed (in part) because of our (human) communication problem. An authority (Science of influence by Cialdini) could help us do merits negotiation (Getting to Yes by Roger and William). But with such an authority lisp may lose its freedom.
Every time Lisp fails, it's someone else's fault.

It's the government's fault for cutting funding to Lisp projects, not Lisp's fault for not delivering what the government wanted.
It's another OS's fault for being able to run on cheaper computers more efficiently, not Lisp's fault for needing expensive hardware.
It's another language's fault for being more readable, not Lisp's fault for being less readable.
Tom said...
Readability is in the eyes of readers. I personally think there is a degree of laziness that masquerades as preference--the superficial criticisms of Lisp are indicative of the lack of thought given to that for which Lisp can be used. But there is no doubt of Lisp's influence... the more "modern" programming languages only now are beginning to incorporate some of the things that Lisp pioneered...
LambdaHistorian said...
Fun stuff. LISP was implemented in FORTRAN first. The conditionals (if-then-else) were invented by John McCarthy, main creator of LISP. The first implementation of them were in LISP. They were pushed by McCarthy into ALGOL 60 later. He was one of the creators of that language too. Now they are everywhere.
kp said...
Lisp did not fail at all ... people failed to grasp it ;)
It does no harm to learn it and many if not most - as I experienced - changed their mind after some weeks when dealing with the "horrible syntax". Why is that so? I can't say for sure, however, I believe that there is some "ah" effect because it might radically change one's way of thinking (at least to non-math nerds). Certainly, without the existence of auto par completion editors nowadays I wouldn't bet my wig ...
Don't Be Stupid said...
I won't bother to specifically address most of the above comments. They are based on ignorance.

The fact is LISP has not failed. In fact LISP is still used for some of the most advanced work in AI. I know because this is what I do. I use LISP every day, and in fact I am developing a next generation LISP machine, because LISP is the best language for symbolic computation, but using it on existing systems, whether Linux or Windows, is very inconvenient.

It is programmers and systems developers who have failed. So many problems were already solved by the early LISP machines. It makes me sick to see progress taking the form of a drunkard's walk.

gzz said...
Good article.
Jacek said...
I have no problems with Lisp syntax. I learned to love the parentheses and flexibility it gives me in moulding the code. Although I struggle with reading other people's code I can still live with it. What bothers me most is the fragmentation of the Lisp community. They don't seem to care about making the Lisp ecosystem friendly for regular programmers. There is no influx of new well-documented libraries appearing all over the place. To the contrary, some of them suggest that you should not use a library that had its source or documentation edited in the last few years, claiming that such software is not mature enough. With an attitude like that it is not strange that Lisp became irrelevant.
Anonymous said...
Algol was created by mathematicians and scientists who used computers, before computer science existed as a subject. Many Algol researchers focused on static typing and proofs. Popular languages have similarities to Algol because the creators of these languages thought Algol had good ideas, like static typing, block structure, free-form syntax, nested procedures, dynamic arrays, and arrays with arbitrary lower bounds.

Fortran and COBOL are still used today and are still being updated and revised. COBOL is one of the most critical languages in the world. Algol 60 is still used today on the Unisys mainframes. These languages are used by a smaller proportion of programmers because mainframes are a smaller proportion compared to the 1960s.

Instead of saying the "ALGOL language family succeeded", it's more accurate to say that successful languages were influenced by Algol. If an entirely different person or people make an entirely different language, that's replacement, not evolution. C is not an evolution of Pascal or PL/I even though it replaced them in some places.

Garbage collection was an important feature of Algol W, Algol 68, and Simula 67, which are descendants of Algol 60 and from the 1960s. These languages were never widely used, but were widely known and influential to both theoretical computer scientists and practical language designers. Orthogonal as it relates to programming languages is also an Algol word. Algol programmers believed dynamic typing is a special case of static typing.
anonymous_prog said...
Well it is true, that from a first glance on lisp code, you really do not know what this fragments do. But this is also true especially for C++ code. However there are style conventions you can undergo, that exclude a bit the variety for the sake of clarity. What impresses me in that language is the expressiveness and the very munch better means to write mostly bug-free code (wider debugging-features, code change on the fly, trying out things in the REPL similar to Python,...). I come from a C++ industry background. Code grows as time goes by and the amount of bugs grow with it. C++ might be "expressive" but it is a poor mixture of a lot of paradigms (usually inspired by LISP before, surprise) thus yielding to a lot of exceptions to be considered when using them (just try to get your head around the rules for function parameter transfer when you have reference, rvalue-reference, pointer, value, implicit conversions, decay, the rules for const_cast<>(), .... Designed to cause big headaches and a lot of bugs on the fly, and I won't start with templates!). In C++ things can be done in 100 ways, where 99 of them are pure buggy crap! It is not valuable variety, if I likely have the choice between pest and cholera. There is a reason why we need MISRA rules by the way. When you cannot solve things techincally, you need to create a law, simply put. In earlier days lots of code was pushed to open community, where Lisp dialects where kept and priced causing big momentum to those languages. So the "Algol community" grew bigger. AI was not so important any more in the 90s (the AI winter), and Lisp rolled down the hill as not needed so much. Like someone else has put it somewhen: "It's the money, stupid!"

Enter the 10 characters above here

About Us Returns Policy Privacy Policy Send us Feedback
Company Info | Product Index | Category Index | Help | Terms of Use
Copyright © Lockless Inc All Rights Reserved.