Why Lisp Failed
This article investigates why the lisp computer language is no longer widely used. Long ago, this language was at the forefront of computer science research, and in particular artificial intelligence research. Now, it is used rarely, if at all. The reason for this failure is not age. Other languages of similar heritage are still widely used.
Some "old" languages are FORTRAN, COBOL, LISP, BASIC, and the ALGOL family. The primary difference between these languages is who they were designed for. FORTRAN was designed for scientists and engineers, for whom solving equations on computers was the primary task of programming. COBOL was designed for businesses, being especially expressive so that business people could take advantage of the computer age. LISP was designed by computer scientists, and was expressive enough for research into the fundamentals of computation. BASIC was designed for beginners to learn programming. Finally, the ALGOL language was modified by computer programmers, and evolved into a huge family of popular languages such as C, Pascal and Java.
Some of the above languages are no longer quite as popular as they once were. This will be the definition of "failure" we will use here. The question is why did they fail? The first stand-out is COBOL. Unfortunately, its design to be humanly-readable by business people was its downfall. Businesses found that it was possible to hire programmers to look after their computers. Programmers would then gravitate to languages designed for them, rather than their managers. Thus over time, more and more business functions would be programmed in languages such as VB, C, C++ and Java. Now, only relic software tends to be still be written in the language.
BASIC suffered a different fate. It was the language of beginners. Those just learning to program on microcomputers would use the in-built BASIC language to start off with. As time progressed, microcomputers were replaced by personal computers running Microsoft operating systems, or macintoshes running Apple's. The language evolved with time, becoming Visual Basic once the desktop paradigm arrived. Since it could be used by those with little programming skill, it replaced COBOL for a while. Why pay for an expensive compiler, if a cheap interpreter that comes with your machine is all you need? Recently, Microsoft has moved to the .Net system, leaving VB behind. Its replacement, C#, is an ALGOL family member closely related to Java.
FORTRAN usage has waxed and waned throughout the years. At one stage, nearly all science codes were written in it. Its advantage was that there were no pointers in the language, and recursion was disallowed. This meant that all data reference locations were able to be compile-time constants. FORTRAN compilers could use this extra information to make extremely fast programs. Unfortunately, as time progressed fixed sized arrays as data structures became obsolete. Now, science works with arbitrary shaped grids, and even more complex representations of the real world. This required the addition of pointers to the language. Around the time that happened, FORTRAN went into decline. Now it is relegated to high performance computing workloads where new parallel matrix and vector operations recently added to the language still give it the performance edge.
The ALGOL language family succeeded. The reason for this is that these were the languages written by programmers for programmers. As time progressed, these evolved into the system and application languages most commonly used today. Their advantage was that the more programmers that used them, the more the languages improved, and the more programs that were written in them. This provided a virtuous cycle, where more programmers were in turn hired to work on the programs that were written. This is an example of the network effect. The "worth" of a system is proportional to the square of the number of users of it, due to the number of interactions between users scaling at this rate.
So why did the Lisp programming language family end up on the failure side? Some say it is due to the syntax. Lisp is notorious for its parentheses. I do not believe that is the reason. Many users of Lisp say that formatting quickly allows them to match bracket-pairs. Also, soon after the invention of the language super-brackets were created to quickly close off an arbitrary number of open brackets. This language feature is rarely used today. Finally, syntax-understanding editors have made most of the layout problems of Lisp nonexistent in this age.
Another common complaint against Lisp is that it is a functional language. Could this be the reason for failure? Amongst all the early important languages, it alone is functional in nature. Unfortunately, I don't think reality is this simple. Lisp contains imperative features, and the ALGOL family can be used in a purely functional manner. If one wishes to code to a certain paradigm, certain languages may make that choice easier to make. However, modern languages are flexible enough to support many programming paradigms, and there is no reason a mostly imperative Lisp might not exist.
Perhaps the problem with Lisp was that it used a garbage collector? Again, amongst the early important languages, it alone had one. Garbage collection requires more memory and computational resources than manual memory management. Could the lack of memory and low performance of early computers have held Lisp back enough to slow its adoption? Again, I do not think this was the case. The complex programs Lisp was used to create would require something with the complexity of a garbage collector to be written any way if they were implemented in another language. The proverbial statement that any complex enough program eventually contains a poorly written implementation of Lisp does hold some weight after all.
The reason Lisp failed was that it was too successful at what it was designed for. Lisp, alone amongst the early languages was flexible enough that the language itself could be remade into whatever the user required. Programming with the other early languages involved breaking a task into small sub-tasks that could then be implemented. The larger tasks could then be implemented in terms of the smaller ones. Lisp was different, due to its power, a programmer would be able to design a domain-specific language that would perfectly solve the task at hand. Due to the orthogonality of the language, the extensions written would work seamlessly with the core language.
So what is the problem with creating domain-specific languages as a problem solving technique? The results are very efficient. However, the process causes Balkanization. It results in many sub-languages all slightly different. This is the true reason why Lisp code is unreadable to others. In most other languages it is relatively simple to work out what a given line of code does. Lisp, with its extreme expressibility, causes problems as a given symbol could be a variable, function or operator, and a large amount of code may need to be read to find out which.
The reason Lisp failed was because it fragmented, and it fragmented because that was the nature of the language and its domain-specific solution style. The network effect worked in reverse. Less and less programmers ended up talking the same dialect, and thus the total "worth" ended up decreasing relative to the ALGOL family.
If one were designing a language now, how could this problem be prevented? If expressibility is the goal of the language, then it must somehow be moderated. The language must have deliberate limitations that allow for readability of code written in it. Python is a successful language where this has been done, where some of these limitations are hard-coded, and others exist via the use of convention.
Unfortunately, so much time has passed, and so many Lisp variants have been created, that yet another new language based upon it is probably not the answer. There simply will not be enough users to make a difference. Perhaps the solution is to slowly add Lisp-like features to languages within the ALGOL family. Fortunately, this seems to be what is happening. The newer languages (C#, D, Python etc.) tend to have garbage collectors. They also tend to be even more orthogonal than the older languages. The future may eventually contain a popular language that behaves much like Lisp.
Company Info |
Product Index |
Category Index |
Copyright © Lockless Inc All Rights Reserved.