Silver Bullet

5/27/2006 9:06:47 AM

Silver Bullet

No Silver Bullet.

Decades ago, Fred Brooks wrote that there is no silver bullet—that there is no methodology that will improve productivity, reliability, simplicity by an order of magnitude.

But, as we look to the horizon of a decade hence, we see no silver bullet. There is no single development, in either technology or in management technique, that by itself promises even one order-of-magnitude improvement in productivity, in reliability, in simplicity....

Not only are there no silver bullets now in view, the very nature of software makes it unlikely that there will be any--no inventions that will do for software productivity, reliability, and simplicity what electronics, transistors, and large-scale integration did for computer hardware.

His contention was based on the distinction of essential and accidental complexity. Accidental complexity can be removed, but essential complexity is intrinsic to the problem and can never be eliminated. Most of the software industry has accepted this credo of “there being no silver bullet” through blind faith and actually missed the many counterexamples of how developers are more productive through technologies like “garbage collection” and “managed environments” than they were two decades ago. The credo also cause a greater emphasis on process rather than tools, where the problem lies.

Large Code

Recently, in light of the repeated delays of the Vista operating system, there’s been numerous press speculation on difficult of reliably delivering reliable software. WSJ wrote in 2004 about the Jim Allchin, head of Windows, determined Longhorn was fundamentally broken and ordered a restart based on the XP codebase. In addition, a new group was created to examine how Windows could be layered and componentized; some members of the group has spoken about their goals in Channel 9. David Bau of the New York Times blames Microsoft’s software woes on too many lines of code in support of backward-compatibility.

Alan Kay recently spoke to students at University of Utab on “Is the Best Way to Predict the Future to {Invent/Prevent} It?” Kay invented Smalltalk, a language renowned for its elegant mix of simplicity and power. Smalltalk, like Lisp, embodies a set of principles such as “Recursive Design,” in which the “part is just as powerful as the whole.” Theses principles were inspired by biological systems: the complex human body, for example, is based composed of trillions of simple cells, that are ultimately derived from a surprisingly small set of genetic instructions.

One attending student, Windley, wrote in his summary of the Kay’s speech:

This describes the state of computing. We live in the 80’s extended into the 21st century. The only thing that’s changed is the size. Windows XP has 70 million lines of code. It’s impossible for Alan to believe that it has 70 million lines of content. Microsoft engineers don’t dare prune it because they don’t know what it all does. Cathedrals have 1 millionth the mass of pyramids. The difference was the arch. Architecture demands arches.

Another attending student, Grace, chimes in with similar comments.

The pyramids in Egypt - these were an example of non-engineering. They basically brute forced (using about 200,000 slaves) a huge pile of junk together and covered it with a pretty user interface - limestone. The analogy here probably doesn't need to be expounded upon, but just in case, he also mentioned that Windows has 70 million lines of code in it, with nowhere near 70 million lines of content. But they're terrified to take anything out because they don't know what something else might rely on. All they can do is add to it.

Some authors wrote in defense of the complexity of large development project such as TysonD in the Law of Large Code Bases, Eric Lippert in what goes into “Five Lines of Code.” Paul Vick mentions UI Taxes… I am not sure that I buy any of their arguments that the complexity of code is necessary. I think much of these taxes could go away, if programs were more declarative (where control flow is automated) and if code was written in a more regular, self-descriptive, and orthogonal manner.

One argument that I do buy was made by Scott Guthrie in a lecture he gave on Visual Studio’s development process. In a group of ten developers, a build break, which results in developer downtime, may occur every couple weeks. When there are thousands of developers contributing to a project, costly build breaks occur all the time, so Visual Studio has developed a procedure of forward and backward integration of code between subgroups. There seems to be more of a process issue, but one can imagine that tools could be improved so that they are less prone or more resilient to breaks.

Finding the Silver Bullet

Sometimes, I feel alone in my view of a “silver bullet,” although I know that there are others out there pursuing Intentional Programming and the like. I came across this post by Max, “Are Paul Graham and Joel Spolsky Right?”

However, the overall picture is the same: the current prevailing models/methods/tools are defective and should be fixed/replaced. The next step in this reasoning is that if it isn't fixed within their organization, the best programmers should themselves start their own companies.

While I don’t think it logically follows that one should start a company based on the poor programming practices of one’s current employer, my own reason starting a software company are partly based on my realization that software is generally written in the industry in a very brute force manner, which I regard as “wrong” even as it may be more beneficial in the short-term. 

My overall approach has been to embrace a more declarative style of programming, in which code looks exactly like the specification. That has to be the ideal; the only way to achieve write less code would be to change the language of the specification. I would write the specification first as data, and then find a way to transform it into code or process it. In college, I was able to develop an application very quickly through a combination of DSL and AI (constraint satisfaction) engine.

The trick to programming quickly and reliably is to automate as much possible—to get the program to program for you… I love this quote from software factories:

We see a capacity crisis looming. The industry continues to hand-stitch applications … using strings, integers and line by line conditional logic. Most developers build every application as though it is the first of its kind anywhere… It will require a transition from craftsmanship to manufacturing like the ones we have seen in other industries,

Initially, I relied on code generation and data-driven programming to process my DSLs. I moved from XML to S-expressions since I wanted to easily be able to incorporate expressions (code as data) and eventually move to a simple declarative Turing Complete language processed by an AI system.  C# 3.0, with its combination of object initializers and expression trees, is looking good for supporting inline DSLs complete with static type checking and references to global symbols, though at the cost of code size.

This has also led me to look at functional and logic programming, both declarative languages. (I have often asked myself what if Word was written in Lisp?)  I also have an interpreted (compilable) functional language that is able to process expression. Lately, I have been pursuing a more functional style of programming in C# itself. Yes, C# can be used liked a functional language, although it is missing some niceties like optimized tail recursion. If you are willing to accept the costs of passing around anonymous delegates as parameters and return values, you can also get lazy evaluation, but all those temporary classes probably slows down the runtime.

Am I on the right track? So far I think I am…

I noticed this interesting article on the Silver Bullet in which a company is trying to accomplish similar goals. They’ve develop a system called COSAS, a signal-based, synchronous software model, which attempts to drastically improve software reliability and productivity.

The system is based on a Behaving Machine rather than a Turing Machine, which is identified in the article as Turing’s Monster. The article says the step-by-step algorithmic approach used by TM’s descendents (procedural programming languages) is a disaster, but that the industry has a vested interest in maintain the present software model to ensure high-paying, long-term jobs.

The author never mentions the benefits of functional programming based on lambda calculus. I notice many parallels between COSA and functional programming such as better support for concurrency and composition. Neither uses control flow constructs.

Comments

 

Navigation

Categories

About

Net Undocumented is a blog about the internals of .NET including Xamarin implementations. Other topics include managed and web languages (C#, C++, Javascript), computer science theory, software engineering and software entrepreneurship.

Social Media