Anybody familiar with Wirth's law?

LOL, Wikipedia.

Wirth's Law is hardly a law; it's a hypotehsis.

As written in that Wikipedia article, it's based on a flawed premise. Moore's Law (also not a law) doesn't say anything about hardware performance. It's about hardware density.

High-level languages don't always involve "many layers of interpretation", and some don't involve any at runtime.

On the other hand, some architects do use the anticipated increase in available memory and computing power to justify design decisions. The Microsoft managed langauges and Java are examples of this.

Like so many Wikipedia articles, this one does a terrible job of representing the original idea. The conclusion that "software bloat comes from providing a ready solution for every conceivable problem that a computer programmer using such an abstract, high-level programming language might want solved." is a complete fabrication. (And I've deleted it; if you want to see that version of the article, check the history.)

Wirth is a great computer scientist, though not a visionary or hero like, say, Donald Knuth. It's kind of amusing that you think Wirth wrote something that supports your idea, rather than the other way 'round.
 
The most accurate of what set? It's nominated for deletion because it's completely unfounded and original research.
 
[Yeah ... 5 year old thread but ... 'tis a timeless topic. (a google hit that sucked me in to responding :))]
I think OP & mikeblas might be able to agree on (at least some aspects of) the quote below. (From a May 1999 interview [link])

[
Q: What is the most often-overlooked risk in software engineering?

A: Incompetent programmers.
]
There are estimates that the number of programmers needed in the U.S. exceeds 200,000. This is entirely misleading. It is not a quantity problem; we have a quality problem. One bad programmer can easily create two new jobs a year. Hiring more bad programmers will just increase our perceived need for them. If we had more good programmers, and could easily identify them, we would need fewer, not more.

-- David Parnas
 
Still a hypothesis and not a law, after all these years.
 
LOL, Wikipedia.

Wirth's Law is hardly a law; it's a hypotehsis.

It similarly annoys me when people talk about "Moore's Law," especially by people who should actually know better. By using the scientific term law for things that are FAR from laws, you take all credibility out of your statement. And also, Moore's Observation is useless. There is absolutely zero benefit realized in making or knowing about that observation.

This Wirth's Observation thing is similarly useless.
 
I'm curious, what do you think is the salient difference between a phenomena that has always been observed to be true, and a scientific law?

Moore's law may not always be true, but then Newton's law of universal gravitation also may not always be true.
 
Moore's observation will never be a law because it's not repeatable. We can't make an experiment that proves Moore's observation to be true; we can just observe the current situation and see if the law still describes that behavior. Moore's observation might become false when new fabrication techniques become available, for example, or a widespread change in the pace of the industry.

Newton's laws can be repeated in experiments at will, and the relationships his laws describe are demonstrable in experiments at any time. There's nothing that changes the relationships those laws describe, and therefore the laws are immutable.
 
Moore's law is repeatable in the sense that ever since it's creation we can make observations at different times and see that it is true. 130nm->90->65->45->28->22 etc.

Newton's laws are repeatable in the same sense. We do not know they are immutable, though many *believe* them to be in some loose sense via inductive inference (which of course carries its own epistemological problems; see http://plato.stanford.edu/entries/induction-problem/ ) . Do we presume that they were true prior or during the big bang? Will they continue to be true in a centillion years? I have no idea.

I don't want to sperg anymore on this topic, but I'm open to further discussion if it's light-hearted.
 
Last edited:
100 years from now, Newton's laws will still apply. In 10 or 15 years from onw, Moore's law will no longer apply. They also didn't apply before the invention of the silicon wafer. Moore's law is only approximately true -- complexity doesn't double exactly every 18 months; Newton's laws are mathematically exacting.
 
Yeah I tend to agree.

On 10-15 I think that is also probably true, but people have been making false predictions for a while. Also if we are to take Moore literally - "The complexity for minimum component costs has increased at a rate of roughly a factor of two per year" - then it's not necessarily device density that needs to double - a focus on larger chips and yield improvements could hold it true.

In any case I am quite comfortable calling it a law for the window it was true. Very few "laws" are truly immutable.
 
Back
Top