Selected Reactions to Fant's article:



A Critical Review of the Notion of the Algorithm in Computer Science
On-line discussion of article: <http://www.itwire.com.au/content/view/13339/53/>

Maths, Logic and language
written by Trevor Batten, July 09, 2007

It seems to me that the problem may be more fundamental than at first appears: Basically, in recent years, the (commercial) success of pragmatic computer systems seems to have undermined interest in the theoretical aspects. However, without a theoretical framework there is no external way to judge success or failure -and so everything that functions (even marginally) can easilly be considered a "success". Java, for example may be a wonder of pragmatism -but it clearly lacks the internal logical consistency of Algol -which was incidentally commercially supplanted by Fortran despite the fact that C which became fairly mainstream later actually seems to be more Algol-like than Fortran-like. So, if "pragmatism" creates so many false trails there may indeed be conceptual (and eventually pragmatic) disasters lying in wait for those who are too commercially "pragmatic" in their approach.

However, to understand the problem better we may need to understand better the complex relationship between theory and practice: In this context, questioning the "value" of mathematics seems difficult (but perhaps essential) simply because "mathematics" itself seems a rather poorly defined (or poorly understood) subject. Is it, for example, a sub-set of Logic -or is Logic a sub-set of Mathematics? Clearly computer programming is a "language" based activity -so we might also need to ask how does "language" relate to both Logic and Mathematics? The Anglo-American education system seems to make a great distinction between "arts" and "sciences" -where apparently, men of letters do not get involved with the world of numbers.... and yet surely both mathematics and Logic are a form of language -where perhaps for the true practitioners the "expressive" qualities might be more satisfying than their pragmatic uses. Our education system seems to have forgotten that Pythagoras was a mystic.

As an artist who has long been interested in computer programming I'm very interested in the philosophical (ontological and epistemological) aspects of computer programming (including mathematics and language) -but it does seem that (even in the arts) these aspects have become buried deeply out of sight =largely because of academic and commercial interests it seems.

I read somewhere that before the rise of "chaos theory", physics and mathematics in the US also had problems relating to each other: That although modern (sub-atomic) physics is largely mathematical -the physicists didn't want non-physics trained mathematicians teaching physics students their mathematics. So perhaps the same is true for Computer Science: It's not that mathematics isn't needed -it's just that the wrong type of maths might be being taught in many places.

Incidentally, I believe that many years ago in Holland a top International AI researcher (imported from the US) lost his job in a dispute with the faculty -and that he too believed that this was because the AI was still under Mathematics -which lead to a poor understanding of his work.


Taxonomy seems an important part of epistemology -which is surely the ultimate basis for programming. How strange that taxonomy and epistemology have not been applied in a more self-reflective way by the academic system.
----------------------------------------------------------------------

Sysadmin, Mathematician, Programmer, Graduate Student
written by Bertrand, July 09, 2007
Disclaimer: I love mathematics, and I think everyone should know more about this beautiful subject.

To Will M,

What you seem to be gearing for in school is a service orientated model. I understand your position contends with clients everyday. You seem to enjoy dealing with the business end of the computing world. And as you said, it is a profitable world that you live in.

Here is what mathematics is to me, and to anyone who looks at it with some scrutiny. Mathematics and its algorithmic cousin is a language. I use math, everyday, to describe a small chunk of my reality. Most of algorithmic applications of mathematics, i.e. regular expressions, turing machines, push down automata, come from very specific applications of modern algebra. It would be difficult to understand how a computer works in its abstract form without some degree of modern algebra. It would be difficult to understand how a computer works on the level of the jk-flip flops without some understanding of electrical engineering. Which is heavily based on computated results that require mathematics.

I understand that you want graduates to be more like you, whom you seem to be successful at your endeavor. However, if I had to be hiring for a position, that required previous studies in a field at length, I would want them to understand the very foundations of that field. I might not be your average hirer, I live in the world of applied math and I deal with very large matrices everyday, but the quality of a student to me, is his ability to learn and his drive to understand the topic at study to its utmost completion. Without mathematics, I can't seem to fathom how a computer graduate can do that.

What seems to be the quarrel here isn't ultimately about mathematics and computers. Of course they are related. There is no better descriptor of discrete automata that mathematics. They practically go hand in hand. Where I am seeing the divide is the computational nature of entry level mathematics.

University administrators have somehow observed a corollary between high computational math skills and better than average symbolic knowledge skills. They don't really know why, it just happens that people who teach and grade these classes that this is mostly true. Being a graduate of these classes and interacting extensively with people who do well in both the graduate and undergraduate spectrum, those that have highly touted computational skills in early levels of mathematics aren't people who work extremely hard to take notes, work extremely hard at the homeworks, but do spend a great deal of time attempting to understand what is going on. That is, say someone wants to work a green's theorem problem, a classic overachiever will whip out the calc book, find the transformation of rewriting the integral, and go to town. Someone who is attempting to understand what is going on, will look at the region of what they are integrating, check out green's theorem, see why the regional and the line form integral yield the same result, maybe bother the professor about it, and they'll figure out that beneath all this mathematical mumble jumble, there is a down to earth, easy to grok understanding of the process. Mathematics as a hurdle to solid prospects, encourages the pursuit of insight, because in the long run insight is the easiest way to deal with problems. When they get to computer logic, the pursuit of insight propels them to understand what is actually occurring in class.

The real irony of this is, as much as the math people try to teach that the pursuit of understanding is the easiest way in the long run, they are working with their hands tied behind their back. Mathematics is taught as a series of tricks from the time past geometry, to where you are suppose to peruse these giant black boxes as part of your curriculum, never to understand what is really going on. In calculus, you're taught pretty much the same way, except if you get a real professor there is a constant tick at the back of his head that he is really suppose to let on a lil more than "math is just tricks with numbers." So if you're lucky you might get a little more out of it than your average brethen.

Most real math doesn't get taught in the united states, until the late sophomore early junior stages of a collegiate mathematician's academic career. That means that all non math majors, save the physics people, have no opportunity to see that mathematics, properly taught, encourages actively to traverse difficulty with understanding, and denote said traversing with a clean, logically consistent diction as the answer.

But anyway, I hope you now see a little bit more why mathematics is in every engineering curriculum. If used properly, mathematics can do so much more for the student than it is doing now. Hoped that was a good read.

----------------------------------------------------------------------

Missing the point...
written by JXST, July 09, 2007
The defensive "pro-math" people are missing the point here and need to open their mind a bit. Though the sensational article title would imply as such, I don't think Fant is trying is trying so much to argue that mathematics is not an important aspect of computational theory (a strong mathematical basis for computation has already been demonstrated with state machines, Turing Machines, etc., and in practical situations various specific applications will require various knowledge of higher level math obviously -- but many don't).

Fant, rather, is trying to say that concept of the "algorithm" is poor tool for general purpose tasks. In this sense the "algorithm" is roughly defined as data decelerations and sequential lists of operations along with the power of conditional branches, loops, and maybe recursion -- the basis of the way we usually program computers to this day. Since the early pioneers of computational theory were mathematicians, it's no surprise that the algorithmic model is largely based is largely based around sequential operations; solving math problems is usually a step-by-step process which fits this model quite nicely. The algorithmic model becomes much more of pain to work with though when we introduce things like concurrency, data sharing, events, temporal interactions -- things often encountered in general purpose programming. The amount of bugs and exploits found in software is a testament to the difficulty and weakness of this model.

While the algorithm has been triumphed as the premier model for expressing processes, it is not the only model. As described we have alternative models like logic circuits, neural nets, and we also have improvements to the standard algorithmic model such as object-oriented programming (OOP while an improvement still fails though to escape many the algorithmic problems associated with algorithms since it's still fundamentally algorithmic). Each paradigm has advantages and disadvantages in terms of various things like easy of learning/use, rapid development, performance limitations, concurrency, etc. Needless to say, a paradigm which performs optimally at all these tasks would be a massive contribution to CS and society at large. Why isn't their more focus on developing new paradigms and forming a wide theory of process expression? Why has field of computer science has focused so much on questions like "what can be computed?" while neglecting questions such "how can we express a process/interactive system?".

Fant's argument is that the algorithm has been given too much focus due to a mathematical "cultural bias" in CS and needs to be booted as the basis of computation. Instead, the central focus of CS should should be a comprehensive theory of process expression which includes, but is not limited by the algorithm. Perhaps only then will we be able to discover new programming models which could revolutionize computing
----------------------------------------------------------------------

Algorithm or process
written by J.B., July 09, 2007
Lets be clear that the original argument is that algorithmic analysis is a poor method to analyze computational systems and that process (or logistics) is better. NOT that mathematics should no longer be taught; obviously mathematics are an essential tool for computer engineers, like in any other discipline. Someone suggested computer engineering is about shovel coal into the engine; it's not, it's about making the computer shovel coal into an engineer, or make it do any other task of interest. Hence it's about thinking in terms of process, or logistics. Of course logistics can employ algorithmic analysis for subproblems, but fundamentally logistics deal with NP-complete problems (like finding the most efficient way to shovel coal into an engine, in the face of changing external random parameters and to meet particular business requirements), not a problem domain suitable for algorithmic analysis. Some components of this are though, of course.

This is not about abandoning mathematics, only abandoning the notion that computation and the algorithm are interchangeable. CS *obviously* needs its own mathematical field of 'process of computation' that's neither strict algorithm or strict process.

----------------------------------------------------------------------

Algorithms v Processes and CS v IT
written by Brian C, July 09, 2007
The author's ideas are also described in his 2006 (revised) paper http://www.theseusresearch.com/Downloads/algorithm article.pdf and it includes a line in the conclusion with the same sentiment that was misunderstood here sparking the IT v CS debate: "Mathematicians and computer scientists are pursuing fundamentally different aims and the mathematicians [sic] tools are not as appropriate as once supposed to the questions of the computer scientist". I don't think he is suggesting that computer science should be a less rigorous field, but rather that the strict mathematical definition of an algorithm is insufficient to describe the variety of processes witnessed therein. The excerpted bulk of page 6:
"The mathematicians consider the process as independent of its expression. A process may be expressed in any convenient language and executed on any convenient machine including a human with a pencil. Mathematics is primarily concerned with the nature of the behavior of process independent of how that process is expressed. ... Computer science is primarily concerned with the nature of the expression of processes regardless of what particular process might be expressed. ... There is much overlap between the interests of computer science and mathematics, but this core concern with the nature of process expression itself is the unique conceptual focus that distinguishes computer science from the other sciences and from mathematics. Computer science is the science of process expression. One published definition of computer science comes near the mark.
“computer science itself becomes no more (and no less) than the discipline of constructing appropriate descriptive languages.� [16] "

Now that we've dealt with the subject of the article, let's deal with IT vs CS. I recognize that the modern computer age would not work without the untold numbers of coders and admins who are interested more in the doing-it than in the theory, and who are handsomely rewarded for their efforts. They don't need calculus or diff eq or even Big O notation, and it doesn't make what they do less important. I'll go ahead and point out that without the mathy types, the computers themselves would not exist, but that doesn't make CS better (which does not then imply that CS is worse either). What I do resent is that the anti-academic money-is-superior attitude is invading CS programs, causing them to dumb themselves down to meet less rigorous career demands. Universities have IT/CIS programs and those not interested in the mathematics of engineering should avail themselves of those, technical schools, or their own self-sufficiency. We need more IT types than CS, but they are minimally overlapping parts of the same field and we need both.
----------------------------------------------------------------------

Did anyone actually read the paper?
written by A.N.Onymous, July 10, 2007
Well, I did. I find it interesting that is was first published by the ACM in 1993, and there's been virtually no discussion in academia since then.

Anyway, for those of you who haven't read the paper:
* Fant is concerned with the notion of "algorithm" as defined strictly by Church, Turing, Goedel (and others) and observes that it is narrowly construed (Section 4).
* He notes that practical "computing" (not computer science) is not often concerned with this strict algorithmic analysis (Section 5).
* He makes the basic distinction (I paraphrase) that computer scientists are concerned with how a particular piece of computing "stuff" works, whereas mathematicians are concerned with "what you're doing (independent of how you accomplish it)".
* He offers the opinion that the former is more interesting in modern practical computing and that the strict adherence to whether or not something is an algorithm (in the Turing sense) is less important.

Of course, it's hugely ironic that in order to understand his arguments, one needs to have a solid training in mathematics (how many non-mathematics-trained IT practitioners know who Church, Turing and Goedel are?).

Personally, I think Fant is wrong, because an understanding of the formal framework for algorithmic analysis lets a practitioner have insight into how his or her code works (whether or not it is an "algorithm"). Yes there may be new formalizations possible but that
1) will still be "mathematical"
2) won't make the concepts of algorithmic analysis obsolete

As someone who has formal training in math (Waterloo B.Math) before computer science, I can attest to the fact that the formal language/parsing theory courses, queueing theory courses, graph theory, functions and relations and all the rest came in most useful in my career writing compilers, operating systems kernels, command-shells and relational databases. I can't imagine how much harder it would have been for me to really understand how a relational database works if I didn't understand what a mathematical relation was. The SQL "select" statement is just a bad syntactic representation of relational calculus -- writing queries is a whole lot easier when one understands what the relational formulation really is.

Fant's paper was intentionally controversial (if not inflammatory), and gained no traction whatsoever. Just let it rest.
- ----------------------------------------------------------------------
...
written by John E Grisinger, July 10, 2007
Computer science/program is engineering. You apply what you know and the tools you have to solve a problem. Engineers were building bridges long before they has a theory of structures that allowed them to analyze stresses and strains. We use logic and mathematics where they are applicable and useful, including describing the characteristics of the various types of information. However, logic and mathematics are unable to prescribe the characteristics of information and they are unable provide a foundational understanding of the nature of information. Those characteristics are independent of logic and mathematics. Until we have a theory of information, we will continue to be doing engineering. My attempt to develop such a theory is described in a paper at: http://www.varadata.com/AnInfo...ciples.pdf <off-line version>


----------------------------------------------------------------------

math in CS is not only about algorithms
written by A.Lecturer, July 11, 2007
"Computer science needs a comprehensive theory of process expression."

I agree ... and this theory (parts of which are aleady existing) will be a piece of mathematics. Or what else?

BTW, many BSc's in CS do not teach or require math anymore
----------------------------------------------------------------------

Two Notions of "Expressive"
written by Albert Baker, July 12, 2007
Interesting that a comment on an old paper can evoke such strong responses. Too bad we don't collectively realize that Computer Scientists and Software Developers (gml 7/9) are both part of the same whole - let's call it Information Science, just to avoid the loaded terms being bantered about.

Most of the usual issues are well hashed out in the earlier comments. I do think the first comments on the original article dance around an interesting attribute - "expressive". Computer Scientists and Software Developers are inclined to see this differently - and under appreciate each other's viewpoint.

Corner's example of a logic circuit provides a good example. Let's first take the Computer Scientists point of view. Of course a logic circuit is an algorithm. Consider a circuit with just one output. Convert the boolean logic for that circuit into, say, disjunctive normal form (the "or-ing" of terms that just have "ands" - and no other depth to the expression). Might be worth noting that a BS degree in Computer Science would be sufficient to know that every boolean expresssion, regardless of the level of nesting, can be represented in disjunctive normal form. Assume the disjunctive normal form is T1 / T2 / ... / TN. The "more algorithmic feeling" representation of the logic would then be:
if T1 then
TRUE
else if T2 then
TRUE
...
else if TN then
TRUE
else
FALSE

And, as it turns out, logic circuits can all be represented by simple Finite State Automata, so in the hierarch of "computational expressiveness", they are at the lowest level. The Computer Scientist is primarily interested in this notion of "computational expressiveness".

However, this is not the first concern of the Software Developer (or the practicing Computer Engineer designing circuits). The Software Developer wants to see the circuit layout because that representation is the most expressive with respect to human readers/writers of the representation. The conversion of anything but the most trivial expression into disjunctive normal form will result in an expression that is harder to read. It is less expressive.

In my prior professional life as an academic, I worked in the area of formal behavrioral specification languages. The drawback to many of the actual formal specification languages (Z, Larch/xx, ) was they use "mathematical notation" in the specification - even for those things that are readily expressible using the syntax of standard programming languages. Some of them actually required the use of special symbols, e.g., "epsilon" for is an element of, "backwards E" for there exists, etc.

One problem with these specification languages developed by Computer Scientists was they were not expressive - in the sense of interest of interest to software developers. In fact, they were not algorithmic - in the computation expressiveness sense - one could write specifications for which there was no terminating computation. For example:
mySet.isIn(1) && orall(int x) [ mySet.isIn(x) ==> mySet.isIn(x 1) ]
(Note to my CS colleagues: Let's not quibble over whether this assertion is or is not algorithmic. We can all agree it is non-terminating.)

Most of the time, the Software Developer is unconcerned about computational expressiveness. They basically trust that compilers will correctly (in a formal sense) translate source code to some form of interpretable or executable "lower level" representation. They trust that libraries will work as advertised. However, compilers and libraries, e.g., STL for C or java.util.*, were highly dependent on Computer Science for their development.

So, gentle colleagues, understand that when a Software Devleoper says expressive, he or she means "readable", "understandable", "write-able", etc. - a critical concept for languages, IDE's, ETL's, etc. And, to my Computer Science friends, a this notion of expressive which can only be addressed scientifically through experimentation which is generally too expensive to conduct. We can't "prove" this type of expressiveness.

And when a Computer Scientists says "expressive", they mean "computational expressiveness". What "computability class" does a particular representation support. And for a particular instance of a representation, say a good 'ol sort method written in Java, what is the worst case or expected case performance for that algorithm.

We need each other. There's nothing to fight about here.

---------------------------------------------------------------------

Lots of religious people
written by hyper, July 12, 2007
The computing world is made up of many different parts which all function differently... and require different types of thought process ((this is the reason IT / Computing has and is advancing so fast over the last 20 years))

If humans wish to advance further with computing a lot of humans will have to think outside of what we all currently understand and know about the world around us and this field we call it/computing.

exploring what we know now and what humans may know it a 10,000 years time this could be described the era of caveman computing...

Anyone proclaiming x subject matter is the holy grail are surely trapped in a closed minded world.

Surely if the human mind (has the potential to be) more powerful then any computers of today then there must be many times where humans come to the correct answer and not know how they came to that answer and not know how to explain how they are coming to the right answer time and time again...

so lets stop being religious about whatever subjects one has studied this has nothing to do with taking the maths out of CS or taking the maths out of computing... it is pointing to one the old ways of thinking may not be the most desired for future development... and why not experiment with other ways of doing the same thing might come up with new solutions....

the thought process of many will be random blurred between all different parts of computing. (((I may never understand everything about Quantum computing does not mean I should never try to build or use a Quantum computer or that I can not add value to designs /ideas)).
----------------------------------------------------------------------

What should Computer Science students learn from Mathematics?
written by Y.C. Tay, July 12, 2007
http://www.math.nus.edu.sg/~mattyc/CS.Maths.pdf
<off-line version>
----------------------------------------------------------------------

Re: What to learn, if you need more math..
written by Alex M., July 19, 2007
Hmm.. that worked well. Here's the title of the book:

Concrete Mathematics: A Foundation for Computer Science (2nd Edition) (Hardcover)
by Ronald L. Graham (Author), Donald E. Knuth (Author), Oren Patashnik (Author)

Search for it on Amazon.
----------------------------------------------------------------------

Retired: Technical Principal in Computer Science/Technical Manager at JPL-NASA-Caltech: Spacecraft Ground Systems
written by Szabolcs Michael de Gyurky, July 20, 2007
Wow! All of your comments are correct. This is why we in our profession od CS/IT/Software Development are in trouble, and no longer competitive in the global market place! What to do before it is too late for our jobs? We need to define what Computer Science really is. What is the (academic/apprenticeship/trade-school) scope of our profession? I wrote the book The Cognitive Dynamics of Computer Science (see URL below) at the request of my staff and colleagues at JPL to answer the questions you are debating. All of you are correct in your subjective assessments of our profession. Please look at the comments on my book and see if you find resonance.
http://www.amazon.com/gp/product/0471970476/ref=pd_rvi_gw_1/103-1366852-0289401?ie=UTF8
----------------------------------------------------------------------

straw men and real applications
written by Steve B., July 24, 2007
Fant's argument, as excerpted in the article, appears to me to define "algorithm" extremely narrowly, so that in fact it is of limited applicability. Would anybody realistically say an operating system isn't built of algorithms, just because it's intended never to halt, or because it's intended to respond to inputs at unpredictable times? Would anybody realistically say a random algorithm, or a nondeterministic algorithm, doesn't qualify as an algorithm? Would anybody realistically say a circuit diagram doesn't qualify as an algorithm, just because it happens to be instantiated in simultaneous electronic components rather than sequential program steps? Nonsense! If that's really Fant's position (and no, I haven't read the book either), it's the flimsiest of straw men.

In fact, computer scientists HAVE been working for decades to find better ways to understand and express processes. And guess what tools and language they've been using to do it? MATHEMATICS, because process is an abstract concept, and mathematics is the best way anybody's found to express abstract concepts precisely.

Indeed, mathematics could almost be defined as the practice of abstraction: deciding which irrelevant details to ignore so as to see the essence of a problem more clearly. That happens to also be an incredibly useful technique in programming: if you find yourself writing similar code several times, it's a sign that you need to step back and refactor, creating an abstraction of the several pieces of similar code; the result is usually much shorter, clearer, more maintainable, and often more efficient than the original repetitive version was.

It takes mathematical thinking and mathematical language to design an operating system that's not vulnerable to deadlock, or to design a large network that's fail-safe against any one node going down, or to design an error-correcting code that sends data safely over noisy communication lines, or to understand the hierarchy of classes in a large OO program, or to express clearly the possible sequences of paths a user can take through a GUI, or to design a cryptographic protocol by which I can convince you that I am who I say I am, without enabling you to convince somebody else that you are me. These are practical things that affect how real programs really work. Does every programmer need enough mathematical background to invent such things? No, but every programmer should have enough mathematical background to at least understand them, so as to be able to implement them.

As a high school hacker, I decided to write a flight simulator. I already knew the necessary trigonometry, but my program refreshed the screen at about one pixel per second. A few years later I took linear algebra, which provides the tricks it would take to make it run thousands of times faster, enough to be useful. In that same linear algebra course, we studied something called eigenvalues and eigenvectors. They never made sense to me at the time, but guess what's at the heart of the original Google page-ranking algorithm, which helped Google produce better search results than most of the other search engines and become the industry leader.

Should computer science students also learn something about dealing with people? Absolutely, and in my classes they get a lot of teamwork practice. At the same time, they get a lot of practice at abstraction and generalization. It may not involve numbers or integral signs, but it's very definitely math.
----------------------------------------------------------------------

British Education Ministry
written by Quinn, July 10, 2007
This article sounds like it was written by the British Ministry of Education: having successfully destroyed academic physics, they're now looking to do the same with computer science.

----------------------------------------------------------------------

<Regarding Algorithms>
----------------------------------------------------------------------