back to Parallel Programming Languages

The third main programming language paradigm is logic programming (the other two being imperative and functional). Although logic based languages and functional languages are two different animals, they are similar enough that they share the same application domain. They also share many characteristics that make parallel programming in them desirable. For example, both do not require the context of a running program to determine its outcome (they are side-effect free).

Proponents of logic languages, however believe that logic languages have advantaages over functional programming. One is the process of unification. Unification allows for bidirectional computation. A sorting program for example, could work backwards and, given a sorted list, can give you a (non-determinate) permutation of the list. Takeuchi1 also claims that logic language programmers can be freed from having to think about the order of evaluation of expressions.

Prolog, of course, is the mother of all logic languages.2 A Prolog program consists of:

  • facts, such as "parent(bob, bill)." which we can translate as "bob is a parent of bill" (although Prolog does not care about the interpretation of clauses).
  • rules, such as "ancestor(X,Y) :- parent(X,Y)." which we can translate as "X is an ancestor of Y if X is a parent of Y".
  • goals such as "?- ancestor(bob, mary)" which we can think of as asking the question "is bob an ancestor of mary?"
  • We can build a set of rules for ancestor like this:

        ancestor(X,Y) :- parent(X,Y).
        ancestor(X,Y) :- parent(Z,Y), ancestor(X,Z).
    This can be translated as "X is the ancestor of Y if either X is the parent of Y or Z is the parent of Y and X is the ancestor of Z.

    Logic languages can exploit two kinds of implicit parallelism.3 The first is or-parallelism, when there are several rules for one relation (such as for ancestor above). In this case, each rule can be evaluated concurrently. The other is and-parallelism, when there are multiple goals, or terms within a rule (such as in the second ancestor term). In this case, each term may be evaluated concurrently. For and-parallelism, if one term fails, all the other terms being concurrently evaluated have to be stopped.

    Parallel logic programming is particularly useful in problem domains that it has been traditionally useful for, such as natural language processing, and expert systems, although it had been used for other things as well. Parallel logic programming is attractive in the same way as functional languages, in that code is much more concise and simpler to write, and easier to reason about. In terms of speed however, some (admittedly old) tests for parallel vector addition show that parallel logic programming, while achieving essentially the same speedup as hand-coded C, is more than 30 times slower.4

    1 Akikazu Takeuchi, Parallel Logic Programming, John Wiley & Sons, Inc. 1992.
    2Robert W. Sebesta, Concepts of Programming Languages, Fourth Edition, 1999.
    3Gopal Gupta, Enrico Pontelli, Last Alternative Optimization for Or-parallel Logic Programming, Parallelism and Implementation of Logic and Constraint Programming, 1999.
    4Evan Tick, Parallel Logic Programming, MIT Press, 1991.

    Log in or register to write something here or to contact authors.