This process map states that the process is kicked-off by the Individual Tax Return Received trigger. The Enter Details activity is performed by a Mail Handler. Once this activity is performed, the work is deposited into the Pending Returns queue. In the Assess Return activity, a Tax Clerk takes the work from this queue and assesses the tax return. If the return has missing information, then this information is requested from the return filer, and the activity is suspended until the filer provides the requested information (i.e., Requested Info Received trigger). The outcome of the assessment is either a fully processed return, or the initiation of a tax audit. Processed returns go into the Processed Returns queue and are then handled by the automatic Mailout Assessment activity. Auditable returns go into the Auditable Returns queue and result in the automatic generation of an Audit Tax Return process (defined elsewhere).
If you ask why variable names in this particular language should have this strange requirement, the answer has to do with ensuring that the Perl interpreter – the software which “understands” your lines of code and translates them into actions within the workings of the computer – can resolve any line mechanically and without ambiguity. Any programming language has to make compromises between allowing users to write in ways that feel clear and natural to human beings, and imposing constraints so as to make things easy for the computer, which cannot read the programmer’s mind and has to operate mechanically. Requiring dollar signs on variable names is a constraint which gives such large clues to the Perl interpreter that it frees the language up to be easygoing and tolerant of humans’ preferred usage in other respects. Although many other programming languages have no similar requirements on variable names, overall they are more rigid than Perl about forcing users to code in unnatural ways.
To the extent that there seems to be something of an Artinian condition (no in- finite descending chain) on natural language constructions, we may want to inquire whether there is a Noetherian condition (no infinite ascending chain) as well. A con- struction whose head is some lexical category c is said to be a projection of c: the idea is that we obtain more and more complex constructions by successively adjoin- ing more and more material to the head lexical entry. Can this process terminate in some sense? Linguistics has traditionally recognized phrases as maximal projections (i.e. as constructions that can no longer be extended in nontrivial ways). The most important example is the noun phrase, which is effectively closed off from further development by a determiner or quantifier. Once this is in place, there is no further adjective, numeral, or other modifier that can be added from the left (compare *three the books, *three every books to the three books, every three books) and only relative clauses are possible from the right (the three books that I saw yesterday). Once such a that-clause is in place, again there is no room for different kinds of modifications. Further relative clauses are still possible (the three books that I saw yesterday that you bought today), but no other kind of element is. Other notable examples include the verb phrase (VP), the prepositional phrase (PP), the adjectival phrase (AP), and the adverbial phrase (AdvP) – since this covers all major categories, it is commonly assumed that every construction is part of a maximal (phrasal) construction that can be further extended only by the trivial means of coordination.
A class hierarchy is usually illustrated using a simple graph notation. Figure 8.13 illustrates the UML notation that we will be using in this book. Each class is represented by a box which is labeled with the class name. Inheritance between two classes is illustrated by a directed line drawn from the derived class to the base class. A line with a diamond shape at one end depicts composition (i.e., a class object is composed of one or more objects of another class). The number of objects contained by another object is depicted by a label (e.g., n).
Because the kernel does not synchronize accesses to shared memory, you must pro- vide your own synchronization. For example, a process should not read from the memory until after data is written there, and two processes must not write to the same memory location at the same time. A common strategy to avoid these race conditions is to use semaphores, which are discussed in the next section. Our illustrative pro- grams, though, show just a single process accessing the memory, to focus on the shared memory mechanism and to avoid cluttering the sample code with synchronization logic.
In my younger years, I was fortunate to meet a few outstanding scientists and to have an opportunity to take their classes and talk to them. I learned logic from Nikolai Shanin (1919– 2011), Sergei Maslov (1939–1982) and Grigori Mints (1939–2014), and artificial intelligence from John McCarthy (1927–2011) and Raymond Reiter (1939–2002). The most important influence on my professional work, next to that of my teachers, came from Michael Gelfond, an old friend and one of the founding fathers of answer set programming. Michael has read a draft of this book and suggested many ways to improve it. Useful comments have been provided also by Amelia Harrison, Roland Kaminski, Joohyung Lee, and Alfred Zhong.
Storage for local data can be allocated statically or managed via the normal stacking mechanisms of a block-structured language. Such strategies are not useful for the program text, however, or for the tables containing contextual information. Because of memory lim- itations, we can often hold only a small segment of the program text in directly-accessible storage. This constrains us to process the program sequentially, and prevents us from rep- resenting it directly as a linked data structure. Instead, a linear notation that represents a specic traversal of the data structure (e.g. prex or postx) is often employed. Information to be used beyond the immediate vicinity of the place where it was obtained is stored in ta- bles. Conceptually, this information is a component of the program text; in practice it often occupies dierent data structures because it has dierent access patterns. For example, tables must often be accessed randomly. In some cases it is necessary to search them, a process that may require a considerable fraction of the total compilation time. For this reason we do not usually consider the possibility of spilling tables to a le.
• For a very big yellow page list, one may want to speed up the dictionary building process by two concurrent tasks (threads or processes). One task reads the name-phone pair from the head of the list, while the other one reads from the tail. The building terminates when these two tasks meet at the middle of the list. What will be the binary search tree looks like after building? What if you split the list more than two and use more tasks? • Can you find any more cases to exploit a binary search tree? Please
Is there another way? Can we model state change in the real world using only im- mutable functions? Taking mathematics as a guide, the answer is clearly yes: A time-changing quantity is simply modeled by a function f(t) with a time parame- ter t . The same can be done in computation. Instead of overwriting a variable with successive values, we represent all these values as successive elements in a list. So, a mutable variable var x: T gets replaced by an immutable value val x: List[T] . In a sense, we trade space for time – the different values of the variable now all exist concurrently as different elements of the list. One advantage of the list-based view is that we can “time-travel”, i.e. view several successive values of the variable at the same time. Another advantage is that we can make use of the powerful library of list processing functions, which often simplifies computation. For instance, consider the imperative way to compute the sum of all prime numbers in an interval:
The core of the material in this book grew out of laboratory classes and coursework prepared by the author for second year computer science students at Bradford University, as part of the lecture course Symbolic and Declarative Computing – Artiﬁcial Intelligence. This is a two-semester course with an introduction to Functional Programming with Haskell, Logic Programming with Prolog and the basics of AI. The choice of examples and topics for this book is of course tinged by the context in which Prolog was presented. For example, I discuss the functional programming style since it is useful in producing concise, readable and elegant implementations also in Prolog. The selection of topics for the examples was inﬂuenced in part by the AI element of the course though much new material has found its way into the book. To make set problems more easily accessible for the reader, I subdivide the overall task into managable portions indicating in each the desired outcome (if applicable, in form of a sample session in Prolog) with suggestions for how best to attack the subtasks.
Exercise 2.14. (This exercise explores the idea mentioned in footnote 14, p. 83.) The ‘hand-knit’ solution outlined in Sect. 2.8.2 involved a manual implementation of link/2 by deﬁning it by Prolog facts. These facts were, of course, speciﬁc to the puzzle to be solved. Having now deﬁned link/2 by rules not referring to the particulars of the puzzle at hand, we have been able to automate the solution process. An alternative closer to the original idea would be automatically to deﬁne in the database link/2 by the facts applicable to the particular problem. Use link/2 to deﬁne by facts an equivalent new link predicate and use it to solve the loop puzzle. Determine the number of nodes and the number of directed edges of the corresponding network. Determine these quantities also for the network associated with the ‘hand-knit’ solution (Sect. 2.8.2) to conﬁrm
Most games rely almost exclusively on surface blits for their drawing (as opposed to drawing with individual pixels). For example, consider the game Civilization: Call To Power (which was ported to Linux using SDL). Other than the lines used to indicate paths and gridpoints, every character and building that you can see is stored in memory with surfaces, and they are drawn on the screen with blits. All of the game’s artwork was created by artists and stored in files. The game assembles its screen images almost entirely out of these predrawn graphics. We will now examine a series of SDL video-programming examples. It would be a good idea to compile and run each of these examples and to tweak them until you understand how they work. Don’t worry about typing in all of the examples; they are available on the book’s Web page. Throughout the rest of the chapter (and throughout chapters to come) we’ll make note of important structures and functions with references like this:
Purpose of this book. This short treatise is intended to serve as a text for a freshman-level college course that, among other things, addresses the issues mentioned above. The book investigates interrelationships between mathe- matics and music. It reviews some background concepts in both subjects as they are encountered. Along the way, the reader will hopefully augment his/her knowledge of both mathematics and music. The two will be discussed and developed side by side, their languages intermingled and unified, the goal being to break down the dyslexia that inhibits their mental amalgamation and to encourage the analytic, quantitative, artistic, and emotional aspects of the mind to work together in the creative process. Musical and mathe- matical notions are brought together, such as scales/modular arithmetic, oc- tave identification/equivalence relation, intervals/logarithms, equal temper- ament/exponents, overtones/integers, tone/trigonometry, timbre/harmonic analysis, tuning/rationality. When possible, discussions of musical and math- ematical notions are directly interwoven. Occasionally the discourse dwells for a while on one subject and not the other, but eventually the connection is brought to bear. Thus you will find in this treatise an integrative treatment of the two subjects.