\def\wysiwyg{{\it wysiwyg}} \bigskip [The following article is the text of a presentation given at the Royal College of Art on December 8th, 1988. The theme of the meeting was `Training for Desktop Publishing', and it was intended as a vehicle to inform `management' about some of the features of dtp which vendors rarely reveal. In the event, many of the other speakers were representatives of various software and hardware vendors. But having written it, it seemed a pity not to put it out into the world. Some of the article comprises fairly straightforward background material. I believe that this is an area which we seldom address well (myself included). So many people have a totally inaccurate view of \TeX\ and \LaTeX. Just glance at what the computer press has to say about \TeX, should its denizens ever deign to review it: clearly journalism requires neither research nor serious application. But as long as we fail to reach out and communicate, the fault is as much ours. We have nothing to lose except \TeX.] \bigskip %Arthur Keller, 1985. % %The use if \TeX\ offers not only the potential for %producing material of the highest quality, but also the %possibility of producing unreadable or un\ae sthetic %documents. Elegant typesetting is not a substitute for %clear, refined prose. \let\big=\bf \centerline{\big\TeX, training, and a few other windmills} \section{\TeX: background} Let's first of all establish what \TeX\ is, and how you use it. \TeX\ is a computerised typesetting system which has been under development since about 1978, and which ceased development in 1982-ish. Any work done since then has been purely bug fixes. It was created by an academic, Professor Donald Knuth of Stanford University. Knuth was dissatisfied by the quality of typesetting which his publishers were managing to achieve, and basically, he decided to do it right. His own books are largely mathematical in nature, and he gave a great deal of attention in \TeX\ to setting technical text. Even now, there is no computerised typesetting system which can hold a candle to \TeX\ for setting mathematical equations. Some of us would be bold enough to argue that its typographic excellence goes beyond mere mathematics. And this despite the fact that the whole of \TeX\ resides in the public domain. Anyone can adapt or adopt its algorithms (or even its code) and make them part of their own product. There are a number of commercial organisations who will supply their own particular implementations of the software, and who will provide the backup services normally associated with software suppliers -- help, advice, consultancy, teaching, etc. \TeX\ was developed on a Digital TOPS-20 system at Stanford: it was developed on a mini-computer with probably less power than an Macintosh II or a 386 pc. Despite the fact that it started out on traditional multi-user computer systems, it was sufficiently well written that it could be transferred to all sorts of other machines. Today \TeX\ runs on almost every flavour of computing engine. The minimum requirement is that the machine supports 16-bit arthmetic. This encompasses the IBM pc (in all its variations), the Apple Macintosh (in all its varieties), the Atari ST, the Amiga, and even more obscure personal machines. \TeX\ is not fussy about where it lives. The micro versions of \TeX\ are not cut down versions of mainframe programs. They are functionally equivalent in every detail. \TeX\ not only runs on a variety of machines, it produces identical output for identical input, no matter what machine you choose -- no matter what output device you choose. \TeX\ has been driving phototypesetters and laser printers since 1978. But \TeX\ isn't particularly a desktop publishing device. To a user of \TeX, using a Cray supercomputer, the desktop would have to be very strong. To another, using his Macintosh at home, it would look pretty average. But both would have the same facilities at his or her fingertips. This gives \TeX\ users a very mobile quality. When they move from machine to machine they don't have to relearn \TeX, although of course they might have to relearn the details of the operating system and their editor. \TeX\ would probably rather be called a document preparation system. It's really designed for producing books, scientific papers, and other paraphanalia of the academic world. Despite that, it is also used to produce manuals, newsletters, magazines, the odd letter and quite a lot of lecture notes. Given that it was designed by an academic, with rather academic purposes in mind, it isn't altogether surprising that \TeX\ spread to other people in universities. In general they each had a fairly specific need: they wanted to be able to type in their own work -- why did they want to do it themselves? Not really out of any feeling of altruism, but simply because typing scientific material is often fraught with errors for the uninitiated (always assuming that you could read the handwriting). A typist who does not know that this squiggly thing is a $\zeta$, and this one is a $\gamma$ may have difficulties distinguishing $\sum$ from $\Sigma$. One way out of the fairly inevitable revision/edit cycle is to do the job yourself. So the spread of \TeX\ is largely a grass roots one. It has spread because there was nothing else which was available (at least, not generally). It has spread amongst a sector of the population with rather limited spending power, and with a distinctly unglamorous public image. There has been no media hype to tell us how wonderful \TeX\ is, and how we can't possibly manage without it. Despite its unparalleled excellence in setting penalty copy, you don't see it featured in any expensive Seybold Reports. Nobody is getting rich out of \TeX\ (well, hardly anyone -- and certainly not Knuth). \TeX\ is also unfashionable in the sense that it is not a direct manipulation \wysiwyg\ system. You type page layout commands within your text, whose effect you will not see until you run the text through the \TeX\ program. Now this is quite an interesting contrast to the current wave of dtp systems like Ventura, Pagemaker, Fleet Street Editor, and so on. If we examine the roots of these systems we can see that we can identify two starting points. There is the typing root, and there is the typesetting root. Of course, the typewriter was designed originally to emulate printing (which itself was of course designed to emulate the work of scribes and copyists). Nevertheless, we can note immediately that the notion of `what you see is what you get' is deeply embedded in typewriting, but that in the typesetting tradition, only those who could read intaglio mirror images of signature printing could guess what they were likely to get. Both traditions have co-existed for some time. There are many who would argue that it is not only irrelevant but positively distracting to see the layout of your text as you type in. Many believe that content and structure are inherently distinct. Certainly, in typesetting a book, it would be rather irritating to have to recall the exact details of the layout of each chapter opening. Before I finally get down to saying something about training, let me quote the following from Arthur Keller, one of Knuth's associates: {\sl ``The use if \TeX\ offers not only the potential for producing material of the highest quality, but also the possibility of producing unreadable or un\ae sthetic documents. Elegant typesetting is not a substitute for clear, refined prose.''} \section{Training} My own experience with \TeX\ over 5 or 6 years indicates that there are many forms that training can take: \smallskip\noindent $\bullet$There are people who will use \TeX\ without any training whatsoever. Armed only with the book (\TeX\ has a rather good 500 page book which goes with it) they will quite happily produce their thesis or book. This level of sophistication is no challenge to them at all. They probably have more difficulty with the mechanics of typing than any other aspect of the problem. \smallskip\noindent $\bullet$There are many who learn from others. It seems that once you have a `critical mass' of users, they become self sustaining. Individually they may not know a great deal, but by pooling information, they solve most problems. \smallskip\noindent $\bullet$Many people like to go on courses. Those who go on a course with a problem to solve, like a thesis to write, assimilate the information much more quickly than those along for the ride. Directed learning has no equal. \smallskip\noindent $\bullet$The best courses allow people to use the system. Now this actually presents a problem, since once you encourage people to sit at a micro or terminal, you have to assume some level of competence of keyboard and computer. In my own courses, this has been the single biggest problem. I teach \TeX\ on IBM PCs, in an environment where we also have multi-user mainframe machines. Mainframe people, unfamiliar with micros can find them awkward and unfriendly. Some positively relish learning a new system, other dread it. Given that we are covering new material, we cannot afford to let people be left behind because they can't use the editor. Lose them at the beginning, and you've lost them for good. Another consequence of practical work is that classes have to be reasonably small -- you can't spend ten minutes with each member of a 20 strong class. And practical work tends to be slow. \smallskip\noindent $\bullet$Lets dispell the myth that says that you can't teach something this complicated to secretaries and typists. What nonsense! This is usually the refuge of some academic (read `manager') who doesn't really understand what is going on, feels he/she ought to, but does not want to be upstaged by people to whom he/she feels intellectually superior. But courses must be tailored to their audience. \smallskip\noindent $\bullet$People who want to do training usually do better than those who are drafted. \medskip \section{The windmills} There is much more to `publishing' than merely putting the words down on the page. The whole history of professional publishing demonstrates that the `origination' side of things, although not without its own special problems, is only one part of process. Within the desktop and electronic publishing world the following points may be a made. \smallskip\noindent $\bullet$Producing the typeset quality output is usually the beginning of the problem. This requires amplification. Using the nuts and bolts of some system can almost always be mastered, whether intuitively, by rote, or with some understanding. The mechanical elements seldom present an overwhelming problem. What presents the problem is, to paraphrase Beatrice Warde's words, `making the type invisible'. As far as technical typesetting is concerned, the type ought to be transparent. It ought not to be a barrier to understanding. That's one reason that technical texts tend to be set in a rather bland variety of a `modern' serifed typeface. It's what we are used to. If we were preparing advertising material, our objectives might be rather different. There is nothing competitive about a maths book. The competition was between `shall I read the book or dig the garden', not `which article shall I read first', as might be the case in a magazine. \smallskip\noindent $\bullet$Insufficient attention is given to design. We all seem to think we know what books should look like. Our level of criticism seldom advances above `I know what I like'. \TeX\ has made some attempt to tackle the design problem, by having a series of highly structured templates which will guarantee generation of an acceptable design for books, articles, reports, and so on. Not one of these could be described as exciting, but we are not in the business of exciting by our design. We are out to excite and inflame by the contents. Nevertheless, I still see students determined to emulate the typewriter. If you are preparing a thesis, the regulations do tend to assume that that is the medium, so there is some excuse. \smallskip\noindent $\bullet$Where are the typographers? One of the real reasons that much desktop publishing is excerable is that typographers have been shirking their responsibilities. They probably were not involved in the desktop revolution. The apocryphal tale is told that Hermann Zapf (of Dingbats fame), who designed Optima, was not even asked by Adobe for his comments on their digital version. But to many typographers, and I'm using the broad term which encompasses type designers as well as the people who are concerned with the placement of the mass of type, the electronic revolution seems to have induced panic, rather like a rabbit fascinated by a car's headlights, and they seem unable (with a few notable exceptions) to communicate with us, except perhaps to tell us that we're doing it all wrong. That typographers had nothing to do with the design of the Macintosh interface is manifest just by the Underline fonts, or by some of those dreadfully unreadable Shadow fonts. Even when the Linotype fonts on the LaserWriter are used, the `base' set universally available encourage people to mix Times and Helvetica as their `serif' and `sans serif' fonts -- not a happy pairing (it is even worse if you need to use a monospace font, since you will then almost certainly be using Courier -- the three do not blend well). The total lack of clarity in font sizes on the Macintosh is another example of control by people embedded in a typewriting approach. Why will packages allow you to use excessive measure, easily? They should allow you to do it, but only by perseverance. Almost every DTP package should carry a government health warning! \smallskip\noindent $\bullet$Since we've got managers and decision makers here, let's add that you shouldn't be taken in entirely by the claims of the package. In the hands of an expert, the package, whatever it is, will produce stunning results. Why then can't your staff achieve the same, instantly? Obviously you know the answer to this, or you wouldn't be here. But there is more to this than mere training. Any technological changeover leads to some reduction in quality. When Gutenberg started running off his 42-line bible about 1460, there were many who thought that quality had been compromised (we don't actually know when the first book was printed by Gutenberg, or even when the first bible was printed). When hot metal typesetting was ousted by film setting, many felt that standards were in decline (like Donald Knuth). The truth of the matter is that standards do decline, because there needs to be a build up of experience, practice at doing the job. With any luck, this doesn't take too long. A major drawback in the computing world is the distressing frequency with which new versions of the software appear. Usually they have fixed a whole host of bugs, and introduced a whole host of new ones. You will also have a range of new features which no-body knows how to use (or even what they are really for). It is probable that the first versions of PageMaker and Ventura (and \TeX\ too, for that matter), were rather limited. From a software suppliers point of view, in a highly competitive market, enhancements are needed to help attract media attention, and to keep the product firmly in focus. Fortunately from my point of view, \TeX\ is out of this vicious spiral. I can use it instead of relearning it. \smallskip \rightline{\sl Malcolm Clark}