Commentary on a British Geological Survey Computing Archive 1965-85

From MediaWiki
Jump to navigation Jump to search

Commentary on a British Geological Survey Computing Archive 1965-85 by T.V. Loudon

Author Dr T V Loudon, c/o BGS, Murchison House, Edinburgh EH9 3LA, Scotland

Bibliographical reference: Loudon, T.V., 1996. Commentary on a British Geological Survey Computing Archive (1965-85) British Geological Survey Technical Report WO/96/3

NERC copyright 1996.

Edinburgh, Scotland British Geological Survey 1996

Introduction

It has been my good fortune to devote my career after 1959 to the study of computer methods in geology, watching with fascination as electronics began to fulfil their promise, if not our early expectations. Working since 1969 within the British Geological Survey (BGS), a number of documents have been in my care which chronicle the impact on BGS of these astonishing developments, and which may, at some future time, be of interest to historians of the shifting paradigm. My sixtieth birthday having decreed retirement, the documents have been lodged with the BGS Library in Keyworth, Nottingham, England, though not necessarily for public access. Enquiries should be addressed to the Chief Librarian. It may be helpful to publish this brief commentary as a marker of their existence and as an eye-witness view of the impact on a facet of geology of mankind's most amazing machine.

Note

The archived papers are stored in a set of numbered envelopes, with restricted access, at the British Geological Survey Library, Keyworth, Nottingham NG12 5GG, England. References in the text, such as [1], are to the number of the envelope. A list of the envelopes' contents is in the appendix on page 17.

Computing in the 1960s

Most of the archived documents were prepared by others, but there is no point in pretending to write this from other than my own personal viewpoint, with all its limitations. Briefly, then, I obtained a geology degree from Edinburgh University, Scotland, in 1956. After two years as well-site geologist with British American Oil in western Canada, I joined the Alberta Oil and Gas Conservation Board in Calgary. In 1959, I was given the task (when time allowed) of looking into the possibility of computerizing their well index. Two obvious conclusions were, first, that this hopelessly expensive and unreliable machine should not be allowed to interfere with the excellent existing card index, and second, that in time the computer would change the face of geology, as of much else. Wishing to be involved in the change, I resigned and returned to Edinburgh to obtain some basic computing skills during PhD work in the geology department.

Learning statistics and computer programming was a case of finding and reading suitable textbooks, supplemented by a short programming course in Atlas Autocode, and sitting in on a course in statistics for mathematics undergraduates. The University computing facilities were provided by the University of Manchester on their Ferranti Atlas computer, some 200 miles away. Programs and data were prepared locally on paper tape, and the code transmitted, usually overnight, on standard telephone lines. Errors were corrected by the data operator editing the tape and retransmitting. I recall writing and eventually running some small programs for analysis of variance and for trend surface analysis. The geological value of the results was negligible.

The main thrust of my PhD work, which began in 1960, was to attempt to collect a statistically acceptable sample of information on the sedimentology and structural geology of some late Precambrian rocks (Dalradian) near Banff in Scotland with the intention of processing it by computer. One afternoon I sought shelter in a cafe from the rigours of fieldwork in particularly atrocious weather. While brooding over a second coffee, it came to me that by representing my orientation data as direction cosines, their geometrical characteristics might be summarized by methods similar to those of principal component analysis. Furthermore, aspects of geological surfaces could be represented as slopes and similarly processed, thus separating shape characteristics from scaling and locational constraints.

The concepts seemed full of promise (never quite fulfilled). It was some years before I realised that other work on eigenvectors of distributions of direction cosines was going on at that time in other fields, and twenty years before I realised that I had been enjoying the delights of imported analogy.

In 1964, having completed the thesis, I had the opportunity of spending six months at Northwestern University, Illinois. I wrote programs to implement the eigenvector methods and prepared a report (Loudon, 1964) for the US Office of Naval Research which, for reasons that I found somewhat opaque, funded the research. Here was a geology department with its own key punch equipment, a computer (IBM 709, a precursor of the fully transistorized 7090) on the campus, and graduate geology students who were actually being taught about computer applications, with access to a first-rate course on Fortran programming by Betty Benson. The contact with Bill Krumbein, Tim Whitten and many others was memorable indeed.

Later that year I returned to the UK, as a postdoctoral fellow at the Sedimentology Research Laboratory at Reading University, allegedly investigating the computer modeling of sedimentary basins. The Laboratory obtained an IBM keypunch for my research with the unusual but useful attachment of a printer. Programs could thus be prepared locally and the cards mailed in a fibre box to the Atlas Computer Laboratory, achieving a turnaround of a few days. An attractive alternative was to book priority time on the night shift and travel to the computer which was an hour's drive away, and hand the work to the operator. On the midnight shift, there was a possibility of getting results within hours, and facilities for editing punched cards were on site. Thus, in the unlikely event of all going well, three consecutive runs might be obtained before dawn.

The attraction of the Atlas computer, run as a national research centre, was its huge capacity for the time, such as its ability to handle arrays of a few thousand data points. Above all, it had a Fortran compiler. There was an informal help system. On enquiring about differences in the handling of some Fortran statements compared with the IBM 709, I was surprised to realise that I was discussing the matter with a developer of the Atlas Fortran compiler (E B Fossey). The Atlas Computer Laboratory was adjacent to its main customers - the Rutherford High Energy Physics Laboratory and the Harwell Atomic Energy Research Establishment - but was outside the perimeter fence, and there was no obvious security. Customers walked in through empty halls and shouted for the operator. There was a closed-circuit television camera, but its purpose was to allow the operator to check movement on the tape decks. He could not always see these directly as the computer had been built on two floors to minimize cable length and thus reduce delays in signal transmission.

The key punch and printer at Reading were used to prepare a printed list of some geologists who computed, with an indication of the equipment they used and the programs they had prepared. A number of editions of the Geologically Oriented Scheme for Sharing Information on Programming (GOSSIP) were distributed to the participants. In April 1969, a final version was published internally [1]1 as Reading University Geological Report number 3. Nearly 200 contributors from 18 countries reported on their activities. The number of geological computer users had outgrown the list, which was therefore discontinued.

Another document, from 1967 [1], is pithily entitled "The Rokdoc Package - description and listing of a library of routines in Fortran IV for statistical analysis, summary and display of data concerning sedimentary

rocks" (see Loudon, 1974). The two documents illustrate some of the problems of those working in a new field. There were few precedents to follow and few, if any, journals which would publish the detailed procedures of the isolated workers in the field. Writing each new program from scratch for each incompatible machine was clearly not an effective use of resources. In-house publication and personal mailing lists were one answer, and Rokdoc at least showed an appreciation of the problem of integration. It remained an isolated package of general-purpose programs which accepted data in a format defined in a prefatory data description, thus allowing the development of a library of datasets on punched cards, which were reasonably compatible with the programs.

With hindsight, that small prototype of a geological data library (Loudon, 1969) was probably worth while, exposing a small group of geologists to the possibilities of computer methods. Delusions of grandeur were never far away, however, and may account for recurrent focussing of effort on areas, not specific to geology, where waiting for external solutions would have made more sense.

Moves towards a national data bank

Most of the British workers in this field at that time could trace their enthusiasm back to an American connection. Bill Krumbein and Dan Merriam were particularly influential, and tireless visitors to UK geology departments. Wider appreciation of the potential of geological computing resulted from publication during the 1960s of the ONR series of reports from Northwestern University and the Kansas Geological Survey Special Publications and Computer Contributions. The International Association of Mathematical Geologists was officially founded on 22 August 1968 at the 23rd Session of the International Geological Congress in Prague (just before its final interruption by the armies of the Warsaw Pact). All this led to geological computing emerging by the end of the decade as a recognised and almost respectable activity, which there is no need for me to describe further.

Let me turn therefore to the British Geological Survey and the activities of other groups described in the BGS archives. These developments are outlined by H E Wilson (1985) on pages 177 - 179 of his history of the Survey Down to Earth. Bill Read was recognised as the first field geologist in the Survey to take a serious interest in computing, his interest fired by a visit by Krumbein and Merriam in 1965.

In February 1967, Dr K C Dunham (later Sir Kingsley Dunham), Director of the Institute of Geological Sciences (IGS, now the British Geological Survey) in London, appointed a Computer Committee [2,3] to investigate and advise on computer needs. Senior IGS staff had wide ranging discussions with the Ministry of Technology Computer Advisory Service, and with the Atomic Energy Research Establishment and the adjacent Atlas Computer Laboratory (mentioned above) which were funded by the Science Research Council [4]. An investigation of requirements was agreed. The Atlas Computer Laboratory, some fifty miles from London at Chilton, with its remit to provide a national service, provided IGS with initial facilities and advice.

The Committee saw the development of a "national data bank for the Earth Sciences" as a major concern. Three Working Parties made recommendations on a petrographical-lithological code [7], a stratigraphical code [6], and geographical location referencing procedures [7]. These developments involved discussion and consultation with a wide range of other organisations, often at a surprisingly high level, as evidenced by the rather chilly correspondence between the Committee Chairman and a high official of the Water Resources Board discussing detail of such matters as reference numbers for water wells [2].

The Committee also asked the Atlas Laboratory to undertake three pilot studies on the storage and retrieval of hydrogeological, gravity and shallow borehole data. On 2 January 1968 the Committee submitted a report to Director [3], recommending the appointment of a Computer Liaison Officer as the first step towards the formation of a Computer Unit, and the Committee's replacement by a Computer Unit Steering Committee chaired by Director. A data handling office and remote access to the Atlas Computer Laboratory were envisaged, with the future possibility of purchasing small departmental computers. Fifteen members of staff had already attended a programming course, which was seen as the appropriate introduction to computing.

The general tenor of these papers differs strikingly from the majority of the work recorded in the 1969 GOSSIP list [1]. While the academics were exploring the potential of the computer as a glorified calculating machine, analyzing and manipulating the data, simulating processes and presenting the results, the Institute of Geological Sciences, like the US and Canadian Geological Surveys, the National Oceanographic Data Centre, the Smithsonian Institution, and others, emphasised the role of the computer as a glorified card index. Arguments raged about the relative advantages of fixed and free format, that is, whether data items were identified by the columns they occupied on the punched card, or were placed in sequence and delimited by separators such as blank spaces. There were also experiments, recorded in the papers by Gover and Read [3], in recording borehole descriptions with a limited vocabulary of English words and a simple, rigid syntax. This was a British counterpart of the complex and highly coded borehole description system (DASCH, later DASP) which was being developed in the West German geological surveys, as reported later in the papers of the West European Geological Surveys' Advisory Group on Applications of Computers [25,26,27].

The obsession with compact coding arose partly from the mechanics of the punched card and storage limitations and perhaps partly from the card index metaphor. The delight with which Turnbull [1] explains how to save space by distinguishing positive and negative values by combining the minus sign with the initial digit (to give an alphabetic character) is not untypical. Work on semantic coding at the École des Mines in Paris [1] was known in Britain at this time largely through C J Dixon at Imperial College. This involved translating geological terms into codes comprising a sequence of bits, each bit carrying meaning in terms of a logical classification of concepts within the particular field. Thus, all sedimentary rocks, say, could be retrieved by searching the semantic code for the bit implying "sediment" regardless of the name assigned to them in the original record (sandstone, greywacke, etc).

From a 1996 perspective, a startling feature of the IGS papers is the confidence and enthusiasm with which managers at the Survey, with at best a rather superficial understanding of computers, developed and drove forward implementations across most departments, despite widespread apathy and scepticism at lower levels. Much senior staff time was assigned without quibble to the untested concept of a geological survey data bank. Direct costs were more cautiously monitored, with Director's approval needed when the total outlay exceeded 1 000. Much of the enthusiasm may have sprung from the interest and inspirational qualities of the Director, Sir Kingsley Dunham, and from contact with exciting developments in other surveys, in government and industry and in geophysics, oceanography, library science and other areas. It reflected partly the atmosphere of the time. Pelican books, for example, published many popular scientific books describing the wonders to come [1]. The blurb of one (Sluckin, 1964) begins "Electronic brains as they are often called, inspire such admiration and awe that it has been said that they are almost capable of thought and that their behaviour is in many ways like that of living creatures." The computer was held in such awe by some in IGS that within topic areas, such as borehole records, some workers apparently wished to ensure that all their data would be stored by computer, to a more rigorous specification than before, perhaps fearing that unconverted records would be ignored or lost.

Some dissenting voices were raised [2].

Geophysicists who had been using computers for five years or more recorded their dissatisfaction with the proposed use of the Atlas Laboratory facilities. They worked with paper and magnetic tape and programs in Extended Mercury Autocode through a London bureau. The bureau telephoned the geophysicists with results, and instructions to edit the data and programs were returned by telephone. Fortran and punched cards were seen as an unattractive alternative, and the turnaround time, they suggested, would be up to a week. The Committee optimistically recorded that "those members of the Palaeontology Department who are not convinced of the relevance of A.D.P. [automatic data processing] to their current work will in time join their colleagues who are more eager to profit from its potentialities for the future and in contributing to joint studies." Despite his own reservations, the Chief Palaeontologist took the view that stratigraphy was so fundamental in geology that he would not wish palaeontologists to fall behind in computer implementation for fear of delaying the entire enterprise.

Some external activities mentioned in these documents had a lasting impact on IGS, and information on later collaboration is recorded in separate files. They include: investigations by the Museums Association which resulted in a project, involving Dr J L Cutbill and funded by the Office for Scientific and Technical Information, creating a computer system for palaeontological and other information curated by the Geology Department at Cambridge University [8,9]; the emergence of the National Computing Centre, and their proposal to establish a national program index; the Experimental Cartography Unit [10], led by D P Bickmore, which moved from Oxford to South Kensington and established close links with IGS; and a borehole study by the Edinburgh University Geography Department [5], which was undertaken by D B Rhind who later joined the Experimental Cartography Unit, then moved to Birkbeck College, London, and at the time of writing is Director of the Ordnance Survey and an influential figure in digital cartography.

Reconstructing an unexpressed metaphor is always dangerous. I assume, however, that in general the data bankers felt that there was a core of codifiable structure, an implicit classification scheme, in their science and that values from this schema could be assigned to each data record and stored on a computer. In a data bank at index level, records meeting a detailed set of criteria could be identified and selectively retrieved. The full records might be on paper, on microfilm perhaps set in a punched card (aperture card), or perhaps quantitative data held elsewhere on the computer. Therefore, a major objective of the IGS Computer Committee was to identify and provide standard codes for stratigraphy, petrology and mineralogy, and geographical location. Although many separate files were expected for departmental data collections, they should be cross-referenced at least at the level of these codes [6,7]. Local codes, such as those in the hydrogeological well records, carried the remaining information. With startling ingenuity, codes of one or two characters were devised, each representing much geological data. Concern was expressed about identification codes for such items as boreholes, collectors, specimens and samples. This may partly reflect the difficulties that were then arising from amalgamating collections from different offices which had been curated locally with little overall coordination.

The business case was nowhere considered. To speculate again, the feeling may have been that computers were coming, like it or not, and the Survey must adapt. The Flowers Committee Report of 1966 was influential in proposing a high level of spending in Research Council computing. The funding and the backing were available, and a comprehensive approach might be considered necessary to gain the full benefits. There may even have been a hidden agenda in which computer methods were seen as a tool to overcome barriers and rivalries between individual internal fiefdoms, and to enhance the Survey's dominant position as a source of geoscience data in the UK. A separate initiative had suggested an independently (probably University) based geological data bank [2 - February 1967, Report on Computers in Geology, G Y Craig], which might have been construed as a potential competitor in an area of the Survey's activity. The report highlighted the difficulty of accessing existing data, and the rapid increase in the volume of data being collected. When the IGS initiative was announced, the idea of an independent data bank was dropped.

Standards were being defined by IGS with wide consultation, possibly as a means of securing the IGS position, leaving others to follow, then or later. Standard setting was a world-wide activity as shown in the IUGS report [2, August 1967]. Views that the standards might not hold or even be used, that alternative approaches should be explored, or that changing technology might render a lot of hard work obsolete, were not considered. The immediate costs of data banking were given some thought, but not the much larger costs of maintenance nor the likely change of costs with time.

The value of the data bank to the scientist was not quantified. A revealing comment in a perceptive report by W A Read to the IGS Computer Committee [1] dated 28 February 1967 is: "authorities who were consulted . . . strongly recommend that all the staff should be 'exposed to the computer' and be told how it can be used to solve their particular problems." Mathematicians were thought to have this prodigious gift of problem-solving, and the possibility of recruiting one to the survey was discussed. At the time, Read was vigorously exploring computer applications involving statistics and surface fitting in the Carboniferous of the Glasgow-Stirling area (Read, 1966), exploiting his knowledge of the geology by collaborating with enthusiasts with programming skills, mostly in universities.

The emphasis generally, however, was on data banking as opposed to analysis, particularly on data input and coding. The mechanics of retrieving data were so laborious and inconvenient that one can only suppose that advances in technology (discs, and remote access to the computer by telephone line are mentioned) were confidently expected to improve matters beyond recognition. Presumably it was expected that the output from data retrieval would be prepared in the limited format of the lineprinter, and carried by hand to the user. But most of the computer records were so tightly encoded as to be unreadable, and so complex and file-specific that retrieval programs had little generality. Look-up tables within the computer for translating into English were too demanding on programming skills and processing time for routine use.

For the benefit of younger readers, it may be helpful to point out that most manually prepared paper work in the late 1960s involved typewriters, which by mechanical linkages from the key, or by using electromagnets, caused a metal slug carrying the form of the appropriate letter to strike an ink-soaked ribbon at the appropriate point on the paper, leaving an image of the letter. Up to four copies could be produced simultaneously by interleaving additional sheets of paper with carbon paper, which left an image of each letter on the sheet beneath. For a larger number of copies, the typewriter ribbon could be removed and the impact of the metal slugs used to create permeable images of the letters on an otherwise impermeable stencil. Ink was messily squeezed by rollers through the image on the stencil onto sheets of paper. Corrections or changes to the typed material were made in hand-writing or by retyping the entire document. The process being somewhat laborious, the finished papers were carefully added to bound files, which were passed around as necessary and stored departmentally or in a central registry. The mail system was generally efficient and reliable, if on occasion a convenient excuse for delay. The long-term vision of how the information delivery system might be superseded by new technology remained unconsidered for many years.

Some other activities, previously mentioned, seem to have been based on different metaphors. The Cambridge project was concerned with automating the museum catalogue [8,9], and the Experimental Cartography Unit [10] with automating the production of maps, including geological ones. As later IGS, BGS and NERC papers show, the Survey kept in touch with this overlapping work, but proceeded independently. A reluctance to discard one's own work in favour of another's no doubt played its part, but the objectives did differ, and incompatibilities of hardware and software were major barriers to integration. In retrospect, the guilt which some of us felt at avoiding an integrated national effort may

have been misplaced. At this early exploratory stage, innovative and competitive experimentation was probably more fruitful than attempts at routine, coordinated production. Perhaps the greater failing was in supposing that each new departure was a completed contribution to some ultimate all-embracing solution, not just a prototype to learn from and abandon.

The IGS Computer Unit

The IGS Computer Committee recommended the appointment of a Computer Liaison Officer, with a view to establishing an in-house Computer Unit. I applied for the post, and in 1969 was appointed and moved from Reading University to IGS in London. More by accident than design, day-to-day informal jottings of my activities in those early months still exist [11,12]. Disjointed and partly illegible, they nevertheless give a clearer flavour of the environment and range of activities of a new recruit than do the more formal documents.

Director accepted the recommendation that a Computer Unit Steering Committee (CUSC) should succeed the Computer Committee, and skilfully used its meetings to fan the flickering enthusiasm of his so-called Assistant Directors. The CUSC papers speak for themselves, recording the gradual build­up of the Computer Unit. The arrival from universities of staff who had been trained in both geology and computing methods (such as Farmer and Jeffery, who were in post in 1971) made a great difference to what the Computer Unit was able to achieve. Computer appreciation (rather than programming) courses were given in-house

[14]. Collaborative work with the Experimental Cartography Unit is recorded in these papers, as is the growing influence on computing of the Natural Environment Research Council (NERC), of which IGS was a component body. Frustration with some loss of control in computing decisions (to NERC) is shown in the letter from Director to Secretary, NERC [14 - 18 October 1973] and was to become increasingly apparent.

By mid-1974 [15], the minutes indicate that 11 Computer Unit staff were in post, and record a first release of the G-Exec system, which was being developed by Jeffery and Gill within the IGS Computer Unit, using the facilities of the Atlas Computer Laboratory. Various publications describe the system (see Jeffery and Gill, 1977), which was intended to provide comprehensive support for the IGS data banking activities. It was a pioneering attempt to apply ideas of relational databases in a geological context, and also to provide an integration platform, whereby data and programs from many sources could operate within a shared framework of mutual compatibility. While it failed as a panacea, it helped to rationalise data structures, ensured that issues of data integration were not overlooked, and stimulated much interest in other surveys and elsewhere. The later adoption by BGS of relational database management systems, Mimer and then Oracle, owed much to this early work.

The extension of computer networks to link various IGS offices is recorded in the proposals of May 1974 [15]. The IGS work was not of course done in isolation, and similar activities in other fields and places proceeded in parallel. Examples of other work were obtained during visits to Canada [16] and the United States [17] and from international conferences such as that in Bandung [18]. Work on storage of North Sea hydrocarbon exploration data, commissioned by the UK Department of Energy and its predecessors, gave the Computer Unit some experience in this field, which tied in well with these overseas activities.

The IGS hydrocarbons and much offshore work was handled from the Edinburgh office, which also housed a major part of the Geophysics Division (whose computing activities were mostly independent of those described here), including the Global Seismology and Geomagnetism Units. The largest computer obtained by the Computer Unit, a DEC PDP-11/45, was therefore installed in Edinburgh, where a new building (Murchison House) was under construction. With security and temperature control in mind, a large windowless area originally designated "rock store" was reassigned to house the computing facilities. A number of Computer Unit staff were transferred from London or recruited. Delays in construction led to temporary accommodation being necessary, and logistics became a major preoccupation [19].

In this initial phase, it could be argued that IGS had been behaving like a child with a new toy, setting up computer applications without regard to long-term strategy or economic considerations. The enthusiasm and dedication of those involved was remarkable. The camaraderie and cooperation of the computing pioneers in every field was a delight. But it should also be pointed out that much of the early progress in computing depended on government funded research, and geological applications were no exception. Quite basic questions had no answer. For example, did the long-term solution to borehole description lie in the relational-type codes of G-Exec, the computer-specific descriptions of Canadian Stratigraphic Services [16], the semantic codes of the École des Mines, the syntax of DASP, the formalised English of Gover and Read, full English text, or none of these? How would the results be made available to users? Even solutions which are obvious now were less obvious then. As a small example, I assigned the task of writing a program to plot borehole logs to a newly recruited but experienced Fortran programmer. He reported back after a week that the task was impossible. The techniques he required are of course described in numerous texts on computer graphics, but in 1970 they had not yet been written. Instead, we had to search laboriously for the most impossible part, and do that first. Surely it was right for a research organisation to contribute to these developments.

For the technologist charged with implementing new techniques, there were delicate issues of adaptation to changing politics. Finding an optimal solution to ill-defined problems was made more complex by flux and uncertainty about what was really to be optimised and for whom. By implication, the initial aim of exploring the potential of new techniques was giving way to an aim of providing well-targetted support to the Survey's activities. But the solution which best suited an individual scientist or manager, was not necessarily optimal for IGS as a whole. As human beings, scientific managers might conceivably have seen enhancement of their own status as an important objective. The ideal system would ensure that

suboptimisation by line-managers automatically optimises the system as a whole, but in practice this is difficult to achieve. On a larger scale, the Rothschild proposals (see the IGS Annual Reports for this period) had similar objectives, aiming to make research councils more responsive to the needs of government departments by routing some research council funding through the departments. As the latter were even more enthusiastic about data banking than IGS, funding for computing activities benefitted.

NERC takes a hand

The Institute of Geological Sciences was the largest component body of the Natural Environment Research Council (NERC). NERC had an interest in computing as a means of integrating academic and survey research as well as different aspects of the environmental sciences. A NERC working party on data processing in geology and geophysics in 1971-72 reported on relevant work in universities and elsewhere [21].

Survey computing thus reflected a number of layers of management, whose interests did not always coincide. Government policy to support the manufacturers of British computers was at odds with reservations on the part of NERC staff about their suitability for their particular tasks and the need for compatibility with colleagues internationally. NERC and the IGS Directorate were concerned with different levels of integration, while individual groups within IGS wished to get on with their work with as little interference as possible. The bureaucratic delays as well as the development of the overall NERC view are documented in the minutes of the NERC Computer Committee, which it is more appropriate for NERC to archive.

In 1975, to guide the development of NERC computing policy, Mr Gray, the Assistant Director in charge of IGS computing, assembled papers on future trends of earth science computerisation [20]. Based on the literature of the time, I suggested to him that the main future themes would be networking, database, models, text-handling and conferencing, attaching a copy of the now-famous paper by Brooks (1972) on the mythical man-month as a warning to expect delays. Gray, in his submission to NERC, pointed to the dichotomy between the enthusiasts and the many managers and scientists who considered that computers had brought little significant benefit to strategic geological research, and were unlikely to do so in the foreseeable future. Such influences together with lack of resources might, he suggested, be more significant than technical considerations. "Constraints inevitably raise painful questions as to the desirability of introducing new computing activities unless they can be shown to be cost-effective." He saw a degree of centralisation and rationalisation of resources as unpopular but appropriate in these circumstances.

This new note of cautious realism coincided with a period of expansion in the Computer Unit, taking staff numbers over twenty, covering a wide range of applications including library cataloguing and digital cartography. There were ambitious plans for the new Headquarters for the Survey, which was to move from London to Keyworth, near Nottingham. The new site was then occupied by a residential teachers' training college, and the chapel (now the De La Beche Conference Centre) and the gymnasium were each seriously proposed for conversion to a computer room. Experience at Edinburgh had shown the drawbacks of acquiring computers before suitable accommodation had been completed and room for expansion would put IGS in a good position to bid for housing central NERC computing facilities.

By 1978, relationships between NERC and IGS were generally rather tense, and the strains of an organisation moving to a new site were being felt. The IGS Computer Unit, despite its name, had been focussed on data and applications, and the actual computing power in IGS was supplied by large external machines and small

departmental ones. The PDP-11/45 at Edinburgh had been a first move to in-house computing. Discussions within NERC determined that future computing services would be handled on a NERC-wide basis. Wilson (1985) describes on page 178 the difficulties that followed. After damaging delays and acrimonious debate, the IGS Computer Unit was incorporated in a new NERC Computing Service (NCS) [20]. The background and interests of the Computer Unit staff were not appropriate for the new service, and in the end NCS obtained the posts rather than the staff, to the benefit of various oil companies and academe.

The IGS Committee on Records and Archives at Keyworth (CRAK) [29] considered in 1979-80 the need to maintain and improve the IGS archives including data files and specimens after the move from London to the new Headquarters at Keyworth, and against the background of transfer of the Computer Unit to NCS. It reviewed existing collections and adopted a balanced view of the computer as only one tool among many in improving the management of the Survey's data. The debate on centralisation versus local control of records is reported, and a recommendation was made for establishment of a National Geosciences Data Centre, which would identify, index and ensure access to all geological data held in IGS. As far as I am aware, this important and influential document was not widely distributed and never published, and its valuable analysis of IGS data, for example, does not appear to have been the basis for later data catalogues.

The story ended happily. After a bad start, NCS recovered under the leadership of John Down. BGS eventually reconstituted specialist information support as Information Services and Information Systems Groups. Relationships between NERC and BGS gradually improved. For my own part, my wish to work within the hydrocarbons group at Edinburgh exploring the application of computer models to geology was granted. The story of the development of the NERC Computing (later Computer) Service is fully documented in their own publications, as is the work of the information groups in BGS. One further strand, however, has long roots, and may be of interest.

Digital cartography

It is not surprising, since geology is a visual science, that graphical output was seen as important at an early stage in IGS computer applications. Diagrams could be produced from the characters on a lineprinter [5] and drum plotters were obtained at an early stage in IGS computing. A paper in October 1973 records that up to 2000 working maps and diagrams were plotted each year. A large flatbed plotter was among the early equipment to be installed in the new IGS

office in Edinburgh. Higher resolution graphics were of interest to IGS as well as to the Experimental Cartography Unit (ECU), which was located within a few hundred metres of the IGS Headquarters in South Kensington. As both came under the aegis of NERC, a committee was inevitable. It was called the Subcommittee on Computer Graphics [22], and reported to the NERC Computer Committee.

The October 1973 minutes [22] record that the one-inch geological sheet for Abingdon had been prepared with extensive computer assistance and was published, while the Swindon sheet was at colour-proof stage. Systems had also been developed for the production of geochemistry maps in conjunction with IGS geochemists. Dr Brian Kelk, who had worked with ECU on these products, indicated that in addition to further collaboration with ECU, IGS planned to move into digital cartography itself, and a digitising table was already in the Drawing Office. The ECU papers stressed the need for a centralised activity and a consideration of methodological issues rather than ad hoc solutions. IGS saw a threat to its map preparation function, and ECU was a powerful stimulus for automation of the IGS Drawing Office. For certain groups within IGS, on the other hand, the ECU offerred an opportunity to bypass an overloaded Drawing Office and use fashionable

techniques. It was time for another committee.

The Standing Working Group on IGS Automated Cartography [23] met some 13 times between July 1973 and July 1977. Its task was to establish the feasibility and cost of introducing automated geological mapping in the IGS as a full-scale production process. It started with 12 members, and grew to over 20, several at Assistant Director level, with one meeting chaired by Secretary, NERC. At the first meeting, ECU presented a paper from January 1973 explaining that after production of the Abingdon sheet, it was thought that costs would fall for the Swindon sheet, to a level comparable with manual costs, and that production would be faster. In reality costs were four times as high, and although the map was not complete, it had already taken twice as long. Kelk for BGS, however, concluded in May 1973 that automated cartography had potential of the greatest benefit to IGS. "It is undoubtedly true that the advantages seen to date, namely, speed of production and flexibility of output (in particular, variety of lines, mask cutting, ease of scale change, and ephemeral displays), are only precursors of a much greater future."

Kelk [23] argued that cartography was part of a continuum from data collection to map production and that all stages, with the possible exception of printing, should be handled in-house. He saw cartographic data as a principal constituent of the data bank, and maps as one aspect of its output. He indicated the value of a three-dimensional approach, pointing out that it was not currently possible to check all data against the geologist's hypotheses and mental model. "This, therefore, means that the maps themselves, when produced may have areas of only partly tested hypothesis - only partly tested, not because of the lack of data, but because of the great quantity of it." A more complex map, that for Merthyr Tydfil, was selected for the next experimental production by ECU.

Concern about overlap between ECU activities and those of the IGS Computer Unit were discussed on 28 November 1973.

A paper by Read and Loudon [23, March 1975] pointed to the importance of compatibility with Ordnance Survey and other IGS work, implying that the ECU approach did not offer this. Not until March 1976 did P A Sabine point out the apparently obvious conclusion that the work on the Merthyr Tydfil sheet showed that automation of map production at this level was at best premature. However, he felt that as this route would eventually be taken, it was essential to build on the expertise already gained.

The Working Group continued for several more meetings, at each of which extraordinarily detailed bar charts of progress were presented. The Merthyr Tydfil map eventually appeared, having taken much longer to produce at much greater cost than a comparable manual product. Little seems to have been gained from the large investment of management time in the Standing Working Group. The successful introduction of digital methods to the BGS Drawing Offices many years later was based on commercially developed systems.

An internal IGS Cartographic Developments Committee [24] met during the same period (1973-77) to monitor developments in-house, including the installation of a digitizing table in the Princes' Gate office in London, and the development of programs to handle diagrammatic maps including those for sand and gravel resources. Attempts were made to link this with G-Exec developments.

In April 1973, IGS convened a meeting in London, following discussions at the West European Geological Surveys' Directors' meeting. Technical experts were assembled from most West European Surveys to discuss developments in digital cartography, with guests from NERC, the Experimental Cartography Unit and the Canadian and Kansas Geological Surveys. The discussions proved to be of sufficient interest that the group continued to meet regularly until 1990, hosted in turn by various surveys. The technical experts found that their principal shared interest was database rather than cartography, and the acronym WEGS AGAC was interpreted for some time as Advisory Group for Applications of Computers, rather than the original Automated Cartography. In the late 1980s, however, the focus of attention moved back to cartography, and the activities were eventually subsumed in the International Consortium of Geological Surveys for Earth Science Computing. Detailed papers are archived [25-27], as they reflect an interesting parallel development of computer methods world-wide.

All was not well. IGS had given a strong lead on computing matters from the inaugural meeting of the WEGS technical sessions in 1973 for several years. But by 1980, an over-stated comment queried whether IGS retained sufficient expertise in the digital cartography and database areas for continued participation to be appropriate. A collateral effect of the changes connected with the establishment of the NERC Computing Service was the lapse of management support for database activities in areas like Hydrogeology, Engineering Geology, Cartography and Palaeontology. The results of many man-years of effort in data entry were lost. The assumption that the computer data banks were the start of a permanent contribution to the IGS archives was no longer tenable. It could be argued that IGS had moved or had been bounced into the field prematurely, diverting effort from its core activities, and making it difficult to re­establish the credibility of the techniques when the time was right. On the other hand, it could be argued that pushing ahead with new technology was an essential role of research council funding, and that the work of IGS and ECU in the 1970's laid the foundations for later developments.

Certainly, the same pattern of shifting focus can be seen in other surveys.

I am too close to the events to offer an impartial judgment, but rereading the documents, have an impression that, despite the computing developments in many IGS departments recorded in no less than 89 internal reports (see Loudon 1980), many geologists were not prepared to accept the extent of the changes involved, and had little incentive to do so. The believers gave an impression that each change would result in a new, stable way of working, rather than in the increasing turbulence which in fact tended to be the outcome. Indeed, without the illusion that edge-punched cards, aperture cards, or whatever, were a long-term solution to a department's needs, it is difficult to see how the effort of preparing them could have been justified. Each strategic advance was therefore vigorously defended long beyond its sell-by date.

Computer Unit staff transferred to Edinburgh for the management of the Offshore Studies Data Bank [19] (being commercial in confidence, the records are not in this archive) adopted pragmatic solutions to meet user requirements, rather than working within a rigid G-Exec framework. The strong feelings aroused by this were remarkable. As were those when Mimer, a commercially available relational database management system from Sweden, was later introduced to take over some G-Exec functions, or indeed, when Mimer in its turn was displaced by Oracle. Against the last-ditch loyalty of system protagonists, it may be that changes imposed from outside were the only means of maintaining momentum.

For Offshore Studies, the imposed change is recorded in the minutes of the IGS Data Coordinating Committee [29] for 1 March 1983, item 3.4. "The computerised information was first held on the PDP-11/45 in Edinburgh and, following the inception of NCS, was transferred to the Honeywell computer at Bidston. This transfer caused so many problems and interruptions to the programme of work that the Department of Energy had decided, after three years, to sever their connection with NCS. It is the intention that computerised hydrocarbons information be held at Thames House South, London, on the DEn Prime computer, with direct access by HCU [the BGS Hydrocarbons Unit] through another Prime computer in Edinburgh."

In the same document, item 4.4, it was suggested that recent costs on the hydrogeological data bank "were of the order of 10 000 per individual well record stored. The problems resulted from its implementation on G-EXEC on a remote mainframe, and secondly, from inadequate consideration of retrieval at the outset when most thought had been given to data manipulation. The system was under review and of six questions submitted to identify specific problems, only three had been answered in four months."

Computing finds its role

An interesting change in attitude had occurred. The early flounderings had been dressed up as successes, because of the great promise they showed for the day after

tomorrow. In the mid-1980s when computing in geology was established, there was confidence to confront the failures (of others). The new mood of healthy criticism was matched by vigorous development of new applications throughout the Survey. Dr Kelk, whose work on digital cartography was mentioned earlier, became Assistant Director in charge of Corporate Coordination and Information. Under his leadership, the newly created National Geosciences Data Centre made full and effective use of computer indexes. With Keith Adlam, he developed a demonstrator of a map-based data retrieval system (Adlam et al, 1988) on which later

systems were based. The early digital cartographic work also led through projects initiated by the Lumsden Working Party of 1981, which endorsed experimental development of digital geological spatial models [28], to projects commissioned by the Department of the Environment for the Southampton and, later, the Wrexham area, in which the production of many thematic maps from a single database was investigated. This line of development came together with other work centred in the Drawing Office to give rise to the Digital Map Production Initiative led by Dr P M Allen, the Assistant Director in charge of the Thematic Maps and Onshore Surveys Division. These activities are documented in BGS internal reports and publications (see Nickless and Jackson, 1993).

The past twelve years have seen much varied development in computing applications in BGS. Perhaps the main theme has been a move to a more professional approach and greater awareness of costs and commercial opportunities. As this is intended as a historical note, however, a suitable point to leave the history is the mid-1980s, with the Geological Survey established as a pioneer in computer applications to geology, buffetted by forces beyond its control, but with its map preparation facilities intact, retaining its position as the major supplier of geoscience data in the UK, and well placed to adapt to the more significant changes about to come.

References

Adlam, K.A.McL., Clayton, A.R. and Kelk, B., 1988. A 'demonstrator' for the National Geosciences Data Index. International Journal of Geographical Information Systems 2(2) p 161-170.

Institute of Geological Sciences, annually. Annual Report of the Institute of Geological Sciences. Institute of Geological Sciences, London.

Jeffery, K.G. and Gill, E.M., 1977. The use of G-EXEC for resource analysis. Int. Assoc. Math. Geol. J. vol 9(3), p265-272.

Loudon, T.V., 1964. Computer analysis of orientation data in structural geology. Technical Report No.13 of ONR Task 389­135. Northwestern University, Evanston, Illinois.

Loudon, T.V., and Adams, Mrs E.P. (compilers), 1969. GOSSIP annotated list of some geologists who use a computer. Reading University Geological Report No. 3. Reading University, England.

Loudon, T.V., 1969. A small geological data library. Mathematical Geology, 1(2), pp 155-170.

Loudon, T.V., 1974. Analysis of geological data using ROKDOC, a Fortran IV package for the IBM 360/50 computer. IGS Report No 74/1. HMSO, London. (Revised version of Reading University document of 1967)

Loudon, T.V., 1980. Computer reports on open file at the British Institute of Geological Sciences. Computers and Geosciences, 6, pp 463-465.

Nickless, E.F.P. and Jackson, I., 1993. Digital map production in the UK - more than just a cartographic exercise. Proceedings of the 16th International Cartographic Conference, Cologne, May 1993.

Read, W.A. and Merriam, D.F., 1966. Trend-surface analysis of stratigraphic thickness data from some Namurian rocks east of Stirling, Scotland. Scottish Journal of Geology, 2(1), p 96-100.
Wilson, H.E., 1985. Down to Earth - one hundred and fifty years of the British Geological Survey. Scottish Academic Press, Edinburgh.


Appendix: Index to archived documents (restricted access)

Envelope 1: External documents, 1964-72

Includes Cogeodata recommendations, US DoD manual for thesaurus, Computer in Geography (Tarrant), National System for storage and retrieval of geological data in Canada (Geological Survey of Canada), GOSSIP annotated list of some geologists who use a computer (Reading University), the ROKDOC package (Reading University), 2 documents on semantic coding (Ecole des Mines de Paris), 4 Pelican books illustrating general views at that time on the "electronic revolution".

Envelope 2: IGS Computer Committee, 1967-68

Includes the terms of reference, agenda and minutes of the Computer Committee, together with other papers presented or discussed at their meetings (Chairman D A Gray, Secretary R McQuillin).

Envelope 3: IGS Computer Committee, 1968-69

As above, includes report on investigation into the computer requirements of IGS.

Envelope 4: IGS internal reports on computing, 1967

Includes Investigation into the computer requirements of IGS (Gover, 1967), questionnaire to departments and analysis of replies (Gover), and Provisional Report on the Development of Automatic Data Processing Facilities in the Institute of Geological Science (IGS Computer Committee), and various internal comments on the Report. Notes on a range of visits by IGS staff to external establishments and conferences to discuss data banking (mostly 1967).

Envelope 5: Drift borehole data bank project (D W Rhind), 1968

Data sheet, and examples of data, programs and output from a Drift borehole data banking project for the Edinburgh area, by the University of Edinburgh Geography Department in collaboration with IGS. See also envelope 11.

Envelope 6: IGS Stratigraphic Code, 1969­70

Papers on the development of a stratigraphic code within IGS, considered by a working party chaired by R V Melville.

Envelope 7: IGS Petrological/lithological Code and geographical location referencing, 1967-70.

Papers on a petrographical/lithological code considered by a working party chaired by P A Sabine (1967), and on geographical location referencing, chaired by D Masson-Smith (1969-70)

Envelope 8: Cambridge research on data processing in geology, 1969-74.

Papers considered by IGS concerning a project in Cambridge (J L Cutbill) funded by the Office for Scientific and Technical Information, exploring automation of catalogues for the Fitzwilliam Museum collections, and related issues.

Envelope 9: Cambridge reports on data processing in geology 1971-73.

Reports from the above project.

Envelope 10: Experimental Cartography Unit (D P Bickmore) 1969

Includes an early report on their development of mapping programs. Much additional information on the ECU is contained in IGS and NERC files.

Envelope 11: Day files of the IGS Computer Liaison Officer (T V Loudon) 1969

This unedited record of day to day trivia gives an impression of the routine activities of a new member of staff attempting computer liaison in IGS. It contains additional information on the Drift borehole project of D W Rhind (envelope 5).

Envelope 12: Day files of the IGS Computer Liaison Officer (T V Loudon) 1971

As above, for some months in 1971.

Envelope 13: IGS Computer Unit Steering Committee 1969

The highlights in the IGS computer developments are described in these papers, together with the background discussions. The special meeting was called to consider the paper on proposals for IGS computer developments 1970-75.

Envelope 14: IGS Computer Unit Steering Committee 1970-73

As above, with slides of the Computer Unit staff and equipment at Exhibition Road, London and notes provided for a computer appreciation course (1972).

Envelope 15: IGS Computer Unit Steering Committee 1974-75

As above.

Envelope 16: Visits to Canada, 1974

Contains examples and comments on work in Canada seen during visits to geological surveys and oil companies.

Envelope 17: Conference on Environmental Data Management at Houston, Texas, 1974.

Discussions, under NATO auspices, of environmental data management, in which geology is well represented.

Envelope 18: CCOP seminar on data in petroleum resources development, Bandung, Indonesia, 1976.

Envelope 19: Computing activities and equipment at IGS Edinburgh, 1976-79.

Envelope 20: Discussions on the NERC Computing Service 1978-79

The NERC Computing Service was formed amid controversy, and it was some years before it settled down to a valued computer service.

Envelope 21: NERC Working Party on Data Processing in Geology and Geophysics, 1971-72

NERC assessed the various activities in computing in the universities, research councils and industry, and made recommendations for future support. "Data banking" work was seen as an important area.

Envelope 22: NERC Subcommittee on Computer Graphics 1973-74

Reporting to the NERC Computer Committee, with which its membership overlapped, the Subcommittee reviewed computer graphics in NERC and made recommendations for their more effective use, taking into account the role of the Experimental Cartography Unit.

Envelope 23: Standing Working Group on IGS Automated Cartography 1973-76

Chaired by A E Seddon of NERC, who also chaired the NERC Computer Committee, this group of 20 senior and technical ECU and IGS staff monitored the work of ECU for IGS, including the production of the one-inch Merthyr Tydfil geological map.

Envelope 24: IGS Cartographic Developments Committee 1973-77

This group monitored developments of simple cartographic systems in-house, with an eye on developments elsewhere, particularly in the Ordnance Survey and Experimental Cartography Unit.

Envelope 25: WEGS meeting on Modern Methods of Map Production 1973

In 1973, IGS was host to a meeting of technical experts largely from the West European Geological Surveys to discuss developments in digital cartography.

Envelope 26: WEGS technical sessions 1974-80

Technical experts on applications of computers in geological survey continued to meet and share information, following the pattern set in 1973.

Envelope 27: WEGS technical sessions 1980-90

As above. After 1990, the function of the WEGS Advisory Group was largely taken over by the International Consortium of Geological Surveys for Earth Computer Sciences.

Envelope 28: IGS Working Party on Computer Applications to Geological Maps 1981

G I Lumsden (later Director) chaired this working party which made various recommendations for the encouragement and coordination of computer developments in IGS, particularly in database and cartographic work.

Envelope 29: IGS Committee on records and archives at Keyworth 1979-80 and IGS Data Coordinating Committee 1982-83 and IGS Land Survey Databank Coordinating Group 1983

These groups reflect the reviving interest within IGS in coordinating its data and information. The CRAK committee considers data and material in a broad context, including the use of the computer as a tool to assist in data management.