Discussions‎ > ‎

Published information

Use this page to capture references to published information about the Michigan Terminal System and related materials.

Additional articles that mention MTS may be found elsewhere in the "Discussion" section of the Michigan Terminal System Archive (this site) and the "MTS Bibliography" contains a list of published articles about MTS.

Contents


x'2C' (44): Exhibition bytes into the history of computers (Newcastle Helix)

posted Apr 3, 2019, 1:13 AM by Jeff Ogden   [ updated Apr 3, 2019, 2:17 AM ]


Newcastle Helix Logo black

Exhibition bytes into the history of computers

Photograph of Professor Brian Randell and Jon Dowland
Professor Brian Randell and Jon Dowland

Sixty years on from the creation of the first computing lab at Newcastle University, a special exhibition will highlight some of the ground-breaking IT developments that have since taken place.

The exhibition explores the evolution of computing – from the days when computers were so large that they would fill a room, to the personal desktop computers of the 1980s and 90s, and the earliest handheld devices.

Exhibits on display include early examples of home computers such as the iconic BBC Micro and Sinclair ZX81, as well as some of the first portable personal computers that paved the way for today’s tablets and laptops.

The exhibition has been put together by a group of volunteers from current and retired staff and students, led by researcher Jonathan Dowland and Professor Brian Randell, who joined the computing department at Newcastle in the late 1960s from IBM's T J Watson Research Center in New York.

Professor Randell said: “For this exhibition, we’ve brought together some fascinating examples that show how far computing technology has come - from the very early ‘mainframe’ computers via the first PC and the venerable Apple Mac Plus, to more recent handheld devices and tablets. The models on display were all revolutionary at the time and each has an important role in the story of how integral computers have become to modern life.”

Among the veteran models on display is the vacuum-tube technology of the type used in Ferdinand, the Ferranti Pegasus computer which was the very first computer at the University.

Installed in November 1957, Ferdinand - FERranti DIgital and Numerical Analyser Newcastle and Durham – is thought to have been the first computer in the whole of the North East.

As there were so few computers at this time, access was also made available to local industry, and major local companies such as C.A. Parsons, Reyrolles and Thomas Hedley were among those interested in Ferdinand’s capabilities.

To use Ferdinand, programs were prepared on paper tape which was then loaded into the computer by Elizabeth Barraclough, the University’s first Ferranti Pegasus computer operator, whose long career at Newcastle culminated in the role of Director of the University Computing Service.

In 1967, 10 years after its launch, the computing department had grown substantially, and the University – by then separate from Durham – obtained the IBM System 360/67. At the time, this was the largest IBM computer in any British university, and components from this will be on display during the exhibition. It was also Europe's first time-sharing computer - a computer that could be used simultaneously by a number of different users, dramatically lowering the cost and speed of developing and running computer programs.

The original purpose of the computing lab at Newcastle University was to provide computing support to researchers. It quickly became clear that education for users was essential, and in 1958 Newcastle became the first British University to teach a course in computer programming to undergraduates.

Today Newcastle is ranked one of the top universities in the world for computing science. This status as a world-leading centre in data science led to Newcastle leading the UK’s £30m National Innovation Centre for Data. It also has been named as the Governmental Academic Centre of Excellence for Cyber Security Research, as well as being invited to join the prestigious Alan Turing Institute, the flagship national institute for data science.

Professor John Fitzgerald, Head of the School of Computing, Newcastle University, added: “Sixty years on, computing science teaching and research at Newcastle continues to be ground-breaking, allowing us to be at the forefront of critical disciplines such as big data, artificial intelligence, cyber-security and cyber-physical systems.

“Just as the very first computer at Newcastle was also used by business, today we work with our industrial partners to connect research, accelerate innovation and boost skills to meet the needs of an increasingly-digitalised society.”

The History of Computing exhibition has been designed as part of Newcastle University’s ‘Inspired by’ Great Exhibition of the North programme. It is taking place in the entrance to the Urban Sciences Building, Newcastle Helix, weekdays 9.00am – 5.00pm and runs until 9th September.

For more information on a range of options for studying computing science that is based on world-leading research visit www.ncl.ac.uk/computing/

x'2B' (43): How computers have changed since 1968 [at Durham University]

posted Apr 2, 2019, 8:22 AM by Jeff Ogden   [ updated Apr 3, 2019, 1:16 AM ]

The following article appeared in the January 2005 issue of ITS News from Durham University.

See: https://www.dur.ac.uk/cis/news/archive/issues/january2005/complete/

How computers have changed since 1968

Picture of computer at Durham in 1968

We recently unearthed a short publication from 1968, which described the inauguration of N.U.M.A.C. The Northumbrian Universities Multiple Access Computer (N.U.M.A.C.) was the name given to a system installed to serve computing needs of the Universities of Durham and Newcastle upon Tyne.
N.U.M.A.C. was hailed as the first computing system in the UK to be jointly owned and operated by two universities. Such co-operation enabled a much more powerful system to be made available than could have been purchased by either University acting alone.

"The computer chosen was the IBM system 360, Model 67. Situated in the University of Newcastle upon Tyne Computing Laboratory are the central processor unit, the core store of 512 K bytes, a drum of 4 million bytes, a multiple disc unit capable of holding 233 million bytes available for access on eight replaceable discs, magnetic tape drives, appropriate selector and multiplexor channels controlling the flow of information and peripheral devices including printing, card and paper tape equipment and graph plotters.

A small on-line satellite computer, the IBM 1130, also with printer, plotter, card and paper equipment, has been placed in Durham. Typewriter terminals have been installed in both Durham and Newcastle. A wide range of data preparation equipment for both cards and tape is available in both Universities.

When the Model 67 is operated in time-sharing mode, several users in Newcastle and Durham will be working simultaneously at typewriter terminals under the experimental time-sharing system TSS/360. The users will be able to employ a conversational approach, entering modifications to programs or data through the keyboard and receiving information from the typed output. Initially six terminals in the Newcastle laboratory and two in Durham will be connected, followed shortly by five more at different places in Newcastle and Durham......

The majority of the demand on the computing system will arise from research workers in science, engineering and medicine, although workers, especially in the bibliographic and social science fields, will make substantial demands. Although the needs for computer time for each student example are slight, the numbers involved produce an apppreciable demand from this source also.'

(29 Jan 2005)

x'2A' (42): Research At 'U' May Revise Space Theories, Ann Arbor News, August 31, 1960

posted Jul 6, 2017, 10:47 AM by Jeff Ogden   [ updated Apr 3, 2019, 1:18 AM ]

This article appeared in August 31, 1960 edition of the Ann Arbor News. It talks about:

Data from the experiments is now being processed on an electronic computer at the U-M Computing Center. ... the group arranged the experiments so data could easily be processed on the big computer, thus saving many months in evaluating the data.  

x'28' (40): Campus Networking Strategies

posted Jun 9, 2016, 11:00 AM by Jeff Ogden   [ updated Jun 9, 2016, 11:30 AM ]

Campus Networking Strategies (EDUCOM strategies series on information technology) Hardcover – August 10, 1988

by Caroline Arms (Editor)

In-depth case studies of ten higher education institutions, along with background chapters on protocols and standards, wiring, and national networks.

Ten institutions that are considered to be leaders in the implementation of computing technology are reviewed, each in an individual chapter, to assess their current status and future plans for computer networking. The universities included are Wesleyan University, Dartmouth College, Carnegie Mellon University, Rensselaer Polytechnic University, Massachusetts Institute of Technology, Stanford University, Cornell University, The University of Michigan, The University of Minnesota, and The Pennsylvania State University.
  • Series: EDUCOM strategies series on information technology
  • Hardcover: 336 pages
  • Publisher: Butterworth-Heinemann Ltd (August 10, 1988)
  • Language: English
  • ISBN-10: 1555580092
  • ISBN-13: 978-1555580094
  • Product Dimensions: 5.9 x 9.1 inches
  • Shipping Weight: 1.2 pounds


Posted on Facebook 9 June 2016:

1 hr ·[<<-- click here to see the original post]
 
Greg, I was happy to find a copy of this 1988 classic on a colleague's shelf and to read your chapter about the earliest days of networking at U-M. Things have come a fair distance, to say the least. Susan Harris also gets a thanks in the chapter for her editing.


x'27' (39): ARCH:MODEL: CADIA at Michigan

posted May 26, 2016, 12:18 PM by Jeff Ogden   [ updated May 26, 2016, 12:24 PM ]

"CADIA at Michigan", by Theodore Hall, University of Michigan, from ACADIA, the newsletter of the Association for Computer-Aided Design in Architecture, December 1983, Vol. III, No. 2

Describes the CAEADS project at the Architecture and Planning Research Laboratory (APRL) of the University of Michigan. ARCH:MODEL is a modeling program to assist in computer aided building design that ran on MTS.

x'26' (38): FOIL— a file-oriented interpretive language

posted May 24, 2015, 8:48 AM by Jeff Ogden

FOIL— a file-oriented interpretive language
John C. Hesselbart, The University of Michigan, Ann Arbor, Michigan
in Proceedings—1968 ACM National Conference, pages 93-98.

In the summer of 1967 a project was begun at The
University of Michigan to provide users of a generalpurpose,
time-sharing system with the capability for
exploring conversational uses of computers for instruction.
The idea for the project developed from the
interest of faculty members in a number of subject
areas who wished to develop conversational programs
and investigate the benefits of computer-assisted instruction
in the classroom and laboratory using existing
time-sharing facilities. Support was provided by UNIVAC
Division of Sperry Rand Corporation.

 . . .

FOIL (File-Oriented Interpretive Language) was devel-
oped to provide conversational lesson-writing capability
for potential instructional programmers who have access
to a general-purpose, time-sharing system. Programs
written in FOIL reside on direct-access files and are
processed by an interpreter written in FORTRAN.
The interpretive mode places few constraints on the
syntax of the language and a number of beneficial
features are achieved.

 . . .

The source code for the processor is relatively machine
independent and therefore easily adapted to other
time-sharing systems. FOIL was originally implemented
on an IBM 360/67 computer operating under the
Michigan Terminal System. James Ruddell at the University
of Maryland readily adapted the processor for
the UNIVAC 1108 system and added capability for
lesson building and editing.

 . . .

x'25' (37): In the beginning: how MTS came to UBC

posted Dec 25, 2014, 9:19 AM by Jeff Ogden   [ updated Dec 26, 2014, 8:11 AM ]

"In the beginning: how MTS came to UBC"
Ron McQuiggan
ComputerData, March 1979, page 12


x'24' (36): Did you know?

posted Dec 15, 2014, 3:22 PM by Jeff Ogden

The following note appeared in a sidebar in the 3 February 2011 issue of the University of Michigan Record Update:

DID YOU KNOW?

The Michigan Terminal System was developed in 1967 as a time-sharing system that allowed for efficient multi-user access to the university's IBM 360/67.

Learn more

x'23' (35): TSS and Virtual Memory on the S/370

posted Dec 14, 2014, 10:22 PM by Jeff Ogden

This is the second of two excerpts from a 37 part oral history interview of Humphrey Watts by Grady Booch for the Computer History Museum. Watts was an executive at IBM where for a time he was responsible for all software development at IBM. Later he was a Fellow at the Software Engineering Institute at Carnegie Mellon, where he provided the vision for, and early leadership in the development of, the widely used standard for assessing an organization's software development capability, the Capability Maturity Model (CMM). He is the author of several influential books on the software development process and software process improvement. Watts passed away in October 2010.

All 37 parts of the interview are available online, see:  http://www.informit.com/promotions/interview-with-watts-humphrey-137746.

This excerpt doesn't mention MTS directly, but covers two issues that were important in the history of MTS, IBM's TSS operating system and the decision to include Virtual Memroy support on all models of the IBM S/370 computer line.

An Interview with Watts Humphrey, Part 10: The Fortune Interview, IBM Lawsuits, and Virtual Memory

April 26, 2010

http://www.informit.com/articles/article.aspx?p=1574479

In this transcript of an oral history, Grady Booch interviews SEI Fellow Watts Humphrey. In Part 10, they discuss IBM in the mid-1960s, including Humphrey's unfortunate Fortune interview, teaching lawyers about programming, and making the decision to use Virtual Memory in the IBM 370.

This interview was provided courtesy of the Computer History Museum.

 . . .

[TSS]

Also I think towards the end of that year we had a meeting -- the TSS system had gotten into trouble, as I mentioned. We had gone and added a whole lot of function to it. And instead of coming out three months late, it was about six to seven months late with performance problems and everything. And so basically the system 360 was coming along and people were starting to buy it and they were happier with it. And so people had switched back from TSS to 360. So 360 was now starting to go full tilt. And the management decision, the division presidents and that whole crowd all said, “No, we’re going to kill TSS, period.” And I objected because I thought it was a system we should have available, but no -- they were going to shut it down. So they did shut it down. It cost us about $30 million. We did actually get the system running, and it was installed in a few places with the Model 67, but it was stopped. But it was the early virtual memory system, and it was really a very good responsive system, but it was not compatible with 360, and that was a real problem that people were concerned about, and it was out.

So in any event, I remember meeting in the board room. I was talking about software and software phase plans and the whole thing, and Tom Watson interrupted me at one point. I had gone through what the phase plans were and when you announce things and when you do various stuff and he said, “Watts, I’m confused now. You did a marvelous job with the FAA and you’re probably the best guy we’ve got to run software. I don’t understand. The FAA was such a tremendous success and you’re doing so very well with the 360.” He said, “How come TSS was such a disaster?” We had just closed it out at a $30 million loss. So I said, “Look, here’s where we announced the 360 schedule.” And I showed him we had running code, we had a whole lot of stuff in place -- at least the beginning code -- and we had plans, and the design was done, et cetera. And I said, “And here’s where we set the schedule for the TSS,” and it was way back at the beginning before we knew anything. And I said a big part of the problem on controlling this stuff is announcing things you don’t know how to build. We didn’t have a good foundation for a plan. So Tom understood that, fortunately. He was quite a guy. He could be really tough. I remember an executive making a presentation to him and he actually reduced the guy to tears at one point. But he was logical. And if you could understand what he was concerned about and really get to the point, he’d switch, and he was great. So that was that.

 . . .

The Virtual Memory Decision

So I was made VP, and I had the architecture stuff. And as I said, Don Gavis had the OS 360 work. And he came to me -- I think it was in about 1969 – and said, "We've really got to go to virtual memory." And remember, the TSS was killed and the 360 didn't have virtual memory. And we had that big battle with Amdahl that… “just add more memory.”

But the programmers had concluded that virtual memory was probably the only way to go. We just had to do it, get out of the constraints of the physical memory. And just about that time, IBM was developing a newer version of 360 called the 370. We'd been out there for a while with the 360 systems, and people were beginning to catch up with us, in terms of performance and that sort of thing. And there was quite a lot of competition. It was still pretty vicious. And so they were coming up with upgrades and higher performance hardware and that sort of thing. And they decided to call it System 370. We were working on 370 development work. And so the recommendation was that we switch System 370 to be a virtual memory system.

Well, that was a radical change. Some hardware guys were happy to do it, but a bunch of them weren't. We had a bunch of microcode machines, which were fairly easy to switch. But the hard-wired, bigger systems were a much tougher problem. And right about that time, they had a re-organization, and a new Division President was brought in. And guess who was made the Division President? It was Bob Evans, the guy I'd had two previous battles with -- on both the timesharing stuff and on the FAA thing -- and I won both of them. And so Bob and I weren't on the best of terms. He'd gone down to run the Federal Systems Division, where they programmed the FAA system, if you remember. And he was brought back as Division President.

And so one of the first things we did was to go in to Bob with my programming team and the architects. The architects agreed that we ought to move to virtual memory. So we went to Bob. And of course, we were fighting with the hardware guys. And so Bob looked at it and went through the story. I mean, I have tremendous admiration for the way he was able to take a multibillion-dollar decision and make it in a day. I mean, he went through this, he looked at all the choices. And he said, "You're right. We'll do it." And here he'd been opposed to it, but he went through the logic and what the guys were talking about. And he was a very sharp guy. I had differences with him, but he knew what he was doing. He made that call and he was clearly right. So that's how we put virtual memory in 370. Bob made the call and it was, basically, Don Gavis that turned us around.

 . . .

x'22' (34): Marketing the IBM Model 67

posted Dec 13, 2014, 3:35 AM by Jeff Ogden   [ updated Dec 14, 2014, 10:31 PM ]

This is the first of two excerpts from a 37 part oral history interview of Humphrey Watts by Grady Booch for the Computer History Museum. Watts was an executive at IBM where for a time he was responsible for all software development at IBM. Later he was a Fellow at the Software Engineering Institute at Carnegie Mellon, where he provided the vision for, and early leadership in the development of, the widely used standard for assessing an organization's software development capability, the Capability Maturity Model (CMM). He is the author of several influential books on the software development process and software process improvement. Watts passed away in October 2010.

All 37 parts of the interview are available online, see:  http://www.informit.com/promotions/interview-with-watts-humphrey-137746.

This excerpt talks about events leading to the development of the IBM S/360 Model 67, the first IBM system with support for virtual memory. It doesn't mention MTS, but does briefly mention Bernie Galler and the University of Michigan.

An Interview with Watts Humphrey, Part 6: The IBM 360
March 29, 2010
  • The IBM 360 Announcement
  • IBM Time Sharing
  • Marketing the IBM Model 67
http://www.informit.com/articles/article.aspx?p=1571987

In this transcript of an oral history, Grady Booch interviews SEI Fellow Watts Humphrey. In part 6, Humphrey discusses the IBM 360 announcement, battling GE and MIT for market share, and marketing the IBM Model 67.

This interview was provided courtesy of the Computer History Museum.

 . . .

Humphrey: . . . But in any event we put this together, we put in the proposal. It was a very simple design for the virtual memory, but it was a good one. And we put it in and we won the bid. We got the Lincoln Labs bid, and the marketing guys were going off, and Orville Wright and his team and they were putting out fires with this system. The Model 67 turned out to take the market by storm. I mean, people loved it. And it had multiprocessing -- the 67 was the first multiprocessor of the 360 line.

So we had to have that in place so we could have multiple computers come together with a virtual memory, which is a very attractive system, and they had all kinds of expansion capabilities and a big deal with some of the real-time communication that you needed and everything else. So it was a great system. When we put that in, we won and we were going great guns. The programming guys did extremely well. They were up in this lab in Yorktown Heights and then all hell broke loose. The Multics people and the GE folks had decided to leapfrog the technology, and so they had come up with an expanded virtual memory approach and they sold it to General Motors.

Well, the GE people had come up with a bid, GE and MIT together to put a much more expanded virtual memory into the Multics System, and it actually had a great deal of flexibility. The reason it was attractive to General Motors was, General Motors wanted to use this timesharing system for the graphic design for their automobiles. And they had a big graphic design system, and the marketing people got me out to GM to see what they were doing and why it was interesting. And they were working with the University of Michigan and Bernie Galler and folks out there. And very nice folk. I got to know Bernie quite well and he was pushing this stuff real hard. And he was sort of the intellectual push behind all of this stuff for this expanded virtual memory.

Booch: I have to ask. Were they doing their CAD work on it?

Humphrey: Well, they had big IBM displays. I think it was called the 2250 or something, but it was a big display running off the 360, and they had them on earlier systems but they were a big part of the 360 proposal. And so it they were damn good systems. They were doing amazing things with them way back then. So it was quite something that GM had a lot going on, GM Research -- that's who we were working with. I didn't tell you also the Bell Labs people we worked with Ed David and a bunch of those folks. So I got to know all of those guys. It was quite an interesting bunch of folks. In any event GM was really pushing us hard on this. They had to have this added memory and they argued they couldn't -- literally couldn't -- do without it. It was pretty obvious we were not going to win the GM bid unless we could build something substantially more then what we were doing. And the marketing people were all upset because they concluded that, if GM went with GE and Mulltics, we'd lose Bell Labs; and if we lost GM and Bell Labs, we were going to lose the momentum pretty completely. And so the GM win was a big deal. And so I got together with the programming guys and the architects to figure out how we would do it, and that it would take a hardware change, which we were told was straightforward. But the programming guys went through it and after like a weekend's -- a long weekend's -- worth of study….we were strong on long weekends in those days.

Booch: It sure sounds that way.

Humphrey: But in any event they concluded that it would add about three months to the schedule. Well like a dummy I bought it. And so we put in a proposal. We won the GM bid, and Lincoln Labs was very upset with the three month delay, but we talked them into it and all the other customers -- we had a three month delay for everybody. Everybody finally bought it we got it sold in the market. They’d all do it. And so we got started on that. We had that bid. And so this was in the fall of '65, I think it was.

1-10 of 43