World Library  
Flag as Inappropriate
Email this Article

Maestro I

Article Id: WHEBN0021442872
Reproduction Date:

Title: Maestro I  
Author: World Heritage Encyclopedia
Language: English
Subject: Integrated development environment, Computer-aided software engineering tools, Cirquent
Collection: Computer-Aided Software Engineering Tools, Integrated Development Environments
Publisher: World Heritage Encyclopedia

Maestro I

1978-79 Softlab Munich, Tucherpark

Maestro I was the world's first integrated development environment for software.[1] It was developed by Softlab Munich.

Softlab Munich originally called the software Program development system terminal (PET), but renamed it after Commodore International introduced a home computer called the Commodore PET in 1977.

At one time there were 22,000 installations worldwide. The first USA installations were at Boeing in 1979 with eight Maestro I systems and Bank of America with 24 system and 576 developer terminals.[2]

Until 1989 there were 6,000 installations in the Federal Republic of Germany [1]. Maestro I was the world leader in the field in the 1970s and 1980s.

A Maestro keyboard[3]

Maestro I holds a significant place in the history of technology.

One of the last Maestro I systems is at the Museum of Information Technology at Arlington.[4]


  • First presentation in 1975 1
  • Introduction 2
  • Historical context 3
    • Psychological phenomenon 3.1
  • Milestones 4
  • Technology 5
  • Operation 6
  • Contemporaries 7
  • References 8
  • External links 9

First presentation in 1975

Harald Wieler, copartner of Softlab Munich, developed the first prototype of the system, then named PET, in 1974 based on the Philips x 1150 data collection system. Originally a Four Phase System from the USA. Wieler was architect and programmer of the mainframe DOS operating system development, for Siemens licensed by Radio Corporation of America for Siemens. The objective in developing Maestro I was a hardware and software programming tool rentable for 1000 Deutsche Mark a month, about the same as a one family house in the Munich area at the time.


Maestro was an essential factor in the development of:

  • Software engineering
  • Origination of development environments
  • Human-computer interaction, ergonomics
  • Methodology (software technology)

Historical context

Punched card

In order to understand the impact of Maestro, one has to understand the way programmers worked until about 1975. They would enter their code and test data in paper tape or punched cards. After finishing the punching, the programmer would feed the tape and/or the cards in the computer.

The introduction of the IBM 3270 terminals together with IBM’s ISPF (Interactive System Productivity Facility) constituted a real improvement. The text editor that was integrated in ISPF allowed source code for programs to be entered in real time. The editor was controlled with commands, line editing and function keys. The disadvantage was that the reaction to input would only appear after a whole page had been entered which made the operation slow and not very intuitive.

Psychological phenomenon

A delay in a dialogue operation causes an involuntary break in the thinking process and thus in the programmer's work. Fred Brooks calls this phenomenon Immediacy in the landmark paper No Silver Bullet. This is caused by the way short-term memory functions in the brain. Atkinson & Shiffrin proposed a model in 1968 that stipulates that information entering short-term memory fades away in 18–20 seconds if it is no longer attended to. Another important factor is the recency effect which causes a person to remember the last few things better than the things in the middle or the beginning of a time period. Thus, when delays occur in the work, the programmer tends to lose the thread of his or her thoughts.

The introduction of Maestro was considered a real innovation in its time. According to the economist, Joseph Schumpeter, innovation consistes of the acceptance of a technological or organizational novelty, more so than its invention. In the case of Maestro, the “discovery” of short-term memory was turned towards a more technical application. Maestro fed each keystroke directly to the CPU producing immediate feedback. This feedback was also enabled by the particular characteristics of the hardware, specifically the use of a keyboard and console instead of the earlier punchcards or tape.

A comparison with another innovation such as Ajax (programming) is justified. The name Ajax could regularly be found in the media in 2005, as Google used its asynchronous communications paradigm in interactive applications such as Google Maps. Web applications would traditionally work with forms that need to be completed by the user. The IBM 3270 terminals from the 1970s also worked with “forms” (actually screens) that needed to be completed leading to delays and disturbing breaks in the work. Maestro remedied these delays, similarly to Ajax some thirty years later.


1975: Introduction

The first prototype of PET was developed by Harald Wieler of Softlab, based on a Philips X1150 Data Entry system (actually a Four-Phase IV/70 system, made in the USA). Wieler worked as architect (and programmer) of operating systems for mainframe computers of RCA and Siemens before joining Softlab.

The development of Maestro was co-funded by the German government. The target was to create an interactive programming terminal for 1.000 Mark (approx US$ 500) per month.

“The creator of the Maestro program is an American. But Harald Wieler, 45 years old, has German parents. After completing his studies (Physics), he wanted to get acquainted with the country of his ancestors and found employment in a research laboratory with Siemens in Munich. He met his wife at the home of Bavarian friends of his mother and decided to stay in Germany. He became a co-founder of Softlab in 1971.”

Der Spiegel, Jan. 17, 1983, page 71

“Softlab’s charming specialist, Ms. Christiane Floyd PhD, demonstrated the Program Development System PET on the companies’ stand to a large number of experts.”

Computerwoche, November 21, 1975

1977: Connection to Mainframe computers

“The release of data communication procedures for the linking of the PET-hardware, the Philips X1150 (data entry system) with IBM S360/370 and Siemens 4004/7000 completes the development activities by the Software company Softlab of Munich.”

Computerwoche, April 1, 1977

1978/79: Export to the USA

{cquote|The first US customer was the Boeing Company, the aerospace and Defense Corporation with 7 systems. The biggest purchaser became the Bank of America who ordered 24 Maestro-computers with 576 terminals for its 10.000 programmers in their San Francisco computing center. Softlab founded a US branch which sold about 100 Maestro systems with some 2000 terminals in the US.[6]

1980: Interactive training

“’There is much more education than knowledge in the world’, wrote Thomas Fuller already in 1732. Learning is a mental activity and its efficiency has always been fairly low. The same is shown, some 350 years later, for the poor production results of a modern form of mental activity: software development. This, at least, is the opinion of Rita Nagel, Softlab GmbH. Munich who also believes that this need not be the case. Software Company Softlab has developed an interactive program development system called PET/X1150 which has rationalized the mental activity. It made therefore sense to include training facilities for the PET-users within the same tool.”

Computerwoche, August 8, 1980

1982: Connection with IBM TSO, IMS and CICS

Geza Gerhardt, manager of the communications group at Softlab extended[7] the IBM3270 simulation in Maestro in 1982. This allowed further off-loading of computer processes from mainframe to dedicated systems.

“The system now offers extended interactive support for design, documentation and testing as well as project management. Next to 3270-BSC dialog, SDLC/SNA is now also supported. Parallel connections to TSO, IMS and CICS are also possible.”

Computerwoche, April 30, 1982


Maestro central processing unit
Maestro tape, disk drives, printers


The basic system was a “key-to-disc” data entry system. Historical predecessors were “key-to-tape” systems such as the Mohawk Data Recorder, Olympia Multiplex 80 and Philips X1100.

Maestro used the Philips (Apeldoorn, the Netherlands) X1150 Data Entry system, which was built on a Four-Phase (Cupertino, California) IV/70 processor.

A typical configuration at the time of introduction[8] was:

  • System with 96-192 KB RAM
  • 6-24 (dumb) terminals
  • 10- 80 MB disc
  • Magnetic tape
  • Line printer (various types and models were supported)
  • Data communication connection

"In the Four-Phase System IV/70 the memory and control requirements of up to 32 keyboard display terminals are combined with the mainframe memory and logic of the Central Processing Unit. As a result, data is displayed directly from refresh areas of the Four-Phase Systems parallel-accessed LSI memory, eliminating the cost of separate buffer memories in every terminal. Using this technique, exceptionally high video throughput results, enabling new information to be displayed at a rate of 395,000 characters per second."

Four-Phase commercial brochure, about 1972

The hardware evolved over time: the Four-Phase IV/70 processor was replaced by the more powerful Four-Phase IV/90 system and more terminals, memory and disc capacity could be supported. The base Philips X1150 Data Entry system was rebranded as Philips P7000 Distributed Processing System as significant additional functionality was added.


The operating system was a proprietary Four-Phase Disc Operating System (rebranded by Philips) which supported the usual components at that time: text editor, assembler, various compilers, and linkage editor.

The Four-Phase software offer consisted of packages for:

  • Data Entry (key-to-disc)
  • 3270 emulation
  • 3270 emulation with programming facilities
    • This unique package allowed the user to include local programming to off-load the mainframe

The original PET/Maestro software made extensive use of existing libraries from the above packages.


1974: Structured Programming

“One of the cornerstones of modern methods in Software technology was Structured programming. This methodology became obligatory for all program development at Softlab in Munich. Peter Schnupp PhD and one of the founders of Softlab, but also Associate Professor and author of many professional publications, considered Structured Programming to be the ‘Return of common sense’”.

Computerwoche, March 1974

The founders of Structured Programming, Prof. Edsger Dijkstra and Sir Charles Hoare, were keynote speakers at a meeting for software specialists at the Max Planck institute in Munich in December 1974. Peter Schnupp PhD was the president of the ACM at that time and presented a lecture with the content above.

1978: Is the life of COBOL eternal?

“Even if the new program languages would be considerably better than the existing ones, their widespread use would not be certain because of the lack of need with the prospective users. During the design of the new languages, decisions are often made which may bring advantages in scientific institutions but disadvantages in industrial software production. These problems often offset the advantages versus the older designs.”

Peter Schnupp PhD, in Computerwoche April 3, 1978

1980: Art, Manual Labor or Science?

“Structure and originality are not necessarily exclusive of each other. This will be proven in the following:

  1. There are people who are against structures because they harm the originality.
  2. There are people who are against originality because they limit the usability and the maintainability of software products.
  3. There are people who are in favor of originality, because only then is it possible to realize creativity in programming.
  4. And, finally, there are those that are against tools that enforce structure, thus excluding originality and blocking ‘self-realization’.

Who is right and who is wrong? Everyone! It depends on your definitions of structure and originality. And, of course, it depends on the use of the right tools.

As it is still not clearly decided whether programming is art, manual labor or science – probably a bit of all –it is necessary to discuss all three aspects”.

Computerwoche, May 9, 1980 Peter Schnupp PhD


How realistic is Software Technology?

Introduction to the history of Maestro (The effect of psychological mechanisms on Software technology)

“The author has not been witness yet to any larger and successful software project that was executed according to the rules of software technology for more than one third of the projected duration. Or that was explicitly discussed, specified and planned without programming of critical system parts, modeling or similar ….”

“On the other hand, there is no reason to disregard a successful project in which the most basic rules for (extensive) specification of the coding were totally neglected. …

This project was the PET system, to be regarded as Germany’s, and perhaps the world’s most successful software development tool at that time. The first version of PET was started about 4 months before it was introduced at the Hannover Fair. And, even more so, it was more or less “fumbled” in the software of the Philips X1150 Data Entry system. And that was done as an add-on to existing components of the base system, not even as separate ad-hoc programs. This method had the advantage that the systems to be developed existed from day one so that developers would never be separated from the reality: in the end, they developed their system with the system itself which forced them constantly on the real requirements of their environment.”

Peter Schnupp PhD, 1973

Invariants of software Engineering

“Prognoses are … a favorite theme in our profession’s press and their editorials. They like to speculate about how client/server systems will replace the mainframes, that Java is the programming language of the future or who e-commerce will change the economy. But, they never reflect about their predictions from yesterday, one year or even 5 years ago – it would be disgraceful and probably not very interesting to anyone. But, it would be an educational experiment, to record once a year, what the changes in Informatics have been over the last two, five and ten years. At the same time, one should reflect how the prognoses from last year have turned out – a good training on your judgment capabilities, mostly a disillusion on one’s capabilities of prediction. If this experiment would seem to laborious, one could replace this with self-reflection: What did I expect in 1980 for the state-of-the-art in 1985, or in 1985 about 1990 etc. “

Prof. Ernst Denert PhD: CeBit, Heads & News/ Corporate Consult, 1998


  1. ^ Computerwoche
  2. ^ Spiegel, January.17.1983, Page 71
  3. ^ Image credit: Museum of Information Technology at Arlington
  4. ^ Image credit: Museum of Information Technology at Arlington
  5. ^ RvG: This is the best translation that I can do, but it is still not very good as the original text is equally vague. I would recommend changing the quotation.
  6. ^ Der Spiegel, Jan. 17, 1983, page 71
  7. ^ RvG: I changed the sentence from realized to extended as the actual IBM3270 emulation software was developed and introduced by Four-Phase System and already part of the original PET software far before 1982.
  8. ^ RvG: I extended and changed the original text based on my recollection of that period
  9. ^ RvG: new text; i.e. not translated

External links

  • Christiane Floyd
  • Peter Schnupp to the story of Maestro I [2]
  • IEEE History Center: [3] Ernst Denert Interview (29 June 1993)
  • [4] - Museum of Information Technology at Arlington - Four Phase
  • [5] Four Phase System a multi-terminal display- processing system
This article was sourced from Creative Commons Attribution-ShareAlike License; additional terms may apply. World Heritage Encyclopedia content is assembled from numerous content providers, Open Access Publishing, and in compliance with The Fair Access to Science and Technology Research Act (FASTR), Wikimedia Foundation, Inc., Public Library of Science, The Encyclopedia of Life, Open Book Publishers (OBP), PubMed, U.S. National Library of Medicine, National Center for Biotechnology Information, U.S. National Library of Medicine, National Institutes of Health (NIH), U.S. Department of Health & Human Services, and, which sources content from all federal, state, local, tribal, and territorial government publication portals (.gov, .mil, .edu). Funding for and content contributors is made possible from the U.S. Congress, E-Government Act of 2002.
Crowd sourced content that is contributed to World Heritage Encyclopedia is peer reviewed and edited by our editorial staff to ensure quality scholarly research articles.
By using this site, you agree to the Terms of Use and Privacy Policy. World Heritage Encyclopedia™ is a registered trademark of the World Public Library Association, a non-profit organization.

Copyright © World Library Foundation. All rights reserved. eBooks from Project Gutenberg are sponsored by the World Library Foundation,
a 501c(4) Member's Support Non-Profit Organization, and is NOT affiliated with any governmental agency or department.