HIROSE Michitaka
Research Center for Advanced Science and Technology
The University of Tokyo



Today, the nature of man's relationship to machines has re-emerged as a hot topic among information technology specialists. The driving force behind this trend is the emergence of a new breed of information technology, characterized by mobile computing and ubiquitous computing, to name only a couple of examples, which is calling for a new paradigm for human interfaces.

  The theme of “man and machine” is not new. It has been discussed extensively for many years, since establishing a link between the two involves a whole range of issues - after all, man and machine are poles apart. What puzzles us scientists when we try to insert “human” factors into the context of science and technology is that we have to deal with ourselves as an object of research. Now, when you deal with yourself, it's impossible to completely insulate yourself from subjectivity. No methodology exists to cope with the concept of subjectivity in the first place, since objectivity is very fundamental to scientific research.

  Subjectivity is premised on no two individuals seeing the world in the same way, so we have to base our discussion on the notion that we are all different. Since generalization is the key to science and technology, “being the same” is valued higher than “being different.”Thus we scientists tend to first locate the mean, and fiddle with individuality as a secondary consideration. The tricky part is, however, that the dispersion among individuals sometimes speaks more eloquently than mean-centrism. Think about the blood-type system, for instance. The four types - A, B, O and AB - are not distributed peripherally around a single “average” blood type.

  What should be noted here is that dispersion is more important than mean value, and “difference” more important than “similarity.” Of course, this does not only cause trouble. Because fingerprints and eyeground blood vessel images differ from one person to the next, they can be used to identify individuals. If we start from the assumption that we are all different, we can broaden the applications of technological innovations. Take eyeglasses and clothing, for example. Their manufacture starts from the understanding that we are all different. You cannot mass-produce them at a factory, but need to make sure they fit each individual nicely - hence the need for vocations like opticians and tailors. The much-talked-about wearable computers are exactly like that - therein lies their essential novelty. Since the interface is in direct contact with a person, the computer can be tuned to the individual's characteristics in the extreme; otherwise, it will fail to demonstrate its inherent abilities. The greatest difference from the conventional computer is that it isn't premised on shared use or borrowing/lending among different individuals.

  Another intriguing feature of wearable computers is that they can also function as clothing. After putting on a wearable computer, the user might resemble a machine. Suppose that this computer is connected to a network around the clock, and that the user can access the network at will. Seen from the network, the user becomes like another terminal. When you put on a wearable computer that records your every move and all the input from your sensory organs, a huge quantity of data accumulates, which may be accessed via the network. In the past, human interface implied a border between man's environment and a machine's. With wearable computers, however, the machine intrudes upon man's territory.

  This may sound overboard, but it seems to me that there is emerging a whole new philosophy centered on how to bring man closer to machine, on top of the conventional one centered on how to bring the machine closer to man. In short, it's a philosophy of “cyborgs.” As fantastic as it may sound, the current fashion of mobile phone users in Japan - most people carry one and use that thumbing interface quite naturally - is like equipping a person with a keyboard interface. That being the case, it is possible to design highly practical interfaces outside the realms of state-of-the-art pattern recognition technology.

  This implies that the border between man and machine is becoming increasingly obscure. True, a flesh-and-blood person and a machine are opposing concepts. However, artificial intelligence has developed so dramatically that we often hesitate to describe machines as “mechanical,” since they seem so close to human. If the trend toward mechanizing man accelerates, the definition of the border between the two will change dramatically. My guess is that this relationship will gain many more layers, rather than a dichotomy between the two different things. “Man and machine” and “human interface” are two research themes that never fail to fascinate me.