T 3131/19 (Body movement dependent user interface/PHILIPS) 07-04-2021
Download and more information:
METHOD OF PROVIDING A USER INTERFACE
Amendments - allowable (yes)
Claims - clarity
Claims - main request (yes)
Novelty - main request (yes)
Inventive step - main request (yes)
I. This appeal is against the decision of the examining division, posted on 11 July 2019, refusing
European patent application No. 08 854 624.7. The application was refused for non-compliance with Article 123(2) EPC and lack of clarity (Article 84 EPC). The decision further contained remarks in respect of novelty of the claims (Article 54 EPC) over the disclosure of
D1: US 5 704 836.
The following documents were also cited in examination:
D2: FR 2 695 745,
D3: US 2007/139370.
II. Notice of appeal was received on 1 August 2019, and the appeal fee was paid on the same date. The statement setting out the grounds of appeal was received on
7 November 2019. The appellant requested that the decision under appeal be set aside and that a patent be granted on the basis of the claims on which the decision was based (main request) or on the basis of claims according to one of first to third auxiliary requests filed with the statement setting out the grounds of appeal. The appellant also requested oral proceedings in the event that the decision was not set aside.
III. Claim 1 of the main request reads as follows:
"Method of providing a user interface for controlling a system (1;27), including the steps of:
observing a presence of a body or body part of a user in a certain environment of a sensor device (6,14-17;30);
and being characterized by further comprising:
making available to the user at least one perceptible part (28,34) of the user interface in association with a particular functionality for interacting with the system (1;27) controlled by the user interface in dependence on a rate of displacement of the body or body part of the user observed in the certain environment; and
wherein making available to the user the at least one perceptible part of the user interface in association with the particular functionality for interacting with the system (1;27) controlled by the user interface comprises providing one of a plurality of user input controls associated with respective different actions of the system controlled by the user interface in dependence on the rate of displacement of the body or body part of the user observed in the certain environment."
The main request comprises a further independent claim (claim 9) directed to a corresponding system.
Due to the outcome of the appeal, there is no need to detail the claims of the auxiliary requests.
1. The appeal is admissible (see point II above).
2. Main request - Article 123(2) EPC
2.1 In the decision under appeal, it was objected that the introduction of the term "sensor" in claim 1 represented an intermediate generalisation of the feature "distance sensor" present in the description originally filed.
Claim 1 and the description as originally filed (see section "Summary of the invention") both comprise the feature that the observing step is performed in a certain environment of a device. The description relates to two embodiments of the claimed method, a first implemented in a home entertainment system (see Figures 1 and 2) and a second implemented in a coffee machine (see Figure 3). The first embodiment uses an infra-red transducer, an ultrasound transducer and a camera to detect the presence, distance and direction of movement of a body part of the user (see page 5, lines 32 to 34). The skilled person will understand from the whole passage relating to the first embodiment (from page 4, line 31 to page 6, line 23) that the device defined in claim 1 is one of the infra-red transducer 14, ultrasound transducer 15 and camera 16. These devices are undoubtedly sensor devices which are able to detect, at least, the presence of a body part in their environment. In a variant of the first embodiment, a docking station 4 for a portable media player 5 comprises a distance sensor 17 which senses that the portable media player 5 crosses distance thresholds when approaching the docking station. The distance sensor 17 is thus also a sensor device detecting the presence of a body part, since the user is carrying the portable media player 5. In the second embodiment (see Figure 3), a sensor 30 on the coffee machine determines the rate at which the user's hand approaches a switch 28 of the coffee machine. The sensor 30 is thus a device which, at least, detects the presence of a body part in its environment.
For these reasons, the board holds that the replacement of the term "device" by the wording "sensor device" in claim 1 does not contravene Article 123(2) EPC.
2.2 It was further objected in the impugned decision that the amendment from "controls" to "user input controls" in claims 1 and 9 was not supported by the application documents as originally filed. However, the board agrees with the appellant that the following passages provide unambiguous support for the introduction of the wording "user input controls". The passage on page 5, lines 27 to 29 describes user controls and output means as elements of the user interface, which implies to the skilled person that user controls are input elements of the user interface, and are thus user input controls. Further, the passage on page 6, lines 11 to 18 describes the provision of user controls to provide an input function to a media player. Since the media player is operated through a user interface, it is clear to the skilled person that the user controls are user input controls of the user interface. Moreover, the passage on page 7, lines 18 to 20 describes that the user provides input to the control system. Since the originally-filed claim 1 defines that the controls are associated with actions of the system controlled by the user interface, the skilled person will clearly understand that the controls in claim 1 as originally filed are indeed user input controls.
2.3 For these reasons, the board holds that independent claims 1 and 9 meet the requirements of Article 123(2) EPC.
3. Article 84 EPC
3.1 The impugned decision found that the feature of "observing a presence of a body or body part of a user in a certain environment of a sensor device" was unclear, since neither the kind of sensor device nor the certain environment were specified.
With respect to the wording "certain environment", the board agrees in substance with the appellant that the skilled person understands from the context of the present application that it merely defines the area that the sensor is observing. The term "certain" is merely to be understood as meaning given or predefined.
In respect of the wording "sensor device", several types of sensor are mentioned in the description, e.g. on page 5, lines 13 to 14, for performing the specific task of, at least, detecting the presence of a body part in its coverage area. The device 13 mentioned on page 5, lines 9 to 12, contrary to what is stated in the decision, is not a sensor device which observes the presence of a body part within the meaning of claim 1, but rather a component of the user interface when the latter is a touch screen used to directly interact with the user interface. Thus the wording "sensor device", read in the light of the description, is clear.
3.2 The decision also found that the feature of "making available to the user at least one perceptible part of the user interface" was not clear, since in the embodiment relating to the coffee machine the perceptible part was a mechanical switch which was, per se, always physically present and thus available.
However, as argued by the appellant, the wording "making available" in claim 1 does not relate only to the "perceptible part" but rather to the "perceptible part of the user interface in association with a particular functionality for interacting with the system controlled by the user interface". In the embodiment relating to the coffee machine, the mechanical switch is made available in association with a functionality for interacting with the machine which is dependent on the displacement rate of the user's hand. The switch's functionality is either the control function "stop pouring coffee" or the control function "on/off", both of which controlling the operation of the coffee machine. The mechanical switch thus represents both a user interface and a perceptible part of it within the meaning of claim 1, the switch having two different functionalities in dependence on the rate of displacement of the user's hand toward the coffee machine.
3.3 For these reasons, the board holds that independent claims 1 and 9 meet the requirements of Article 84 EPC.
4. Novelty and inventive step
4.1 Prior art
D1 discloses a control system of a video game that displays a graphical user interface, for example an animated character, on a display unit, e.g. a television monitor (see column 9, lines 36 to 39). The control of the character is performed by the user moving parts of his body (see column 9, lines 43 to 45). The control system modifies an image displayed on the monitor based on a command signal by moving the animated character in a manner intended by the user (see column 9, lines 60 to 64). The speed and direction at which the user moves determine the command signal for controlling the graphical user interface on the display unit (see column 11, lines 37 to 38). For example, an upward movement of the right arm of the user results in the character displayed on the monitor jumping, with a slow upward movement resulting in the character jumping slightly, while a fast upward movement results in the character jumping strongly (see column 11, lines 43 to 46). The system of D1 enables the video game to be controlled without requiring a combination of buttons and joysticks.
D2 discloses a man-machine gesticulatory dialogue method in a virtual environment. It uses gesticulatory means of a user the state of which is defined by a set of parameters and makes use of two functional modes. The passage from one mode to another is triggered by a step of a user's hand crossing a boundary with the means for gesticulatory communication (see the hand in Figures 2, 3a, 3b, 5, 6).
D3 discloses a motion recognition system for controlling an electronic device for interacting with a user. It is capable of detecting accelerations and angular velocities generated by the user's gestures, converting these data into a time-related gesture sequence, and then comparing the time-related gesture sequence with a predefined motion information sequence so as to enable the controlled electronic device to perform a specific operation according to the result of the comparison (see paragraph [0008]). In an embodiment, the motion recognition system is capable of defining a series of gestures to be used as a code for locking/unlocking an electronic device (see paragraph [0061]).
4.2 In the part "Further remarks" of the decision, a novelty objection based on D1 was raised.
The board however agrees with the appellant that D1 does not disclose that a part of the graphical user interface is associated, based on the displacement rate of a user's body part, with a particular functionality for interacting with the system controlled by the graphical user interface. Indeed, although the graphical display of D1 is modified based on the user's movement, no functionality for controlling the system, i.e. the video game, is associated with the modified graphical display.
The passage in column 11, lines 37 to 47 quoted in the decision discloses that the speed and direction at which the user moves determine a command signal. However, this command signal is used merely to control the movement of the graphical display output and not to provide a part of a user interface with an associated functionality for controlling the video game.
Further, the passage in column 11, lines 60 to 67 in conjunction with Figure 7 quoted in the decision relates to a subject frame captured from a camera observing the user and the derivation of a displacement rate of the user based on this subject frame. The subject frame shown in Figure 7 is an internal representation of the captured image of the user used for determining the user's motion. It cannot represent a perceptible part of the user interface with associated functionality within the meaning of claim 1.
The technical effect of the essential differences detailed above between the subject-matter of claim 1 and the disclosure of D1 is that a part of the user interface provides different functionalities for controlling the system depending on the rate of displacement of a user's body part towards the user interface.
The objective technical problem can thus be formulated, as proposed by the appellant, as how to provide an improved and more flexible user input interface for controlling a system.
The skilled person trying to solve this problem would not have looked into D2 or D3 since neither document is aimed at improving a user interface. Moreover, neither of these documents discloses providing different parts of a user interface with functionalities dependent on the displacement rate of the user.
4.3 For these reasons, the board holds that the subject-matter of independent claims 1 and 9 is novel (Article 54 EPC) and involves an inventive step
(Article 56 EPC), having regard to the prior art on file. Claims 2 to 8 are dependent claims and, as such, also meet the requirements of Article 56 EPC.
For these reasons it is decided that:
1. The decision under appeal is set aside.
2. The case is remitted to the department of first instance with the order to grant a patent on the basis of the following documents:
- claims 1 to 9 of the main request,
- description and figures to be adapted.