Sunteți pe pagina 1din 89

DESIGN FOR INTUITIVE USE

MMM 130

MECHANICAL ENGINEERING

ADAM A MARSH

0404304

April 2008
The author would like to thank Dr. R M Setchi
for her continued support, discussion and
ideas throughout the duration of this project
and Dr. C Charron for his assistance in the
‘Blue Room’.
I hereby declare:
That except where reference has clearly been made to work by others,
all the work presented in this report is my own work;
That it has not previously been submitted for assessment; and
That I have not knowingly allowed any of it to be copied by another
student.

I understand that deceiving or attempting to deceive examiners by passing off


the work of another as my own in plagiarism. I also understand that
plagiarising the work of another or knowingly allowing another student to
plagiarise from my work is against the University regulations and that doing so
will result in loss of marks and possible disciplinary proceedings against me.

Signed ……………………………………………

Date ……………………………………………
ABSTRACT

This project explores the field of intuitive design. Existing ideas are researched
and built upon to discover if mobile phones are currently viewed as intuitive to
use and if there is a trend between this factor and the success of a product,
comparing Nokia and Sony Ericsson. An experiment was carried out to
determine which areas of a mobile phone facia are commonly viewed to be
used for specific tasks and to investigate what image schema are used to carry
out generic interaction tasks with a hand held device.

It was found that the Nokia interface scheme is currently viewed as being very
intuitive to use, being given a score of 4.7/5 and that their system of interface
design has shaped the consumers expectations of interaction with a mobile
phone. It is shown that the nearer the top of the keypad a key is positioned,
the more important, or regularly used it should be. The most common image
schemas indicated by participants for interaction were vertical and depth
schemas.

It has been decided the following statement should be the definition for
intuitive design:
Intuitive design is the seamless alignment of cognitive expectation with
interface actuality.

New technologies are changing the function of the mobile phone, but this study
suggests that these extra functions appear to be confusing to use on such a
small device. New techniques may need to be investigated to keep the mobile
phone a light, portable product that remains intuitive to use.
CONTENTS

List of figures 3

1. Introduction 4
1.1 Motivation 4
1.2 Perspectives 5
1.3 Aims and Objectives 7
1.4 Outline 7

2. Existing Approaches 8
2.1 Defining Intuitive Interaction 8
2.2 Understanding Intuitive Interaction 10
2.3 Principles of Intuitive Interaction 11
2.4 Tools for Design 13

3. Cognitive ergonomics 15
3.1 Image Schema 15
3.2 Gestalt Laws 18

4. Market Trends in the Mobile Phone Industry 19

5. Questionnaire 21

6. Experimental Procedure and Hypothesis 24


6.1 Visual Test 24
6.2 Image Schema Test 27

1
7. Results 30
7.1 Questionnaire 30
7.2 Visual Test 31
7.3 Image Schema Test 31

8. Discussion 32
8.1 Questionnaire 32
8.2 Experiment 35
8.2.1 Visual Test 35
8.2.2 Image Schema Test 44

9. Conclusions 48
9.1 Experimental Conclusions 48
9.2 Intuitive Designs in the Future 50

References and Bibliography 53

Appendix A – Questionnaire i
Appendix B – Results iii
Appendix C - Ethical Approval Form xv
Appendix D - Immersion CyberGlove xviii
Appendix E - Setup of the CyberGlove into Alias MoCap6 xix
Appendix F - Importing Data into Microsoft Excel xxi
Appendix G - The Analysis Code, Matlab xxiv
Appendix H – Record of Project Meetings xxix

2
3
1. INTRODUCTION

1.1 Motivation

The term ‘intuitive’ is referred to regularly in design, technical fields and by users
of products that require cognitive interaction. It has become what many call a
‘buzz’ word, clouding its meaning to all. Little work has been carried out to
determine what makes a design intuitive to use. The Oxford English Dictionary
(1983) has the following definition:

“Intuition n. the power of knowing or understanding something immediately


without reasoning or being taught.”

Therefore, a product can be intuitive if its function is clear and the methods
needed to implement its functions take little or no cognitive thought power. It is
apparent then, that to have an intuitively designed product would be a very
powerful competitor in a market. A mobile phone, for example, that responded
to the owner’s actions in the desired manner every time would not need an
instruction manual, would be quick to use and would allow the customer to use
the phone as the powerful tool they are increasingly developing to be.

The website industry is perhaps the largest investor in research into ‘intuitive
design’. A client wants their website to be clear and easy to use for all who arrive
on their homepage, so that the desired information can be found.

Computers are used to control processes in areas that are unsafe for humankind
to enter, for example nuclear power stations. Such sectors can have catastrophic
disasters if the process becomes unstable, Chernobyl, for example (E. Stang,
1996). When an emergency becomes apparent, it is desirable for the control
engineers to use their cognitive power on determining the required actions to

4
avoid disaster, rather than on how to implement these decisions by way of the
control panel. In the case of Chernobyl, the control engineers were not aware of
the situation, as the control panel failed to indicate the dangerous conditions
developing in the reactor core.

Cardiff University MEC research teams (www.mec.cf.ac.uk) have recently


developed a Tangible Acoustic Interface for Computer-Human Interaction (TAI-
CHI). This research has made a possibility of creating an interface out of any
surface by measuring the change in acoustics of a body. This opens up a huge
new field for inventiveness in interaction with many devices in the future. Making
these interfaces intuitive to use should be a key area of development, as the
movements that can be measured are possibly endless in both action and
subtlety.

Intuition, or to be intuitive, is a very difficult concept to define. The idea stands


strong as being something that is subconsciously effortless to use, but achieving
a definition requires a particularity fastidious approach. Unfortunately, in only
five months, it would be impossible to investigate and define all the factors that
lead to this ideal. A specific perspective, narrowing this project to explore just a
few factors, has to be taken to ensure that design for intuitive use is explored to
the depth it requires.

1.2 Perspectives

Possible perspectives are of the consumer (what aspects of existing products are
currently intuitive to the consumer), the designer (what current trends exist for
products that are ‘intuitively designed’ and why), reaction control (how it is
possible to force a desired result by making that specific path the easiest to

5
follow) or the product (how successfully do products that are, by current
standards, the most intuitive to use, compete on the market).

As intuition is not a physical or tangible variable, experimental procedures and


technologies from many fields, psychology, science and ergonomics may have to
be combined to implement a suitable study.

In J. Sweller’s description of the Cognitive Load Theory (1994), he explains that


“learning best happens under conditions that are aligned with human cognitive
architecture.” It has been argued and backed up by experimentation by many
psychologists that the use of images proves to be a very powerful process of
teaching, and it is safe to say that the majority of interfaces rely heavily on
vision. This seems to suggest that images are aligned with human cognitive
architecture. Therefore, the idea that the subconscious creates images to work
out complex problems, or to describe an idea to another body seems viable.
Working a device is similar and to determine what images are created in the sub
conscious, and from what experiences they are rooted, could result in a very
powerful tool if these images are represented as closely as possible in a
developing interface.

Such a study would be never ending as each individual would have had different
experiences, and hence a different level of intuition. This is not to suggest that it
is impossible to define a set of parameters which would be intuitive to all, but
perhaps that this level of intuition would be too basic to be implemented solely
into an interface without some level of learning required for a completely
effortless interaction. Patterns of intuition may be exposed between generations
and cultures from which the participants came, possibly resulting with a number
of salient directions of intuition. To intuitively design a product for just one of
these sub-cases may prove to be far easier and successful.

6
1.3 Aims and Objectives

The aims and objectives of this project were to;

1.4 Outline

Objective 1 has been carried out in this chapter and future significance is
discussed later in chapter 9. Chapters 2 and 3 investigate current ideas of
intuition and factors leading to intuitive products whilst chapter 4 looks at
previous market trends. A questionnaire, chapter 5, was used to search for a
relationship between intuitive design and success of a product as part of the
experiment which is explained in chapters 6 and 7. The results are analysed in
chapter 8 and discussed in chapter 9 where a definition for intuitive design has
been stated and areas of future research are suggested.

7
2. EXISTING APPROACHES

2.1 DEFINING INTUITIVE INTERACTION

Through reviewing literature, Blackler (2002-2007) gave intuitive use the


following definition:

Through literature, interviews and workshops the Intuitive Use of User Interfaces
research group (IUUI, 2005) defined an intuitive interaction with the following
statement;

J. Raskin (1994) believes that all intuition is derived from past experiences and
knowledge.

It is also stated that intuition is learnt over time through use of technologies, so
that intuition as the dictionary defines, cannot exist in a designed interface.
Instead, an interface that can be easily learnt has the appearance of being
intuitive. Therefore, intuition is something acquired by people whilst using
technology, rather than a standard of technology with which interaction is
effortless.

8
It is suggested that a superior product or new technology cannot be intuitive as
it cannot be the same as an existing product. Therefore it must be different and
inherently non intuitive, or familiar. Raskin concludes by redefining the use of the
word ‘intuitive’ with the following statement:

“Intuitive = uses readily transferred, existing skills.”

This is radically different from the dictionary definition given earlier in this text,
agreeing with Raskin that the word is used inappropriately. Some similarities
exist however, when compared to Blackler’s definition. Perhaps what is being
said is that there is no congenital or inherent knowledge of technology. Intuition
is a term that cannot be applied to the product itself but instead to the link
between interface and user. As every user is different, so too will be the level of
intuition between the two. A computer mouse seems intuitive to all computer
users, but the first time user may still need to be shown how to use it, although
it would be very rare for that person to need to be shown again. This would
suggest that a mouse becomes intuitive very quickly and this is perhaps the
ultimate aim of an intuitive interface.

9
2.2 UNDERSTANDING INTUITIVE INTERACTION

Hurtienne (2005) describes how prior knowledge, knowledge acquired before


interaction with the new product, will come from a variety of sources. Hurtienne
classifies these sources into innate knowledge, embodied interaction, culture and
expertise. It is explained how specialist knowledge of tools and technologies can
exist over the last three levels (Figure 2.1).

Hurtienne acknowledges that the higher up the continuum, the smaller the
potential number of users possessing this knowledge, but also that the lower
level knowledge is used more frequently. These are the levels that are more

10
likely to be applied unconsciously and therefore intuitively. During high mental
workload and stress, a fallback on lower stages of the continuum will occur and
further, that a more intuitive interface, using these stages, will use less cognitive
processing power.

2.3 PRINCIPLES OF INTUITIVE INTERACTION

Blackler (2005) states that there are three principles to work with to create an
intuitive interface;
1. Make functions, locations and appearances familiar for features that are
already known.
2. Make obvious how to use less well-known features by using similar things
to demonstrate their function, appearance and location.
3. Increase consistency of function, appearance and location within the
interface.

These principles lead to Blackler’s “Continuum of Intuitive Interaction”, shown in


Figure 2.2. Each principle is linked to a set of terms. Principle 1 is compared to
influencing the interface with body reflectors, population stereotypes and
features from existing products in the same domain. Principle 2 relates to familiar
feature from other domains and metaphors, for example Microsoft ‘Windows’.
Principle 3 is drawn outside the continuum as the decision to influence the
interface by this principle draws upon the terms used for the previous principles
and is applied to the entire design.

11
Similarly to Blackler, Hurtienne devised a number of principles through which to
design an intuitive interface (figure 2.3).

There are strong similarities between Blackler and Hurtienne here. Affordances
and consistency exist in both, with the same definition. Blackler has more
principles related to previous experiences whilst Hurtienne’s principles are all
related to the interface.

12
2.4 TOOLS FOR DESIGN

To aid designers in applying intuitive use to interfaces, Blackler developed a


conceptual tool (Figure 2.4). The designer is required to first investigate who the
target user is and to determine what technologies and interfaces these people
are familiar with. These are then related to the product-to-be and the designer
can pass down the spiral applying each field to the design, gradually making the
final product a more intuitive design. In theory, a design which only requires the
top few levels will be more intuitive to use than one which requires influences
from the lower levels. However, the more recent the technology, the further
down the spiral one would have to go as there are less previous experiences to
work from.

13
It is important to note that each level of the tool has three spirals. The spirals
represent function, appearance and location as shown in figure 2.5. This order
should be followed as the importance is such that function precedes appearance,
which in turn, precedes location in accordance with previous work (Blackler,
2005)

The IUUI have produced a questionnaire to evaluate intuitive interaction of a


product, measuring four factors; perceived effortlessness, error rate,
achievement of goals and effort of learning. A product with a high perceived
effortlessness and achievement of goals would be classed as more intuitive than
one with a high error rate and effort of learning as the latter factors break the
definitions for intuitive use.

14
3. COGNITIVE ERGONOMICS

3.1 IMAGE SCHEMA

Image Schema or Schemata in the plural sense, is a term used by psychologists


to explain the functioning of the brain when receiving information. There are
disputed uses of the term schema, especially among psychologists, resulting in
vague definitions.

This description on image schemata aims to explain how this cognitive function
can be viewed from a non-psychologists perspective and ultimately from an
engineering or design perspective.

Mark Johnson was one of the first to define the idea of image schemata. He
argues that;
“Human bodily movement, manipulation of objects and perceptual
interactions involve recurring patterns without which our experience would be
chaotic and incomprehensible.” (Mark Johnson, The Body In The Mind, 1987)

He later describes image schema as;


“…a dynamic pattern which functions somewhat like the abstract structure
of an image and thereby connects up a vast range of different experiences that
manifest this same recurring structure.”

An image schema is a mental pattern that recurrently provides a structured


understanding of various experiences. They can be used physically, describing
objects or actions, or metaphorically as a source domain to provide an
understanding of other experiences. For example, a force schema of gravity or
wind would be physically applying the schema whist love or justice would be
applying the schema metaphorically.

15
There are many schemata used by the brain in this way. Johnson noted a ‘partial
list of schemata’ consisting of twenty seven individual schemata, a small number
of which are described in figure 3.1 below.

It is important to note that the term ‘image’ in image schema is not an image
that can be drawn or be shaped in a three dimensional world. An image schema
does not have the rigidity or specific-ness of a picture or structure, but consists
of parts that can be flexed and sculpted into an infinite number of ways,
sometimes interacting with other image schema, to align with perceptions,
‘visions’ and events.

In conjunction with Blackler’s principles of interaction; to “make functions,


locations and appearances familiar for features that are already known” Aristotle,

16
explaining why we need memory, pointed out that “Without a presentation,
intellectual activity is impossible”. (Arnheim, 1969)

It would appear that to teach a mind how to use something, it is the information
presented to it that will determine its success. Going further, the success of this
information resulting in the ‘correct’ cognitive response lies in the similarity of the
image schema formed whilst receiving information and the image schema used
to implement interaction. Image schemata are flexible structures for the mental
organisation of experiences and comprehension.

17
3.2 GESTALT LAWS

Hurtienne applied the gestalt laws as one of his principles and Johnson argues
that gestalt laws apply to his schema. Gestalt is a German word meaning
configuration or pattern. Its psychology aims to explore the brain processes
involved with the organisation of perception with an approach focusing on the
idea of the mind ‘grouping’ elements to perceive objects. Generally there are five
laws from which others stem, shown in figure 3.2.

The Gestalt laws can be applied to the design of the device with respect to
cognitive ergonomics. It is assumed that buttons, for example, that are arranged
to cause grouping by a single law, will be perceived having very similar levels of
function, for example the number pads on a phone facia.

18
4. MARKET TRENDS IN THE MOBILE PHONE INDUSTRY
The mobile car phone emerged 25 years ago in 1982 and by 1987, the first
handheld mobile phones had emerged with the number of worldwide subscribers
(phones connected to an in-use number) reaching one million. Sales of mobile
phones have experienced an exponential growth up to current date, (figure 4.1),
whilst network capabilities and services are constantly innovating and improving.
In 2007, one billion new handsets were sold worldwide (Hazel Gidleyr), bringing
the worldwide subscriptions total to three billion, almost half the worlds
estimated population of a little over six billion (U.S. Census Bureau). Figure 4.2
indicates that the UK, with a current population of 61 million in 2007 has 70
million subscribers!

In the third quarter of 2007, Nokia distributed over two and half times as many
mobile phones worldwide than its closest competitor, Samsung, shown in figure
4.3, and has held the top spot since 1998.

19
We are now in the third generation (3G) of network technology. The 1G network
worked with an analogue system, only sending and receiving voice calls. The
1990’s spawned the digital 2G with the ability to send text messages and
remains the most used mobile technology by the consumer.

The UK introduced 3G in 2003. 3G systems are designed to process data,


offering high data transmission rates and increased capacity, making them
suitable for high-speed data applications. Voice signals are converted into digital
‘packets’, resulting in speech being dealt with in much the same way as any
other form of data. 20% of UK mobile phone users are using 3G, but worldwide
this figure drops to 6.7%.

Currently, some companies have started development of the 4G communication


system. With a mixture of 3G and wireless technology, a high uplink rate of
200Mbps can be achieved. So much data will be able to be transferred in the
mobile phone that a 4G mobile can have many more functions, such as operating
a television and other home and office electronics.

20
5. QUESTIONNAIRE

To gain a view of what the sample population were using their mobiles phones
for and what phones they have used, the questionnaire (Appendix A), was
distributed by e-mail. The answers were stored in an Excel spread sheet to be
analysed.

HYPOTHESIS

The current style of phone sales from a high street network store often offers a
free mobile phone handset when a 12 or 18 month pay monthly tariff is
purchased. After this time, a new tariff is sold, resulting in a new handset. This
market trend will undoubtedly result in the vast majority of participants to have
owned their phone for a short period of time, having a positive correlation to the
number of phones that have been used in the past.

This style of selling results in the user making their way through a high number
of phones in a relatively short period of time, potentially learning a new interface
every year or so. The user may, to make things easier, repeatedly choose the
same phone manufacturer to replace the old handset. New technologies,
appearances and styles are most probably the main factors that would cause the
user to choose a new phone manufacturer. Users may have the ability to be in
contact with the new handsets from peers, so when time comes to choose a new
handset, the buyer may have already done their market research and already
made their decision for a new phone before walking into the store. The factors
discussed here will become apparent, or disproved by the correlations between
the handset currently used and those that have been used before.

Remaining at heart, a telephone, it is hypothesised that calls will rank the highest
score for use of the mobile phone. With the success of the text message after

21
the advent of 2G network capabilities, text messaging
will score highly also. Sony Ericsson created a range
Walkman mobile phones, with their MP3 capabilities
being advertised heavily (figure 5.1). This technology
comes almost as standard for most modern phones, but
these are still early days to faze out specific, proven,
large memory MP3 players. Due to advertising, those
that own Sony Ericsson handsets are more likely to
have a high MP3 score than other phones.

The digital camera function has hit a broader niche. A mobile phone is carried on
one’s person almost 24/7, including times when one would not want to take an
expensive digital camera. These times do not
require excellent resolution or swift transfer rates,
making the camera phone (figure 5.2) an excellent
extra feature. Use of this will only increase as
miniaturisation of good camera technologies leak
their way into mobile phones. It is believed that
Figure 5.2– Samsung SCH-V770
(www.cellphones.techfresh.net)
the camera will score highly as a use for the
mobile phone.

It is not expected that internet and e-mail will have high scores as these are still
relatively new technologies for the mobile phone and most people will have a
computer linked to far faster internet at home. Those with rapid moving business
lives would be more likely to utilise the on-the-go internet and e-mail services.
Games also, are expected to score low, simply because the youthful users, less
than 17 years of age, have not been included in this survey due to ethical
reasons.

22
The awareness and ability sections of the questionnaire will unveil how confident
users feel about their current phone. Phones with a high awareness score may
have an ‘open’ feel to their interface, where most of these features will be on
display at the same time, increasing awareness (figure 5.3). A high score will
naturally appear if the user actually uses all
the features of their phone. In theory, the
ability score should not exceed the awareness
score, as awareness must be present before
one can have the ability to use a feature.
There may however be some confusion here
and perhaps pride also, where the user may
rank their ability to use all the features they
regularly use. The ease of ‘learning’ ones
current phone is a direct score for specific
manufacturers and will be analysed as such. If the hypothesis of most phones
being owned for 1 or under years is true, then this question holds a great
significance as the early learning stages may still be fresh in the user’s mind.

23
6. EXPERIMENTAL PROCEDURES AND HYPOTHESIS

Before the experiment began, the following consent form was asked to be filled
in by the participant. The experiment was only carried out with those who had
given their written permission.

6.1 VISUAL TEST


The experiment consisted of two parts. The first part involved a visual test with
two schematics of simplified button configurations; one based on the Nokia
system (figure 6.2) and the other, based on the Sony Ericsson system (figure
6.3), which also holds similarities to the Motorola, Samsung and iPod systems.
The test was videoed with a digital camera for analysis.

24
The aim of this part of the experiment was do determine what keys, or areas of
the mobile phone facia were most commonly viewed to be used for specific tasks
and to investigate if there were any consistencies with the methods required for
existing phones and these schematics. The schematics were considerably larger
than life size, printed on A4 paper. Participants were informed about the scene in
front of them;

“You have in front of you a simple schematic of a phone facia. The visuals, key
notation and screen interface are entirely that of your choice. You are currently
at your ‘home’ screen with a locked keypad.”
The participants were asked to carry out the following tasks, on each
configuration one at a time, using the schematics in any way they saw fit.

25
The results from this task were used to determine any consistency in the areas
used for locking and unlocking the keypad, menu, select, back, number keys,
delete and scroll down, scroll up, call and hang up.

HYPOTHESIS

It is hypothesised that there should be no problems with the typing of numbers


due to the repeated constant standards of these keys being placed at the base of
the keypad with the ordering of 1 top left and 9 bottom right with *, 0 and #
along the bottom row.

The Nokia schematic may cause more inconsistencies than the Sony Ericsson
schematic as the only variable in the former is position, with identically sized and
shaped keys, thus removing most of the Gestalt laws. The results from this test
may therefore indicate broad similarities in desired positions of keys only.

Having the round shape in the top central position of the Sony Ericsson
schematic bares a far closer resemblance to the items from which this schematic
has been derived, and may cause initial recollection of past experiences with
such items. If this is the case, it would be expected that each test on this
schematic will result in similar patterns of interaction.

26
6.2 IMAGE SCHEMA TEST

The second sections to the experiment aimed to explore what image schema
were created in the mind when asked to carry out typical tasks that occur on
mobile phones. Movement of the thumb was recorded using an Immersion
CyberGlove, figure 6.4, details of which are cited in Appendix C. As the mobile
phone handset is typically used mainly with the thumb, the analytical programme
was written to track this motion. Matlab was used to carry out this task, and the
written code can be found in Appendix G. The full process of handling the raw
data and calibration of the CyberGlove are found in Appendix F.

This test was carried out using the left hand to reduce the effects of muscle
memory for right handed people, in theory leading the participant to think a little
deeper about what they want to do. The participant was asked to make the
‘thumbs up’ gesture, as shown in figure 6.5. This was done to restrict the
movement of the other fingers achieving as much motion as possible solely by
the thumb. The participants were then informed that they had the ability to
control a hand held device simply by the movement of their thumb. Being
imaginary, whatever motion was made, the desired interaction would take place.
At the end of the experiment, the dimensions of the participants thumb was
measured using a rule to be input into the Matlab programme (Appendix G).

27
Three tasks were asked to be carried out;

Care had to be taken not to reduce the scope of imagination from the participant
by asking for example, scroll down a list. This would have naturally forced list
and an image schemata relating to down to be created in the mind of the
participant, rendering the experiment useless.

28
HYPOTHESIS

It is believed that this section of the experiment will show that mobile phones
have greatly influenced our perception of what is possible in terms of interaction
with a hand held device. Having been carried out after the mobile phone facia
test, these interactions will be fresh in the participant’s minds, possibly causing
the results from this latter part to follow a very similar pattern. The input of a
number will no doubt result in many responses of key pressing on an imaginary
keypad, as this is the way all mobile telephones operate.

It is believed that the majority of participants will refer to touching a surface.


With the positioning of the hand, this surface will no doubt be their first finger.

It is hypothesised that interaction with a device must use more than one image
schema because the ability to interact with a man made object will not be an
innate ability, but instead may be a complex thought process involving many
links to past experiences to make this interaction take place. The tasks shape a
primary image schema; list, increase or decrease; and it is what secondary,
tertiary or further image schemas are used in partnership that is being
investigated. It is believed that the scale image schema will be used with another
directional schema. The directions in which the participants move their thumb
may be shaped however, by the physical affordances that have occurred in past
experiences.

For passing through a list, it is hypothesised that the list image schema will be
paired with the vertical schema, passing ‘down’ and ‘up’ a list. Increasing and
decreasing a value may well result in the same motion as the list, because to
increase is typically synonymous with upwards, and visa versa.

29
7. RESULTS

7.1 QUESTIONNAIRE

The resultant charts from the questionnaire can be viewed in appendix B.1.

A total of 46 completed questionnaires were returned and analysed. Only one of


these participants did not own a mobile phone. The average age was 28 years
and on average, each user had owned 5.5 phones in their time.

Text messaging had 33% of the score for use of the mobile phone (figure B.1.5),
and calls had 32%. The final third was split between all other uses displayed in
the questionnaire.

Nokia dominated the past market share with 49% (figure B.1.6). Sony Ericsson
and Motorola shared the runner up spot, holding 15% of past sales. However, of
currently owned phones (figure B.1.7), Nokia and Sony Ericsson share the top
spot with 32% each and Samsung took the third spot with 16% as Motorola
dropped to fourth with 12%.

Participants gave themselves average scores of 3.4, 3.2 and 3.6 of a possible 5
(figure B.1.8), for phone awareness, ability to use all features and ease of
learning current phone respectively.

30
7.2 VISUAL TEST

The results from the visual test have been displayed in appendix B.2

There were a total of 8 participants in this section of the experiment. All


participants carried out the tasks on both schematics, the Nokia schematic first,
followed by the Sony Ericsson schematic.

The Nokia schematic had a 50% success for the correct interaction being carried
out for the respective task. The Sony Ericsson scheme achieved 61%.

7.3 IMAGE SCHEMA TEST

The resultant vector graphs from the image schema test have been displayed in
appendix B.3

A total of 7 participants completed this section of the experiment, apart from test
C, which 6 participants completed as one result was not recorded properly. The
majority of movement occurred in the vertical, Z axis and most participants made
actions representing the pressing of a button for their choice of interaction. Of
the 7 participants, 5 were right handed and using their weaker hand to interact.

31
8 – DISCUSSION

8.1 QUESTIONNAIRE

Persons less than 24 years of age accounted for 74% of the completed
questionnaires, indicating that the results will show the trends that exist for a
small sample of the future independent generation. The vast majority of these
results are the perceptions of students in this age range. This generation has
grown up surrounded by technology and the computer era from a relatively
young age. The student will have a high social use of their mobile phone, the
possibility of higher peer pressures to keep up to date with styles and
technology, a fair amount of ‘free’ time, but less free cash flow and may be open
or naive to special marketing techniques targeting the student customer base.
The results gathered from older generations can be used as a slight comparison,
but no claim is made that the perceptions of these generations are properly
represented.

Almost 60% of those asked said they still used instruction manuals. This
question was asked before the mention of a mobile phone, so the results indicate
that either the user is not willing to trust their existing ability with technology, or
the majority of devices are still too complicated to use on first experience,
suggesting that there is much scope for improvement in the way technology
presents the way it should be interacted with.

Mobile phones have been used by those asked for an average of 6.8 years, with
an average of 5.5 phones being owned in this time. This positively correlates
with the result of 55% of the participants owning their current phone for less
than 1year and a total of 87% for fewer than 2 years. This result follows an
exponential decay profile, showing that future sales of mobile phones have the

32
opportunity to continue to soar as they are currently doing so. If this influx of
new phones to the population continues, they will have an even greater effect on
shaping how one interacts with technology than this study will show.

Text messaging takes the top spot for mobile phone use with 33% of the total
scores, whilst calling takes a second third with 32% of the scores. As predicted,
the use of a phone as a camera scores highly amongst the remaining uses at
12%. MP3 capabilities score joint 4th with organisation at 7%, with Sony Ericsson
users donating 56% of the MP3 player score. Interestingly, games scored higher
than both internet access and e-mail which take only 4% of the scores between
them. This agrees with the hypothesis that this technology is still new and slow
in comparison to using a home computer. It also shows that access to the
internet is rarely so important to this sample population that it cannot wait until
one can find a better connection, for a fraction of the cost that it would from a
mobile device.

A staggering 49% of previously owned mobile phones have been manufactured


by Nokia. This Nokia flooding of the market must have had a great influence in
inspiring other manufacturers and setting the expectations of users on how to
interact with a mobile device. Sony Ericsson however tied Nokia with a fraction
under a third share of the current phones owned each. This means the visual
test section of the experiment is a comparison of the top two manufacturers for
this sample. All other manufactures have had a relatively constant share, with
Samsung, LG and 3 enjoying a slight increase of the share, whilst Motorola has
lost a little and Samsung has dropped to the ranks of Blackberry and Alcatec with
no current phones owned by this sample population.

These mobile phone owners gave themselves an average score of 3.4/5 (≈70%)
for phone feature awareness. This is a good score, but with the low use scores
for the newer features, this suggests that that a mobile phone is still essentially a

33
speech and text communicator between individuals. Over half of the sample gave
themselves, perhaps modestly, a score of 4/5 for their ability to use the features
available. This suggests that phones are generally well mastered by the sample,
however, all five participants to give a score of 1 for this question was less than
24years of age, and all owned a Nokia mobile phone. It is not known which
phone was owned. The source of this could prove to be a very important factor
for future development. It may be a factor of Nokia simply having too many
features available on a mobile phone, or that these features are difficult to
manipulate with the classic interface that Nokia mastered during the 2G era.
Nokia did however score a remarkable 4.7/5 score for ease of ‘learning’ the
current phone. If Nokia decide to alter the way their interface functions to
promote the use of the newer features, the interactions required for calling and
sending a text message should not change. Sony Ericsson had the only two 1/5
scores for ease of learning, but overall scored a second place average of 3.4/5.

34
8.2 EXPERIMENT

Eight subjects took part in the experiment due to time restraints. On average,
the experiment took a total of 10 to 15 minutes for the participant to complete,
but as most of the participants were fellow students, finding suitable time off for
both parties proved to be the challenge. This small number of subjects and age
range makes it difficult to make any valid conclusions, so this section of the
dissertation has become a preliminary study and even a test of the success of
the experimental procedure. Computerisation of collection and analysis of results
would have made this task easier.

8.2.1 VISUAL TEST

Figure 8.1 to figure 8.13, excluding figure 8.7 and figure 8.8 are the author’s
own work, produced using Microsoft PowerPoint.

Unlock;
Although there was no deviation by any participants between their interaction
process to lock and unlock a keypad, there appears to be many different ideas
for what these interactions should be. The Nokia system resulted in six different
interactions, whilst the Sony Ericsson had seven. It does however appear to be
the general consensus (81%) that locking and unlocking the keypad should
involve the pressing of two keys, typically at opposite ends of the keypad.
Neither top to bottom nor bottom to top comes out as dominant, with both
schemes producing a balance of almost 50% each way. The only other technique
involved a long hold of a single key, chosen once for the Nokia scheme and twice
for the Sony Ericsson. Using two keys gives more safety against accidental
unlocking of the keypad, perhaps why it is the favoured technique.

The bottom left of the keypad holds a significant influence in this task, being
used by 75% of participants for the Nokia scheme and 62.5% of participants for

35
the Sony Ericsson scheme. The entire top row has been used for both schemes,
and in fact each interface interaction key has been used in the Sony Ericsson
system. Generally, each participant has tried to fit the system they have used for
the first scheme to the second, showing that perhaps this task is not greatly
affected by the configuration of keys, but instead any user’s preferred method is
simple and flexible enough to easily adapt to multiple keypads. The strong
correlation of use of the bottom section of the schemes paired with the looser
correlations for the top indicate that perhaps this is how each phone
manufacturer has tried to come close to an existing method to make learning
their interface easier.

Only a single participant used the correct method for locking or unlocking the
keypad for the Nokia scheme (figure 8.1), and only two for the Sony Ericsson
scheme (figure 8.2). This is surprisingly low for both, as from the results from
the questionnaire, Nokia have had a very large influence in the market, and Sony
Ericsson has been growing rapidly in recent years. It can clearly be seen how
Sony Ericsson have mirror mimicked the Nokia method.

36
Menu;
The top left and top centres are clearly the preferred areas for the menu key.
The enclosed key in the Sony Ericsson scheme scored 75% of the participant’s
opinion, one of the two existing menu keys on a Sony Ericsson mobile phone
(figure 8.4). The top left is Nokia’s choice of menu key (figure 8.3), which 50%
of participants used, closely followed by the top central key.

As each participant has chosen an area at the top of the keypad and the right
hand side has not been used by any participant for either scheme, an added
importance has been to the top and the left hand side of the keypad. The menu
key is probably one of the most used on a phone keypad and acts as the first
port of call when carrying out almost any function on the phone, drawing
similarities to how Western text is written, starting at the top left, moving right
and down. This result could lead to the conclusion that the more important or
common the use of the key, the closer to the top it should be, and if there is no
obvious central location, to the left also.

37
Select;
The results for the select key and the menu key are exactly the same, and it was
so for each participant. Nokia (figure 8.5) have used this system on their phones
and Sony Ericsson (figure 8.6) have made use of the enclosed central key for
both features, although they have changed the top key from top right to top left.
As it was Nokia who made their phones originally work as such, credit must be
given, as it is now simply expected by the user.

Using the lone key for both features saves time and effort for the user and
makes sense for the interface also. These top keys typically have no pre-
described function attached to them as they stand alone, but is allocated as
shown in figure 8.7, which shows a welcome screen on the left and the text
message Inbox folder on the right. These keys are referred to as ‘soft keys’ as it
is the phone software that determines their use.

38
Back;
Other than the general use of the top section of the keypad, no particular key
seems to indicate a strong correlation for the ‘back’ function. No central keys
have been used however, so a side key should definitely be used, but there is no
sway to which side should take preference with both left and right taking 50% of
the results. Only two participants thought that back should be an option for the
soft keys on the Sony Ericsson scheme.

It should be noted that with a Nokia phone, the red hang up key, shown in figure
8.8, acts as a cancel key when not in a phone call, returning the user to the
welcome screen, whilst the back key will take the user back only one step.
Therefore, those who pressed this key would have had the desired effect as the
question was to return to the welcome screen. This means half the participants
had a correct solution for the Nokia scheme, whilst only 3 of 8 did so for the
Sony Ericsson scheme. Almost all participants decided to hold their chosen key to
return to the home screen, with the same desired result as the Nokia hang up
key. The two participants that repeatedly tapped the same key did so at least 3
times.

39
Number Pad;
There is no discrepancy here. When asked to type in a number, all participants
did so the same way, as it would have been on a real phone (figures 8.11 and
8.12). This makes it clear that this part of a mobile handset is so set in ones
mind because all phone manufacturers have followed the same system. The
number that was asked to be typed in started with a zero, as do all UK numbers.
With this as the bottom central key, a good reference point is made to determine
where the other numbers lye in relation to this one. More time was taken by
participants to key in correctly on the Nokia scheme, no doubt because of the
lack of gestalt laws between the keys, whilst less time was spent on the Sony
Ericsson scheme, agreeing with the hypothesis that a clear difference in the
function of the top keys to the bottom keys is made apparent on this scheme.

40
Delete;
The results for delete share much similarity with the results for back, suggesting
that the same key should be used for both, however both have a wide spread of
results. Both left and right hand sides have been used, and one participant used
the bottom left key. There is a slight preference for the delete function to be
placed on the left hand side, which is the opposite of both Nokia (figure 8.13)
and Sony Ericsson (figure 8.14). Only two participants thought that delete should
be a soft key for the Sony Ericsson scheme, whilst half did for the Nokia.

Finding Contacts;
As with unlocking the keypad, finding contacts has spawned many different
results. Both Nokia and Sony Ericsson have multiple methods of finding contacts
(figure 8.15, figure 8.16) and both share the same shortcut; pressing the ‘down’
key. Only one participant carried out the true manipulation for the Nokia scheme,
using this shortcut, with another using the same method, but using the top key
as a joystick. Three selected the correct first key for the Nokia method (b) so
with a screen, they would have realised they just needed to select one other key
to find their contacts successfully. Three used the correct scheme for the Sony
Ericsson but this would increase to four (50%) if the outer circle acted as a
sliding scroll wheel, with only one using the shortcut. It is clear that the majority
would like to find their contacts with the press of just one key, but it is believed
that the multiple true methods available have caused some confusion. For those

41
that found their contacts via the menu, all have indicated that contacts are not
the first option in the menu, but can be found directly below. During the text
message part of this test, all indicated that messaging was the first option
available in the main menu. This is the case for most phone manufacturers and
is clearly imbedded in the participant’s expectations.

Scroll Down and Scroll Up;


The central column of keys has been used by all participants in this exercise and
most would have achieved successful interaction with a real interface. Each
participant has also clearly used opposite motions for scroll up and scroll down.
Three participants wanted to be able to use the outer circle in the Sony Ericsson
scheme as a slide wheel, like that of the iPod, and all of these participants
interacted with a clockwise motion to pass down the list and anti clockwise to
pass up. All other participants split the circle into separate keys. All participants
scrolled down and up, as asked, not left and right.

42
Call and Hang Up;
As predicted, there is a strong tendency to use the left hand side of the keypad
for making a call and the right hand side to hang up, as used by all phone
manufacturers, although one participant used the top right in the Sony Ericsson
scheme for both. Three participants used the correct Nokia interaction for both
call and hang up, whilst five could have made a call on the Sony Ericsson
scheme, but of these five, only three would have successfully hung up. There is
a fair use of the same key to call and hang up, typically a central key. Those who
used two clearly used the corresponding key on the opposite side, placing call
and hang up on the same level.

43
8.2.2 IMAGE SCHEMA TEST

On analysis of this part of the experiment, it was discovered that a few factors of
error run through all the results and must be accounted for.

The biggest factor is that the thumb has a small range of motion in the y axis. To
overcome this problem, the rest of the hand moves to increase the relative
movement. As this relative movement has been reduced as much as possible in
the experiment by the forming of the thumbs up gesture, any measured
movements along this axis are much smaller than those in the x and z axes. This
could not be rectified without causing a large scale change to the operation of
the experiment, but can simply be factored in when analysing the resultant
vector graphs. This is most apparent in the typing of a number test (figure
B.3.21).

Where participants have been ‘pressing a button’ a number of times, it was rare
for each pass to follow the same path, producing a number of lines with
approximately the same motion, but shifted along one of the axes. This can be
seen in figure B.3.3. The fact that the participants were asked to use their left
hand, which for 70% of the participants was their weaker hand, may have
resulted in their movements being less defined and controlled.

Some participants used the top of their first finger as a surface tool for
interaction whilst others did not. This occurs where there is a clear, abrupt stop
in the motion. However, as this surface is not flat, when touched in different
areas the z position may be different, as in figure 7.24.

When participants were holding their thumb still, without touching a surface, the
CyberGlove would often continue reading small movements, twitching of the

44
participant’s thumb or relative movement of the glove to the hand. This resulted
in a ‘cloud’ of small vector plots (figure B.3.6), preventing a vertical marker plot
to be added. This could be rectified by setting a lower threshold vector value, for
example 0.1mm in all directions, replacing all with zeros. However, if participants
were asked to specifically pause to indicate they were pressing a button, then
more participants would have used buttons to interact rather than their desired
method.

This part of the experiment was always carried out directly after the visual test
using the mobile phone fascias, resulting in the participant imagining a mobile
phone as their device. For this section to work as planned, ideally the participant
would have imagined anything else as a comparison to the mobile phone. These
effects could have been reduced by each section being carried out in alternative
order for each participant.

Passing through a list and changing a variable

Results for passing through a list and changing a variable share striking
similarities with each other, showing that very similar image schemas have been
used to carry out both these tasks

Three results for each task made very little movement in the x or y axes but a
large straight motion in the z axis (figures B.3.4, B.3.6, B.3.7, B.3.11, B.3.12 and
B.3.15) This shows they have placed a vertical image schema with the list
schema to interact with their device. All of these participants preferred to make
the downward motion followed by the upward motion for the list and used
upwards to increase and downwards to decrease avalue.

The participant for figure 7.23 makes use of some movement in the y axis also,
combining the vertical schema with the ‘forward’ or ‘inwards’ schema. Combining

45
makes a movement similar to a typical volume symbol, shown in figure 8.21.
This means that it could be images like this, the symbols used to indicate
function, that shape the thought functions and hence the image schema when
carrying out a task.

It is believed that the participants represented in figures B.3.2, B.3.3, B.3.5 and
B.3.8 used a forward and backward directional image schema for the list task,
but they could also be translated as an in and out depth image schema. The
‘inward/outward’ or ‘forward/backward’ schema has been used to increase and
decrease a variable by the three participants in figures B.3.10, B.3.13 and
B.3.14. Each of these participants have also moved right as they have gone
forward, creating a similarity to figure 8.14.

Figure B.3.8 has a stark difference all the other results in this test, being the only
one to apply a scroll or circular motion. The arrows pass both ways around the
movement, so reversed motion has been used to pass either way through the
list. The shape is so oval because of the limited motion the thumb has in the y
axis, as discussed earlier. This result holds particular interest as devices closest
to using this scheme have a roller wheel, like that of a relevant computer mouse,
but they are relatively few devices that work in this manner.

46
Typing a telephone number

All participants in this section chose to type a number in button fashion as on an


existing mobile phone keypad.

It can also be seen how the 0 key acts as the reference point to determine the
positioning of all the other numbers. Figure B.3.16 and figure B.3.21 have a
clear gap between where the number 0 was selected with respect to the others,
which seem to be grouped far closer together and almost indistinguishable.

Where participants paused for a short while between numbers, determining the
positioning of the next, there is constant motion of the thumb at a similar height
(figures B.3.17, B.3.18, and B.3.20). If this had occurred in tests A or B, it may
have been analysed as an interaction. Not all movements made will be an
intentional interaction. It is instead a passive extension of the mind whilst
thinking. It is this level that is required to be tapped into to further this study,
removing the ideas of a device at all, but instead just the movements that the
body passively performs whilst thinking on an operation, to then be compared to
the movements required to operate a device.

47
9 – CONCLUSIONS

9.1 EXPERIMENTAL CONCLUSIONS

It can be concluded that the mobile phone, when used as a voice and text
message communicator, is generally a very successful product in terms of being
easy to use. The majority of phones are based on one of a few classic styles and
there has been much pressure from the consumer for these devices to be
intuitive to use. However, phone manufacturers have a delicate time ahead of
them as the era of the mobile phone in a pure sense coming to a close. New
technologies are changing the function of the mobile phone and currently these
new functions appear to be confusing to use on such a small device. This
justifies research into intuitive design in the future as many different devices,
along with their respective styles of interaction, are fused together into what is
currently the mobile phone.

On reviewing literature and carrying out this study, the author has decided the
following is a fair statement as a definition for intuitive design for the interface of
a device:

Intuitive design is the seamless alignment of cognitive expectation with interface


actuality.

The participants in this study have shown that Nokia have set a very high
standard of ‘intuitive design’ for the mobile phone. Even with an unmarked,
limited Gestalt arrangement of keys, 50% of all interactions asked to be carried
out on the schema would have been successful. Sony Ericsson, with a starkly
different scheme, where Gestalt laws are far more eminent had the higher score
with a 61% successful interaction rate. However, Sony Ericsson scored only

48
3.4/5 score for ease of learning whilst Nokia have clearly set the standard with a
4.7/5 score. This early appreciation of ease of use must have played a key factor
in Nokia’s early domination of the market, but now other manufacturers have
learnt this fact and are far closer competitors to Nokia than just a few years ago.

It has been shown that the nearer the top of the keypad a key is positioned, the
more important, or regularly used it should be. These buttons are currently used
to interact with the software of the phone, whilst the number pad should be
exclusively used for number and text and positioned on 12, identically sized and
shaped keys in a 3x4 arrangement. Future mobile phone design should keep this
system to appear to be intuitive to use. Unlocking and locking a keypad should
involve two keys, at opposite ends of the keypad, adapted from the Nokia
system and menu should be located top left or top centre. This key should also
double up as the select key. There is no preferred side to locate a back button,
but it should not be placed in a central position. The function of a delete key
should be clearly marked as there is no preferred choice of position for this
function. Scrolling should be given a top central position, with a vertical
alignment. Call should always be on the left hand side and hang up on the right.

Investigating image schema proved difficult to analyse but has shown that the
most prevalent schema paired with function schema to interact with a mobile
phone are vertical (up and down), or depth (forward, in and backward, out).
Scale becomes more apparent when slide actions rather than buttons are
involved. It is felt however, that this section of the test requires a far more
thorough experiment than that which has been carried out. This conclusion
shows simply that existing interfaces have shaped and moulded our schema into
these angular forms through button affordances. In the future, if there is a
development in interaction techniques, this area should be consulted far more
heavily by designers, engineers and psychologists to open a path for far more
subtle interactions.

49
9.2 INTUITIVE DESIGN IN THE FUTURE

Possible future methods for making an interface intuitive to every user may
involve making every interface completely customisable by the user. This way
the user can decide what features, buttons or styles they would like, and where.

A new slide phone on the market, the LG KF400, has split their main screen into
two. The top screen is the display, whilst the lower is fully touch sensitive and
changes its buttons with respect to what functions are available on the display
(figure 9.1). However, the buttons are not customisable.

This could be developed into an interesting experiment for further research. If a


piece of software was to be designed to allow the creation of individual
interfaces, given to participants to create and fine tune their own personal
interface, each creation could be analysed to find similarities in style, positioning
and size between each creation and existing interfaces. Further from this, the
search for an intuitive design may flip current normalities around. We could have
interfaces that learn and develop with the user to find their intuitive or preferred
motions to carry out tasks. This is especially viable with the increasing popularity
and success of the touch screen.

50
It is believed by the author that the field of image schema involved in interaction
is one that needs further research. It is however, a very vague topic to narrow
down and requires far greater knowledge of psychology than that offered by the
author. It is debateable whether image schemas have been considered in depth
with interface design. It needs to be determined if image schema for navigation
have any similarities to those in use with technology. However, in a test, when
discovering the image schema, the participants mind should be clear from any
ideas of existing interfaces or technology, for a fair comparison.

51
When Alexander Bell invented his ‘electrical speech machine’ in 1876 he created
a device that has now been developed and innovated to such a level that it has
become a focal point for absorbing all forms of technology. The device itself has
shrunk in size, but its function for sharing information has covered the globe
whilst it stands calm as the perfect item to inject a seemingly unrelated product
into its being. Pioneering technologies, from all areas of the technological world
seem to simply act as a platform for inventing new functions for the mobile
phone. As an omen to its roots, future personal devices that have spawned from
this creation may still be called a mobile phone, but as this study has shown, the
classic phone call is being matched in popularity as a use by technologies that
have existed for a mere decade. The freedom that has been made available to
the current generations that make use of the technologies available is a far cry
from what was thought possible on the advent of mobile communications. They
have changed the way people live their lives and will simply continue to do so as
technology develops with new generations.

The mobile phone may on occasion have been adapted to try to mimic the
interactions previously expected from the products fused into it, but with the
mobile phone becoming the ultimate jack of all trades, intelligent, sensitive
design will be required to keep it a simple, easy to use device. This will become
increasingly difficult with more features and a new approach may well have to be
taken in the near future. This next step will be a major factor in determining the
way we, as people, view and interact with the world. The mobile phone is an
ever increasingly powerful tool and it may one day, through intuitive design,
become a part of us, rather than just a part of our life.

52
53
54
55
APPENDIX A

QUESTIONNAIRE

i
ii
APPENDIX B

RESULTS

B.1 QUESTIONNAIRE

Figures B.1.1 to B.1.8 are the author’s own work, produced using Microsoft
Excel.

Figure B.1.1 – Age of participants who completed the questionnaire.

Figure B.1.2 – Do you use instruction manuals?

Figure B.1.3 – How long have you owned a mobile?

iii
Figure B.1.4 – How long have you had your current phone?

Figure B.1.5 – What are the main uses for your phone?

Figure B.1.6 - What phones have you owned in the past?

On average including that which is in current use, each participant has owned
5.5 phones.

iv
Figure B.1.7 - What phone do you currently use?

Figure B.1.8 – participant score for phone features.

v
B.2 VISUAL TEST

If the same key was used by more than one participant, it has been colour coded
respectively as in the colour scale given on the left hand side in figure B.2.2. A
dotted line indicates the passing from one key to the next whilst a solid line
indicates a ‘slide’, movement keeping contact with the schematic. Keys or
motions used in these scenarios have not been colour coded by occurrence, but
instead the colours of the lines correspond to different movements. This has
been indicated by a colour chart on the right hand side of the figure where
applicable, with the number enclosed denoting the repeated occurrence.

Figure B.2.1 to figure B.2.11 are the author’s own work, produced on Microsoft
PowerPoint.

Figure B.2.1: unlock the keypad

Figure B.2.2: menu

Figure B.2.3: select

vi
Figure B.2.4: back Figure B.2.6: delete

Figure B.2.5: keypad Figure B.2.7: Find Contacts

Figure B.2.8: Scroll Down Figure B.2.9: Scroll Up

Figure B.2.10: Call Figure B.2.11: Hang up

vii
B.3 IMAGE SCHEMA TEST

The resultant vector graphs from the image schema tests have been displayed in
the following format; 3D vector graph; XY plot; XZ plot; YZ plot. It is important
to note that the X axis corresponds to left and right, the Y axis to forward and
backward and the Z axis to up and down, as shown in figure B.3.1. The ‘corner
plots’ have been circled in pale blue to distinguish them from the others. They
should not be regarded in an analysis. The MoCap6 programme took data at a
rate of 30 frames per second, so each vector plot indicates the movement over
0.03 seconds.

Figure B.3.2 to B.3.21 are the author’s own work, produced using The
MathWorks MatLab and adapted using Paint and Microsoft PowerPoint.

viii
Test A – PASS THROUGH A LIST

Figure B.3.1
(Author’s own)

Figure B.3.2

Figure B.3.3

Figure B.3.4

ix
Figure B.3.5

Figure B.3.6

Figure B.3.7

Figure B.3.8

x
Test B – INCREASE AND DECREASE A VALUE

Figure B.3.1
(Source: Author’s own)

Figure B.3.9

Figure B.3.10

Figure B.3.11

xi
Figure B.3.12

Figure B.3.13

Figure B.3.14

Figure B.3.15

xii
Test C – INSERT A TELEPHONE NUMBER

Figure B.3.1
(Source: Author’s own)

Figure B.3.16

Figure B.3.17

xiii
Figure B.3.18

Figure B.3.19

Figure B.3.20

Figure B.3.21

xiv
APPENDIX C

THE IMMERSION CYBERGLOVE®

The Immersion CyberGlove measures the joint angles of the hand with the
exception of wrist rotation. It does so with the use of resistive bend sensing
technology to transform physical motion into digital angles. If hand position and
orientation in space is required, motion tracking sensors can be used to receive
signals from the wristband with appropriate software.

The glove used was the 22 sensor CyberGlove I model with open fingertips to
allow the user full sensitivity when carrying out the experiment. It is powered
and connected to a computer using a 25 pin parallel port via a transformation
box. It has resolution to the accuracy of a single degree and a repeatability value
of 3 degrees. The CyberGlove II, which currently available on the market and
shown in figure 6.5, has a wireless network link to a PC and is battery powered.

Figure 6.5 – The Immersion Cyber Glove II


(www.immersion.com)

The glove is fabricated with an upper stretch fabric to allow free movement of
the hand with a palm mesh to allow ventilation. The thin, lightweight sensor
strips are held in place in pockets of the stretch material and flex with the hand’s
movements. Each finger has three flexion sensors whilst four abduction sensors,
a palm arch sensor and two sensors measure flexion and abduction of the wrist.

Figure 6.6 – Diagram indicating medical terminology of movements


(Author’s own, adapted from www. sifter.org)

xv
APPENDIX D

ETHICAL APPROVAL FORM

xvi
xvii
xviii
APPENDIX E

SETUP OF THE CYBERGLOVE INTO ALIAS MOCAP6

Figure D.1 to figure D.8 are the author’s own work and include screen shots from
Alias MoCap6.

Figure D.1 Figure D.2


Opening screen of Alias Mocap 6 Drag and drop ‘CyberGlove’ into
‘Viewer’
Create ‘Model binding’
Set Port to ‘Comm 7’
Set Speed to ’38 400’

Figure D.3
Zoom, rotate and pan viewer using buttons at the top.
Hold click on the respective icon and move mouse to alter image.
Click ‘Online’ and wait for the dial to turn green.
Begin calibration.

xix
Calibration
The buttons for calibration are found at the bottom central position. The hand
postures are shown below as figure D.4x to figure D.8. The hand should be put
into position before the respective calibration button is selected.

Figure D.4 Figure D.6


All Open Fingers closed

Figure D.5 Figure D.7


Flat closed Thumb to pinkie

Thumb closed
Figure D.8

To record the test, make sure the frame number is set to zero, select record,
which will begin when the play button is also selected. Select stop at the end of
the test. Save and then export as a .amc file. This file type can be opened as a
text document, consisting of angles to be imported into Excel.

xx
APPENDIX F

IMPORTING DATA INTO MICROSOFT EXCEL

Figure E.1 to figure E.7 are the author’s own work and include screen shots from
Microsoft Excel.

Figure E.1 – Import data into Excel

Figure E.2 – Find data file.amc

Figure E.3 – Delimited data type

xxi
Figure E.4 – Select tab and space

Figure E.5 - Finish

Figure E.6 – Click O.K.

xxii
nb; number of time frames

Thumb matrix

Thumb A ФXA ФYA ФZA


Thumb B ФXB ФYB ФZB
Thumb C ФXB ФYC ФZC

Sample number/Time frame

Figure E.7 – Summary of spreadsheet

The participant’s thumb dimensions a, b and c was input by hand into cells I3, J3
and K3 respectively. The raw data from each test of each individual participant
was allocated a separate worksheet. The selected raw data to be analysed must
be copied and pasted into worksheet 01 and the file saved.

xxiii
APPENDIX G

THE ANALYSIS CODE, MATLAB

Before the code can be run, each set of instructions must run up to, but not
above, the number of samples recorded as raw data (in this example, 10). If this
is not the case, the following error message will appear when the programme is
run;
??? Error using ==> mtimes; Inner matrix dimensions must agree

A summary of the operations carried out by the code is given below, for ten time
frames.

Define thumb dimensions of participant;


Dimensions01= xlsread('File Name.xls', Spreadsheet #,
‘Far left coordinate:Far right coordinate')

Dimensions01= xlsread('Test Data.xls', 01, 'I3:K3')


⎡ A⎤
Dimensions = ⎢⎢ B ⎥⎥
⎢⎣C ⎥⎦

Importing data into Matlab;


DataT### = xlsread('File Name.xls', Spreadsheet #, 'Top left
coordinate:Bottom right coordinate')

DataT001 = xlsread('Test Data.xls', 01, 'B7:D9')


DataT002 = xlsread('Test Data.xls', 01, 'B25:D27')
DataT003 = xlsread('Test Data.xls', 01, 'B43:D45')
DataT004 = xlsread('Test Data.xls', 01, 'B61:D63')
DataT005 = xlsread('Test Data.xls', 01, 'B79:D81')
DataT006 = xlsread('Test Data.xls', 01, 'B97:D99')
DataT007 = xlsread('Test Data.xls', 01, 'B115:D117')
DataT008 = xlsread('Test Data.xls', 01, 'B133:D135')
DataT009 = xlsread('Test Data.xls', 01, 'B151:D153')
DataT010 = xlsread('Test Data.xls', 01, 'B169:D171')

⎡ϕ XA ϕ YA ϕ ZA ⎤
DataT 001 = ⎢⎢ϕ XB ϕ YB ϕ ZB ⎥⎥
⎢⎣ϕ XC ϕ YC ϕ ZC ⎥⎦

xxiv
Converting degree data into radians for Matlab;
RadDataT001=(1/360)*2*pi*(DataT001)’
RadDataT002=(1/360)*2*pi*(DataT002)’
RadDataT003=(1/360)*2*pi*(DataT003)’
RadDataT004=(1/360)*2*pi*(DataT004)’
RadDataT005=(1/360)*2*pi*(DataT005)’
RadDataT006=(1/360)*2*pi*(DataT006)’
RadDataT007=(1/360)*2*pi*(DataT007)’
RadDataT008=(1/360)*2*pi*(DataT008)’
RadDataT009=(1/360)*2*pi*(DataT009)’
RadDataT010=(1/360)*2*pi*(DataT010)’
Note that the data values have been transposed in this part of the code, denoted
by the dash at the end of the line.

⎡φ XA φ XB φ XC ⎤
RadDataT 001 = ⎢⎢φYA φYB φYC ⎥⎥
⎢⎣φ ZA φ ZB φ ZC ⎥⎦

Resolving angles;
ResT001=cos(RadDataT001)
ResT002=cos(RadDataT002)
ResT003=cos(RadDataT003)
ResT004=cos(RadDataT004)
ResT005=cos(RadDataT005)
ResT006=cos(RadDataT006)
ResT007=cos(RadDataT007)
ResT008=cos(RadDataT008)
ResT009=cos(RadDataT009)
ResT010=cos(RadDataT010)

⎡cos φ XA cos φ XB cos φ XC ⎤


Re sT 001 = ⎢⎢ cos φYA cos φYB cos φYC ⎥⎥
⎣⎢ cos φ ZA cos φ ZB cos φ ZC ⎦⎥

Determining 1x3 xyz coordinate matrices;


PositionT001=ResT001*Dimensions01;
PositionT002=ResT002*Dimensions01;
PositionT003=ResT003*Dimensions01;
PositionT004=ResT004*Dimensions01;
PositionT005=ResT005*Dimensions01;
PositionT006=ResT006*Dimensions01;
PositionT007=ResT007*Dimensions01;
PositionT008=ResT008*Dimensions01;
PositionT009=ResT009*Dimensions01;
PositionT010=ResT010*Dimensions01;

⎡cos φ XA cos φ XB cos φ XC ⎤ ⎡ A⎤ ⎡ A cos φ XA + B cos φ XB + C cos φ XC ⎤ ⎡ x ⎤


PositionT 001 = ⎢⎢ cos φYA cos φYB cos φYC ⎥⎥ • ⎢⎢ B ⎥⎥ = ⎢⎢ A cos φYA + B cos φYB + C cos φYC ⎥⎥ = ⎢⎢ y ⎥⎥
⎢⎣ cos φ ZA cos φ ZB cos φ ZC ⎥⎦ ⎢⎣C ⎥⎦ ⎢⎣ A cos φ ZA + B cos φ ZB + C cos φ ZC ⎥⎦ ⎢⎣ z ⎥⎦

xxv
Determining 1x3 uvw vector matrices;
VectorT001=PositionT002-PositionT001
VectorT002=PositionT003-PositionT002
VectorT003=PositionT004-PositionT003
VectorT004=PositionT005-PositionT004
VectorT005=PositionT006-PositionT005
VectorT006=PositionT007-PositionT006
VectorT007=PositionT008-PositionT007
VectorT008=PositionT009-PositionT008
VectorT009=PositionT010-PositionT009
VectorT010=PositionT010-PositionT010

⎡ xT 001 ⎤ ⎡ xT 002 ⎤ ⎡ u T 001 ⎤


VectorT 001 = ⎢⎢ yT 001 ⎥⎥ − ⎢⎢ yT 002 ⎥⎥ = ⎢⎢ vT 001 ⎥⎥
⎢⎣ z T 001 ⎥⎦ ⎢⎣ z T 003 ⎥⎦ ⎢⎣ wT 001 ⎥⎦

Creating 3xT## xyz position matrix;


xyz=[PositionT001 PositionT002 PositionT003 PositionT004… …PositionT005
PositionT006 PositionT007 PositionT008… …PositionT009 PositionT010]

⎡ xT 001 xT 002 ⎤
xyz = ⎢⎢ yT 001 yT 002 ⎥

⎢⎣ z T 001 zT 002 ...⎥⎦

Creating 3xT## uvw vector matrix;


uvw=[VectorT001 VectorT002 VectorT003 VectorT004…
…VectorT005 VectorT006 VectorT007 VectorT008… …VectorT009
VectorT010]

⎡ uT 001 uT 002 ⎤
uvw = ⎢⎢ vT 001 vT 002 ⎥

⎢⎣ wT 001 wT 002 ...⎥⎦

The last value in the uvw matrix will be zero, but is needed to ensure the matrix
dimensions agree to enable Matlab to produce the vector graph.

xxvi
Copy and paste matrices xyz and uvw into notepad, save, and then import as
before into Excel file ‘Results.xls’.
Rearrange in Excel to produce the equivalent of figure F.1;

Figure F.1

The result from each individual participant was allocated a separate worksheet.
The results selected to be plotted must be copied and pasted into worksheet 01
and the file saved. MoCap6 started the recording from the last recorded position,
resulting in large initial vectors to the current starting position. As this was a
recurrent error, the first reading for each test was deleted, so as to not appear in
latter parts of analysis.

Any pauses in motion, possibly representing the pressing of a button, or applying


a continued pressure resulted in time frames with the same positions, so
naturally, vectors of (0,0,0,). Any plots without a vector would not appear on a
vector graph. To indicate when such an act had occurred, the first reading in a
packet of over four time frames with a (0,0,0) vector was given a vector of (0,0,-
5), resulting in a vertical downward arrow on the vector graph.

xxvii
Run ‘Matlab graph’

Creating individual 1xT## x,y,z,u,v,w matrices;


x = xlsread('File Name.xls', Spreadsheet #, 'Top left
coordinate:Bottom right coordinate');
x = xlsread('Results.xls', 01, 'B1:K1')
y = xlsread('Results.xls', 01, 'B2:K2')
z = xlsread('Results.xls', 01, 'B3:K3')
u = xlsread('Results.xls', 01, 'B5:K5')
v = xlsread('Results.xls', 01, 'B6:K6')
w = xlsread('Results.xls', 01, 'B7:K7')

Plotting 3D vector graph;


quiver3(x,y,z,u,v,w)

The results from figure F.1 produce figure F.2. The red axis has been placed into
the image by hand and the values are given in millimetres. The blue arrows
indicate the thumb’s motion, starting from their respective xyz coordinates and
pointing in their respective vector components uvw. It can be seen in this
example that the thumb has moved in a semi circular path involving all axis
during this time interval.

Figure F.2

It was realised during analysis that the axis were not all automatically plotted to
the same scale, resulting in excessively magnified movements in the plane of
lowest movement. This made analysis difficult, so two ‘corner’ positions were
added to each plot, with coordinates at the minimum and maximum values of x,
y or z and each with a vector of 5. This made the plot volume a cuboid, so the
results could be analysed evenly. These plots were added in the excel
spreadsheets ‘Results’ in columns B and C using Excel max and min functions.

xxviii
xxix
xxx