Left: COGS Home Page
Up: COGS Home Page
COGNITIVE MODELLING: LECTURE 1, 2002
-
- How do we use language, problem-solve and operate effectively in a
complex, dynamic world?
-
- How do we learn to carry out these functions?
COGNITIVE MODELLING
http://www.informatics.sussex.ac.uk/users/bend/cogmod/outline.html
-
- Course Outline
-
- Aims and Objectives
-
- Reading List -- Green et al. ``Cognitive Science''
-
- Computer Classes -- Week 2 onwards: working in pairs (if possible)
-
- Seminars -- Week 1 onwards: Sharples et al. (1989), Chapter
2 of Green et al.
-
- Assignment -- Log of computer-based experiments
-
- Assessment -- Essay plus Unseen
-
- Types of Models
TYPES OF MODELS
-
- Non runnable: description
-
- diagrammatic -- box/arrow, flowchart (see Green et al.)
-
- descriptive -- ``working memory can hold 7 plus/minus
2 chunks''
-
- mathematical/logical e.g. an equation or formula
-
- Runnable: description plus behaviour
-
- Mechanical/hydraulic -- ``automata'', ``inference engine''
-
- Computer program
ARTIFICIAL INTELLIGENCE
Contributed runnable and non-runnable models to Cognitive Science.
Contrast ``engineering'' type AI.
For runnable cognitive models:
-
- Parts can be changed, rules of interaction of parts can be changed:
behaviour of model can be compared with behaviour of what is being
modelled
-
- Granularity and level of model -- what counts as ``behaviour''?
-
- What aspects are not being modelled?
BUGGY EXAMPLE -- SURFACE BEHAVIOUR
Which child would you like to test: a, b, c, or d? a
Now set some subtraction sums for the child.
You will be asked to type the top row of the sum,
then the bottom row.
Press <RETURN> instead of typing a number
when you want to finish with this child.
Testing child a
Top number: ? 23
Bottom number: ? 4
2 3
4
------
2 1
The program will now set five sums and you should answer
them as if you were child "a". If you want to quit
earlier then press <RETURN> when asked for your answer
Top number: 683
Bottom number: 76
What answer would child "a" give? 613
6 8 3
7 6
---------
6 1 3
Yes - that is the child's answer
Top number: 929
Bottom number: 235
What answer would child "a" give? 714
9 2 9
2 3 5
---------
7 1 4
Yes - that is the child's answer
Congratulations. You appear to understand child "a"
This is my description of the bug:
Child a: the child always subtracts the smaller digit from
the larger in a column
BUGGY EXAMPLE -- INNARDS
-
- production rules e.g.:
prule fd: [m ?m][s ?s] => assert([nextcolumn]);
assert([finddiff]); endprule;
prule b2a: [s greater m] => assert([borrow]); endprule;
prule bs1: [borrow] => addtentom(); endprule;
prule bs2: [borrow] => decrement(); endprule;
prule cm: [m ?m][s ?s] => compare(); endprule;
prule in: [processcolumn] => readmands(); endprule;
prule nxt: [nextcolumn] => assert([processcolumn]);
shiftleft(); endprule;
-
- production rule interpreter
-
- subsidiary parts of the program e.g. printing procedures
POSSIBLE EXPERIMENTS
-
- Parts can be changed, rules of interaction of parts can be changed:
select rule mix and order, change rules themselves
change way rules are interpreted
behaviour of model can be compared with behaviour that is being
modelled: offer different sums
-
- Granularity and level of model -- what counts as ``behaviour''?
choice of digits in columns in answer, some intermediate digits
-
- What aspects are not being modelled?
timing, understanding meaning of subtraction, subtraction in
context, motivation ...
DIFFERENT FLAVOURS OF ARTIFICIAL INTELLIGENCE
``Nouvelle AI'' & ``GOFAI -- Good Old-Fashioned AI'':
How far does do the processes being modelled seem to depend on there
being some kind of an internal symbolic representation of
the ``world''?
-
- Complete simple organism interacting with its environment e.g. a
simulated snail on a simulated leaf
-
- A robot fish in a real pond
-
- A (presumed) single (human) mental process -- problem-solving,
language understanding, planning
-
- The overall cognitive architecture in which such processes operate and
are integrated into a coherent whole.
REPRESENTATION OF WORLD IN BUGGY
-
- For model of child ``a'' -- static set of production rules that mimic
his/her behaviour.
-
- A production rule interpreter that uses the rules in a consistent
manner.
-
- A dynamic ``working memory'' that changes according to the state of
the problem solving.
Models of Symbolic Computation
For seminar in week 1 read Sharples et al. Foreword and Chapter 3
For seminar in week 2 read Green et al. Chapter 2
-
- propositional representations
-
- production systems
-
- connectionist and parallel distributed processing
-
- hybrid models
-
- architectures
CONCLUSIONS
-
- The AI modelling approach to Cognitive Science
-
- The strengths and weaknesses of modelling and of
particular models
-
- Kinds of Cognitive Models
-
- Weeks 2-5
Runnable |
Symbolic |
Type |
Domain |
Yes |
Yes |
Production Systems |
Arithmetic |
Yes |
Yes |
Propositional |
Language |
Yes |
No |
Neural Networks |
Language |
Yes |
No |
Behaviour-based Modelling |
Robotics |
Left: COGS Home Page
Up: COGS Home Page
Benedict du Boulay, Cognitive Modelling web pages updated on Sunday 21 April 2002