Wednesday, February 28, 2007
XMarks technical team meeting 28th Feb 07
1. Arran to work on hand drawn UML sequence diagrams for these use cases we
* tutor uses assessment tool to create an assessment that MATCHES an
existing record on the admin database
* tutor uses assessment tool to create an assessment that then CREATES a
new record on the admin database
Also to draft out a sequence diagram for sending marks, for discussion at
I've found this article on UML extremely helpful:
2. Arran and Carol to go through the DTD and create some example XML
documents for various stages of the UML sequence diagrams
3. John and Arran to look at getting XML output from Sussex Direct into the
4. Paolo to identify the various Moodle activities that might end up
generating marks data and to think about the steps we need to model
5. Carol and Paolo to take forward engaging with the Moodle community [not
sure we talked about this, but we should do it anyway :-) ]
I've set the next meeting up for this friday 9th March at 11 am - in a tea
bar if people fancy an XMarks cuppa.
Tuesday, February 27, 2007
Data formats/approaches to consider - OKI OSID
The Open Knowledge Initiative maintain a number of data models aimed at supporting a service orientated approach - these are the OSID models - Open Service Interface Definitions
I am posting while I have the links to hand and will return to discuss the OSID assessment and grading definitions when I've had a chance to read them.
Data formats/approaches to consider - the Blackboard API
Analysis of the Blackboard API may give us some ideas about sensible ways of managing transactions with assessments and marks.
This is a quote from the java docs of the API on the gradebook:Provides the public implementation classes for Gradebook subsystem. The Gradebook is basically a 3-D container for "attempts". The "columns" are represented as Lineitem objects. The "rows" are Users in the course. Each cell displays a single value that may be calculated from one or more Scores. (The third dimension exists by allowing multiple attempts for a given line item).
A list of basic actions that can be carried out with the gradebook:
* Create a Lineitem and set appropriate properties
* Persist Lineitem
* Store Lineitem Id reference for later use
* When student interacts with "gradeable" resource, create a Score object and set appropriate properties, including Lineitem Id stored earlier
The description and the list of actions are quoted directly from the javadocs documentation, which is viewable here:http://www.utm.utoronto.ca/~ajwang/bb/bb/
So far, in our thinking about scope we have stated that multiple attempts will not
be tracked by our XMarks data format.
The "lineitem" is basically an assessment/assignment, and the operations on it are defined here:http://www.utm.utoronto.ca/~ajwang/bb/bb/blackboard/data/gradebook/Lineitem.html
Thursday, February 22, 2007
An information model for XMarks
XMarks needs an information model for data transfer, and we are still evaluating possible existing formats and getting advice and support from the community (see some recent postings on this blog).
But we are also looking at what we think the information model needs to contain.
At the Assessment SIG
today, I presented the background to the XMarks project, and then circulated a very early draft information model for comment. There was a good response from the group, who made a number of useful suggestions. It sparked off quite a lot of discussion about exactly what the scope of the information model was.
For example, some of the issues raised and the discussion on scope were:
- managing penalty points for late submission - my feeling is that the management of penalty points, mitigating evidence etc should all be kept out of scope for this model, and that all the model needs to do is to collect the raw data from which deductions can be made if necessary. not everyone shared this view. optional attributes could be provided to support the deduction of penalties.
- tracking which items of data might be displayed to potential employers etc - an interesting issue, and one we need to keep aware of but which probably falls outside the scope of XMarks
- supporting multiple marks for a given assessment (for example, if a peer review exercise results in several marks) - our feeling is that the aggregation of marks is Someone Else's Problem, and that XMarks should just model a single mark for a given assessment. But I know Sam had some thoughts about that, and I wonder what the Peer Pigeon team think?
As well as this useful discussion on scope, people suggested a number of useful additional data items that we would probably want to track.Assessments
Marks - at overall mark-list object level
- assessment authored by (a person id)
- assessment authored date
- assessment moderated by (a person id)
- assessment moderated date
- date after which no submission is possible
- sourcedid of the assessment in the system where the assessment took place
- invigilated by (a person id)
- moderated by (a person id)
Marks - at a given student's mark object level
- marked by could be a machine process, in which case the system should indicate that, but also track who wrote the marking schema
- duration of actual test time
FREMA and XMarks
At today's Assessment SIG
(special interest group) meeting in Southampton, it was good to hear Yvonne Howard talking about the semantic wiki
that the FREMA
project have developed. The FREMA project was funded by JISC to develop a reference model for e-assessment. The semantic wiki is a record of the careful modelling work that they did, but is also designed to grow and change along with the community.
I first came across the FREMA wiki a couple of months ago and have been finding it a very stimulating place to visit and browse. I've spent some time looking at the Summative Assessment
Service Usage Model and trying to see where our XMarks endeavour fits in. There's a use case here for Tracking
which defines it as:Players: SIS
- SIS logs that assessment has been assigned to candidate
- SIS logs marks achieved by candidate
- SIS logs assessment has been moderated
- SIS logs assessment grades have been awarded
(quoted from http://frema.ecs.soton.ac.uk/wiki/index.php?title=Tracking)
This is pretty much XMarks territory, although in XMarks we're aware that we need to do quite a lot of matching between assessment records on different systems, which isn't covered in this use case.
So is XMarks basically Track Service v1.0
I asked this as a question in the FREMA wiki:http://frema.ecs.soton.ac.uk/wiki/index.php?title=Talk:Summative_Assessment_%28
Wednesday, February 21, 2007
RPC style and document style web services
At our XMarks technical team meeting yesterday we discussed the issue of RPC vs document style web services. I think as a team we are more used to RPC style services. So I emailed Warwick, as I thought I could remember him mentioning the issue at the JISC Toolkit and Demonstrator projects start up day
in January, to ask him if he had any good references on the subject.
He has set up a thread on the Icodeon site to discuss this more:http://icodeon.wiki-neon.adaptavist.com/display/soa/2007/02/21/
This has got lots of interesting stuff and some useful references, and can give us a place to discuss our questions about document style web services too.
Sunday, February 18, 2007
Potential data format: IMS-Enterprise
Could the "membership" data object of the IMS-Enterprise (IMS-E)
data format be appropriate for the XMarks data exchange format? It contains attributes for interimresult
In a typical use of the IMS-E format, the "group" data object will be mapped to programme, course/module or teaching group within a given degree programme.
The "membership" object will allow the association of a given person with such a group.
structure can be used to hold multiple interim results, for example 'Mid-Term', 'Part 1' etc:
- resulttype (the type of interim result, e.g. 'Mid-term')
- mode (a descriptive name for the grading mode, e.g. 'Pass/Fail', 'Percentage' etc)
- values (a structure that contains one or more valid values for the result)
- result (the actual result that the member was assigned, ideally a member of the values list)
- comments (comments about the interim result)
structure is similar, except that it does not contain the resulttype
attribute. This isn't needed, as the result type is already known - it is defined as being final.
So the membership data object can allow the transfer of interim and final marks data.Why this approach is not sufficient for XMarks
XMarks is about the exchange of both marks and assessment information. A typical use case for exchanging assessment information would be that:
- The course definition contains a set of pre-defined assessments that will be used to determine the final result of a given course. This information needs to be passed from the course definition database into the learning management system, particularly where the learning management system is going to deliver the method of assessment.
But the IMS-E interimresult
data structures don'tt enable the assessment structure of the course to be passed separately from the results data.
Additionally, the IMS-E specification version 1.1 states that both interimresult
data objects are deprecated and are going to be removed from IMS-E version 2.0. So it might not make sense to develop a service that relies on them.What we can learn from IMS-E
If we view the summative assessment methods as a core part of a course definition, then it might make sense to pass them as attributes of the "course" data object.
Perhaps we could take the same view of formative assignments.
However, it also makes sense to view the actual modes of assessment and results as being an attribute of the student's membership of a course, as the IMS-E specification models them.
This is powerful because it allows different students to be assessed with different modes - which happens for example when a student has disabilities which mean that the standard mode is inappropriate for them.
The IMS-E structure also allows for the possibility that different students taking the same assesment mode will have their grades represented differently. This can occur when MA and MSc students are taking the same course, and the two degree programmes are using different notations for marking.
Thursday, February 15, 2007
Potential data format: ANSI TS 130
ANSI TS 130
describes a student transcript for use in Electronic Document Interchange, typically between schools or other educational institutions or school districts, to transmit current and historical data.
It contains structures within it that allow an exchange of courses taken and grades attained.
- test name
- date test administered
- version of test
- level of test
- level of test/individual/course (e.g. 1st grade, 2nd grade, Post-secondary 1st year etc.)
- how test norming was carried out
- other details, e.g. language of test
For each TST
structure, there must be at least one SBT (sub-test)
containing information about the components of the test.
For each SBT
record, there must be at least one corresponding SRE (test results)
score.Could this be the data exchange format used by XMarks?
A key part of the XMarks project is that assessment data needs to be exchanged prior
to marks data being available. The TS 130 structure is much more geared to exchanging data on results. It doesn't contain all the information that would be needed for the exchange of assessment data, such as the conflation rules and weighting, location and mode of submission etc.
The emphasis for marks data is also not quite right for the XMarks project. In order for a learning management system to pass raw marks data back to a student record system, then it must be possible to encode information relating to non-submission, late submission etc.
The emphasis in TS 130 is on exchanging final results where issues of conflation, lateness, penalties etc have all been already taken into account.
Thus our view is that this format wouldn't be appropriate for XMarks.
Saturday, February 10, 2007
Notes from the Enterprise SIG 6th Feb 2007
Paolo and I attended the Enterprise SIG on 6th Feb 2007
, where we fed back on work we'd done in the MINTED project
and talked about plans for XMarks.
There was then a discussion on some of the issues that the SIG members thought would be relevant for XMarks to consider.Permissions and security
Discussion of XACML
(eXtensible access control markup language), an access control policy language ratified by the OASIS consortium (Organization for the Advancement of Structured Information Standards)Traceability
Where did this message come from?
Ensuring it is possible to determine how a mark gets modified
At least one HE institution has failed to get its final degree classifications ratified due to poor paper logging.
Is there a policy around the logging of changes?Reliability of messaging
(Web Services Reliable Messaging)Interoperability - learning from other projects
Project - information passed between institutions using IMS LIP format. The Shell project worked in a similar space to the XMarks project in that it was looking at data exchange and looking at the way that data exchange would impact on business processes.