Ten Most Wanted: Hunting down missing information about cultural artefacts Marcus Winter, University of Brighton; Susan Lambert, Arts University Bournemouth; Phil Blume, Adaptive Technologies
Ten Most Wanted develops a game-based approach to crowdsourcing certain aspects of curatorial research, including the discovery and verification of previously undocumented facts about collection items. Rather than presenting artefacts as removed from people’s lives and explained only by experts, it encourages players of the game to find out and tell experts what they know about the artefacts and why this knowledge is important.
The Museums Computer Group 'Museums on the Web' conference 2013 (UKMW13)
https://twitter.com/search?q=%23ukmw13
Tate Modern, 15 November 2013.
The theme for UKMW13 was ‘Power to the people’.
The Museums Computer Group: connecting, supporting, inspiring museum technology professionals.
how online collections could potentially impact the actual art systemMuseums Computer Group
More Related Content
Similar to Ten Most Wanted: Hunting down missing information about cultural artefacts Marcus Winter, University of Brighton; Susan Lambert, Arts University Bournemouth; Phil Blume, Adaptive Technologies
Similar to Ten Most Wanted: Hunting down missing information about cultural artefacts Marcus Winter, University of Brighton; Susan Lambert, Arts University Bournemouth; Phil Blume, Adaptive Technologies (20)
Ten Most Wanted: Hunting down missing information about cultural artefacts Marcus Winter, University of Brighton; Susan Lambert, Arts University Bournemouth; Phil Blume, Adaptive Technologies
1. Ten Most Wanted:
Hunting down missing information
about cultural artefacts
Presented by
Susan Lambert ∙ Marcus Winter ∙ Phil Blume
2. Ten Most Wanted
A research project with three partners:
The Museum of Design in Plastics
Arts University Bournemouth
Susan Lambert, Museum Head
Interactive Technologies Research Group
University of Brighton
Dr Lyn Pemberton, Reader
Marcus Winter, Research Fellow
Adaptive Technologies Limited
Phil Blume, Project Manager
3. Ten Most Wanted
1. The problem we address
2. Research aspects
3. Design decisions
4. How it works
5. Wrap up
4. Ten Most Wanted
In the absence of maker’s marks,
packaging or additional information, we
are left only with the intrinsic properties
of size, colour, material and process.
8. Ten Most Wanted
Where do we fit in?
Classification
Correction &
Transcription
Co-curation
Oomen &
Aroyo (2011)
Complementing
Collection
Contextualisation
9. Ten Most Wanted
What’s new?
Game-based crowd-sourcing: types of participation
Casual games
Exploratory games
Ridge (2011)
Multi-user
Single-user
Ridge (2011)
Crowd
Community
Haythornthwaite (2009)
10. Ten Most Wanted
Research questions
ote and facilitate
How to design, prom
R1 collections?
dsourcing games for
complex crow
R2 - How
to integ
ra
content
with cur te user-generat
ated col
e
lection d d
a ta ?
R3 - How to deal with IPR issues in a transparent,
non-limiting, user-friendly way?
11. Ten Most Wanted
The FBI focus the search and ask the public.
www.fbi.gov/wanted/topten
13. Ten Most Wanted
Social media channels were established
to promote the discussions.
Facebook group
10mostinvolved
Google+
Ten Most Wanted
Twitter
@TenMost
14. Ten Most Wanted
Social media APIs are used to pull the
posts into the website home page.
15. Ten Most Wanted
Posts are then filtered into the
individual object pages.
Key discoveries in the investigation
are recorded in the case notes.
16. Ten Most Wanted
Case notes are an important part of
the process.
1.Visitors can see progress without
reading the entire narrative.
2.The key discoveries are recorded in
our own domain.
1.This evidence is summarised in a
museum context.
2.Contributors can be credited by
name.
17. Ten Most Wanted
Credits earn points and points
earn prizes.
Hall of Fame
Promotion
Certificate
Opportunity to manage cases
Opportunity to blog
Invitation to our Summer Party
18. Ten Most Wanted
Players are asked to agree to our
Terms & Conditions on sign-up.
These are summarised in three
bullet points linked to a long form
document.
Put simply, a contributor owns
their own stuff and we have IPR
clearance to use it.
19. Ten Most Wanted
Case number - AIBDC : 005528
Designer
Manufacturer
Country
Date
Dimensions
Materials
Method
Inscription
Unknown - Wanted
Byson
UK
1920 - 1959 (circa) - Wanted
height 38 mm
width 135 mm
depth 20 mm
plastic, phenol formaldehyde,
bakelite - generic term
compression moulded
""Byson" Design and method
of manufacture Pat. in England
& Abroad No. 392800. Tested
and approved. Serial 924. Good
Housekeeping Institute.
Conducted by Good
Housekeeping Magazine."
(Under clip)
23. Ten Most Wanted
What we have learnt so far:
1.Case notes seem a good way to
> summarise findings
> present evidence
> credit players
2.Game mechanics and reward structures
are a difficult topic due to participants’
different backgrounds and motivations
3.The “crowd” has amazing expertise
Talk about the Ten Most Wanted project,
which explores game-based crowdsourcing to enhance
collection data at the Museum of Design in Plastics
The project runs from May 2013 to April 2014 and is supported by the Digital R&D Fund for the Arts.
Like all projects in this programme…
… it has 3 partners:
MoDiPArts partner
UoB Research partner
ATLTechnology partner
Here’s a quick overview of our talk:
1 - The problem we address(Susan)
2 - Research aspects(Marcus)
3 - Design decisions(Phil)
4 - How it works(Susan: example)
5 – Wrap up(Marcus)
The project was Phil’s, idea. His firm, Adaptive Technology, had built our website.
This is a typical page - Phil was shocked by how much information was missing.
We don’t know who designed or made it.
This is not of course uniquely MoDiP’s problem but is shared by many different types of collections.
This slide gives some examples.
It isn’t that in each case the information is necessarily very difficult to find but that
there are so many objects and there isn’t the time to do the work.
Shared with the public it becomes doable and also engages people who
may never visit the physical collection with it in a meaningful way.
These aspects of the problem…
- many objects with missing information
- requires detective work, not special skills or tools
- secondary agenda: audience engagement.
… make it ideal for crowdsourcing…
… which is a term coined by Jeff Howe to describe the practice of using the Internet to outsource work to a large number of individuals.
Crowdsourcing has been used in various contexts in the cultural heritage sector.
Looking at Oomen & Arroyo’s classification how crowdsourcing can support
“the digital content lifecycle in heritage organisations”, we can see that
most crowdsourcing projects, especially game-based ones, fall into Classification and Correction & Transcription
We focus on Contextualisation where people go out and
discover facts that are missing in the collection data
---
Oomen, J. & Arroyo, L. (2011). Crowdsourcing in the cultural heritage domain: opportunities and challenges. In: 5th International Conference on Communities & Technologies. Brisbane, Australia - 29 June – 2 July 2011. Available http://www.cs.vu.nl/~marieke/OomenAroyoCT2011.pdf
There are many different interpretations and, in fact, typologies of crowdsourcing.
Some key aspects in this context are level of engagement, task difficulty, collaboration and level of co-determination in projects.
Ten Most Wanted looks at game-based crowdsourcing, and in particular the more complex end of the scale,
i.e. exploratory multi-user games that require sustained engagement and social interaction.
---
Ridge, M. (2011). Playing with Difficult Objects – Game Designs to Improve Museum Collections. Proceedings of Museums and the Web 2011. Available: http://www.museumsandtheweb.com/mw2011/papers/playing_with_difficult_objects_game_designs_to
Haythornthwaite, C. (2009). Crowds and communities: Light and heavyweight models of peer production. Proceedings of the Hawaii International Conference On System Sciences (pp. 1–11). IEEE. Retrieved from http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=4755627
We have 3 main research questions, including:
- How to design and facilitate complex crowdsourcing games
- How to integrate found information with the collection data
- How to deal with intellectual property issues in a transparent, non-limiting way that keeps our players happy.
---
Now Phil will talk about some key design decisions relating to these questions.
Although we see today as the real launch of the game, it has had a soft launch vis volunteers connected to Adaptive Technologies. We already have two and almost three solved cases. I am going to talk you through this one. We knew the name of the manufacturer although we hadn’t been able to find out anything about the firm. We also knew it was a patented design. We wanted to know who hade designed it and when.
People in Brighton, London and Canada helped in the search. The case got going by someone finding the patent was taken out by an Ernest Harrison of Bury in 1932. Someone else found that the manufacturer was also located in Bury; that the clips were exhibited at the White City British Industries fair in 1934; and that Ernest Harrison and Byson were associated together with at least 9 patents suggesting that Harrison was either an owner or employee of Byson.
Someone else had a collection of the clips that he had gathered together when renovating a 1920s house and sent us a picture of them.
And finally, after the involvement of six different people, the case was solved. The last piece in the puzzle was the discovery of Rosemary Phillips’s Canadian website which mentions her grandfather, Ernest Harrison, beginning a creative career in 1927 with the invention of a bread slicer. Contact with Rosemary revealed that, yes, Ernest Harrison was the inventor of the carpet clips and also the owner of the Byson company, plus a lot more information. Rosemary has now signed up to play the game and is encouraging her friends to do so.
To wrap up what we have learnt so far:
Curated case notes seem a good way to:> summarise findings> present evidence> credit players
Game mechanics and reward structures are a difficult topic due to participants’ different backgrounds and motivations
The “crowd” has amazing expertise: we were really surprised at the quality of responses