Create a place where a GLAM can batch upload their own multimedia items and associated metadata for review/improvement/removal before publishing live to Commons.
Galleries • Libraries • Archives • Museums
Toolset project edit
A project has been started to address this request. See GLAM/Toolset_project for updates and discussions for this project.
Use Cases edit
- "...Why not a simple interface where I can upload as many files as possible in one go from my computer (open a folder and upload, like in Wordpress or Flickr)"
Feature requests edit
- institutional log-in accounts
- accept data in multiple formats but output in Commons-standard format
- allow mass categorisation/tagging/licensing/captioning
- allow Commons admins to review before publishing
- allow tracking/export of usage/change data - see Metrics request.
Past discussions/ideas edit
The main problem for GLAM mass uploads is the so-called metadata ingestion, i.e. the transfer of any metadata about the files from the medium and format they are in to something suitable for Commons and its templates and categories (data and metadata are then very easy to upload via scripts). Institutions obviously don't want to remake their (meta)database from scratch or by hand, and Commons users shouldn't be left with too many errors to fix.
The uploading tool wouldn't solve the problem that Commons doesn't actually handle metadata hence they'll be mostly wasted (see strategy:Proposal:Dublin Core), although with some effort they can be reused and fixed (see Bundesarchiv).
- Documentation on the "multimedia staging area" from the 2009 Paris Multimedia meeting: usability:Multimedia:Meeting_in_Paris/Notes/Staging_Area
- "Media Review" proposal in StrategyWiki - referred to in the Product Whitepaper: strategy:Proposal:Media review
- Wiki Loves Monuments uploader: commons:Commons:Wiki_Loves_Monuments_upload
- Ryan Kaldari's uploader: tools:~kaldari/uploader/
- runs on the toolserver, reliability problems
- It's only an uploader and doesn't have a meta-data ingester. You still have to write a custom script for parsing whatever meta-data you have into its database.
- It only works in English and doesn't have any localizability currently.
- Was written in a few days so it's still rough around the edges and likely has some bugs.
- Europeana work
- David Haskiya's summary of the toolset project here: File:GLAM_tooling_project_GLAMcamp.pdf
- more lengthy/contextualized powerpoint: http://www.slideshare.net/DavidHaskiya/europeana-and-wikipedians