Importing Sfx Marc Records Into Talis (oct 2009 Talk)

  • Uploaded by: Chris Keene
  • 0
  • 0
  • June 2020
  • PDF

This document was uploaded by user and they confirmed that they have the permission to share it. If you are author or own the copyright of this book, please report to us by using this DMCA report form. Report DMCA


Overview

Download & View Importing Sfx Marc Records Into Talis (oct 2009 Talk) as PDF for free.

More details

  • Words: 1,077
  • Pages: 4
Importing SFX MARC records into Talis Background Number of Journals available on SFX A-Z list: 25,752 Number of Journals using SCONUL criteria1: 10,818 Number of SFX active targets: 90 The University of Sussex has used Talis Alto as our LMS (ILS) since 2001. and SFX as our link resolver since 2006. It was desirable to have all journals we have access to regardless of media type - on the catalogue, as that was where many users expected to find them, and used as a way of discovering relevant content. After briefly unsuccessfully trying the simple export which comes as standard in SFX, we purchased MARCit in 2007. LMS

Online Journals and Link Resolver

Other

2001

Talis

In-house MS Access of some online journals database exported to html

ReadingLists: Talislist

2002







2003







2004



Ebsco A-Z



2005







2006



SFX (first link resolver)



2007



↓ (Marcit)



2008





Federated search: Metalib



Web Catalogue: Aquabrowser (along side Prism 2) ReadingLists: Talis Aspire

Year

2009

1



Print+e & e-only, excludes databases such as Jstor

Timeline • • • • • • • •

• • • •

Purchased MarcIt in June 2007 Started off treating this as a small job, but inexperience with the Talis Import process and varying quality of the records lead to duplicate records, some broken links and many missing journals. September 2008: A small working group got together to over see the process. A plan was drawn up on how to go forward, from removing current records, to loading records again, with testing/checking etc, and how to proceed in the future. Used an internal wiki to keep a log of all actions. Back to square one (but, perhaps the second best place to be other than 'successfully finished'), deleted all records we had loaded. Decided to load one publisher at a time, starting with some we only had a few of their journals. Lots of checking of records, and feedback from cataloguer/academic liaison section. Talis had a (undocumented) limit to the number of records that could be imported in one go, roughly around 3,000 records. We found many did not include an 001 field in the MARC record, something which our Talis LMS required, and so rejected these records. Many also lacked the 006/007 fields, which holds the physical description of the item, i.e. that it is an online copy. These were loaded, but our cataloguer was not pleased! We developed a script which could add a 001 field to those records that do not have it (easily to adapt to add / change other fields). This allowed us to load those records which could not load before. Finished in February 2009 (with a gap around Christmas) Still found some duplicates. There are a pain to remove. Now need to decide on long term method to keep Talis in sync with our e-journal holdings, must bear in mind demands on staff time, and avoid anything that could cause problems with our catalogue (duplicates etc).

Future MARCit allows you to export new items since the last time you exported a file, and this allows you to add new records to the catalogue. There is the wider question of keeping the two in sync. How do we delete records (or even know when to do so), or modify those that have changed? We could delete all records from the Catalogue (perhaps annually) and then do a full reload, but this is time consuming, and is the LMS designed to cope with 20,000 records being suppressed and another 20,000 loaded on a regular basis. Are we trying to put the wrong shape in to the wrong hole? What are we trying to achieve? allowing our online journals to be discovered using the same path as our physically content, because that is what many (most) users go to when

searching for library resources. With the introduction of search tools such as Primo, Aquabrowser, VuFind and more, do we need to add them to the LMS catalogue?

Aquabrowser At the same time of doing our final MARCit load in to Talis (Autumn 2008) we were also setting up Aquabrowser, to go live as a beta in January 2009. Aquabrowser is a replacement web interface to a library catalogue. Every night a script on our Talis server creates a large MARC export (in several files). These are placed on to Aquabrowser server, which then loads them in (basically reloading from scratch each night). This is a seamless process. One night we place the MARCit export file in to the same directory and the next morning all online journal records were in our Aquabrowser based catalogue! This had an added advantage that 'syncing' would not be a problem, we could periodically put a new MARCit file in the directory, no issues with removing old or adding new records. So why have we not yet gone down this road? We were already mostly through the process of loading records in to Talis when we discovered this. Additionally there were some licensing issues with Aquabrowser (this was considered another data source, something which came with an additional not-insignificant charge). It's possible they are reviewing this policy.

Summary • • • • •

Most records are of good quality, however a significant (for us) were not. MARCit not cheap LMS are not designed for large bulk imports and sync'ing with a separate system... ... perhaps Next Generation Catalogues and Search tools are the solution, bypass the LMS. The process was seamless with Aquabrowser (and I'm sure the same would be true for Primo) Treat it as a mini-project, with testing, checking, acceptance, etc. Include Cataloguers and someone from a front end department (e.g. for us a Senior Library Assistant from our Research Liaison department).



• •

The ultimate aim is that users can discover online journals via the catalogue, and access them. In this sense, a record with a title, ISSN and url would fit this need! Issues around quality of cataloguing are relevant and should be addressed (by those s are not ideal but not the end of the world. Once you have done an initial load, then need to think about keeping e-journal records up to date on LMS. We created a Perl script to add 001/003 fields, and can easily be adapted to add/edit other fields. Happy to share.

Chris Keene University of Sussex October 2009

Related Documents


More Documents from ""