Alpha-test : Evaluation report (T04)


Marianne Dujardin (ULB)
 
 

1. Overview of the project
2. Integration of VirLib in Impala - Actual system & changes for the Librarian 2.1. Electronic delivery within Impala *
2.2. Users *
3. What was tested ?   4. Methodology- Strategy 4.1. Schedule * 1. The internal tests *
2. The inter-institutions tests *
4.2. Evaluation grids: * 1. Aspects analysed *
2. Grids for each test (for each impala request) *
3. Synthesis grids of the whole system *

5. The Results

5.1. First trials (internal) *
5.2. Inter - institutions trials * I. The system * Scanning *
Sending *
Quality of the document *
Reliability of the system *
Ariel *
II Global amelioration : interface & Monitoring * Interface *
Monitoring *
 
6. conclusion & Improvements

 
 
 
 

1. Overview of the project

The evaluation of the VirLib system took place after the completion of the two first workpackages : WP1 "Improvements and Packaging of VirLib Software" & WP 2 " Development of a PDF server". In the technical annex of the project, those two WP were prerequisites before the evaluation trials. The aim of the WP 4 is to " Test of the technical developments made under the WP1 & 2 in a real environment".

The objective of the WP 1 managed by IRIS was the consolidation and the improvement of existing software, the security of file transfer and the software packaging.

The purpose of this workpackage WP2 co-ordinated by UIA is the development of a PDF server for electronic delivery of ILL materials between libraries in Belgium using the Impala document ordering system.

The evaluation of software developed for VirLib was coordinated by ULB. The tests were performed by the three partners of the project :

In the technical annex, the start date mentioned was month 7 (May 1999) and the end date was month 9 (July 1999). The completion of this report of evaluation was delayed mainly because we encountered technical problems and difficulties of making out a schedule of tests between the partners during the holidays period.
 
 

2. Integration of VirLib in Impala - Actual system & changes for the Librarian

2.1. Electronic delivery within Impala

Impala, Instant mailing procedure for automated lending activities, provides Belgian libraries

with a system for sending and receiving interlibrary loan (ILL) requests. It is used for returnables

(books) as well as for non-returnables (photocopies). Since 1992 Impala is opened to all Antilope libraries (Antilope is from 1981 the union catalogue of periodicals for Belgium) and meanwhile Impala became the national document ordering system for Belgian scientific libraries.

VirLib can be considered as an add-on to Impala and is specifically intended to facilitate the electronic document delivery (from photocopies of journal articles) between Impala libraries. The Web version of Impala now includes options for electronic document delivery services: VirLib II allows libraries to launch the scanning of documents right from the Impala Web interface, convert them to PDF and send them to the requesting library.
 
 

The VirLib software is in fact a set of applications. These applications can be divided into

two groups:

In fact, VirLib is integrated with Impala to the effect that after successful receipt of the file, a clickable URL is displayed in the Impala application. Moreover, all processes besides scanning happen in background, so users in Inter Library Loan (ILL) departments are not confronted with the technical details involved when converting documents from TIFF to PDF or when sending/receiving PDF files through email or FTP. When a problem occurs, a monitoring system alerts the ILL desk and allows the user to intervene.
 
 

2.2. Users

This evaluation was made in collaboration with the ILL departments of the three institutions : UIA, ULB, KBR. VirLib is going to be used only by professional librarians. In Impala a user cannot send a ILL request directly to another Belgian institution; Impala obliges that a user’s request was previously confirmed by a person of the ILL department of his institution . Therefore the system did not need to be evaluated by no-professional users.

The tests-users who evaluated the system are the actual librarians of the three ILL departments. Those persons have known the system for a long time and work daily with Impala. They were not confronted to an entire new system they had to understand. Being used to work with Impala, they had only to learn how to use the few new VirLib functionalities that were added to the system.

In fact, the work of the librarian with VirLib II is nearly the same than before :

When launching a request in Impala, the electronic delivery is the default delivery option Supplying libraries will be prompted to deliver materials electronically only if they are registered as "electronic suppliers" in Impala The supplying library receives the request and can use the "Scan button" to launch a scanning operation. This operation launches the VirLib acquisition software. The librarian has just to check the scanning process. The librarian does not need to understand the whole process in details. All the operations are in background for the ILL librarians who are therefore not able to act directly on the system. They only can verify if the whole system works well (if the requested document is supplied), if the scanning process works well, and if the quality of the received document is good enough. The computer specialist checks if the system (the servers, the communication, and so on) is working well and solves technical problems when they occur.
 
 

3. What was tested ?

All the operations – except the launching of the scanning - are in background for librarians of the two types of libraries (the requesting library and the supplying library), so they can not be directly controlled or evaluated with a grids (writing of TIFF file to the local drive, creating a *.dat file containing meta information, wrapping to a TAR file, sending the TAR file to the server, untarring of the TAR file and converting to single PDF file, and the sending of the PDF file through email to the VirLib workstation of the requesting library, or for the requesting library : watching of incoming files, extracting of the incoming PDF files from the mail message and putting them in a secure location, the construction of a secure URL, updating of the Impala database). The tests could evaluate the ergonomical aspect of the Impala interface, they made it possible to verify if the system works well and evaluate the time needed to receive a requested document.

Two aspects were tested: on the one hand the usability of the system and on the other hand the system itself and its reliability ;

1 The librarian’s point of view

The attitude of the librarian working in the ILL department : for this work, we made a kind of "observation of the Work Practice" . We observed the users interacting with the new system in context, i.e. in this workplace. This WP 4 did not aim to focus on the Impala system in its entirety. As we said it before, the VirLib functionality is a little part of Impala. The evaluation focused only on the usability of the Virlib functionnality.

For the first set of tests, a direct observation was made by the observer physically on-site and taking notes. It allows us to estimate the attitudes of the future users toward the system.

2. The system itself

The reliability and the rapidity of the system are the two aspects that were evaluated. The evaluation of the system itself was made in the normal conditions of use. They were made in two steps : the first trials were realised internally, and the second set of trials were conducted in collaboration with the three ILL departments of the institutions involved in the project : UIA, ULB, KBR. This second aspect is the main point that was evaluated in the trials.
 
 

4. Methodology- Strategy

4.1. Schedule

The evaluation began later than expected for several reasons previously detailed : the VirLib software had only been installed by the end of June. The evaluation was then taking place during holidays.

The installation of the VirLib software on the PC of the ULB ILL department was realised at the beginning of July by the ULB team. This installation was later accompanied by some explications about the modules of VirLib and about how the system works. The tests have been made in two steps : the internal tests and the inter-institutions tests. During the internal trials a comparison between VirLib and Ariel was also made.
 
 
 
 
 
 
 
 

1. The internal tests

The first tests were made by two institutions : KBR and ULB. UIA had already made them because they had installed and tested the system for several months. When they did it, the methodology was not chosen yet and the evaluation grids were not prepared yet. Therefore, we haven’t got any data from their trials.

An internal trial is a request which is made by an institution and supplied by the same institution. Such tests are possible because in VirLib a unique server is used for both operations : the reception of documents coming from other VirLib institutions and the sending of documents to those VirLib institutions . The same server can then be used for the two steps of an Impala demand and a library can be at the same time the requesting library and the supplying library.

Each institution sends some Impala requests to itself. The staff was invited to become acquainted with the new system, to evaluate it and to point out the main technical problems. Those first tests were foreseen to take more or less 10 days. A preliminary report was made to point out the problems to be solved and to suggest some possible improvements to the system.

The evaluation tests were made by a person of the ILL staff and controlled by a neutral observant. A computer scientist collaborated also to the evaluation : to give information about the technical problems occurring at the VirLib Workstation.

This approach enabled detailed analysis of the performance of users doing real work with the product being evaluated. Users undertook real work tasks while observers made notes and timings were taken. The observations were subsequently analysed in detail, and usability metrics for efficiency, effectiveness, productive period etc. were produced. It was a user-based evaluation. A debriefing with the person of the ILL staff has been made at the end of trials by the observant to fill in the grids.

This first phase was used for installing the software on the local PC and to make it in such a way that it works properly. The main problems occurred during the scanning process. This phase wanted to focus on the main aspects of the evaluation : reliability (maturity, fault tolerance, recoverability), usability (understandability, learnability, operability), maintainability (analysability, changeability, stability, testability) and portability (adaptability, installability, conformance, replaceability.).

This test also wanted to compare VirLib and Ariel. Due to the delay occurred during the first set of trials, those tests were reduced to the minimum. They have been made according to the remaining time. They also wanted to test the new Ariel integration inside VirLib.

It was foreseen to evaluate Ariel through a set of trials. Unfortunately, the tests were made during the internal tests only in the ULB and they only evaluated the scanning process in comparison with the same step in VirLib.
 
 

2. The inter-institutions tests

This second phase was realised between the three different institutions (UIA, KBR, ULB). A set of Impala requests was sent between UA, KBR and ULB. Those tests allowed us to evaluate the system on a "large scale".

This step was realised by the ILL department staff itself and was focused on the evaluation of the system. It aimed to calculate the timing of the several steps of the delivery of an article, to evaluate the quality of the documents and to precise the reliability of the system.
 
 

4.2. Evaluation grids:

A request in Impala involves two libraries : a requesting library which sends the demand of an article through Impala and a supplying library which has the article and can send a copy of it. Therefore the requesting library and the supplying library were playing a part in the evaluation of a unique document request .

The requesting library is concerned by the request itself, the reception and the quality of the incoming files as well as the updating of the Impala interface. The supplying library is concerned by the reception of the request, the scanning process (via the Plug-in), the conversions of files and the sending of the files to the requesting library. To allow this double part of the evaluation of a unique request, two different grids were created : one for the requesting library and another one for the supplying library; in order to allow the matching between the different grids, the Impala number of the request was requested to be noted on top of the evaluation tables.

Four evaluation grids were provided to allow the persons in charge to evaluate the system following the same way as the other and to standardise the results. Two grids allowed to evaluate the tasks of the requesting library and two grids allowed to evaluate the tasks of the supplying library.
 
 

To test VirLib, 4 grids were realised :

1. Aspects analysed

The ISO/IEC 9126 characteristics and subcharacteristics provide a useful checklist of issues related to quality. The definitions of the software quality characteristics are :

2. Grids for each test (for each impala request)

One grid is a work document allowing to take note of the encountered problems, of the timing, of comments for each step of a Impala request and finally the quality of the sent document. First of all those grids served to evaluate the reliability and the speed of the system.

For each request, the following aspects were evaluated :

  1. the reliability of the system

  2. The trials wanted to evaluate if the system works well and if not, they wanted to make a list of the encountered problems and to precise their frequency (Note : special attention was paid to the reliability of the system during the scanning process).

  3. The speed of the system (the timing)

  4. Most of the VirLib processes are running in background. So it was difficult and sometimes impossible to time the different steps. However, it was not necessary to have a precise timing of each step of the supplying of a request. Timings were consequently preferably based on date/hours mentioned in Impala and in the VirLib monitoring.

    It had been decided not to evaluate the time needed to find the requested document on the shelves of the library and the time needed to photocopy the article either. The important time was the time needed to make the treatment of a request ; that means the time needed to scan and to send the scanned document to the requesting library.

    The person in charge in the supplying library wrote down in the grids the date and the starting time and the end time of the scanning, and the monitoring time (when the file is sent to the server of the requesting library). In the requesting library, the tester only noted the time indicated in the monitoring.

  5. The quality of the document

  6. That field allowed to specify the quality of the received document and its completeness.

    In a specific Comments field the user could mention information that he considered as important.

    For each library (the requesting library and the supplying library) the person in charge had to fill in the following fields concerning the request:

    Impala N°:

    Date:

    requesting library (acronym)

    Supplying library (acronym):

    Number of pages of the document:

    Equipment :
     
     

    The Impala number allowed to put together the two parts of a same request : the evaluation grids filled in by the requesting library as well as those provided by the supplying library. The information about the two libraries involved and the tester could also give some information : for instance about the frequency of problems occurring during the tests made by a specific library or between two specific libraries.

    The information concerning the number of pages of the document sent was also very important. According to the previous tests already made, frequent problems occurred during the scanning process and when the number of pages was too large.

    The equipment used could also be a cause of some problems, it is the reason why information about the used equipment had to be written down.
     
     

    3. Synthesis grids of the whole system

    The grids highlight also some aspects of the system which were not directly linked with individual tests. The first grids allowed us, on the one hand, to make a synthesis of the encountered problems and their frequency and on the other hand, the average of all the tests would give us an indication of the time needed for each step of the process. These two grids allowed to evaluate the functionality, the ergonomics of the system. For each step of the process, the person in charge could also suggest some improvements of the system. Those grids were filled in at the end of the evaluation period in one go. As we have said, the Impala system was not changed entirely, so the evaluation concerned only the new VirLib adds.

    The two libraries also evaluated the monitoring interface (the incoming files and the outgoing files) in a separate grid. The monitoring has to be evaluated separately .
     
     

    For each step, the following aspects were evaluated :

  7. The functionality of the system
  8. The ergonomics of the screens, the tools, the messages sent by the system and the help-function
  9. The possible improvements
5. Results

5.1. First trials (internal)

a. Introduction

An internal trial is a request which is made by an institution and supplied by the same institution.

The aim of those tests was to test the local installation and to be sure that the inter-institutions tests could be realised in good conditions. The ILL departments could then see the technical problems and solve them when they occurred. Those tests were realised by only two institutions : ULB and KBR but not by the UIA that had used the system since October 1998 and that had already done the internal tests.

This first part of the evaluation was conducted during July, August and September 1999 following the availability of the persons involved in this work : on one hand, persons in charge of the evaluation in each institution and on the other hand, the librarians who work in the ILL departments.

At the end of the tests, the person of charge in the institution had to make a summary of all the trials. This synthesis summarised all the problems encountered, the frequency of their occurrences and the solutions chosen to solve them. It would give a first average of the time needed for each step and for the whole process of a delivery.
 
 

b. The internal tests in KBR site

The battery of tests in the KBR were made in two steps :

- the first phase used the HP ScanJet 4c/T

- the second phase used the HP ScanJet 6200.

The KBR did the tests using HP scanners because the Minolta PS3000 bookscanner still doesn't work with the plug-in.

During the tests, KBR installed the VirLib system on a dedicated server with Linux RedHat 6.0 as operating system.

The first phase (5 tests)

With the HP ScanJet 4c/T, the KBR team faced a lot of problems. The problems occurred when the KBR team tried to improve the quality of the scanned documents. In fact, the documents scanned in black and white were unreadable on screen as well as on paper. Firstly this poor quality of the image was attributed to the glazed paper that was used, but the scanning of a copy of the same article was also of bad quality.

Secondly, they tried to adapt the standard settings of the scanner software. Those adaptations posed problem. The scanning process was all the time interrupted during the scanning of the second page of the document. The ACQVIRLIB module sent a message saying : " This program has performed an invalid page fault in module Acqvirlib.exe, ...", and indicating that the program would be closed. In fact, the program was not closed but the whole system was blocked. The system worked only with the standard settings of the software but the quality of the scanned document was bad.

The use of this scanner was totally abandoned by the KBR and no solution has been found yet to solve this problems of settings in the KBR. On the other hand, IRIS realised a set of trials in its premises with the same equipment and didn’t face any similar problem but couldn't explain what was done wrong at the KBR, so this problem isn't solved yet.

The second phase (2 tests)

To improve the situation, a second set of tests was made with the HP ScanJet 6200, and it worked well and caused no problem. This scanner was not provided with an ADF, the librarian has to scan manually each page of the document and load the preference settings. Therefore the whole process is time-consuming. KBR elaborated this to IRIS and again IRIS doesn't encounter any problem but can not tell what is done wrong.
 
 

c. The internal tests in ULB site

The ULB realised also two sets of tests with two different scanners :

- HP Scanjet 4C

- Bell & Howell 5000FS

At the beginning of the trials, the ULB team faced a lot of problems due to the scanner software. The scanner previously (HP ScanJet 4P) used in the ILL department of ULB does not work with a feeder and consequently took a lot of time. To avoid the manual intervention and the waste of time and to improve the rapidity of the treatment of a request, it was decided to install another scanner working with a feeder. The new scanner (an HP ScanJet 4C) was installed in the ILL department at the beginning of the tests.

With the installation of the new scanner, the corresponding scanner software was not installed on the hard disk of the PC because the scanner was working with the old version. The VirLib system using the original scanner software, this situation caused a lot of problem. The system was blocked. About twenty tests were realised during July and August 1999 and treated internal requests (requests from ULB to ULB) as well as normal Impala requests (requests from another institution to ULB). Only 4 tests were successful.

For several internal problems we resolved this problem late. We were misled by :

- the system sent a message saying :" This program has performed an illegal operation and will be shut down" (the first line of the details was : "AcqVirlib caused an invalid page fault in module AcqVirlib.exe"), which gave us no information on the problem and the "message error" led us to conclude that the problem was coming from the AcqVirlib module

- the Ariel system was running without any problem (but this system developed its own software and do not use the original scanner software).

Anyway, even when the correct version of the software was installed, the process with this scanner was time-consuming because

- the feeder worked slowly

- the scanning was made in two steps ( firstly a pre-scanning was realised)

- the librarian was obliged to selected himself the correct zone to be scanned according to quality and the contents of the document.

To be complete, we can note that only two tests were successful in the ULB when it used the scanner HP ScanJet 4C. Those tests were realised during the internal evaluation ( Impala request : N° 804950 [from UIA] and N° 816120 [from ULB]). For the two requested documents the time needed to scan was respectably 10 minutes for 10 pages and 12 minutes pour 11 pages. We can reasonably conclude that the scanning with the HP ScanJet takes more than one minute per page and this conclusion is confirmed by the results provided by the tests realised in the KBR.

Later, a second set of tests were realised on the Bell & Howell 5000FS . Four tests were realised with this scanner, we could not make more tests because we have lost too much time with the first scanner. All the tests were successful. At the opposite of the HP scanner, the feeder worked, the selected zone is correct and the scanning is made in one go. The time needed to scan a document is consequently reduced and the whole process takes half of the time needed to work with the HP.

The current situation was also no totally satisfactory because the Bell & Howell 5000FS used is not compatible with the Ariel software. ULB can receive a document sent by a Ariel partner through VirLib but it can not deliver a document to an Ariel partner (except to BLDSC, TUDelf, and INIST).
 
 

d. Ariel

The few trials realised with Ariel software did not pose any problems. The few number of trials was caused by several reasons :

The evaluation of the Ariel software in comparison with the VirLib software is then based on few tests realised at ULB with the HP Scanjet during the first set of tests.

First of all, we have to note that the Ariel software poses less problems than the VirLib software because Ariel uses its own image acquisition software while VirLib calls the standard TWAIN interface and is thus less device dependant than Ariel.

Much depends on the equipment one uses to scan documents. The few trials allowed us to conclude that when an HP scanner with no ADF is used, the ARIEL software is easier to use than VirLib. Moreover, with the HP TWAIN interface the manual intervention of the librarian is frequently needed (manual feeding of sheets, manual selection of the zone to be scanned) and each page was scanned two times for VirLib (pre-scanning and scanning). The tests with the Bell&Howell scanner with ADF indicate that VirLib can be as performant as Ariel

With VirLib, the Impala service is improved but the librarian’s work become more complicated when the equipement used is not adequate. Ariel seems to be more efficient with non-ADF HP Scanners.. On the other hand, the VirLib system is competitive with Ariel when an HP scanner with ADF or a Bell&Howell scanner is used.
 
 

e. Conclusions of the first trials

The first trials were completed later than expected. The complexion of this step was behind the schedule. The encountered problems reduced the collected data. Few aspects of the system could be evaluated during this part of the evaluation. This step was principally of use to install the scanner and the software on the local PC. The evaluation of the whole system (including the worksations and the network) was reported to the second set of trials.

This difficult installation of the software led us to insist on the fact that we have to pay special attention on the selection of appropriate scanning equipment and on the installation of the system in the ILL departments of the Pro VirLib partners.
 
 

5.2. Inter - institutions trials

a. introduction

The three VirLib institutions (KBR, UIA, ULB) participated to this second phase of the evaluation of the VirLib system. They lasted two weeks (from the 18th October until the 29th October). New grids were made to take into account the critical remarks noted during the first set of tests. The three partners filled in the grids for each document supplied electronically to the two other institutions or received electronically from them.

The requests sent before the 18th October or received after the 29th October could not be taken into account for the evaluation. Some data of those tests were however used during this evaluation ; for instance, the data concerning the time needed to scan a document is an information that could be informative and interesting by itself even if we did not have all the data about the request.

This set of tests took place later than foreseen. The holidays and some technical problems (for instance KBR experienced problems with setting up an email account) delayed the start of the inter-institutions tests. They began late but they only could begin when the technical problems linked with the scanner installation were solved.

The tests were only made in the normal conditions of work in Impala. The partners did not create specific requests for this period.

In agreement with the others partners, for the supplying library it was decided to test only the time needed to scan the document and to fill in the request; but the time span between collecting the document, making a photocopy, scanning the document, and the notification in Impala was not taken into account. Because the time needed to collect, copy & scan the document was not considered as a part of the VirLib system but as a problem of the ILL service of each institution.
 
 

b. The participating libraries

  1. UIA

  2. The University of Antwerp is the coordinating organization of three institutions : Universitaire Instelling Antwerpen ( UIA), Universitair Centrum Antwerpen (RUCA) and Universitaire Faculteiten Sint-Ignatius te Antwerpen (UFSIA). Each institution has its own campus and library

    For the evaluation only the Universitaire Instelling Antwerpen ( UIA) participated and not RUCA nor UFSIA

  3. ULB

  4. At the ULB, there are also several libraries : the library of law, the library of Human Sciences and the libraries of Applied Sciences on the campus of Solbosch and the library of Medicine in the campus Erasme. Each campus has his own ILL department. Only the ILL department of the Solbosch participated to the evaluations.

  5. KBR : the ILL department works for the whole Library.

c. Conditions

The evaluation is based on the 25 requests that were supplied during the two chosen weeks but some of them could not be taken into account in this evaluation for several reasons that we will detail below.

Three documents asked by UIA to ULB, were sent on paper support because the format of the pages was not A4 and because the scanning needed several operations before the document was ready to be scanned and sent by electronic delivery (N° Impala: 847909; 851681 ; 852211).

Several other Impala requests were deleted because they were not complete. In fact, some demands were asked before the beginning of the evaluation period or were delivered after this period and then it missed either the Evaluation grids of the requesting library or the Evaluation grids of the supplying library (N° Impala : 836548 ; 833313 ; 833527 ; 851153)

The number of complete requests that we have taken into account in this evaluation is then : 18 Impala demands.

Another cause can explain the small number of requests. The participating libraries do not cover the same field. For the University of Antwerp only the Universitaire Instelling Antwerpen ( UIA) participated. At the ULB, it was ILL department of the Solbosch that participated to the evaluations . The ILL department of UIA involved in this evaluation concerned the Biomedical field and at the opposite, the KBR and the ULB are covering more Human Sciences field or even the Applied Sciences field. Several documents requested by ULB and KBR from UIA are titles held in other reading rooms from where they could not be delivered electronically.

This situation explains also why all the 18 requests came from UIA. 14 of them were sent to KBR (78 % of the total requests) and 4 to ULB. On the other hand, UIA did not send any demand to the other institutions. It seems normal that the users of the ULB and the KBR are less likely interested in the documents that UIA owns than the contrary.
 
 

REQUESTS
 
From To Number of requests
UIA ULB
4
UIA KBR
14
ULB UIA
0
ULB KBR
0
KBR ULB
0
KBR UIA
0

 

Those trials were few but there were conducted in the "real world working environment". They allowed us to evaluate some aspects of the system. The organisation of a session of intensive trials was not judged useful by the other partners. The system was not evaluated in conditions of misuses, in extreme or bad conditions. Any unsuccessful request have been sent again to the requesting library using the files conserved in the "archives" folder nor the "outgoing" folder.
 
 

d. Results
 
 

I. The system

Note 1 : As all the requests were sent by UIA, the results can be viewed as on the one hand, the evaluation of the capacity of the ULB and the KBR to deliver a document to UIA and on the other hand, the quality of the service. In fact, each library was playing a specific role ; UIA was exclusively the requesting library and ULB and KBR were exclusively supplying libraries.

Note 2 : the time noted by the persons in charge in the grids can differ from a team to another. The time used is generally given by the internal clock of the PC or of the server or the watch of a member of the team that are usually not synchronised (the differences can be of several minutes). Those data have then to be viewed as indicative.

Note 3 : the grids of the tests that we have taken into account are often not entirely completed. Some data are sometimes missing.

Scanning

The calculation of scanning time was based on all the data collected ; incomplete requests and internal tests (from the documents delivered to LUC or UFSIA and from delivering made after the tests at ULB) have been taken into account in order to increase the set of data.

In general, the time needed to introduce and prepare the works is obviously divided into the number of pages. The average time needed to scan a page is then shorter for numerous pages than for few pages. This aspect is more considerable for a scanner more performing and faster like the Bell & Howell than for a HP ScanJet.

The time needed with the Bell & Howell scanner is for instance : per scanned page the time needed goes from 2,63 sec/page for a document of 19 pages (Impala N° 845939) to 15 sec/page for a document of 2 pages (Impala N° 852519). At the KBR, the average of scanning time is 74,3 sec/page. Usually the scanning process goes from 60 sec to 75 sec per page but for small documents the time needed is more important : 120 sec for a document of 2 pages (Impala N° 852214) or 100 sec for a document of 3 pages (Impala N° 852203) .

This big difference of time is due to several reasons :

When the scanning is finished, it could be useful to display a message to confirm the user that the scanning is well made.

Few tests were realised at ULB scanning directly from the original book without having previously photocopying the articles. The results of those tests have clearly prove that the direct scanning of the book is more complicated, less ergonomic and more time-consuming than photocopying and scanning the copies.

Sending

Unfortunately, some grids are incompletely filled in. We have data from 12 demands supplied by KBR and 6 supplied by ULB (information coming from requests out of the test period). We have calculated the time needed to send the document (from the PC of the ILL department of the supplying library to the Virlib workstation of the requesting library [monitoring timing]). For KBR, this time went from 7 to 17 minutes.

The average time needed for this step was 11 minutes 25 seconds. This average seems normal according to the fact that each 15 minutes, the work station is watching if a new file containing a document is ready to be sent.

Quality of the document

This aspect did not pose problems. There is only one article sent by the KBR that was declared "unreadable" because the copies were too dark (Impala N° 852208). This problem was due to the darkness of the scanned copies (on the evaluation grids filled in by the KBR, there is a remark that mentions that the original copies were "very dark").

It seems obvious that the quality of the original copies is directly linked to the quality of the scanned article. The sending of document scanned from copies of bad quality has to be avoided.

Reliability of the system
 
Articles
Number
%
Successful 18 86%
With problems 3 14%
Total 21 100%

When the system is well installed there is no specific problem. The VirLib system is easy to use and works pretty well ; only one request was unsuccessful without being caused by a known explanation.

For the inter-institutional tests, three tests posed problems. The problematic tests were the Impala N° 851171, 851681 and 852208.

We have said yet that the problem of the request N° 852208 is linked to the bad quality of the copies used for the scanning.

The two remaining articles were received incomplete by the requesting library. For the Impala request N° 851681, the ILL department of UIA only received the first page of the document. For this request, the ULB ILL department informs that the problem comes from the format of the article that could not be sent via VirLib. The other article not received entirely was the Impala N° 851171; no precise cause was pointed out except the fact that the requested article contains 36 pages and was the biggest one of all the requests of this evaluation.
 
 

Ariel

Due to the fact that the first step was disturbed and delayed by the problems occurred during installation of the scanners at ULB and KBR, the Ariel tests were reduced to the minimum. The few data collected about Ariel are the result of the ULB first trial.

At the moment, the ILL department of the ULB can receive the Ariel request but cannot supply a document using Ariel.
 
 
 

II Global amelioration : interface & Monitoring

The evaluation of the global system was namely done by the KBR and the ULB. The UIA has built the system and consequently has previously done this kind of evaluation internally.

Interface

No precise remarks have been pointed out during the evaluation. The Impala interface has been well known for a long time by the staff of the ILL department.

Monitoring

The monitoring is a system tool provided with VirLib, a new application allowing to control the quality of output and input of the VirLib server (a function to check invoice and reception of files). Contrary to the well known Impala interface, the monitoring is not oriented to the end user.

This tool is a management tool used preferably by a technically skilled person (that is a system manager or an ILL manager with some knowledge on the working of the VirLib server). It is the raison why the guidelines are included in the technical documentation.

The evaluation was made without extensive uses for two reasons. Firstly, the Impala interface gives enough information on the request including the electronic deliveries (consequently, the staff did not often use it and it has not been introduced in their procedures yet) and secondly, the few number of the electronic deliveries did not encourage the staff to use it. Nevertheless, the use of this tool will increase rapidly with the participation of new partners and the extensive use of VirLib. Therefore, if it’s possible, this tool will be improved and adapted to be more relevant but according to its purpose, those improvements can be considered as desirable and not essential at the moment.

The user interface of the monitoring can obviously be improved A real explanation of the system is only given in the technical documentation of the VirLib server and not in the Impala manual. That implies that the monitoring of the system is under-used and that the real function of this tool is not well understood by the librarians. To resolve this problem, a short presentation of this tool could be done in the Impala manual to allow the users to better understand the global system and its possibilities .

Several aspects of the monitoring tool to be improved were underscored. The displaying of information (its presentation) and of functionalities can be improve. For instance, a clear separation between the delete-functionality and the display-functionality could be a real improvement as well as the asking of a user confirmation when the delete-functionality is used (in this case, the addition of a message-box could be useful).

In the same idea, the displaying of information can be improved to allow a easier manipulation of the tool. The separation between the different Impala requests could be more obvious or, the deletion of some unnecessary fields (like fields of copyright) that makes the system less usable and that shows redundant information (data contained in the fields "mail" and "mailto" are identical) could make the system easier to use.

The ProVirLib tests will be useful to better evaluate the system because they will be preceded by a precise nomination of the actors. This tool will be managed by a single person (who will play the role of system manager) and will be evaluated only by him.
 
 
 
 
 
 

6. conclusion & Improvements
 
 

The main problems were resolved during the first set of trials (internal tests). They occurred during the scanning and came from a wrong installation of the scanner (ULB) and a problem of settings (KBR). At the end of those trials, both partners decided to change the scanner they used by a more effective one. At ULB, the new scanner is compatible with VirLib but not with Ariel.

Those problems led to a waste of time and the tests of the whole VirLib system themselves were delayed.

Improvements

The VirLib evaluation is based on two sets of trials. The internal tests were delayed by some technical problems. On the other hand, the inter-institutions trials were few in number (18 complete trials). This situation was caused by several reasons : The tests were conducted in the " real world working environment" in Impala. The ILL departments involved in this evaluation do not deal with the same scientific domains (field of UIA is biomedical while the field of ULB and KBR is Human sciences and Applied Sciences). All the requests were consequently sent by UIA which means that only the KBR and the ULB were evaluated.

The evaluation was focused on the reliability (capability of software to maintain a level of performance) and the efficiency (relationship between the level of performance of the software and the amount of resources used of the whole system) rather than its functionality or its usability. The new VirLib functionality did not pose any problem because it is integrated in the well-known Impala system. Pro VirLib will give some additional information about the portability (the ability of software to be transferred from one environment to another) of the system.

Improvements

The quality and the format of the copies of the articles used for the scanning is primordial. The daily work of the ILL department staff has to be adapted to the new work inside VirLib =>

The results are reasonably successful. When the scanner installation problems are solved, the system works well. Two of the three requests which failed were caused by a problem of format or of quality of the copies scanned and were not directly linked to the system. The reliability of the system is good. The sole unsuccessful request correspond to the biggest file (36 pages).

With a performing equipment, the VirLib system is easy to use and efficient. The time needed to scan and to send a demand is competitive with the time needed with Ariel. The new VirLib functionality is well understood and does not change the work of the librarian of the ILL department. The old way of delivery was more simple because it was done in fewer steps. The new system is improving the speed of the delivery process but no the time needed to the treatment of the request inside the ILL department of the supplying library.

The use of a no performing equipment is time-consuming and implies from the staff a lot of manipulations and supplementary work. This addition of work increases automatically the cost of the delivery inside Impala and make this task more laborious. The efficiency of VirLib decreases quickly when no ADF is used. Efficiency at that moment becomes inferior and lower than the Ariel software (because Ariel has developed its own software). At the moment, the system is not frequently used but this new way of delivery will be quickly the most efficient.

Improvements


 
 

Data about Requests


 
Request
date Timing
Time ( min)
N° impala
From
To
pages Number
Supplying Library
Requesting

Library

Network transfer
847909
UIA ULB 15
21-oct
16:48
17:04:14
16
849334
UIA KBR 2
19-oct
_
10:11:59
_
849871
UIA ULB  8
19-oct
16:17
16:23
6
849881
UIA KBR 12
19-oct
14:38
9:51
13
849900
UIA KBR 7
19-oct
14:47
9:55
8
849920
UIA KBR 3
19-oct
_
10:22:03
_
850747
UIA KBR 6
21-oct
9:57
10:04
7
851084
UIA ULB 11
29-oct
11:35
11:43:39
4
851171
UIA KBR 36
21-oct
12:00
12:13
13
851338
UIA KBR 9
21-oct
9:47
10:04
17
851492
UIA KBR 4
21-oct
11:16
11:24
8
851681
UIA ULB
21-oct
16:27
16:33:56
6
852203
UIA KBR 3
22-oct
11:53
12:04
11
852208
UIA KBR 7
22-oct
13:38
13:55
17
852211
UIA ULB 22
22-oct
14:07
14:22:06
15
852214
UIA KBR 2
22-oct
13:43
13:55
12
852515
UIA ULB 5
26-oct
16:45
16:50:21
5
852518
UIA KBR 5
25-oct
10:57
11:06
9
852519
UIA ULB 5
26-oct
16:45
16:50:21
5
853432
UIA KBR 8
25-oct
14:26
14:35
9
854339
UIA KBR 8
26-oct 
15:07
15:18
11
Average
10 min 6 sec

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Working Documents

Evaluation grids

Test for the requesting library

Impala N°: Date:

requesting library (acronym)

name of the tester : Supplying library (acronym):

Number of pages of the document: Equipment :
 
STEP
Timing
step = ok (y/n)
Problems & solution <if possible>

Comment

1. Request

*camera icon on the Impala interface

Time :    
* notification of the request of an electronic supply 
 
 
 
 
 
 

 

 
 
 
 
 

 

     
2. Reception of the document

*reception of the .pdf file 

* storage of the .pdf file on the Virlib Workstation 

   
 
 
 
 
 
 

 

       
* updating of the monitoring

* creation of a URL

time :  
 
 

 

3. Updating of Impala 

* updating of impala & URL available within Impala


 
 
 
 

:

 

 

4. access to the document

* transmission of the URL to the user

     
  time    
5. verification of the document

* printing of the document

* quality of the document

* checking of the document

   
Quality of the documents-(quality of the images, front-page, 

number of pages,. copyright..)


 
 








 

Synthesis - Evaluation grids

for the requesting library

STEP
Comment on Functionality & 

ergonomics ( - screen / tools / messages provided by Virlib / help)

 

improvements

1.Request

* camera icon on the Impala interface

* notification of the request for an electronic supply 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

2. Reception of the document

* reception of the .pdf file

* storage of the .pdf file on the Virlib Workstation 

* updating of the monitoring

* creation of a clickable URL

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

 
 
 

 

 
3. Updating of Impala

updating of Impala & URL available within Impala
 
 
 
 

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

     
4. Access to the document

* transmission of the URL to the user


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

 
5. Verification of the document

* printing of the document

* quality of the document

* checking of the document

 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 


 
 
 
 
 
 
 

Monitoring (incoming files)

Comment on Functionality
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

Comment on ergonomics ( - screen / tools / messages provided by Virlib / help)



























 

Improvement
 
 




























 


 
 
 
 
 

Evaluation grids

Test for the supplying library

Impala N°: Date:

requesting library (acronym)

name of the tester : Supplying library (acronym):

Number of pages of the document: Equipment :
 
STEP
Timing
step = ok (y/n)
Problems & solution <if possible>

Comment

1. reception of the request      
* request for electronic document in Impala    

 

2. scanning

* launching of the scanning ,

start:    
* scanning of the pages, 

* notification of the electronic furniture of the document

     
  end :    
* creation & storage of the .tif files, .dat file & on the local PC in the directory C:\virlib\outgoing
 
 

:

 

 

       
3. archiving & sending to the local Virlib workstation

* creation of the .tar file

* copy to the Virlib workstation

* archiving in the directory C:\virlib\archive


 
 
 
 
 
 
 
 
 
 

 

 
 
 
 
 
 
 

 

4. Sending to the requesting library

* reception & untaring of the .tar file

* conversion to .pdf file

* creation of the front page

  • creation of the e-mail 
     
* sending to the server of the Requesting library monotiring

time

   
       
5 facultative test: resending of the file

* get back the request impala


 
 
 
 

 

 
 
 
 
 
 
 
 
 

 

* resending of the file time :    

 
 
 
 
 
 
 
 
 
 
 
 

 

     
       
6. Archive * destruction of the archives 

- on the local PC

- on the local workstation

 

 
 
 
 
 
 
 
 
 
 
 
 
 

 


 
 
 

Synthesis - Evaluation grids

for the supplying library

STEP
Comment on Functionality & 

ergonomics ( - screen / tools / messages provided by Virlib / help)

 

improvements

1. reception of the request

* request for electronic document in Impala


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

2. scanning

* launching of the scanning , 

* scanning of the pages, 

* notification of the electronic furniture of the document

* creation & storage of the .tif files, .dat file & on the local PC in the directory C:\virlib\outgoing

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

 
 
 

 

 
3. archiving & sending to the local Virlib workstation

* creation of the .tar file

* copy to the Virlib workstation

* archiving in the directory C:\virlib\archive

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

     
4. Sending to the requesting library

* reception & untaring of the .tar file

* conversion to .pdf file

* creation of the front page

* creation of the 

e-mail

* sending to the server of the Requesting library


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

 
     
5 facultative test: resending of the file

* get back the request impala

* resending of the file 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

     
6. Archive * destruction of the archives 

- on the local PC

- on the local workstation


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 


 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 


 
 
 
 
 

Monitoring (outgoing files)

Comment on Functionality
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

Comment on ergonomics ( - screen / tools / messages provided by Virlib / help)



























 

Improvement