marc import

classic Classic list List threaded Threaded
7 messages Options
Reply | Threaded
Open this post in threaded view
|

marc import

Christine Forte
hi everyone,

i've got several files of marc records i downloaded from oclc.  i would
like to do a bulk import, but have been reading about the need to add a
952 subfield.  will i need to change something in these oclc records?  or
can i just do a bulk import?  

i'm pretty sure they're all sitting in the breeding area right now waiting
to be uploaded,  (must i do this one at a time?)

Christine Forte
Psychology Library Director
Antioch University Santa Barbara
801 Garden Street
Santa Barbara, CA  93101
Phone:  805-966-5615
Fax:  805-962-4786
e-mail:  [hidden email]

_______________________________________________
Koha mailing list
[hidden email]
http://lists.katipo.co.nz/mailman/listinfo/koha
Reply | Threaded
Open this post in threaded view
|

Re: marc import

Steven F.Baljkas
Wednesday, February 22, 2006     18:15 CST

Hi, Christine,

In quick reply to your message ...

> From: "Christine Forte" <[hidden email]>
> Date: 2006/02/22 Wed PM 05:38:52 CST
> To: " <[hidden email]>" <[hidden email]>
> Subject: [Koha] marc import
>
> hi everyone,
>
> i've got several files of marc records i downloaded from oclc.  i would
> like to do a bulk import, but have been reading about the need to add a
> 952 subfield.  will i need to change something in these oclc records?  or
> can i just do a bulk import?  

... yes, you can bulk import them. I would recommend decompiling them into editable text outside of Koha first though, using a free tool like MARCEdit (about which see URL <http://www.loc.gov/marc/marctools.html>)
then adding the necessary 942 and 952 information, editing or deleting any other fields you like, and then using MARCEdit to reconvert the whole back into valid MARC21 which you will then be able to bulkimport easily.

Plus, doing things this way you will save any trouble trying to edit with Koha's editor.

As has been cautioned before (many times), Christine, please remember to keep your original (in this case OCLC) MARC records, as there are elements in MARC records which Koha does not currently handle correctly and some which it just plain ignores.

(Given your library type and depending on your long-term interest in keeping with Koha, Christine, you may well want to spend the time configuring Koha to include display of the MARC control fields it would otherwise ignore: see Thomas Dukleth's blessedly helpful explanation on this in the comments to the defining Systems Parameters section in Stephen Hedge's Koha User's Guide at
   URL <http://www.kohadocs.org/usersguide/ch01s03.html#lowtags> .)

> i'm pretty sure they're all sitting in the breeding area right now waiting
> to be uploaded,  (must i do this one at a time?)

I think it might be better to do it through running a bulkmarcimport script as I outlined above. (I think you might have to do them one at a time out of the breeding area: it has been a long time since I've looked at that, sorry.)

Hope this helps a bit.

Cheers,
Steven F. Baljkas
library tech at large
Koha neophyte
Winnipeg, MB, Canada

_______________________________________________
Koha mailing list
[hidden email]
http://lists.katipo.co.nz/mailman/listinfo/koha
============================================================
_______________________________________________
Koha mailing list
[hidden email]
http://lists.katipo.co.nz/mailman/listinfo/koha
Reply | Threaded
Open this post in threaded view
|

Re: marc import

Joshua Ferraro-3
Hi Christine, Steven,

On Wed, Feb 22, 2006 at 06:26:35PM -0600, Steven F. Baljkas wrote:

> > i've got several files of marc records i downloaded from oclc.  i would
> > like to do a bulk import, but have been reading about the need to add a
> > 952 subfield.  will i need to change something in these oclc records?  or
> > can i just do a bulk import?  
>
> ... yes, you can bulk import them. I would recommend decompiling them into
> editable text outside of Koha first though, using a free tool like
> MARCEdit (about which see URL <http://www.loc.gov/marc/marctools.html>)
> then adding the necessary 942 and 952 information, editing or deleting any
> other fields you like, and then using MARCEdit to reconvert the whole back
> into valid MARC21 which you will then be able to bulkimport easily.
That sounds like a bit too much work to me, editing every field. I
would recommend using a small Perl script to parse the record and mirror
your holdings data (wherever it may be located) to the default fields
that Koha uses (942 for the record-level and 952 for the item-level).

Another perfectly valid way to proceed is to map the koha holdings fields
to the fields where your holdings data lives. With that approach, you
wouldn't need to edit the records at all.

> Plus, doing things this way you will save any trouble trying to edit with
> Koha's editor.

> As has been cautioned before (many times), Christine, please remember to
> keep your original (in this case OCLC) MARC records, as there are elements in
> MARC records which Koha does not currently handle correctly and some which
> it just plain ignores.
I don't see any value at this point in keeping the original MARC records
as the bulkmarcimport tool doesn't rely on the frameworks to determine which
elements of the MARC record to save. In the 2.2 series, there is 0 loss
while importing. No ILS makes full use of MARC, it would be absurd if
one did. The important questions to ask is "is the system going to lose
information on import" and "can the system export the records again in
valid MARC". Koha will not lose information and it will export in valid
MARC.

Where there is a problem, is when a library does not properly set up
the framework to meet their needs (adding control fields, etc.). See
below.

>
> (Given your library type and depending on your long-term interest in
> keeping with Koha, Christine, you may well want to spend the time
> configuring Koha to include display of the MARC control fields it would
> otherwise ignore: see Thomas Dukleth's blessedly helpful explanation
> on this in the comments to the defining Systems Parameters section in
> Stephen Hedge's Koha User's Guide at
>    URL <http://www.kohadocs.org/usersguide/ch01s03.html#lowtags> .)
If you'd like an example of how this might look, please see this MARC
edit screen:
http://koha.liblime.com/cgi-bin/koha/acqui.simple/addbiblio.pl

> > i'm pretty sure they're all sitting in the breeding area right now waiting
> > to be uploaded,  (must i do this one at a time?)
The breeding farm is for records that don't have any holdings information
attached, not for records that already have holdings. I'd follow Stephen's
advice and use bulkmarcimport (though I would recommend against manually
editing the files to update the holdings location).

Cheers,

--
Joshua Ferraro               VENDOR SERVICES FOR OPEN-SOURCE SOFTWARE
President, Technology       migration, training, maintenance, support
LibLime                                Featuring Koha Open-Source ILS
[hidden email] |Full Demos at http://liblime.com/koha |1(888)KohaILS
_______________________________________________
Koha mailing list
[hidden email]
http://lists.katipo.co.nz/mailman/listinfo/koha
Reply | Threaded
Open this post in threaded view
|

Re: Re: marc import

Joshua Ferraro-3
In reply to this post by Christine Forte
On Wed, Feb 22, 2006 at 08:56:53PM -0600, Steven F. Baljkas wrote:
> That said, Christine, you may need to revisit the call numbers used because
> Koha is still a touch fussy because it was originally designed for a
> specific type of call number (Dewey) and I'd bet that Christine will be
> making use of unique LC numbers.
Koha is able to support unique LC numbers so long as your MARC links
are set up correctly.

> Third: Joshua, I am sorry I keep having to repeat this, but the simple
> fact is that Koha doesn't properly use MARC records and so it is prudent
> -- and in fact, at least in my legal jurisdiction and probably in yours,
> too, Christine, since it would fall under the legal category of **due
> diligence** -- to protect the data that you have already harvested AT
> COST against loss (OCLC or not, your cataloguing department presumably
> isn't run by volunteers, directed by volunteers, on donated computers with
> free power from the electric company).
> In the past, I have worked at libraries where we did double back-ups of
> both our raw retrieved cords and system-specific edited records (call it
> paranoid if you want, but it can help prevent disasters).
First off, I'm all for being paranoid. By all means, make backups. However,
I must reemphasize one fact: there is a major difference between 'properly
using' MARC records and 'properly storing' MARC records. It's true that
Koha does not _use_ MARC records to the fullest extent possible ... see
below ... however, it's missleading to imply that Koha can't _store_ MARC
properly.

>
> Fourth: yes, Virginina there is a Santa Claus, but he doesn't give stuff
> away for free.
>
> This is my smarmy way of saying that, yes, there are ILS that actually
> do use MARC completely, Joshua. Think Voyager for one. There are others.
> Of course, they are not free and they do not respond to users' requests/
> demands/pleas for improvements/corrections/customisations at nearly the
> rate Koha attempts.
This is simply not true: no ILS fully uses MARC, they all use a subset
of it (which is usually how standards are adopted, think SQL, POSIX, W3C
and even Z39.50). If you've ever attempted to program something to a
standard you'll know why most standards are never fully realized :) Let
me cite just one example.

Standard MARC requires the following of the 245 tag:

If $f $g $h $k follow $b they refer to subtitle. If they follow $a they
refer to title. No ILS that I've ever seen handles this in its indexes.

As a friend once remarked:  if the sun shines in ecuador then 245z
equals title in swahili.

Many bits of semantic information are lost the moment you import your
records into an ILS because the ILS doesn't have any reasonable way to
present such a vast semantic domain to humans in a way that it ends
up improving our use of the system.

Of course, there is also the issue of what do you mean by MARC. Are
you refering to MARC syntax, MARC data elements, AACR? Another topic
I suppose, though feel free to clarify...

> But, PLEASE, PLEASE, PLEASE, Joshua et alia, don't get into the
> habit of excusing Koha's current deficiencies in that regard by
> assuming that nothing else gets it completely right either!
>
> Joshua, I know that attitude isn't typical of what you have expressed in
> the past; it certainly isn't on the progressive path that people need to
> be on if problems are ever to be fixed and the software improved for
> everyone.
If course, my goal as release manager for 3.0 is to improve Koha's use
of MARC. That's the whole reason we're switching to Zebra. I am aware of
our current limitations with regard to handling, for instance, Standard
MARC Holdings, and I fully intend to fix those limitations so that Koha
will be a strong candidate for even the most MARC-centric libraries.
 

>
> (And for the record, yes, I realise it is easy for us who were
> trained to catalogue to spot and harp on Koha's problems with
> cataloguing. Similarly, yes, I realise that I and most cataloguing-
> oriented folk have not contributed and likely will never contribute
> anything concrete towards resolving the problems we can detect,
> other than pointing them out to the talented programmers and
> developpers who (may) have the skill set to solve the problems. And
> as a final concession/confession, yes, it is true that most ILS have
> quirks with their cataloguing components and how they handle MARC and
> that many don't use MARC completely/completely correctly: the fact
> that Koha has quirks isn't the problem; it is a problem that it doesn't
> completely handle MARC by the MINIMUM STANDARD rules yet -- because
> you have been so helpful to me, Joshua, I could clarify what that
> entails, but I can guarantee you would not find it pretty).
It's worth bringing yet another distinction between Koha's ability
to _catalog_ Standard MARC and Koha's ability to _store_ Standard MARC.
Koha can store standard MARC fine. And give the proper framework it
can catalog Standard MARC as well. If you don't believe me, take a look
at the proof-of-concept here:

http://koha.liblime.com/cgi-bin/koha/acqui.simple/addbiblio.pl

That demo handles a leader and Directory, 003, 005, 006, 007,008,
and to my knowledge can catalog a BOOK material designation as per
the MARC standard. If you find otherwise please let me know.

Also, I should point out that I don't know of any other MARC editor
that provide such a great interface for selecting possible values
for the fixed fields ...

The problem is, none of our clients want to catalog using 'Standard
MARC' so noone has bothered to write a complete set of frameworks to
accomodate that. Also, no professional cataloger that I know of has
take the time to customize his/her own framework to support the
subset of the standard that he/she needs.

>
> Fifth: I have yet to see any example of Koha-exported MARC, so I
> cannot say definitively, but I would suggest that any system that
> does the violence of separating the record into multiple levels may
> not be capable of reintegrating into valid MARC. I would have to see
> actual samples to know. I have to receive any takers on that (and I
> have requested Koha-ised MARC exports in the past).
I'd be happy to supply some MARC records exported from Koha ... you
could of course export some yourself from the LibLime demo:

http://koha.liblime.com/cgi-bin/koha/export/marc.pl

> To begin with the obvious, though, Joshua, exactly how do you propose
> that Koha re-encodes the Directory portion (the series of numerals
> coding tag occurence and field length, immediately following the Leader),
> when it currently ignores all the control fields? Any changes a
> Koha-library made to a record would be lost; any attempt to export that
> record would result in invalid MARC as the Directory, even if kept
> somewhere, would match the original MARC, not the edited version.
Again, you're missing the point. Koha doesn't need to 'use' the MARC
record to properly store it and export it, just like a directory
doesn't need to 'use' the MARC to store a MARC file. The field length
in the Directory portion is managed by the MARC::Record module.

> Sixth: yes, the majority of problems Koha-ites complain of/question on
> list-serv do indeed seem to be, as Joshua suggests, occurences of
> improper or incomplete configuration by the new would-be Koha library.
>
> That admitted, Joshua, you, like most of the programming-savvy, tend to
> forget that most of us actually working in library science/library tech
> are used to turn-key systems that require a great deal less set-up. (Take
> Voyager circulation config for one small example: it can autofill the
> circulation fields saving a lot of time Koha wastes requiring cut and paste.)
Absoutely, which is why some of us have formed support companies to help
out with that proccess. (BTW: autofill circulation fields, I'm curious,
could you expand on that?).
>
> That is not a complaint, in and of itself, just an observation of
> differences in experience that are apt to cause friction unless taken
> into account.
I think it's a healthy conversation. I'm certainly not going to claim
that I've got as much direct knowledge about the ins and outs of the
MARC standard as you; I know enough to build some tools that can manage
MARC correctly though, and I can read the free online specifications.
Certainly, if you are willing to lend help in defining how Koha should
'use' MARC in ways that it doesn't currently, I'm all ears.

> (I still think it would be ideal, one day in the distant future, to get
> Koha to a level where it would behave like a more typical turn-key ILS:
> that said, I do like the fact that Koha reminds us about all that goes
> into making the system work and what configuration really requires.)
ditto

I truly hope we can get a good thread going here ... I think it would be
good for the project if we addressed these issues now, as we're designing
the next major version of Koha.

Sincerely,

--
Joshua Ferraro               VENDOR SERVICES FOR OPEN-SOURCE SOFTWARE
President, Technology       migration, training, maintenance, support
LibLime                                Featuring Koha Open-Source ILS
[hidden email] |Full Demos at http://liblime.com/koha |1(888)KohaILS
_______________________________________________
Koha mailing list
[hidden email]
http://lists.katipo.co.nz/mailman/listinfo/koha
Reply | Threaded
Open this post in threaded view
|

Re: marc import

Stephen Hedges
In reply to this post by Joshua Ferraro-3
At the risk of feeding a flame war...

I don't understand why people think Koha is somehow deficient in MARC
support.  (Most of the "problems" I've seen are a result of not reading
the manual!)  When I got my MLS, I was taught that MARC is a way to store,
retrieve and share records between libraries.  I don't remember a single
word being said about how the records are displayed or manipulated by an
ILS.  If you can save, edit, and search on all valid MARC fields, then you
have full MARC support.  Since Koha can do exactly that when it's set up
properly (as well as importing and exporting MARC records), then Koha has
full MARC support.  Period.

NPL has been using Koha for 29 months now, and we haven't had any unusual
problems with any of our MARC records.  To imply that Koha is somehow
"MARC-deficient" may work in the hallowed halls of theory, but in the real
world of day-to-day library operations, it's just not true.

Someday soon, when all of the rest of the world is using XML and only a
few of us MLS folks still give a damn about MARC, I expect to see Koha as
one of the few ILS's that is flexible enough to still survive.

Stephen

Joshua Ferraro said:

> Hi Christine, Steven,
>
> On Wed, Feb 22, 2006 at 06:26:35PM -0600, Steven F. Baljkas wrote:
>> > i've got several files of marc records i downloaded from oclc.  i
>> would
>> > like to do a bulk import, but have been reading about the need to add
>> a
>> > 952 subfield.  will i need to change something in these oclc records?
>> or
>> > can i just do a bulk import?
>>
>> ... yes, you can bulk import them. I would recommend decompiling them
>> into
>> editable text outside of Koha first though, using a free tool like
>> MARCEdit (about which see URL <http://www.loc.gov/marc/marctools.html>)
>> then adding the necessary 942 and 952 information, editing or deleting
>> any
>> other fields you like, and then using MARCEdit to reconvert the whole
>> back
>> into valid MARC21 which you will then be able to bulkimport easily.
> That sounds like a bit too much work to me, editing every field. I
> would recommend using a small Perl script to parse the record and mirror
> your holdings data (wherever it may be located) to the default fields
> that Koha uses (942 for the record-level and 952 for the item-level).
>
> Another perfectly valid way to proceed is to map the koha holdings fields
> to the fields where your holdings data lives. With that approach, you
> wouldn't need to edit the records at all.
>
>> Plus, doing things this way you will save any trouble trying to edit
>> with
>> Koha's editor.
>
>> As has been cautioned before (many times), Christine, please remember to
>> keep your original (in this case OCLC) MARC records, as there are
>> elements in
>> MARC records which Koha does not currently handle correctly and some
>> which
>> it just plain ignores.
> I don't see any value at this point in keeping the original MARC records
> as the bulkmarcimport tool doesn't rely on the frameworks to determine
> which
> elements of the MARC record to save. In the 2.2 series, there is 0 loss
> while importing. No ILS makes full use of MARC, it would be absurd if
> one did. The important questions to ask is "is the system going to lose
> information on import" and "can the system export the records again in
> valid MARC". Koha will not lose information and it will export in valid
> MARC.
>
> Where there is a problem, is when a library does not properly set up
> the framework to meet their needs (adding control fields, etc.). See
> below.
>
>>
>> (Given your library type and depending on your long-term interest in
>> keeping with Koha, Christine, you may well want to spend the time
>> configuring Koha to include display of the MARC control fields it would
>> otherwise ignore: see Thomas Dukleth's blessedly helpful explanation
>> on this in the comments to the defining Systems Parameters section in
>> Stephen Hedge's Koha User's Guide at
>>    URL <http://www.kohadocs.org/usersguide/ch01s03.html#lowtags> .)
> If you'd like an example of how this might look, please see this MARC
> edit screen:
> http://koha.liblime.com/cgi-bin/koha/acqui.simple/addbiblio.pl
>
>> > i'm pretty sure they're all sitting in the breeding area right now
>> waiting
>> > to be uploaded,  (must i do this one at a time?)
> The breeding farm is for records that don't have any holdings information
> attached, not for records that already have holdings. I'd follow Stephen's
> advice and use bulkmarcimport (though I would recommend against manually
> editing the files to update the holdings location).
>
> Cheers,
>
> --
> Joshua Ferraro               VENDOR SERVICES FOR OPEN-SOURCE SOFTWARE
> President, Technology       migration, training, maintenance, support
> LibLime                                Featuring Koha Open-Source ILS
> [hidden email] |Full Demos at http://liblime.com/koha |1(888)KohaILS
> _______________________________________________
> Koha mailing list
> [hidden email]
> http://lists.katipo.co.nz/mailman/listinfo/koha
>


--
Stephen Hedges
Skemotah Solutions, USA
www.skemotah.com  --  [hidden email]

_______________________________________________
Koha mailing list
[hidden email]
http://lists.katipo.co.nz/mailman/listinfo/koha
Reply | Threaded
Open this post in threaded view
|

Re: marc import

BWS Johnson
In reply to this post by Christine Forte
Salve!


>I don't understand why people think Koha is somehow deficient in MARC
>support. (Most of the "problems" I've seen are a result of not reading
>the manual!) When I got my MLS, I was taught that MARC is a way to store,
>retrieve and share records between libraries. I don't remember a single
>word being said about how the records are displayed or manipulated by an
>ILS. If you can save, edit, and search on all valid MARC fields, then you
>have full MARC support. Since Koha can do exactly that when it's set up
>properly (as well as importing and exporting MARC records), then Koha has
>full MARC support. Period.
>

The 650 x problem is a big deal. Other than that, I agree. Now, if there is a way to make the subject subheadings play nice with their bigger brothers, let me know. I do like the keyword search - it blows other ILSs out of the water. Much of this falls under "save the time of the reader" to me.


>NPL has been using Koha for 29 months now, and we haven't had any unusual
>problems with any of our MARC records. To imply that Koha is somehow
>"MARC-deficient" may work in the hallowed halls of theory, but in the real
>world of day-to-day library operations, it's just not true.
>


I generally tell people that most librarians wouldn't notice, and almost all users won't. To me, it boils down to this: I'm not paying for the fix, so if the fix comes slowly, great. As it is, the support with Koha is way better than commercial products. Frankly, and I know I'll get burned at the stake for this, Koha was really meant as a small to medium public library product in a nation that couldn't give a fig about MARC. Having perfect MARC compatibility is an academic or medium to large library problem.


>Someday soon, when all of the rest of the world is using XML and only a
>few of us MLS folks still give a damn about MARC, I expect to see Koha as
>one of the few ILS's that is flexible enough to still survive.


This I wholeheartedly agree on. How I already grin at the thought of being able to easily link my ebooks! It's awesome that Koha was all set to do this.

Cheers,
Brooke @ Hinsdale MA
_______________________________________________
Koha mailing list
[hidden email]
http://lists.katipo.co.nz/mailman/listinfo/koha
Reply | Threaded
Open this post in threaded view
|

Re: Re: marc import

Joshua Ferraro-3
In reply to this post by Christine Forte
On Thu, Feb 23, 2006 at 04:18:02PM -0600, Steven F. Baljkas wrote:
> Given the existence of the MARC21 standards, one such expectation is
> that an ILS will be able to handle the data encoded in a record WITHOUT
> the necessity of teaching the ILS about every field and subfield. In my
> library science training, that was considered to be the responsibility of
> programmers and developpers: not library staff. (I daresay if programmers
> had to start correcting flaws in programming languages touted as ready for
> use, it wouldn't be seen as accommodating to tell them that there are
> easy solutions at hand.)
You have to understand, Steven, that most of the programmers working on
Koha have been paid by libraries to develop the functionality that it
currently has. They were paid to set things up the way that _those_ libraries
needed things set up.

Nelsonville is still running version 2.2.0 by the way, and their frameworks
haven't been set up properly to handle the 650s. They will be upgrading
shortly, and when they do, I think you'll be hard pressed to find a problem.

> All the confessions aside, thus, I see no point in pretending we are
> having an open conversation towards improving Koha's handling of MARC
> when the retorts on the programming side are consistently of the kind
> that 'it works well enough' and 'nobody but we MLS will care'. I daresay
> that if you were to continue with that line of thinking you would see,
> without any contribution on my part, how quickly it leads to disaster.
That's not what I'm saying at all. My point is simple:

Koha's MARC Frameworks can fully support the MARC Bibliographic Standard if
you set them up properly.

> Up until now, I always hoped that Koha would one day be 100% MARC-
> compliant -- able to understand and make use of (import, store, index,
> retrieve, display and export) valid MARC data -- but I am beginning to
> feel that is naive optimism on my part. I have always agreed with
> Brooke's reasoning -- 'I am not paying for the fix so if the fix comes
> slowly, great' -- and have been following Koha patiently for 4 years now
> because of that reasoning. But there has to be hope to make waiting
> worthwhile and it seems strained when the constant retort is that it
> already works.
Steven ... if you find a problem with the MARC support on the
http://koha.liblime.com site please let me know. Thusfar, all the
'problems' you have pointed to are relics of the past and have been
eliminated in 2.2.5.

> Brooke is incredibly generous in excusing the failure of handling the
> 650s problem. It has been almost 2 years since it was first detected.
> This would be considered a fatal flaw to any commercial ILS, great
> support notwithstanding.
I already pointed out to you that that problem has been fixed in the
latest version of Koha.

> My question would be, given what Joshua has shown, for example, with the
> Control Fields, when common sense demands ease of use, library science
> demands the traditional access points, and when the standards as they
> have emerged demand integrated access to has been deemed -- by the
> relevant authorities -- core fields as a mark of compliance, WHY are
> these things left to be entered in at all by the end user? (You do not
> seem to take nearly the same casual attitude towards the much simpler
> matters involved in acquisitions.)
>
> Would it not be easier, more logical and more compliant to have them
> hard-wired in? If someone wants to ignore them, isn't it easier just to
> have the option to delete a field from view (whether that is in a
> standard view, an ISBD view or a MARC view)? (Again to take
> Acquisitions, you have different modes available there: was that not a
> similar kind of decision to what I am questioning?)
In fact, I've hired a consultant to do just that. He's systematically
setting up the MARC Frameworks to be fully compliant with Standard
MARC Bibliographic Records. When he's done, the frameworks will be
included in CVS and released with version 2.2.6 of Koha.

> I don't think, Brooke, you should fear being burned at the stake for
> your assessment of Koha's library type focus. I think what you said is
> both remarkably honest and completely accurate. I guess I just hoped
> that Koha would be expanding its domain rather than concentrating.
We're definitely aiming high with Zebra. I fully intend to have Koha
up-and-running in large public and academic libraries in the near
future.

Cheers,

--
Joshua Ferraro               VENDOR SERVICES FOR OPEN-SOURCE SOFTWARE
President, Technology       migration, training, maintenance, support
LibLime                                Featuring Koha Open-Source ILS
[hidden email] |Full Demos at http://liblime.com/koha |1(888)KohaILS
_______________________________________________
Koha mailing list
[hidden email]
http://lists.katipo.co.nz/mailman/listinfo/koha