Can my ILS be added to MarcEdit’s ILS Integration?

By reeset / On / In Innovative Interfaces, Koha, Library, MarcEdit

This question has shown up in my email box a number of times over the past couple of days.  My guess, it’s related to the youtube videos recently posted demonstrating how to setup and use MarcEdit directly with Alma.

  1. Windows Version: https://youtu.be/8aSUnNC48Hw
  2. Mac Version: https://youtu.be/6SNYjR_WHKU

 

Folks have been curious how this work was done, and if it would be possible to do this kind of integration on their local ILS system.  As I was answering these questions, it dawned on me, others may be interested in this information as well — especially if they are planning to speak to their ILS vendor.  So, here are some common questions currently being asked, and my answers.

How are you integrating MarcEdit with the ILS?

About 3 years ago, the folks at Koha approached me.  A number of their users make use of MarcEdit, and had wondered if it would be possible to have MarcEdit work directly with their ILS system.  I love the folks over in that community — they are consistently putting out great work, and had just recently developed a REST-based API that provided read/write operations into the database.   Working with a few folks (who happen to be at ByWaters, another great group of people), I was provided with documentation, a testing system, and a few folks willing to give it a go — so I started working to see how difficult it would be.  And the whole time I was doing this, I kept thinking – it would be really nice if I could do this kind of thing with our Innovative Interfaces (III) catalog.  While III didn’t offer an API at the time (and for the record, as of 4/17/2017, they still don’t offer a viable API for their product outside of some toy API for dealing primarily with patron and circulation information), I started to think beyond Koha and realized that I had an opportunity to not just create a Koha specific plugin but use this integration as a model to develop an integration framework in MarcEdit.  And that’s what I did.  MarcEdit’s integration framework can potentially handle the following operations (assuming the system’s API provides them):

  1. Bibliographic and Holdings Records Search and Retrieval — search can be via API call, SRU or Z39.50
  2. Bibliographic and Holdings Records creation and update
  3. Item record management

 

I’ve added tooling directly into MarcEdit that supports the above functionality, allowing me to plug and play an ILS based on the API that they provide.  The benefit is that this code is available in all versions of MarcEdit, so once the integration is created, it works in the Windows version, the Linux version, and the Mac version without any additional work.  If a community was interested in building a more robust integration client, then I/they could look at developing a plugin — but this would be outside of the integration framework, and takes a significant amount of work to make cross-platform compatible (given the significant differences in UI development between Windows, the MacOS, and Linux).

This sounds great, what do you need to integrate my ILS with MarcEdit?

This has been one of the most common questions I’ve received this weekend.  Folks have watched or read about the Alma integration, and wondered if I can do it with their ILS.  My general answer, and I mean this, is that I’m willing to integrate any ILS system with MarcEdit, so long as they can provide the available API end points that make it possible to:

  1. Search for bibliographic data (holdings data is a plus)
  2. Allow for the creation and update of bibliographic data
  3. Utilize an application friendly authentication process, that hopefully allows the tool to determine user permissions

 

This is a pretty low bar.  Basically, an API just needs to be present; and if there is one, then integrating the ILS with MarcEdit is pretty straightforward.

OK, so my ILS system has an API, what else do I need to do?

This is where it gets a bit trickier.  ILS systems tend to not work well with folks that are not their customers, or who are not other corporations.  I’m generally neither, and for the purposes of this type of development, I’ll always be neither.  This means that getting this work to happen generally requires a local organization within a particular ILS community to champion the development, and by that, I mean either provide the introductions to the necessary people at the ILS, or provide access to a local sandbox so that development can occur.  This is how the Alma integration was first initiated.  There were some interested folks at the University of Maryland that spent a lot of time working with me and with ExLibris to make it possible for me to do this integration work.  Of course, after getting started and this work gained some interest, ExLibris reached out directly, which ultimately made this a much easier process.  In fact, I’m rarely impressed by our ILS community, but I’ve been impressed by the individuals at ExLibris for this specifically.  While it took a little while to get the process started, they do have open documentation, and once we got started, have been very approachable in answering questions.  I’ve never used their systems, and I’ve had other dealings with the company that have been less positive, but in this, ExLibris’s open approach to documentation is something I wish other ILS vendors would emulate.

I’ve checked, we have an API and our library would be happy to work with you…but we’ll need you to sign an NDA because the ILS API isn’t open

Ah, I neglected above to mention one of my deal-breakers and why I have not at present, worked with the APIs that I know are available in systems like Sirsi.  I won’t sign an NDA.  In fact, in most cases, I’ll likely publish the integration code for those that are interested.  But more importantly, and this I can’t stress enough, I will not build an integration into MarcEdit to an ILS system where the API is something that must be purchased as an add-on service, or requires an organization to purchase a license to “unlock” the API access.  API access is a core part of any system, and the ability to interact, update, and develop new workflows should be available to every user.  I have no problem that ILS vendors work with closed sourced systems (MarcEdit is closed source, even though I release large portions of the components into the public domain, to simplify supporting the tool), but if you are going to develop a closed source tool, you have a responsibility to open up your APIs and provide meaningful gateways into the application to enable innovation.  And let’s face it, ILS systems have sucked at this, and much to the library community’s detriment.  This really needs to change, and while the ability to integrate with a tiny, insignificant tool like MarcEdit isn’t going to make an ILS system more open, I personally get to make that same choice, and I have made the choice that I will only put development time into integration efforts on ILS systems that understand that their community needs choices and actively embraces the ability for their communities to innovate.  What this means, in practical terms, is if your ILS system requires you or I to sign an NDA to work with the API, I’m out.  If your ILS system requires you or their customers to pay for access to the API through additional license, training, or as an add-on to the system (and this one particularly annoys me), I’m out.  As an individual, you are welcome to develop the integrations yourself as a MarcEdit plugin, and I’m happy to answer questions and help individuals through that process, but I will not do the integration work in MarcEdit itself.

I’ve checked, my ILS system API meets the above requirements, how do we proceed?

Get in touch with me at reeset@gmail.com.  The actual integration work is pretty insignificant (I’m just plugging things into the integration framework), usually, the most time consuming part is getting access to a test system and documenting the process.

Hopefully, that answers some questions.

–tr

 

 

 

 

 

C# Koha API Updates

By reeset / On / In Koha

I’ve updated the C# Koha API library that MarcEdit utilizes to include support for the item update when working with Koha version 3.8+.  Additionally, I added to the git repository a reference application that demonstrates how to use the library.  You’ll need to provide your own system and credentials to try it, but it should demonstrate how to make it all work together.

Git Hub Repository: https://github.com/reeset/koha_api

–tr

Working with the Koha ILS HTTP API

By reeset / On / In General Computing, Koha

I’ve been spending the last week working with the Koha API, using it as an example for MarcEdit’s direct ILS integration platform.  After spending some time working with it and pushing some data through it, I have a couple of brief thoughts.

  1. I was pleasantly surprised as how easy the API was to work with.  Generally, the need for good authentication often stymies many a good API designs because the process for doing and maintaining authentication becomes so painful.  I found the cookejar approach that Koha implemented to be a very simple one to support and work with.  What’s more, error responses when working with the API tended to show up as HTTP Status codes, so it was easy to work with them using existing html tools.
  2. While the API is easy to use, it’s also really, really sparse.  There isn’t a facility for deleting records and I’m not sure if there is an easy way with the API to affect holdings for a set of records. I do know you can create items, but I’m not sure if that is a one off that occurs when you pass an entire bib record for update, or if there is a separate API that works just for Item data.  Search is also disappointing.  There is a specific API for retrieving individual records data – but the Search API is essentially Z39.50 (or SRU).  I’m not particularly enamored with either, though Z39.50 works (and I’m told that it’s fairly universal in terms of implementation).  I’ve never really liked SRU so it didn’t hurt my feelings too much to not work with it.  However, after spending time working with the Summon search API for other projects here at Oregon State, I was disappointed that search wasn’t something that the API specifically addressed.
  3. The API documentation leaves much to be desired.  I was primarily utilizing the Wiki (http://wiki.koha-community.org/wiki/Koha_/svc/_HTTP_API) which includes a single page on the API.  The page provided some simple demonstrations to show usage, which are really helpful.  What is less helpful is the lack of information regarding what happens when an error occurs.  The Authorization API returns an XML file with a status message – however, all other API return HTTP status codes.  This caught me a little by surprise, given the Authorization response – it would be nice if that information was documented somewhere.
  4. One thing that I can’t find in the documentation, so I really can’t answer this question is the impact of the API on system resources.  The API seems really to be geared towards working with individual records.  Well, MarcEdit is a batch records tool.  So, in my testing, I tried to see what would happen if I uploaded 1010 records through the API.  The process finished, sluggishly, but it appeared that uploading records through the API at high rates was having an impact on system performance.  The Upload process itself slowed considerably as the records were fed through the API.  But more curious – after the process finished, I had to wait about 15 minutes or so for all the records to make it through the workflow.  I’m assuming the API must queue items coming into the system, but this made it very difficult to test successful upload because the API was reporting success, but the data changes were not visible for a considerable amount of time.  Since I’ve never worked in a library that ran Koha in a production environment, I’m not sure if this type of record queuing is normal, but a better description of what is happening in the documentation would have been nice.  When I first started working with the API, I actually thought that the data updates were failing because I was expecting the changes to be in real-time in the system…my experience however seemed to indicate that they are not.

Anyway – those are my quick thoughts.  I need to caveat these notes by saying I have never worked at a library where Koha has been used in production, so maybe some of these behaviors are common knowledge.

 

–TR

Using the Koha API with C#

By reeset / On / In Koha, MarcEdit

I’ve been tinkering with the Koha API to allow MarcEdit to do some direct ILS integration with Koha-based systems.  When I first agreed to look at doing this work, I wasn’t sure what the API looked like.  Fortunately for me, the folks that put it together made it simple and easy to work with.  There are a few gocha’s when working with C#, and while I’ll be posting the source code for the Koha library that I’ve developed in C# on my github account, I thought I’d post some of my initial notes for those that are interested.

Essentially, to work with the Koha API, there are two things that you need to have.  First, you need to authenticate yourself.  Upon authentication, Koha provides session data that is maintained as a cookie that must be passed as part of future requests to the API.  Generally, this process is straightforward, in that you create a cookiejar.  In C#, this looks like the following:

private bool Authorize()
        {
            HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(var_host + "/cgi-bin/koha/svc/authentication?userid=" + var_user + "&password=" + var_pass);
            request.CookieContainer = cookieJar;

            HttpWebResponse response = (HttpWebResponse)request.GetResponse();

            System.IO.StreamReader reader = new System.IO.StreamReader(response.GetResponseStream(), Encoding.UTF8);
            string tmp = reader.ReadToEnd();

            //Add cookies to CookieJar (Cookie Container)
            foreach (Cookie cookie in response.Cookies)
            {
                cookieJar.Add(new Cookie(cookie.Name.Trim(), cookie.Value.Trim(), cookie.Path, cookie.Domain));
            }

            reader.Close();
            response.Close();

            if (tmp.IndexOf("ok") > -1)
            {
                return true;
            }
            else
            {
                return false;
            }
            
	}

The Cookie Jar in this case is scoped as global to the class, this way the session information doesn’t need to be regenerated unless the session expires.

Retrieving individual records is supported by the API. 

private string GetRecord(string id)
        {
            string uri = var_host + "/cgi-bin/koha/svc/bib/" + id + "?userid=" + var_user + "&password=" + var_pass;
            HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri);
            request.CookieContainer = cookieJar;

            HttpWebResponse response = (HttpWebResponse)request.GetResponse();

            System.IO.StreamReader reader = new System.IO.StreamReader(response.GetResponseStream(), Encoding.UTF8);
            string tmp = reader.ReadToEnd();
            
            reader.Close();
            response.Close();

            return tmp;
        }

In the case of Koha, the GetRecord function returns data in MARCXML format. This is my preferred method for retrieving data, but for general search and retrieval, while Koha supports SRU, the most reliable search API is still Z39.50 (sadly).

Finally, when you want to update or create a new record, you need to push the data up to the server in MARCXML format.  If the data is an update, you pass a record id, if it’s a new record, you don’t.

private bool UpdateRecord(string rec) 
	{
   	     return UpdateRecord(rec, null);
	}

private bool UpdateRecord(string rec, string id)
        {
            string uri = "";
 	    if (id == null) {
                uri = var_host + "/cgi-bin/koha/svc/new_bib?userid=" + var_user + "&password=" + var_pass;
            } else {
  	        uri = var_host + "/cgi-bin/koha/svc/bib/" + id + "?userid=" + var_user + "&password=" + var_pass;
 	    }
            HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(uri);
            request.CookieContainer = cookieJar;
            request.Method = "POST";
            request.ContentType = @"text/xml";
            System.IO.StreamWriter writer = new System.IO.StreamWriter(request.GetRequestStream(), System.Text.Encoding.UTF8);
            writer.Write(rec);
            writer.Flush();
            writer.Close();
            HttpWebResponse response = (HttpWebResponse)request.GetResponse();

            System.IO.StreamReader reader = new System.IO.StreamReader(response.GetResponseStream(), Encoding.UTF8);
            string tmp = reader.ReadToEnd();

            reader.Close();
            response.Close();

            return true;

        }

And that’s pretty much it. Simple and straightforward. Koha supports a few more API (http://wiki.koha-community.org/wiki/Koha_/svc/_HTTP_API, but for my immediate purposes, these are the couple that I need to support some very simple integration with the ILS. Ideally, at some point, it would be nice to see these API also support automated deletion of records, as well as maybe an ability to set holdings/items — but for now, this is good enough. I’m sure if these other functions are needed, the communities themselves will push for them within their own user communities and when they show up, MarcEdit will indirectly benefit.

–TR