Helge,
I think there may be two dimensions to this problem, and I’m copying the solrmarc-tech list into the thread in case anyone there has further feedback/suggestions.
Thanks for raising this issue; I look forward to hearing more on the subject from others!
- Demian
From: Ahrens, Helge <Helge....@ulb.hhu.de>
Sent: Friday, February 18, 2022 9:17 AM
To: vufin...@lists.sourceforge.net
Subject: [EXTERNAL] [VuFind-Tech] Revival: MARC::Record Size exceeding 99999 Bytes
Dear group,
today we encountered an error concerning the MARC::Record Size exceeding 99999 Bytes. A search for some information lead me to conversations in this list and on the Koha forum around the year 2012 (Links below). Unfortunately I feel like there has not really been a solution for the case of existing records that are bigger that the given Limit of 99999 Bytes. Correct me if I’m wrong.
So far if I’d just comment out line L86-L88 in Iso2709.php ( https://github.com/vufind-org/vufind/blob/2969196c94df2f97b839debec20a2399c97e8f44/module/VuFind/src/VuFind/Marc/Serialization/Iso2709.php#L86-L88 ) the record is displayed without problems, but the Exception causes VuFind to crash if uncommented.
Now the question for me is: Did anyone try to overcome this size-limit and ideally was successful? Because if the search crashes only based on such an error it’s not really explainable to the user and therefore not acceptable.
This doesn’t work: https://katalog.ulb.hhu.de/Search/Results?lookfor=%22Rheinische+Post%22&type=Title&limit=10
This works: https://katalog.ulb.hhu.de/Search/Results?lookfor=Rheinische+Post&type=Title&limit=10
Best wishes and thank you!
Helge
https://groups.google.com/g/solrmarc-tech/c/TrFs6m3DW58
https://koha-devel.koha-community.narkive.com/iUaBE7PQ/marc-record-record-length-and-leader
--
You received this message because you are subscribed to the Google Groups "solrmarc-tech" group.
To unsubscribe from this group and stop receiving emails from it, send an email to solrmarc-tec...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/solrmarc-tech/BL1PR03MB6136F344F12157800DB0FDD3E8379%40BL1PR03MB6136.namprd03.prod.outlook.com.
Thanks, Bob – I wonder if some kind of custom method that takes a list of encodings would be a solution – then you could set a prioritized order of formats, and if the formatter fails, the code moves along to the next item in the list. That might cover not just the “too long for binary” scenario but also other oddball things like “data contains characters which cannot be included in XML.”
- Demian
To view this discussion on the web visit https://groups.google.com/d/msgid/solrmarc-tech/CAKrr7dY0fpRjEFtBE_Q3cCAz8-kyH3rmG%2BKcOg1jvdOz89HYCA%40mail.gmail.com.