PerkinElmer Informatics Support Forum
Decrease font size
Increase font size
Topic Title: Columbus db fails to initialise
Topic Summary: Columbus database component fails to start due to a corruption in the Lucene Index
Created On: 8/21/2018 3:26 PM
Status Post and Reply
Linear : Threading : Single : Branch
Topic Tools Topic Tools
View topic in raw text format. Print this topic.
 8/21/2018 3:26 PM
User is offline View Users Profile Print this message


RAH

Posts: 67
Joined: 5/16/2014

There are a number of reasons why the db component may not intialise so the Blitz-0.log is the first place to look for clues as to why.

In this post we are interested in a specific root cause - a corruption in the Lucene Index.

On the Columbus server the log is located here:

/var/log/columbus/db/Blitz-0.log

Search the log for entries relating to 'Lucene'.

If you see an error entry in the log which reads:

Caused by: org.hibernate.search.SearchException: Unable to open Lucene IndexReader
at org.hibernate.search.reader.SharingBufferReaderProvider.createReader(SharingBufferReaderProvider.java:96)
at org.hibernate.search.reader.SharingBufferReaderProvider.initialize(SharingBufferReaderProvider.java:73)
at org.hibernate.search.reader.ReaderProviderFactory.createReaderProvider(ReaderProviderFactory.java:64)
at org.hibernate.search.impl.SearchFactoryImpl.(SearchFactoryImpl.java:130)
at org.hibernate.search.event.ContextHolder.getOrBuildSearchFactory(ContextHolder.java:30)
at org.hibernate.search.event.FullTextIndexEventListener.initialize(FullTextIndexEventListener.java:79)
at org.hibernate.event.EventListeners$1.processListener(EventListeners.java:198)
at org.hibernate.event.EventListeners.processListeners(EventListeners.java:181)
at org.hibernate.event.EventListeners.initializeListeners(EventListeners.java:194)
... 76 more
Caused by: java.io.IOException: read past EOF
at org.apache.lucene.store.BufferedIndexInput.refill(BufferedIndexInput.java:151)
at org.apache.lucene.store.BufferedIndexInput.readByte(BufferedIndexInput.java:38)
at org.apache.lucene.store.IndexInput.readVInt(IndexInput.java:78)
at org.apache.lucene.index.FieldInfos.read(FieldInfos.java:311)
at org.apache.lucene.index.FieldInfos.(FieldInfos.java:60)
at org.apache.lucene.index.SegmentReader.initialize(SegmentReader.java:341)
at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:306)
at org.apache.lucene.index.SegmentReader.get(SegmentReader.java:228)
at org.apache.lucene.index.MultiSegmentReader.(MultiSegmentReader.java:55)
at org.apache.lucene.index.ReadOnlyMultiSegmentReader.(ReadOnlyMultiSegmentReader.java:27)
at org.apache.lucene.index.DirectoryIndexReader$1.doBody(DirectoryIndexReader.java:102)
at org.apache.lucene.index.SegmentInfos$FindSegmentsFile.run(SegmentInfos.java:653)
at org.apache.lucene.index.DirectoryIndexReader.open(DirectoryIndexReader.java:115)
at org.apache.lucene.index.IndexReader.open(IndexReader.java:316)
at org.apache.lucene.index.IndexReader.open(IndexReader.java:237)
at org.hibernate.search.reader.SharingBufferReaderProvider.readerFactory(SharingBufferReaderProvider.java:146)
at org.hibernate.search.reader.SharingBufferReaderProvider$PerDirectoryLatestReader.(SharingBufferReaderProvider.java:220)
at org.hibernate.search.reader.SharingBufferReaderProvider.createReader(SharingBufferReaderProvider.java:91)

The likelihood is that there is a corruption in the Lucene Index.

To resolve the issue it is necessary to replace the existing /OMERO/OMERO4_4/FullText directory so that a new Index can be created.

Stop the Columbus services:

$ sudo /etc/init.d/columbus stop

Change the name and make a backup of the existing/original FullText directory using the 'mv' command:

$ mv /OMERO/OMERO4_4/FullText OMERO/OMERO4_4/FullText.bak

Restart the Columbus services:

$ sudo /etc/init.d/columbus start

Check the status to make sure all services are up.

$ sudo /etc/init.d/columbus status

If the db component syuccessfully starts then a new FullText directory should have been created in /OMERO/OMERO4_4/

RAH

FuseTalk Basic Edition v4.0 - © 1999-2018 FuseTalk Inc. All rights reserved.