Lucene 4.0 cancels final tokenStream method

For various reasons, I have to work with the latest version of the Lucene API.

The API is not well documented yet, so I cannot execute simple addDocument()

Here is the initialization of Writer :

 analyzer = new StopAnalyzer(Version.LUCENE_40); config = new IndexWriterConfig(Version.LUCENE_40, analyzer); writer = new IndexWriter(FSDirectory.open(new File(ConfigUtil.getProperty("lucene.directory"))), config); 

A simple way toDocument :

 public static Document getDocument(User user) { Document doc = new Document(); FieldType storedType = new FieldType(); storedType.setStored(true); storedType.setTokenized(false); // Store user data doc.add(new Field(USER_ID, user.getId().toString(), storedType)); doc.add(new Field(USER_NAME, user.getFirstName() + user.getLastName(), storedType)); FieldType unstoredType = new FieldType(); unstoredType.setStored(false); unstoredType.setTokenized(true); // Analyze Location String tokens = ""; if (user.getLocation() != null && ! user.getLocation().isEmpty()){ for (Tag location : user.getLocation()) tokens += location.getName() + " "; doc.add(new Field(USER_LOCATION, tokens, unstoredType)); } } 

At startup:

 Document userDoc = DocumentManager.getDocument(userWrap); IndexAccess.getWriter().addDocument(userDoc); 

This is the error message I get:

class org.apache.lucene.analysis.util.ReusableAnalyzerBase overrides final method tokenStream.(Ljava/lang/String;Ljava/io/Reader;)Lorg/apache/lucene/analysis/TokenStream;

This may be a simple question, but I cannot find any link to help with this problem. I use the default analyzer and I followed the tutorial to avoid the deprecated Field.Index.ANALYZED

+4
source share
1 answer

This is due to a JAR version mismatch. You may be dependent on the contribution of the JAR, which in turn depends on another version of Lucene. Try to keep the exact dependency set at runtime and look for any version mismatches.

+2
source

Source: https://habr.com/ru/post/1442803/


All Articles