Speed ​​up CoreNLP mood analysis

Can anyone think of ways to speed up CoreNLP mood analysis (below)?

I initialize the CoreNLP pipeline once at server startup:

// Initialize the CoreNLP text processing pipeline public static Properties props = new Properties(); public static StanfordCoreNLP pipeline; // Set text processing pipeline annotators props.setProperty("annotators", "tokenize, ssplit, pos, parse, sentiment"); // Use Shift-Reduce Constituency Parsing (O(n), // http://nlp.stanford.edu/software/srparser.shtml) vs CoreNLP default // Probabilistic Context-Free Grammar Parsing (O(n^3)) props.setProperty("parse.model", "edu/stanford/nlp/models/srparser/englishSR.ser.gz"); pipeline = new StanfordCoreNLP(props); 

Then I call the pipeline from my controller:

 String text = 'A sample string.' Annotation annotation = pipeline.process(text); List<CoreMap> sentences = annotation.get(CoreAnnotations.SentencesAnnotation.class); for (CoreMap sentence : sentences) { Tree tree = sentence.get(SentimentCoreAnnotations.SentimentAnnotatedTree.class); int sentiment = RNNCoreAnnotations.getPredictedClass(tree); ... } 

I profiled the code - the Annotation annotation = pipeline.process(text) , which is the main call to CoreNLP processing, is very slow. A request with 100 calls to my controller takes an average of 1.07 seconds. Annotations take ~ 7 ms per call. I need to reduce this to ~ 2 ms.

I can’t remove any of the annotators because the mood is based on all of them. I already use Shift-Reduce Constructance Parser, because it is much faster than a standard grammar analyzer without context.

Are there any other options that I can configure to speed this up significantly?

+5
source share
1 answer

With the same problem. I also tried SR Beam, which was even slower than PCFG! Based on the tests at Stanford, SR Beam should be much faster than PCFG, and only slightly slower than SR.

I think that instead of using the SR parser instead of PCFG, the only remaining way to improve speed can be played with the tokenizer options ...

0
source

Source: https://habr.com/ru/post/1240879/


All Articles