In this series, I will introduce how to integrate OpenNLP, UIMA and Solr.
Integrate OpenNLP with UIMA
Talk about how to install UIMA, build OpenNLP pear, and run OpenNLP pear in CVD or UIMA Simple Server.
Integrate OpenNLP, UIMA and Solr via SOAP Web Service
Talk about how to deploy OpenNLP UIMA pear as SOAP web service, and integrate it with Solr.
Integrate OpenNLP, UIMA AS and Solr
Talk about how to deploy OpenNLP UIMA pear as UIMA AS Service, and integrate it with Solr.
Installing the UIMA SDK
Follow README in UIMA SDK: section 2. Installation and Setup
Set JAVA_HOME, UIMA_HOME
Append %UIMA_HOME%/bin to your PATH
Run %UIMA_HOME%/bin/adjustExamplePaths.bat
Build OpenNLP UIMA Pear
Follow instruction at OpenNLP UIMA
Download latest source code, go to %opennlp_src_home%\opennlp, type mvn install.
Go to %opennlp_src_home%\opennlp-uima, type ant -f createPear.xml.
The built OpenNlpTextAnalyzer.pear would be in %opennlp_src_home%\opennlp-uima\target folder.
Run OpenNLP Pear in UIMA Cas Visual Debugger
Call Set UIMA_JVM_OPTS=-Xms128M -Xmx8g to adjust JVM heap size, we can also change this in runUimaClass.bat or add it to system environment.
Execute runPearInstaller.bat, point to the built OpenNlpTextAnalyzer.pear in PEAR file, and specify installation directory, for example: %PEARS_HOME_REPLACE_THIS%\opennlp.uima.OpenNlpTextAnalyzer
To run OpenNLP analysis engine, click "Run your AE in the CAS", paste some text, then click "Run" -> "Run OpenNlpTextAnalyzer", or use shortcut key: Ctrl+R.
In future, we call cvd.bat, click "Run" -> "Local AE", browse to the location where OpenNLP pear is installed. Select %PEARS_HOME_REPLACE_THIS%\opennlp.uima.OpenNlpTextAnalyzer\opennlp.uima.OpenNlpTextAnalyzer_pear.xml, paste some test, then click click "Run" -> "Run OpenNlpTextAnalyzer".
Deploy OpenNLP Pear in UIMA Simple Server
We can deploy OpenNLP Pear in web service as a REST service. Please follow UIMA Simple Server User Guide to build the war.
http://uima.apache.org/downloads/sandbox/simpleServerUserGuide/simpleServerUserGuide.html
Then copy OpenNlpTextAnalyzer.pear to WEB-INF/resources, add opennlp servlet in web.xml:
Go to http://localhost:8080/uima-server/opennlp?mode=form, type some text, then hit the "Submit Query" button.
Or you can send a Get request: http://localhost:8080/uima-server/opennlp?text=some_text_here
Or send a post request:
curl http://localhost:8080/uima-server/opennlp -X POST -d "text=some_text_here"
Resources
UIMA Documentation Overview
UIMA Asynchronous Scaleout Documentation Overview
Integrate OpenNLP with UIMA
Talk about how to install UIMA, build OpenNLP pear, and run OpenNLP pear in CVD or UIMA Simple Server.
Integrate OpenNLP, UIMA and Solr via SOAP Web Service
Talk about how to deploy OpenNLP UIMA pear as SOAP web service, and integrate it with Solr.
Integrate OpenNLP, UIMA AS and Solr
Talk about how to deploy OpenNLP UIMA pear as UIMA AS Service, and integrate it with Solr.
Installing the UIMA SDK
Follow README in UIMA SDK: section 2. Installation and Setup
Set JAVA_HOME, UIMA_HOME
Append %UIMA_HOME%/bin to your PATH
Run %UIMA_HOME%/bin/adjustExamplePaths.bat
Build OpenNLP UIMA Pear
Follow instruction at OpenNLP UIMA
Download latest source code, go to %opennlp_src_home%\opennlp, type mvn install.
Go to %opennlp_src_home%\opennlp-uima, type ant -f createPear.xml.
The built OpenNlpTextAnalyzer.pear would be in %opennlp_src_home%\opennlp-uima\target folder.
Run OpenNLP Pear in UIMA Cas Visual Debugger
Call Set UIMA_JVM_OPTS=-Xms128M -Xmx8g to adjust JVM heap size, we can also change this in runUimaClass.bat or add it to system environment.
Execute runPearInstaller.bat, point to the built OpenNlpTextAnalyzer.pear in PEAR file, and specify installation directory, for example: %PEARS_HOME_REPLACE_THIS%\opennlp.uima.OpenNlpTextAnalyzer
To run OpenNLP analysis engine, click "Run your AE in the CAS", paste some text, then click "Run" -> "Run OpenNlpTextAnalyzer", or use shortcut key: Ctrl+R.
In future, we call cvd.bat, click "Run" -> "Local AE", browse to the location where OpenNLP pear is installed. Select %PEARS_HOME_REPLACE_THIS%\opennlp.uima.OpenNlpTextAnalyzer\opennlp.uima.OpenNlpTextAnalyzer_pear.xml, paste some test, then click click "Run" -> "Run OpenNlpTextAnalyzer".
Deploy OpenNLP Pear in UIMA Simple Server
We can deploy OpenNLP Pear in web service as a REST service. Please follow UIMA Simple Server User Guide to build the war.
http://uima.apache.org/downloads/sandbox/simpleServerUserGuide/simpleServerUserGuide.html
Then copy OpenNlpTextAnalyzer.pear to WEB-INF/resources, add opennlp servlet in web.xml:
<servlet> <servlet-name>opennlp</servlet-name> <servlet-class> org.apache.uima.simpleserver.servlet.SimpleServerServlet </servlet-class> <!-- Define the path to the pear file --> <init-param> <param-name>PearPath</param-name> <param-value> WEB-INF/resources/OpenNlpTextAnalyzer.pear </param-value> </init-param> </servlet> <servlet-mapping> <servlet-name>opennlp</servlet-name> <url-pattern>/opennlp</url-pattern> </servlet-mapping>Browse to http://localhost:8080/uima-server/opennlp
Go to http://localhost:8080/uima-server/opennlp?mode=form, type some text, then hit the "Submit Query" button.
Or you can send a Get request: http://localhost:8080/uima-server/opennlp?text=some_text_here
Or send a post request:
curl http://localhost:8080/uima-server/opennlp -X POST -d "text=some_text_here"
Resources
UIMA Documentation Overview
UIMA Asynchronous Scaleout Documentation Overview