You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
-*-i* install mode: create a virtual environment and install the library
25
+
-*-r* run mode: start jupyter after installation of the library
26
+
-*-v* path to virtual environment (default: ./sparknlp_env)
27
+
-*-j* path to license json file for Spark NLP for Healthcare
28
+
-*-o* path to license json file for Spark OCR
29
+
-*-a* path to a single license json file for both Spark OCR and Spark NLP
30
+
-*-s* specify pyspark version
31
+
-*-p* specify port of jupyter notebook
32
+
33
+
Use the -i flag for installing the libraries in a new virtual environment.
34
+
35
+
You can provide the desired path for virtual env using -v flag, otherwise a default location of ./sparknlp_env will be selected.
36
+
37
+
The PATH_TO_LICENSE_JSON_FILE must be replaced to the path where the license file is available on the local machine. According to the libraries you want to use you have different flags: -j, -o, -a. The license files can be easily downloaded from *My Subscription* section in your my.JohnSnowLabs.com account.
38
+
39
+
To directly start using Jupyter Notebook after the installation of the libraries user the -r flag. The install script downloads a couple of ready to use example notebooks that you can use to start experimenting with the libraries.
40
+
41
+
42
+
## Install NLP Libraries via Docker
15
43
16
44
We have prepared a docker image that contains all the required libraries for installing and running Spark NLP for Healthcare. However, it does not contain the library itself, as it is licensed, and requires installation credentials.
The `{secret.code}` is a secret code that is only available to users with valid/trial license. If you did not receive it yet, please contact us at <ahref="mailto:[email protected]">[email protected]</a>.
55
83
56
84
57
-
</div><divclass="h3-box"markdown="1">
58
-
59
85
### Setup AWS-CLI Credentials for licensed pretrained models
60
86
61
87
Starting from Spark NLP for Healthcare version 2.4.2, you need to first setup your AWS credentials to be able to access the private repository for John Snow Labs Pretrained Models.
@@ -69,10 +95,8 @@ Make sure you configure your credentials with aws configure following the instru
69
95
70
96
<ahref="https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-configure.html">Configuring the AWS CLI</a>
71
97
72
-
Please substitute the `ACCESS_KEY` and `SECRET_KEY` with the credentials you have received from your Customer Owner (CO). If you need your credentials contact us at
Please substitute the `ACCESS_KEY` and `SECRET_KEY` with the credentials you have received from your Customer Owner (CO). If you need your credentials contact us at <ahref="mailto:[email protected]">[email protected]</a>.
74
99
75
-
</div>
76
100
77
101
### Start Spark NLP for Healthcare Session from Python
@@ -185,6 +209,13 @@ unmanagedJars in Compile += file("lib/sparknlp-jsl.jar")
185
209
186
210
## Install on Databricks
187
211
212
+
### Automatic deployment of John Snow Labs NLP libraries
213
+
214
+
You can automatically deploy John Snow Labs libraries on Databricks by filling in the form available [here](https://www.johnsnowlabs.com/databricks/).
215
+
This will allow you to start a 30-day free trial with no limit on the amount of processed data. You just need to provide a Databricks Access Token that is used by our deployment script to connect to your Databricks instance and install John Snow Labs NLP libraries on a cluster of your choice.
216
+
217
+
### Manual deployment of Spark NLP for Healthcare
218
+
188
219
1. Create a cluster if you don't have one already
189
220
2. On a new cluster or existing one you need to add the following to the `Advanced Options -> Spark` tab, in `Spark.Config` box:
190
221
@@ -232,11 +263,9 @@ unmanagedJars in Compile += file("lib/sparknlp-jsl.jar")
232
263
233
264
234
265
235
-
## Use on Google Colab Notebook
236
-
237
-
Google Colab is perhaps the easiest way to get started with spark-nlp. It requires no installation or setup other than having a Google account.
266
+
## Use on Google Colab
238
267
239
-
Run the following code in Google Colab notebook and start using spark-nlp right away.
268
+
Run the following code in Google Colab notebook and start using Spark NLP right away.
240
269
241
270
The first thing that you need is to create the json file with the credentials and the configuration in your local system.
242
271
@@ -251,6 +280,9 @@ The first thing that you need is to create the json file with the credentials an
251
280
}
252
281
```
253
282
283
+
If you have a valid floating license, the license json file can be downloaded from your account on [my.JohnSnowLabs.com](https://my.johnsnowlabs.com/) on **My Subscriptions** section. To get a trial license please visit
284
+
285
+
254
286
Then you need to write that piece of code to load the credentials that you created before.
0 commit comments