Skip to content

Commit 14dc766

Browse files
Merge pull request #6455 from JohnSnowLabs/egenc
amazon linux 2 installation
2 parents a4c01d9 + 7bae0c0 commit 14dc766

File tree

2 files changed

+113
-0
lines changed

2 files changed

+113
-0
lines changed

docs/en/install.md

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -467,6 +467,49 @@ gcloud dataproc clusters create ${CLUSTER_NAME} \
467467

468468
</div>
469469

470+
## Amazon Linux 2 Support
471+
472+
```bash
473+
# Update Package List & Install Required Packages
474+
sudo yum update
475+
sudo yum install -y amazon-linux-extras
476+
sudo yum -y install python3-pip
477+
478+
# Create Python virtual environment and activate it:
479+
python3 -m venv .sparknlp-env
480+
source .sparknlp-env/bin/activate
481+
```
482+
483+
Check JAVA version:
484+
- For Sparknlp versions above 3.x, please use JAVA-11
485+
- For Sparknlp versions below 3.x and SparkOCR, please use JAVA-8
486+
487+
Checking Java versions installed on your machine:
488+
```bash
489+
sudo alternatives --config java
490+
```
491+
492+
You can pick the index number (I am using java-8 as default - index 2):
493+
494+
</div><div class="h3-box" markdown="1">
495+
496+
<img class="image image--xl" src="/assets/images/installation/amazon-linux.png" style="width:100%; align:center; box-shadow: 0 3px 6px rgba(0,0,0,0.16), 0 3px 6px rgba(0,0,0,0.23);"/>
497+
498+
</div><div class="h3-box" markdown="1">
499+
500+
If you dont have java-11 or java-8 in you system, you can easily install via:
501+
502+
```bash
503+
sudo yum install java-1.8.0-openjdk
504+
```
505+
506+
Now, we can start installing the required libraries:
507+
508+
```bash
509+
pip install pyspark==3.1.2
510+
pip install spark-nlp
511+
```
512+
470513
## Docker Support
471514

472515
For having Spark NLP, PySpark, Jupyter, and other ML/DL dependencies as a Docker image you can use the following template:

docs/en/licensed_install.md

Lines changed: 70 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -417,6 +417,76 @@ As you see, we did not set `.master('local[*]')` explicitly to let YARN manage t
417417
Or you can set `.master('yarn')`.
418418
419419
420+
## Amazon Linux 2 Support
421+
422+
```bash
423+
# Update Package List & Install Required Packages
424+
sudo yum update
425+
sudo yum install -y amazon-linux-extras
426+
sudo yum -y install python3-pip
427+
428+
# Create Python virtual environment and activate it:
429+
python3 -m venv .sparknlp-env
430+
source .sparknlp-env/bin/activate
431+
```
432+
433+
Check JAVA version:
434+
- For Sparknlp versions above 3.x, please use JAVA-11
435+
- For Sparknlp versions below 3.x and SparkOCR, please use JAVA-8
436+
437+
Checking Java versions installed on your machine:
438+
```bash
439+
sudo alternatives --config java
440+
```
441+
442+
You can pick the index number (I am using java-8 as default - index 2):
443+
444+
</div><div class="h3-box" markdown="1">
445+
446+
<img class="image image--xl" src="/assets/images/installation/amazon-linux.png" style="width:100%; align:center; box-shadow: 0 3px 6px rgba(0,0,0,0.16), 0 3px 6px rgba(0,0,0,0.23);"/>
447+
448+
</div><div class="h3-box" markdown="1">
449+
450+
If you dont have java-11 or java-8 in you system, you can easily install via:
451+
452+
```bash
453+
sudo yum install java-1.8.0-openjdk
454+
```
455+
456+
Now, we can start installing the required libraries:
457+
458+
```bash
459+
pip install jupyter
460+
```
461+
462+
We can start jupyter notebook via:
463+
```bash
464+
jupyter notebook
465+
```
466+
467+
```bash
468+
### Now we are in the jupyter notebook cell:
469+
import json
470+
import os
471+
472+
with open('sparknlp_for_healthcare.json) as f:
473+
license_keys = json.load(f)
474+
475+
# Defining license key-value pairs as local variables
476+
locals().update(license_keys)
477+
478+
# Adding license key-value pairs to environment variables
479+
os.environ.update(license_keys)
480+
481+
# Installing pyspark and spark-nlp
482+
! pip install --upgrade -q pyspark==3.1.2 spark-nlp==$PUBLIC_VERSION
483+
484+
# Installing Spark NLP Healthcare
485+
! pip install --upgrade -q spark-nlp-jsl==$JSL_VERSION --extra-index-url https://pypi.johnsnowlabs.com/$SECRET
486+
```
487+
488+
489+
420490
## Get a Spark NLP for Healthcare license
421491
422492
You can ask for a free trial for Spark NLP for Healthcare [here](https://www.johnsnowlabs.com/install/). This will automatically create a new account for you on [my.JohnSnowLabs.com](https://my.johnsnowlabs.com/). Login in to your new account and from `My Subscriptions` section, you can download your license key as a json file.

0 commit comments

Comments
 (0)