Skip to content

Public runnable examples of using John Snow Labs' NLP for Apache Spark.

License

Notifications You must be signed in to change notification settings

HashamUlHaq/spark-nlp-workshop

 
 

Repository files navigation

Spark NLP Workshop

Build Status Maven Central PyPI version Anaconda-Cloud License

Showcasing notebooks and codes of how to use Spark NLP in Python and Scala.

Table of contents

Python Setup

$ java -version
# should be Java 8 (Oracle or OpenJDK)
$ conda create -n sparknlp python=3.6 -y
$ conda activate sparknlp

# Install Spark NLP and PySpark 2.4.x
$ pip install spark-nlp pyspark==2.4.7

Colab setup

import os

# Install JDK 8
! apt-get update -qq
! apt-get install -y openjdk-8-jdk-headless -qq > /dev/null

os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
os.environ["PATH"] = os.environ["JAVA_HOME"] + "/bin:" + os.environ["PATH"]
! java -version

# Install PySpark 2.4.x
! pip install -q pyspark==2.4.7
! pip install -q spark-nlp

Main repository

https://github.com/JohnSnowLabs/spark-nlp

Project's website

Take a look at our official spark-nlp page: http://nlp.johnsnowlabs.com/ for user documentation and examples

Slack community channel

Join Slack

Contributing

If you find any example that is no longer working, please create an issue.

License

Apache Licence 2.0

About

Public runnable examples of using John Snow Labs' NLP for Apache Spark.

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 99.9%
  • Other 0.1%