Skip to content

Commit

Permalink
Merge branch 'master' of github.com:inpho/topic-explorer
Browse files Browse the repository at this point in the history
  • Loading branch information
JaimieMurdock committed Oct 26, 2018
2 parents 68089b7 + 5a41586 commit 0f0567b
Show file tree
Hide file tree
Showing 2 changed files with 4 additions and 4 deletions.
4 changes: 2 additions & 2 deletions ipynb/Topic-Explorer-Tutorial.py2.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -304,7 +304,7 @@
"The above code shows the topic-word distributions and allows us to estimate the quality of our topics.\n",
"\n",
"#### `v.labels`\n",
"The property `v.labels` (without parentheses) returns a list of all documents in a corpus, and is useful for processing each document generically, wihtout having to look up the identifiers on the file system.\n",
"The property `v.labels` (without parentheses) returns a list of all documents in a corpus, and is useful for processing each document generically, without having to look up the identifiers on the file system.\n",
"\n",
"Below, we print the first 3 document labels:"
]
Expand Down Expand Up @@ -386,7 +386,7 @@
"#### Alternative distance measures\n",
"By default, the Topic Explorer uses the Jensen-Shannon Distance to calculate the distance between documents. The Jensen-Shannon Distance (JSD) is a symmetric measure based on information theory that characterizes the difference between two probability distributions.\n",
"\n",
"However, several alternate methods are built into the `vsm.spatial` module. These include the Kullbeck-Liebler Divergence, which is an asymmetric component of the JSD and is used in [Murdock et al. (in review)](http://arxiv.org/abs/1509.07175) to characterize the cognitive surprise of a new text, given previous texts.\n",
"However, several alternate methods are built into the `vsm.spatial` module. These include the Kullback-Leibler Divergence, which is an asymmetric component of the JSD and is used in [Murdock et al. (in review)](http://arxiv.org/abs/1509.07175) to characterize the cognitive surprise of a new text, given previous texts.\n",
"\n",
"Rather than using the JSD and assuming symmetric divergence between items, we assume that the second document is encountered after the first, effectively measuring text-to-text divergence."
]
Expand Down
4 changes: 2 additions & 2 deletions ipynb/Topic-Explorer-Tutorial.py3.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -344,7 +344,7 @@
"The above code shows the topic-word distributions and allows us to estimate the quality of our topics.\n",
"\n",
"#### `v.labels`\n",
"The property `v.labels` (without parentheses) returns a list of all documents in a corpus, and is useful for processing each document generically, wihtout having to look up the identifiers on the file system.\n",
"The property `v.labels` (without parentheses) returns a list of all documents in a corpus, and is useful for processing each document generically, without having to look up the identifiers on the file system.\n",
"\n",
"Below, we print the first 3 document labels:"
]
Expand Down Expand Up @@ -426,7 +426,7 @@
"#### Alternative distance measures\n",
"By default, the Topic Explorer uses the Jensen-Shannon Distance to calculate the distance between documents. The Jensen-Shannon Distance (JSD) is a symmetric measure based on information theory that characterizes the difference between two probability distributions.\n",
"\n",
"However, several alternate methods are built into the `vsm.spatial` module. These include the Kullbeck-Liebler Divergence, which is an asymmetric component of the JSD and is used in [Murdock et al. (2017)](http://arxiv.org/abs/1509.07175) to characterize the cognitive surprise of a new text, given previous texts.\n",
"However, several alternate methods are built into the `vsm.spatial` module. These include the Kullback-Leibler Divergence, which is an asymmetric component of the JSD and is used in [Murdock et al. (2017)](http://arxiv.org/abs/1509.07175) to characterize the cognitive surprise of a new text, given previous texts.\n",
"\n",
"Rather than using the JSD and assuming symmetric divergence between items, we assume that the second document is encountered after the first, effectively measuring text-to-text divergence."
]
Expand Down

0 comments on commit 0f0567b

Please sign in to comment.