Does NLTK have TF-IDF implemented?

2019-02-21 20:06发布

问题:

There are TF-IDF implementations in scikit-learn and gensim.

There are simple implementations Simple implementation of N-Gram, tf-idf and Cosine similarity in Python

To avoid reinventing the wheel,

  • Is there really no TF-IDF in NLTK?
  • Are there sub-packages that we can manipulate to implement TF-IDF in NLTK? If there are how?

In this blogpost, it says NLTK doesn't have it. Is that true? http://www.bogotobogo.com/python/NLTK/tf_idf_with_scikit-learn_NLTK.php

回答1:

The NLTK TextCollection class has a method for computing the tf-idf of terms. The documentation is here, and the source is here. However, it says "may be slow to load", so using scikit-learn may be preferable.



回答2:

I guess, there are enough evidences to conclude non-existence of TF-IDF in NLTK:

  1. Unfortunately, calculating tf-idf is not available in NLTK so we'll use another data analysis library, scikit-learn

    from COMPSCI 290-01 Spring 2014 lab

  2. More important, source code contains nothing related to tfidf (or tf-idf). Exceptions are NLTK-contrib, which contains map-reduce implementation for TF-IDF.

There are several libs for tf-idf mentioned in related question.

Upd: search by tf idf or tf_idf lets to find the function already found by @yvespeirsman