BLKSerene/Wordless

View on GitHub
wordless/wl_nlp/wl_word_tokenization.py

Summary

Maintainability
A
45 mins
Test Coverage

Avoid deeply nested control flow statements.
Open

                            for sentence in sentences:
                                tokens_multilevel[-1].append(main.nltk_nist_tokenizer.international_tokenize(sentence))
                        case 'nltk_nltk':
Severity: Major
Found in wordless/wl_nlp/wl_word_tokenization.py - About 45 mins to fix

    There are no issues that match your filters.

    Category
    Status