wikimedia/pywikibot

View on GitHub

Showing 616 of 616 total issues

Function preload_entities has a Cognitive Complexity of 19 (exceeds 10 allowed). Consider refactoring.
Open

    def preload_entities(
        self,
        pagelist: Iterable[pywikibot.page.WikibaseEntity
                           | pywikibot.page.Page],
        groupsize: int = 50
Severity: Minor
Found in pywikibot/site/_datasite.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function format_diff has a Cognitive Complexity of 19 (exceeds 10 allowed). Consider refactoring.
Open

    def format_diff(self) -> Iterable[str]:
        """Color diff lines."""
        diff = iter(self.diff)

        fmt: str | None = ''
Severity: Minor
Found in pywikibot/diff.py - About 1 hr to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Similar blocks of code found in 5 locations. Consider refactoring.
Open

        'ca': lambda m: multi(m, [
            (lambda v: dh_decAD(v, 'Dècada de %d'), lambda p: p == 1970),
            (lambda v: dh_decAD(v, 'Dècada del %d'), alwaysTrue)]),
Severity: Major
Found in pywikibot/date.py and 4 other locations - About 1 hr to fix
pywikibot/date.py on lines 1096..1098
pywikibot/date.py on lines 1113..1115
pywikibot/date.py on lines 1218..1222
pywikibot/date.py on lines 1302..1305

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 55.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 5 locations. Consider refactoring.
Open

        'fr': lambda m: multi(m, [
            (lambda v: dh_centuryAD(v, '%Rer siècle'), lambda p: p == 1),
            (lambda v: dh_centuryAD(v, '%Re siècle'), alwaysTrue)]),
Severity: Major
Found in pywikibot/date.py and 4 other locations - About 1 hr to fix
pywikibot/date.py on lines 876..878
pywikibot/date.py on lines 1096..1098
pywikibot/date.py on lines 1218..1222
pywikibot/date.py on lines 1302..1305

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 55.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Similar blocks of code found in 5 locations. Consider refactoring.
Open

        'el': lambda m: multi(m, [
            (lambda v: dh_centuryAD(v, '%dός αιώνας'), lambda p: p == 20),
            (lambda v: dh_centuryAD(v, '%dος αιώνας'), alwaysTrue)]),
Severity: Major
Found in pywikibot/date.py and 4 other locations - About 1 hr to fix
pywikibot/date.py on lines 876..878
pywikibot/date.py on lines 1113..1115
pywikibot/date.py on lines 1218..1222
pywikibot/date.py on lines 1302..1305

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 55.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Function __init__ has 16 arguments (exceeds 7 allowed). Consider refactoring.
Open

    def __init__(self, url: list[str] | str, *,
Severity: Major
Found in pywikibot/specialbots/_upload.py - About 1 hr to fix

    Function __init__ has 45 lines of code (exceeds 30 allowed). Consider refactoring.
    Open

        def __init__(self, parent=None, **kwargs) -> None:
            """Initializer."""
            for module in (idlelib, tkinter):
                if isinstance(module, ImportError):
                    raise module
    Severity: Minor
    Found in pywikibot/userinterfaces/gui.py - About 1 hr to fix

      Function allow_asynchronous has a Cognitive Complexity of 18 (exceeds 10 allowed). Consider refactoring.
      Open

      def allow_asynchronous(func):
          """Decorator to make it possible to run a BasePage method asynchronously.
      
          This is done when the method is called with kwarg
          :code:`asynchronous=True`. Optionally, you can also provide kwarg
      Severity: Minor
      Found in pywikibot/page/_decorators.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function get_item_list has a Cognitive Complexity of 18 (exceeds 10 allowed). Consider refactoring.
      Open

      def get_item_list(item_name: str,
                        instance_id: str | set[str] | list[str]) -> set[str]:
          """Get list of items by name, belonging to an instance (list).
      
          Normally there should have one single best match. The caller should
      Severity: Minor
      Found in scripts/create_isbn_edition.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function main has a Cognitive Complexity of 18 (exceeds 10 allowed). Consider refactoring.
      Open

      def main(*args: str) -> None:
          """Process command line arguments and invoke bot.
      
          If args is an empty list, sys.argv is used.
      
      
      Severity: Minor
      Found in scripts/djvutext.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function process_entries has a Cognitive Complexity of 18 (exceeds 10 allowed). Consider refactoring.
      Open

      def process_entries(cache_path, func, use_accesstime: bool | None = None,
                          output_func=None, action_func=None, *,
                          tests: int | None = None):
          """Check the contents of the cache.
      
      
      Severity: Minor
      Found in scripts/maintenance/cache.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Function _print has a Cognitive Complexity of 18 (exceeds 10 allowed). Consider refactoring.
      Open

          def _print(self, text, target_stream) -> None:
              """Write the text to the target stream handling the colors."""
              colorized = (config.colorized_output
                           and self.support_color(target_stream))
              colored_line = False
      Severity: Minor
      Found in pywikibot/userinterfaces/terminal_interface_base.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Similar blocks of code found in 4 locations. Consider refactoring.
      Open

          def _handle_titleregexnot(self, value: str) -> Literal[True]:
              """Handle `-titleregexnot` argument."""
              if not value:
                  value = pywikibot.input(
                      'All pages except which ones?')
      Severity: Major
      Found in pywikibot/pagegenerators/_factory.py and 3 other locations - About 1 hr to fix
      pywikibot/pagegenerators/_factory.py on lines 812..818
      pywikibot/pagegenerators/_factory.py on lines 828..833
      pywikibot/pagegenerators/_factory.py on lines 835..840

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 53.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Function _post_process has a Cognitive Complexity of 18 (exceeds 10 allowed). Consider refactoring.
      Open

          def _post_process(prop, data) -> None:
              """Do some default handling of data. Directly modifies data."""
              # Be careful with version tests inside this here as it might need to
              # query this method to actually get the version number
      
      
      Severity: Minor
      Found in pywikibot/site/_siteinfo.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Similar blocks of code found in 3 locations. Consider refactoring.
      Open

          def test_wmf_versions(self):
              """Test comparison between wmf versions."""
              self.assertGreater(self._make('1.33wmf10'), self._make('1.33wmf9'))
              self.assertEqual(self._make('1.33wmf10'), self._make('1.33wmf10'))
      Severity: Major
      Found in tests/mediawikiversion_tests.py and 2 other locations - About 1 hr to fix
      tests/mediawikiversion_tests.py on lines 31..34
      tests/mediawikiversion_tests.py on lines 41..44

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 53.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 3 locations. Consider refactoring.
      Open

          def test_combined_versions(self):
              """Test comparison between wmf versions and release versions."""
              self.assertGreater(self._make('1.33wmf10'), self._make('1.32.3'))
              self.assertGreater(self._make('1.33'), self._make('1.33wmf10'))
      Severity: Major
      Found in tests/mediawikiversion_tests.py and 2 other locations - About 1 hr to fix
      tests/mediawikiversion_tests.py on lines 31..34
      tests/mediawikiversion_tests.py on lines 36..39

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 53.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 3 locations. Consider refactoring.
      Open

          def test_normal_versions(self):
              """Test comparison between release versions."""
              self.assertGreater(self._make('1.33'), self._make('1.32.0'))
              self.assertEqual(self._make('1.33'), self._make('1.33'))
      Severity: Major
      Found in tests/mediawikiversion_tests.py and 2 other locations - About 1 hr to fix
      tests/mediawikiversion_tests.py on lines 36..39
      tests/mediawikiversion_tests.py on lines 41..44

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 53.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 4 locations. Consider refactoring.
      Open

          def _handle_grep(self, value: str) -> Literal[True]:
              """Handle `-grep` argument."""
              if not value:
                  value = pywikibot.input('Which pattern do you want to grep?')
              self.articlefilter_list.append(value)
      Severity: Major
      Found in pywikibot/pagegenerators/_factory.py and 3 other locations - About 1 hr to fix
      pywikibot/pagegenerators/_factory.py on lines 812..818
      pywikibot/pagegenerators/_factory.py on lines 820..826
      pywikibot/pagegenerators/_factory.py on lines 835..840

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 53.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 4 locations. Consider refactoring.
      Open

          def _handle_grepnot(self, value: str) -> Literal[True]:
              """Handle `-grepnot` argument."""
              if not value:
                  value = pywikibot.input('Which pattern do you want to skip?')
              self.articlenotfilter_list.append(value)
      Severity: Major
      Found in pywikibot/pagegenerators/_factory.py and 3 other locations - About 1 hr to fix
      pywikibot/pagegenerators/_factory.py on lines 812..818
      pywikibot/pagegenerators/_factory.py on lines 820..826
      pywikibot/pagegenerators/_factory.py on lines 828..833

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 53.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Function __init__ has a Cognitive Complexity of 18 (exceeds 10 allowed). Consider refactoring.
      Open

          def __init__(self, site=None,
                       mime: dict | None = None,
                       throttle: bool = True,
                       max_retries: int | None = None,
                       retry_wait: int | None = None,
      Severity: Minor
      Found in pywikibot/data/api/_requests.py - About 1 hr to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Severity
      Category
      Status
      Source
      Language