wikimedia/pywikibot

View on GitHub

Showing 735 of 735 total issues

Function toJSON has a Cognitive Complexity of 29 (exceeds 10 allowed). Consider refactoring.
Open

    def toJSON(self, diffto: dict | None = None) -> dict:
        """
        Create JSON suitable for Wikibase API.

        When diffto is provided, JSON representing differences
Severity: Minor
Found in pywikibot/page/_collections.py - About 3 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function treat_page has a Cognitive Complexity of 28 (exceeds 10 allowed). Consider refactoring.
Open

    def treat_page(self) -> None:
        """Treat a single page."""
        local_file_page = self.current_page
        file_on_commons = self.find_file_on_commons(local_file_page)

Severity: Minor
Found in scripts/nowcommons.py - About 3 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function translate has a Cognitive Complexity of 28 (exceeds 10 allowed). Consider refactoring.
Open

def translate(code: str | pywikibot.site.BaseSite,
              xdict: str | Mapping[str, str],
              parameters: Mapping[str, int] | None = None,
              fallback: bool | Iterable[str] = False) -> str | None:
    """Return the most appropriate localization from a localization dict.
Severity: Minor
Found in pywikibot/i18n.py - About 3 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function batched has a Cognitive Complexity of 28 (exceeds 10 allowed). Consider refactoring.
Open

    def batched(iterable, n: int, *,
                strict: bool = False) -> Generator[tuple, None, None]:
        """Batch data from the *iterable* into tuples of length *n*.

        .. note:: The last batch may be shorter than *n* if *strict* is
Severity: Minor
Found in pywikibot/backports.py - About 3 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function translateAndCapitalizeNamespaces has a Cognitive Complexity of 28 (exceeds 10 allowed). Consider refactoring.
Open

    def translateAndCapitalizeNamespaces(self, text: str) -> str:
        """Use localized namespace names.

        .. versionchanged:: 7.4
           No longer expect a specific namespace alias for File:
Severity: Minor
Found in pywikibot/cosmetic_changes.py - About 3 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

Function intersect_generators has a Cognitive Complexity of 28 (exceeds 10 allowed). Consider refactoring.
Open

def intersect_generators(*iterables, allow_duplicates: bool = False):
    """Generator of intersect iterables.

    Yield items only if they are yielded by all iterables. zip_longest
    is used to retrieve items from all iterables in parallel, so that
Severity: Minor
Found in pywikibot/tools/itertools.py - About 3 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

TestFactoryGenerator has 37 functions (exceeds 30 allowed). Consider refactoring.
Open

class TestFactoryGenerator(DefaultSiteTestCase):

    """Test pagegenerators.GeneratorFactory."""

    def test_combined_generator(self):
Severity: Minor
Found in tests/pagegenerators_tests.py - About 3 hrs to fix

    DataSite has 37 functions (exceeds 30 allowed). Consider refactoring.
    Open

    class DataSite(APISite):
    
        """Wikibase data capable site."""
    
        def __init__(self, *args, **kwargs) -> None:
    Severity: Minor
    Found in pywikibot/site/_datasite.py - About 3 hrs to fix

      Function toJSON has a Cognitive Complexity of 27 (exceeds 10 allowed). Consider refactoring.
      Open

          def toJSON(self, diffto: dict | None = None) -> dict:
              """Create JSON suitable for Wikibase API.
      
              When diffto is provided, JSON representing differences
              to the provided data is created.
      Severity: Minor
      Found in pywikibot/page/_collections.py - About 3 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Similar blocks of code found in 4 locations. Consider refactoring.
      Open

      def LonelyPagesPageGenerator(
          total: int | None = None,
          site: BaseSite | None = None
      ) -> Iterable[pywikibot.page.Page]:
          """Lonely page generator.
      Severity: Major
      Found in pywikibot/pagegenerators/_generators.py and 3 other locations - About 3 hrs to fix
      pywikibot/pagegenerators/_generators.py on lines 519..530
      pywikibot/pagegenerators/_generators.py on lines 533..544
      pywikibot/pagegenerators/_generators.py on lines 617..628

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 71.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 4 locations. Consider refactoring.
      Open

      def WithoutInterwikiPageGenerator(
          total: int | None = None,
          site: BaseSite | None = None
      ) -> Iterable[pywikibot.page.Page]:
          """Page lacking interwikis generator.
      Severity: Major
      Found in pywikibot/pagegenerators/_generators.py and 3 other locations - About 3 hrs to fix
      pywikibot/pagegenerators/_generators.py on lines 519..530
      pywikibot/pagegenerators/_generators.py on lines 603..614
      pywikibot/pagegenerators/_generators.py on lines 617..628

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 71.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Similar blocks of code found in 4 locations. Consider refactoring.
      Open

      def UnusedFilesGenerator(
          total: int | None = None,
          site: BaseSite | None = None
      ) -> Iterable[pywikibot.page.FilePage]:
          """Unused files generator.
      Severity: Major
      Found in pywikibot/pagegenerators/_generators.py and 3 other locations - About 3 hrs to fix
      pywikibot/pagegenerators/_generators.py on lines 533..544
      pywikibot/pagegenerators/_generators.py on lines 603..614
      pywikibot/pagegenerators/_generators.py on lines 617..628

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 71.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      Function generator has a Cognitive Complexity of 27 (exceeds 10 allowed). Consider refactoring.
      Open

          def generator(self):
              """Submit request and iterate the response based on self.resultkey.
      
              Continues response as needed until limit (if any) is reached.
      
      
      Severity: Minor
      Found in pywikibot/data/api/_generators.py - About 3 hrs to fix

      Cognitive Complexity

      Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

      A method's cognitive complexity is based on a few simple rules:

      • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
      • Code is considered more complex for each "break in the linear flow of the code"
      • Code is considered more complex when "flow breaking structures are nested"

      Further reading

      Similar blocks of code found in 4 locations. Consider refactoring.
      Open

      def UnwatchedPagesPageGenerator(
          total: int | None = None,
          site: BaseSite | None = None
      ) -> Iterable[pywikibot.page.Page]:
          """Unwatched page generator.
      Severity: Major
      Found in pywikibot/pagegenerators/_generators.py and 3 other locations - About 3 hrs to fix
      pywikibot/pagegenerators/_generators.py on lines 519..530
      pywikibot/pagegenerators/_generators.py on lines 533..544
      pywikibot/pagegenerators/_generators.py on lines 603..614

      Duplicated Code

      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

      Tuning

      This issue has a mass of 71.

      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

      Refactorings

      Further Reading

      File _wbtypes.py has 943 lines of code (exceeds 900 allowed). Consider refactoring.
      Open

      """Wikibase data type classes."""
      #
      # (C) Pywikibot team, 2013-2024
      #
      # Distributed under the terms of the MIT license.
      Severity: Major
      Found in pywikibot/_wbtypes.py - About 3 hrs to fix

        Claim has 36 functions (exceeds 30 allowed). Consider refactoring.
        Open

        class Claim(Property):
        
            """
            A Claim on a Wikibase entity.
        
        
        Severity: Minor
        Found in pywikibot/page/_wikibase.py - About 3 hrs to fix

          Similar blocks of code found in 3 locations. Consider refactoring.
          Open

              def test_random_generator_ns(self):
                  """Test random generator with namespace."""
                  gf = pagegenerators.GeneratorFactory(site=self.site)
                  gf.handle_arg('-ns:1')
                  gf.handle_arg('-random:1')
          Severity: Major
          Found in tests/pagegenerators_tests.py and 2 other locations - About 3 hrs to fix
          tests/pagegenerators_tests.py on lines 1127..1134
          tests/pagegenerators_tests.py on lines 1214..1221

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 70.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 3 locations. Consider refactoring.
          Open

              def test_randomredirect_generator_ns(self):
                  """Test random generator with namespace."""
                  gf = pagegenerators.GeneratorFactory(site=self.site)
                  gf.handle_arg('-ns:1')
                  gf.handle_arg('-randomredirect:1')
          Severity: Major
          Found in tests/pagegenerators_tests.py and 2 other locations - About 3 hrs to fix
          tests/pagegenerators_tests.py on lines 1127..1134
          tests/pagegenerators_tests.py on lines 1186..1193

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 70.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 3 locations. Consider refactoring.
          Open

              def test_recentchanges_ns(self):
                  """Test recentchanges generator with namespace."""
                  gf = pagegenerators.GeneratorFactory(site=self.site)
                  gf.handle_arg('-ns:1')
                  gf.handle_arg('-recentchanges:10')
          Severity: Major
          Found in tests/pagegenerators_tests.py and 2 other locations - About 3 hrs to fix
          tests/pagegenerators_tests.py on lines 1186..1193
          tests/pagegenerators_tests.py on lines 1214..1221

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 70.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Function find_file_on_commons has a Cognitive Complexity of 26 (exceeds 10 allowed). Consider refactoring.
          Open

              def find_file_on_commons(self, local_file_page):
                  """Find filename on Commons."""
                  for template_name, params in local_file_page.templatesWithParams():
                      if template_name not in self.nc_templates:
                          continue
          Severity: Minor
          Found in scripts/nowcommons.py - About 2 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Severity
          Category
          Status
          Source
          Language