webcomics/dosage

View on GitHub

Showing 215 of 215 total issues

File comicfury.py has 1123 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# -*- coding: utf-8 -*-
# Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
# Copyright (C) 2012-2014 Bastian Kleineidam
# Copyright (C) 2015-2020 Tobias Gruetzmacher
# Copyright (C) 2019-2020 Daniel Ring
Severity: Major
Found in dosagelib/plugins/comicfury.py - About 2 days to fix

    File old.py has 732 lines of code (exceeds 250 allowed). Consider refactoring.
    Open

    # -*- coding: utf-8 -*-
    # Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
    # Copyright (C) 2012-2014 Bastian Kleineidam
    # Copyright (C) 2015-2020 Tobias Gruetzmacher
    # Copyright (C) 2019-2020 Daniel Ring
    Severity: Major
    Found in dosagelib/plugins/old.py - About 1 day to fix

      File smackjeeves.py has 625 lines of code (exceeds 250 allowed). Consider refactoring.
      Open

      # -*- coding: utf-8 -*-
      # Copyright (C) 2015-2020 Tobias Gruetzmacher
      # Copyright (C) 2019-2020 Daniel Ring
      import re
      
      
      Severity: Major
      Found in dosagelib/plugins/smackjeeves.py - About 1 day to fix

        File gocomics.py has 527 lines of code (exceeds 250 allowed). Consider refactoring.
        Open

        # -*- coding: utf-8 -*-
        # Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
        # Copyright (C) 2012-2014 Bastian Kleineidam
        # Copyright (C) 2015-2020 Tobias Gruetzmacher
        from ..scraper import _ParserScraper
        Severity: Major
        Found in dosagelib/plugins/gocomics.py - About 1 day to fix

          File s.py has 514 lines of code (exceeds 250 allowed). Consider refactoring.
          Open

          # -*- coding: utf-8 -*-
          # Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
          # Copyright (C) 2012-2014 Bastian Kleineidam
          # Copyright (C) 2015-2020 Tobias Gruetzmacher
          # Copyright (C) 2019-2020 Daniel Ring
          Severity: Major
          Found in dosagelib/plugins/s.py - About 1 day to fix

            File scraper.py has 471 lines of code (exceeds 250 allowed). Consider refactoring.
            Open

            # -*- coding: utf-8 -*-
            # Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
            # Copyright (C) 2012-2014 Bastian Kleineidam
            # Copyright (C) 2015-2020 Tobias Gruetzmacher
            import os
            Severity: Minor
            Found in dosagelib/scraper.py - About 7 hrs to fix

              File util.py has 391 lines of code (exceeds 250 allowed). Consider refactoring.
              Open

              # -*- coding: utf-8 -*-
              # Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
              # Copyright (C) 2012-2014 Bastian Kleineidam
              # Copyright (C) 2015-2019 Tobias Gruetzmacher
              import html
              Severity: Minor
              Found in dosagelib/util.py - About 5 hrs to fix

                File webtoons.py has 381 lines of code (exceeds 250 allowed). Consider refactoring.
                Open

                # -*- coding: utf-8 -*-
                # Copyright (C) 2019-2020 Tobias Gruetzmacher
                # Copyright (C) 2019-2020 Daniel Ring
                from ..scraper import _ParserScraper
                
                
                Severity: Minor
                Found in dosagelib/plugins/webtoons.py - About 5 hrs to fix

                  File c.py has 376 lines of code (exceeds 250 allowed). Consider refactoring.
                  Open

                  # -*- coding: utf-8 -*-
                  # Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
                  # Copyright (C) 2012-2014 Bastian Kleineidam
                  # Copyright (C) 2015-2020 Tobias Gruetzmacher
                  # Copyright (C) 2019-2020 Daniel Ring
                  Severity: Minor
                  Found in dosagelib/plugins/c.py - About 5 hrs to fix

                    Function getScrapers has a Cognitive Complexity of 28 (exceeds 5 allowed). Consider refactoring.
                    Open

                    def getScrapers(comics, basepath=None, adult=True, multiple_allowed=False, listing=False):
                        """Get scraper objects for the given comics."""
                        if '@' in comics:
                            # only scrapers whose directory already exists
                            if len(comics) > 1:
                    Severity: Minor
                    Found in dosagelib/director.py - About 4 hrs to fix

                    Cognitive Complexity

                    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                    A method's cognitive complexity is based on a few simple rules:

                    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                    • Code is considered more complex for each "break in the linear flow of the code"
                    • Code is considered more complex when "flow breaking structures are nested"

                    Further reading

                    Similar blocks of code found in 2 locations. Consider refactoring.
                    Open

                        def handle_url(self, url):
                            """Parse one listing page."""
                            data = self.get_url(url)
                    
                            for comicdiv in data.cssselect('ul.all-test li'):
                    Severity: Major
                    Found in scripts/creators.py and 1 other location - About 3 hrs to fix
                    scripts/comicskingdom.py on lines 21..30

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 105.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    Similar blocks of code found in 2 locations. Consider refactoring.
                    Open

                        def handle_url(self, url):
                            """Parse one listing page."""
                            data = self.get_url(url)
                    
                            for comicdiv in data.cssselect('ul.comic-link-group li'):
                    Severity: Major
                    Found in scripts/comicskingdom.py and 1 other location - About 3 hrs to fix
                    scripts/creators.py on lines 22..31

                    Duplicated Code

                    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                    Tuning

                    This issue has a mass of 105.

                    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                    Refactorings

                    Further Reading

                    File a.py has 326 lines of code (exceeds 250 allowed). Consider refactoring.
                    Open

                    # -*- coding: utf-8 -*-
                    # Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
                    # Copyright (C) 2012-2014 Bastian Kleineidam
                    # Copyright (C) 2015-2020 Tobias Gruetzmacher
                    # Copyright (C) 2019-2020 Daniel Ring
                    Severity: Minor
                    Found in dosagelib/plugins/a.py - About 3 hrs to fix

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                      class OkCancel(_BasicScraper):
                          url = 'http://okcancel.com/'
                          rurl = escape(url)
                          stripUrl = url + 'comic/%s.html'
                          firstStripUrl = stripUrl % '1'
                      Severity: Major
                      Found in dosagelib/plugins/o.py and 1 other location - About 3 hrs to fix
                      dosagelib/plugins/c.py on lines 337..345

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 99.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                      class CompanyY(_BasicScraper):
                          url = 'http://company-y.com/'
                          rurl = escape(url)
                          stripUrl = url + '%s/'
                          firstStripUrl = stripUrl % '2009/08/14/coming-soon'
                      Severity: Major
                      Found in dosagelib/plugins/c.py and 1 other location - About 3 hrs to fix
                      dosagelib/plugins/o.py on lines 80..88

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 99.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                      class Newshounds(_ParserScraper):
                          stripUrl = 'http://www.newshounds.com/%s.html'
                          url = stripUrl % 'nh2/20140929'
                          firstStripUrl = stripUrl % 'nh1/19971101'
                          imageSearch = '//img[@class="ksc"]'
                      Severity: Major
                      Found in dosagelib/plugins/n.py and 1 other location - About 3 hrs to fix
                      dosagelib/plugins/f.py on lines 148..160

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 98.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Similar blocks of code found in 2 locations. Consider refactoring.
                      Open

                      class FoxTails(_ParserScraper):
                          stripUrl = 'http://foxtails.magickitsune.com/strips/%s.html'
                          url = stripUrl % 'current'
                          firstStripUrl = stripUrl % '20041024'
                          imageSearch = '//img[contains(@src, "img/2")]'
                      Severity: Major
                      Found in dosagelib/plugins/f.py and 1 other location - About 3 hrs to fix
                      dosagelib/plugins/n.py on lines 63..75

                      Duplicated Code

                      Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                      Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                      When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                      Tuning

                      This issue has a mass of 98.

                      We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                      The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                      If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                      See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                      Refactorings

                      Further Reading

                      Cyclomatic complexity is too high in method _getStrips. (15)
                      Open

                          def _getStrips(self, scraperobj):
                              """Get all strips from a scraper."""
                              if self.options.numstrips:
                                  numstrips = self.options.numstrips
                              elif self.options.cont or self.options.all:
                      Severity: Minor
                      Found in dosagelib/director.py by radon

                      Cyclomatic Complexity

                      Cyclomatic Complexity corresponds to the number of decisions a block of code contains plus 1. This number (also called McCabe number) is equal to the number of linearly independent paths through the code. This number can be used as a guide when testing conditional logic in blocks.

                      Radon analyzes the AST tree of a Python program to compute Cyclomatic Complexity. Statements have the following effects on Cyclomatic Complexity:

                      Construct Effect on CC Reasoning
                      if +1 An if statement is a single decision.
                      elif +1 The elif statement adds another decision.
                      else +0 The else statement does not cause a new decision. The decision is at the if.
                      for +1 There is a decision at the start of the loop.
                      while +1 There is a decision at the while statement.
                      except +1 Each except branch adds a new conditional path of execution.
                      finally +0 The finally block is unconditionally executed.
                      with +1 The with statement roughly corresponds to a try/except block (see PEP 343 for details).
                      assert +1 The assert statement internally roughly equals a conditional statement.
                      Comprehension +1 A list/set/dict comprehension of generator expression is equivalent to a for loop.
                      Boolean Operator +1 Every boolean operator (and, or) adds a decision point.

                      Source: http://radon.readthedocs.org/en/latest/intro.html

                      Scraper has 27 functions (exceeds 20 allowed). Consider refactoring.
                      Open

                      class Scraper(object):
                          '''Base class for all comic scraper, but without a specific scrape
                          implementation.'''
                      
                          # The URL for the comic strip
                      Severity: Minor
                      Found in dosagelib/scraper.py - About 3 hrs to fix

                        File events.py has 293 lines of code (exceeds 250 allowed). Consider refactoring.
                        Open

                        # -*- coding: utf-8 -*-
                        # Copyright (C) 2004-2008 Tristan Seligmann and Jonathan Jacobs
                        # Copyright (C) 2012-2014 Bastian Kleineidam
                        # Copyright (C) 2015-2020 Tobias Gruetzmacher
                        import os
                        Severity: Minor
                        Found in dosagelib/events.py - About 3 hrs to fix
                          Severity
                          Category
                          Status
                          Source
                          Language