Showing 37 of 37 total issues

Similar blocks of code found in 2 locations. Consider refactoring.
Open

        host_iface = ip.IPv4Interface(str(list(netobj.hosts())[1]) + net_pfx)
Severity: Minor
Found in node_tools/ctlr_funcs.py and 1 other location - About 45 mins to fix
node_tools/ctlr_funcs.py on lines 105..105

Duplicated Code

Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

Tuning

This issue has a mass of 35.

We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

Refactorings

Further Reading

Avoid deeply nested control flow statements.
Open

                            if 'result' in reply[0]:
                                st.fpnState['wdg_ref'] = True
                            logger.error('HEALTH: network is unreachable!!')
Severity: Major
Found in node_tools/nodestate.py - About 45 mins to fix

    Function load_id_trie has 6 arguments (exceeds 4 allowed). Consider refactoring.
    Open

    def load_id_trie(net_trie, id_trie, nw_id, node_id, needs=[], nw=False):
    Severity: Minor
    Found in node_tools/trie_funcs.py - About 45 mins to fix

      Function drain_msg_queue has 5 arguments (exceeds 4 allowed). Consider refactoring.
      Open

      def drain_msg_queue(reg_q, pub_q=None, tmp_q=None, addr=None, method='handle_node'):
      Severity: Minor
      Found in node_tools/network_funcs.py - About 35 mins to fix

        Function update_id_trie has 5 arguments (exceeds 4 allowed). Consider refactoring.
        Open

        def update_id_trie(trie, nw_id, node_id, needs=[], nw=False):
        Severity: Minor
        Found in node_tools/trie_funcs.py - About 35 mins to fix

          Function cleanup_state_tries has 5 arguments (exceeds 4 allowed). Consider refactoring.
          Open

          def cleanup_state_tries(net_trie, id_trie, nw_id, node_id, mbr_only=False):
          Severity: Minor
          Found in node_tools/trie_funcs.py - About 35 mins to fix

            Function load_cache_by_type has a Cognitive Complexity of 20 (exceeds 18 allowed). Consider refactoring.
            Open

            def load_cache_by_type(cache, data, key_str):
                """Load or update cache by key type string (uses find_keys)."""
                from itertools import zip_longest
                key_list = find_keys(cache, key_str)
                if not key_list:
            Severity: Minor
            Found in node_tools/cache_funcs.py - About 35 mins to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                    for key in key_list:
                        logger.debug('Deleting entry for: {}'.format(key))
                        with cache.transact():
                            del cache[key]
            Severity: Minor
            Found in node_tools/cache_funcs.py and 1 other location - About 35 mins to fix
            node_tools/cache_funcs.py on lines 246..249

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 33.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                            elif not item:
                                logger.debug('Removing cache entry for key: {}'.format(key))
                                with cache.transact():
                                    del cache[key]
            Severity: Minor
            Found in node_tools/cache_funcs.py and 1 other location - About 35 mins to fix
            node_tools/cache_funcs.py on lines 43..46

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 33.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                            for node in trie.suffixes(exit_net)[1:]:
                                if node_id != node:
                                    exit_node = node
            Severity: Minor
            Found in node_tools/trie_funcs.py and 1 other location - About 35 mins to fix
            node_tools/trie_funcs.py on lines 269..271

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 33.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Function get_peer_status has a Cognitive Complexity of 20 (exceeds 18 allowed). Consider refactoring.
            Open

            def get_peer_status(cache):
                """
                Get status data for 'peer' endpoint from cache, return a
                list of dictionaries.
                """
            Severity: Minor
            Found in node_tools/cache_funcs.py - About 35 mins to fix

            Cognitive Complexity

            Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

            A method's cognitive complexity is based on a few simple rules:

            • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
            • Code is considered more complex for each "break in the linear flow of the code"
            • Code is considered more complex when "flow breaking structures are nested"

            Further reading

            Similar blocks of code found in 2 locations. Consider refactoring.
            Open

                            for node in trie.suffixes(src_net)[1:]:
                                if node_id != node:
                                    src_node = node
            Severity: Minor
            Found in node_tools/trie_funcs.py and 1 other location - About 35 mins to fix
            node_tools/trie_funcs.py on lines 274..276

            Duplicated Code

            Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

            Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

            When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

            Tuning

            This issue has a mass of 33.

            We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

            The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

            If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

            See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

            Refactorings

            Further Reading

            Avoid too many return statements within this function.
            Open

                    return False
            Severity: Major
            Found in node_tools/sched_funcs.py - About 30 mins to fix

              Avoid too many return statements within this function.
              Open

                                      return True
              Severity: Major
              Found in node_tools/sched_funcs.py - About 30 mins to fix

                Avoid too many return statements within this function.
                Open

                                    return True
                Severity: Major
                Found in node_tools/sched_funcs.py - About 30 mins to fix

                  Function get_net_cmds has a Cognitive Complexity of 19 (exceeds 18 allowed). Consider refactoring.
                  Open

                  def get_net_cmds(bin_dir, iface=None, state=False):
                      import os
                  
                      res = None
                      if not os.path.isdir(bin_dir):
                  Severity: Minor
                  Found in node_tools/network_funcs.py - About 25 mins to fix

                  Cognitive Complexity

                  Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                  A method's cognitive complexity is based on a few simple rules:

                  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                  • Code is considered more complex for each "break in the linear flow of the code"
                  • Code is considered more complex when "flow breaking structures are nested"

                  Further reading

                  Function echo_client has a Cognitive Complexity of 19 (exceeds 18 allowed). Consider refactoring.
                  Open

                  def echo_client(fpn_id, addr, send_cfg=False):
                      import json
                      from node_tools import state_data as st
                      from node_tools.msg_queues import make_version_msg
                      from node_tools.node_funcs import do_shutdown
                  Severity: Minor
                  Found in node_tools/network_funcs.py - About 25 mins to fix

                  Cognitive Complexity

                  Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                  A method's cognitive complexity is based on a few simple rules:

                  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                  • Code is considered more complex for each "break in the linear flow of the code"
                  • Code is considered more complex when "flow breaking structures are nested"

                  Further reading

                  Severity
                  Category
                  Status
                  Source
                  Language