marcus67/little_brother

View on GitHub

Showing 152 of 152 total issues

Function get_process_statistics has a Cognitive Complexity of 51 (exceeds 5 allowed). Consider refactoring.
Open

def get_process_statistics(
        p_user_map,
        p_process_infos,
        p_reference_time,
        p_max_lookback_in_days,
Severity: Minor
Found in little_brother/process_statistics.py - About 7 hrs to fix

Cognitive Complexity

Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

A method's cognitive complexity is based on a few simple rules:

  • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
  • Code is considered more complex for each "break in the linear flow of the code"
  • Code is considered more complex when "flow breaking structures are nested"

Further reading

File app_control.py has 437 lines of code (exceeds 250 allowed). Consider refactoring.
Open

# -*- coding: utf-8 -*-

# Copyright (C) 2019-2021  Marcus Rickert
#
# See https://github.com/marcus67/little_brother
Severity: Minor
Found in little_brother/app_control.py - About 6 hrs to fix

    Function prepare_services has a Cognitive Complexity of 39 (exceeds 5 allowed). Consider refactoring.
    Open

        def prepare_services(self, p_full_startup=True):
    
            super().prepare_services(p_full_startup=p_full_startup)
    
            # TODO: Activate in memory sqlite backend for clients
    Severity: Minor
    Found in little_brother/app.py - About 5 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    Function scan_processes has a Cognitive Complexity of 38 (exceeds 5 allowed). Consider refactoring.
    Open

        def scan_processes(self, p_session_context, p_reference_time, p_server_group, p_login_mapping, p_host_name,
                           p_process_regex_map, p_prohibited_process_regex_map):
    
            events = []
    
    
    Severity: Minor
    Found in little_brother/client_device_handler.py - About 5 hrs to fix

    Cognitive Complexity

    Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

    A method's cognitive complexity is based on a few simple rules:

    • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
    • Code is considered more complex for each "break in the linear flow of the code"
    • Code is considered more complex when "flow breaking structures are nested"

    Further reading

    AppControl has 38 functions (exceeds 20 allowed). Consider refactoring.
    Open

    class AppControl(PersistenceDependencyInjectionMixIn):
    
        def __init__(self, p_config,
                     p_debug_mode,
                     p_process_handlers=None,
    Severity: Minor
    Found in little_brother/app_control.py - About 5 hrs to fix

      File app.py has 366 lines of code (exceeds 250 allowed). Consider refactoring.
      Open

      # -*- coding: utf-8 -*-
      
      # Copyright (C) 2019-2022  Marcus Rickert
      #
      # See https://github.com/marcus67/little_brother
      Severity: Minor
      Found in little_brother/app.py - About 4 hrs to fix

        File process_handler_manager.py has 356 lines of code (exceeds 250 allowed). Consider refactoring.
        Open

        # -*- coding: utf-8 -*-
        
        # Copyright (C) 2019-2021  Marcus Rickert
        #
        # See https://github.com/marcus67/little_brother
        Severity: Minor
        Found in little_brother/process_handler_manager.py - About 4 hrs to fix

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

              def save_for_user2device(self, p_form, p_session_context, p_user2device):
          
                  changed = False
                  a_persistent_user_2_device = self.user_2_device_entity_manager.get_by_id(
                      p_session_context=p_session_context, p_id=p_user2device.id)
          Severity: Major
          Found in little_brother/web/users_view_handler.py and 1 other location - About 4 hrs to fix
          little_brother/web/users_view_handler.py on lines 234..244

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 76.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Similar blocks of code found in 2 locations. Consider refactoring.
          Open

              def save_for_rule_set(self, p_form, p_session_context, p_rule_set):
          
                  changed = False
                  a_persistent_rule_set = self.rule_set_entity_manager.get_by_id(
                      p_session_context=p_session_context, p_id=p_rule_set.id)
          Severity: Major
          Found in little_brother/web/users_view_handler.py and 1 other location - About 4 hrs to fix
          little_brother/web/users_view_handler.py on lines 221..232

          Duplicated Code

          Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

          Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

          When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

          Tuning

          This issue has a mass of 76.

          We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

          The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

          If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

          See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

          Refactorings

          Further Reading

          Function retrieve_user_mappings has a Cognitive Complexity of 27 (exceeds 5 allowed). Consider refactoring.
          Open

              def retrieve_user_mappings(self, p_session_context: SessionContext):
          
                  if len(self._usernames_not_found) > 0:
                      usernames_found = []
          
          
          Severity: Minor
          Found in little_brother/user_manager.py - About 3 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          Function scan_processes has a Cognitive Complexity of 26 (exceeds 5 allowed). Consider refactoring.
          Open

              def scan_processes(self, p_session_context, p_reference_time, p_server_group, p_login_mapping, p_host_name,
                                 p_process_regex_map, p_prohibited_process_regex_map):
          
                  current_processes = {}
                  events = []
          Severity: Minor
          Found in little_brother/client_process_handler.py - About 3 hrs to fix

          Cognitive Complexity

          Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

          A method's cognitive complexity is based on a few simple rules:

          • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
          • Code is considered more complex for each "break in the linear flow of the code"
          • Code is considered more complex when "flow breaking structures are nested"

          Further reading

          File rule_handler.py has 312 lines of code (exceeds 250 allowed). Consider refactoring.
          Open

          # -*- coding: utf-8 -*-
          
          # Copyright (C) 2019  Marcus Rickert
          #
          # See https://github.com/marcus67/little_brother
          Severity: Minor
          Found in little_brother/rule_handler.py - About 3 hrs to fix

            ProcessHandlerManager has 28 functions (exceeds 20 allowed). Consider refactoring.
            Open

            class ProcessHandlerManager(PersistenceDependencyInjectionMixIn):
            
                def __init__(self,
                             p_config,
                             p_is_master,
            Severity: Minor
            Found in little_brother/process_handler_manager.py - About 3 hrs to fix

              File process_statistics.py has 297 lines of code (exceeds 250 allowed). Consider refactoring.
              Open

              # -*- coding: utf-8 -*-
              
              # Copyright (C) 2019  Marcus Rickert
              #
              # See https://github.com/marcus67/little_brother
              Severity: Minor
              Found in little_brother/process_statistics.py - About 3 hrs to fix

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                    def get_admin_session(self):
                        if self._admin_session is None:
                            fmt = "Open database for administrative access"
                            self._logger.info(fmt)
                            self._admin_session = sqlalchemy.orm.sessionmaker(bind=self._admin_engine)()
                Severity: Major
                Found in little_brother/persistence/persistence.py and 1 other location - About 2 hrs to fix
                little_brother/persistence/persistence.py on lines 178..183

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 60.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Similar blocks of code found in 2 locations. Consider refactoring.
                Open

                    def get_create_table_session(self):
                        if self._create_table_session is None:
                            fmt = "Open database for CREATE TABLE"
                            self._logger.info(fmt)
                            self._create_table_session = sqlalchemy.orm.sessionmaker(bind=self._create_table_engine)()
                Severity: Major
                Found in little_brother/persistence/persistence.py and 1 other location - About 2 hrs to fix
                little_brother/persistence/persistence.py on lines 171..176

                Duplicated Code

                Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                Tuning

                This issue has a mass of 60.

                We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                Refactorings

                Further Reading

                Function set_metrics has a Cognitive Complexity of 20 (exceeds 5 allowed). Consider refactoring.
                Open

                    def set_metrics(self):
                
                        with SessionContext(p_persistence=self.persistence) as session_context:
                            if self.prometheus_client is not None:
                                self.prometheus_client.set_uptime(p_hostname="master", p_uptime=time.time() - self._start_time)
                Severity: Minor
                Found in little_brother/app_control.py - About 2 hrs to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Function handle_rule_result_info has a Cognitive Complexity of 20 (exceeds 5 allowed). Consider refactoring.
                Open

                    def handle_rule_result_info(self, p_rule_result_info, p_stat_info, p_user):
                
                        if p_rule_result_info.activity_allowed():
                            self.check_issue_logout_warning(p_username=p_user.username, p_rule_result_info=p_rule_result_info)
                
                
                Severity: Minor
                Found in little_brother/process_handler_manager.py - About 2 hrs to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Function process_rule_sets_for_all_users has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring.
                Open

                    def process_rule_sets_for_all_users(self, p_reference_time):
                
                        fmt = "Processing rules for all users START..."
                        self._logger.debug(fmt)
                
                
                Severity: Minor
                Found in little_brother/app_control.py - About 2 hrs to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Function migrate_ruleset_configs has a Cognitive Complexity of 19 (exceeds 5 allowed). Consider refactoring.
                Open

                    def migrate_ruleset_configs(self, p_ruleset_configs):
                
                        session = self._persistence.get_session()
                
                        for username, configs in p_ruleset_configs.items():
                Severity: Minor
                Found in little_brother/db_migrations.py - About 2 hrs to fix

                Cognitive Complexity

                Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

                A method's cognitive complexity is based on a few simple rules:

                • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
                • Code is considered more complex for each "break in the linear flow of the code"
                • Code is considered more complex when "flow breaking structures are nested"

                Further reading

                Severity
                Category
                Status
                Source
                Language