linkedin/kafka-tools

View on GitHub

Showing 157 of 157 total issues

Avoid deeply nested control flow statements.
Open

                        if proposed_broker not in partition.replicas:
                            newreplica = proposed_broker
                            break
                        attempts += 1
Severity: Major
Found in kafka/tools/assigner/actions/remove.py - About 45 mins to fix

    Similar blocks of code found in 2 locations. Consider refactoring.
    Open

    def error_long(err_num):
        if err_num in errors:
            return errors[err_num]['long']
        else:
            return "This is an unknown error code"
    Severity: Minor
    Found in kafka/tools/protocol/errors.py and 1 other location - About 45 mins to fix
    kafka/tools/protocol/errors.py on lines 158..162

    Duplicated Code

    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

    Tuning

    This issue has a mass of 35.

    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

    Refactorings

    Further Reading

    Similar blocks of code found in 2 locations. Consider refactoring.
    Open

        schema = [
            {'name': 'group_id', 'type': 'string'},
            {'name': 'generation_id', 'type': 'int32'},
            {'name': 'member_id', 'type': 'string'},
            {'name': 'member_assignments', 'type': 'bytes'},
    Severity: Minor
    Found in kafka/tools/protocol/requests/sync_group_v0.py and 1 other location - About 40 mins to fix
    kafka/tools/protocol/responses/group_coordinator_v0.py on lines 21..26

    Duplicated Code

    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

    Tuning

    This issue has a mass of 34.

    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

    Refactorings

    Further Reading

    Similar blocks of code found in 2 locations. Consider refactoring.
    Open

        schema = [
            {'name': 'group_id', 'type': 'string'},
            {'name': 'topics',
             'type': 'array',
             'item_type': [
    Severity: Minor
    Found in kafka/tools/protocol/requests/offset_fetch_v0.py and 1 other location - About 40 mins to fix
    kafka/tools/protocol/requests/alter_replica_log_dirs_v0.py on lines 33..39

    Duplicated Code

    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

    Tuning

    This issue has a mass of 34.

    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

    Refactorings

    Further Reading

    Similar blocks of code found in 2 locations. Consider refactoring.
    Open

    class GroupCoordinatorV0Response(BaseResponse):
        schema = [
            {'name': 'error', 'type': 'int16'},
            {'name': 'node_id', 'type': 'int32'},
            {'name': 'host', 'type': 'string'},
    Severity: Minor
    Found in kafka/tools/protocol/responses/group_coordinator_v0.py and 1 other location - About 40 mins to fix
    kafka/tools/protocol/requests/sync_group_v0.py on lines 35..39

    Duplicated Code

    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

    Tuning

    This issue has a mass of 34.

    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

    Refactorings

    Further Reading

    Similar blocks of code found in 2 locations. Consider refactoring.
    Open

             'item_type': [
                 {'name': 'log_dir', 'type': 'string'},
                 {'name': 'topics',
                  'type': 'array',
                  'item_type': [
    Severity: Minor
    Found in kafka/tools/protocol/requests/alter_replica_log_dirs_v0.py and 1 other location - About 40 mins to fix
    kafka/tools/protocol/requests/offset_fetch_v0.py on lines 45..53

    Duplicated Code

    Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

    Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

    When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

    Tuning

    This issue has a mass of 34.

    We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

    The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

    If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

    See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

    Refactorings

    Further Reading

    Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
    Open

        def __init__(self, hostname, id=0, port=9092, sock=None, configuration=None):
    Severity: Minor
    Found in kafka/tools/models/broker.py - About 35 mins to fix

      Function run_preferred_replica_elections has 5 arguments (exceeds 4 allowed). Consider refactoring.
      Open

      def run_preferred_replica_elections(batches, args, tools_path, plugins, dry_run):
      Severity: Minor
      Found in kafka/tools/assigner/__main__.py - About 35 mins to fix

        Function __init__ has 5 arguments (exceeds 4 allowed). Consider refactoring.
        Open

            def __init__(self, name, client_id=None, client_host=None, metadata=None, assignment=None):
        Severity: Minor
        Found in kafka/tools/models/group.py - About 35 mins to fix

          Function add_member has 5 arguments (exceeds 4 allowed). Consider refactoring.
          Open

              def add_member(self, name, client_id=None, client_host=None, metadata=None, assignment=None):
          Severity: Minor
          Found in kafka/tools/models/group.py - About 35 mins to fix

            Function _parse_command has 5 arguments (exceeds 4 allowed). Consider refactoring.
            Open

            def _parse_command(broker, request_classes, request_cmds, cmd, cmd_args):
            Severity: Minor
            Found in kafka/tools/protocol/__main__.py - About 35 mins to fix

              Function clone has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
              Open

                  def clone(self):
                      newcluster = Cluster()
                      newcluster.retention = self.retention
              
                      # We're not going to clone in the subclasses because we need to map partitions between topics and brokers
              Severity: Minor
              Found in kafka/tools/models/cluster.py - About 35 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Function _send_list_offsets_to_brokers has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
              Open

                  def _send_list_offsets_to_brokers(self, request_values):
                      """
                      Given a mapping of broker IDs to values for ListOffset requests, send the requests to all the brokers and
                      collate the responses into a mapping of topic names to TopicOffsets instances
              
              
              Severity: Minor
              Found in kafka/tools/client.py - About 35 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Function connect has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
              Open

                  def connect(self):
                      """
                      Connect to the all cluster brokers and populate topic and partition information. If the client was created with
                      a zkconnect string, first we connect to Zookeeper to bootstrap the broker and topic information from there,
                      and then the client connects to all brokers in the cluster. Otherwise, connect to the bootstrap broker
              Severity: Minor
              Found in kafka/tools/client.py - About 35 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                           {'name': 'partition_responses',
                            'type': 'array',
                            'item_type': [
                                {'name': 'partition', 'type': 'int32'},
                                {'name': 'error', 'type': 'int16'},
              Severity: Minor
              Found in kafka/tools/protocol/responses/list_offset_v0.py and 1 other location - About 35 mins to fix
              kafka/tools/protocol/requests/describe_configs_v0.py on lines 30..36

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Similar blocks of code found in 2 locations. Consider refactoring.
              Open

                  schema = [
                      {'name': 'resources',
                       'type': 'array',
                       'item_type': [
                           {'name': 'resource_type', 'type': 'int8'},
              Severity: Minor
              Found in kafka/tools/protocol/requests/describe_configs_v0.py and 1 other location - About 35 mins to fix
              kafka/tools/protocol/responses/list_offset_v0.py on lines 27..32

              Duplicated Code

              Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

              Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

              When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

              Tuning

              This issue has a mass of 33.

              We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

              The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

              If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

              See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

              Refactorings

              Further Reading

              Function _query_prometheus has a Cognitive Complexity of 7 (exceeds 5 allowed). Consider refactoring.
              Open

                  def _query_prometheus(self, hostname):
                      size_metric_name = self.properties['size_metric_name']
                      metrics_port = self.properties['metrics_port']
                      metrics_path = self.properties.get('metrics_path', '/metrics')
                      topic_label = self.properties.get('topic_label', 'topic')
              Severity: Minor
              Found in kafka/tools/assigner/sizers/prometheus.py - About 35 mins to fix

              Cognitive Complexity

              Cognitive Complexity is a measure of how difficult a unit of code is to intuitively understand. Unlike Cyclomatic Complexity, which determines how difficult your code will be to test, Cognitive Complexity tells you how difficult your code will be to read and comprehend.

              A method's cognitive complexity is based on a few simple rules:

              • Code is not considered more complex when it uses shorthand that the language provides for collapsing multiple statements into one
              • Code is considered more complex for each "break in the linear flow of the code"
              • Code is considered more complex when "flow breaking structures are nested"

              Further reading

              Avoid too many return statements within this function.
              Open

                      return None if val_len == -1 else buf.get(val_len)
              Severity: Major
              Found in kafka/tools/protocol/responses/__init__.py - About 30 mins to fix

                Avoid too many return statements within this function.
                Open

                        return None if val_len == -1 else buf.get(val_len).decode("utf-8")
                Severity: Major
                Found in kafka/tools/protocol/responses/__init__.py - About 30 mins to fix

                  Similar blocks of code found in 2 locations. Consider refactoring.
                  Open

                          if (not cache) or (self._last_full_metadata < (time.time() - self.configuration.metadata_refresh)):
                  Severity: Minor
                  Found in kafka/tools/client.py and 1 other location - About 30 mins to fix
                  kafka/tools/client.py on lines 789..789

                  Duplicated Code

                  Duplicated code can lead to software that is hard to understand and difficult to change. The Don't Repeat Yourself (DRY) principle states:

                  Every piece of knowledge must have a single, unambiguous, authoritative representation within a system.

                  When you violate DRY, bugs and maintenance problems are sure to follow. Duplicated code has a tendency to both continue to replicate and also to diverge (leaving bugs as two similar implementations differ in subtle ways).

                  Tuning

                  This issue has a mass of 32.

                  We set useful threshold defaults for the languages we support but you may want to adjust these settings based on your project guidelines.

                  The threshold configuration represents the minimum mass a code block must have to be analyzed for duplication. The lower the threshold, the more fine-grained the comparison.

                  If the engine is too easily reporting duplication, try raising the threshold. If you suspect that the engine isn't catching enough duplication, try lowering the threshold. The best setting tends to differ from language to language.

                  See codeclimate-duplication's documentation for more information about tuning the mass threshold in your .codeclimate.yml.

                  Refactorings

                  Further Reading

                  Severity
                  Category
                  Status
                  Source
                  Language