tomato42/tlsfuzzer

View on GitHub

Showing 302 of 302 total issues

Cyclomatic complexity is too high in class Runner. (14)
Open

class Runner(object):
"""Test if sending a set of commands returns expected values"""
 
def __init__(self, conversation):
"""Link conversation with runner"""
Severity: Minor
Found in tlsfuzzer/runner.py by radon

Function _split_data_to_pairwise has a Cognitive Complexity of 22 (exceeds 10 allowed). Consider refactoring.
Open

def _split_data_to_pairwise(self, name):
data = self._read_hamming_weight_data(name)
try:
pair_writers = dict()
 
 
Severity: Minor
Found in tlsfuzzer/analysis.py - About 2 hrs to fix

Cyclomatic complexity is too high in method create_k_specific_dirs. (13)
Open

def create_k_specific_dirs(self):
"""
Creates a folder with timing.csv for each K bit-size so it can be
analyzed one at a time.
"""
Severity: Minor
Found in tlsfuzzer/analysis.py by radon

Cyclomatic complexity is too high in class MarvinCiphertextGenerator. (13)
Open

class MarvinCiphertextGenerator(object):
"""
Generate a set of ciphertexts that should present the same timing behaviour
from server.
 
 
Severity: Minor
Found in tlsfuzzer/utils/rsa.py by radon

Function process_rsa_keys has a Cognitive Complexity of 21 (exceeds 10 allowed). Consider refactoring.
Open

def process_rsa_keys(self):
# list of values for the Hamming weight of d, p, q, dP, dQ, qInv
values = []
times = []
max_len = 20
Severity: Minor
Found in tlsfuzzer/extract.py - About 2 hrs to fix

Function skillings_mack_test has a Cognitive Complexity of 21 (exceeds 10 allowed). Consider refactoring.
Open

def skillings_mack_test(values, groups, blocks, duplicates=None, status=None):
"""Perform the Skillings-Mack rank sum test.
 
Skillings-Mack test is a Friedman-like test for unbalanced incomplete
block design data. The null hypothesis is that no group stochastically
Severity: Minor
Found in tlsfuzzer/utils/stats.py - About 2 hrs to fix

Cyclomatic complexity is too high in method process_and_create_multiple_csv_files. (12)
Open

def process_and_create_multiple_csv_files(self, files = {
"measurements.csv": "k-size",
}, ecdh = False):
original_measuremments_csv = self.measurements_csv
skipped_h_weight_invert = False
Severity: Minor
Found in tlsfuzzer/extract.py by radon

Cyclomatic complexity is too high in method process. (12)
Open

def process(self, state, msg):
"""
:type state: ConnectionState
:type msg: Message
"""
Severity: Minor
Found in tlsfuzzer/expect.py by radon

Cyclomatic complexity is too high in method process_measurements_and_create_hamming_csv_file. (12)
Open

def process_measurements_and_create_hamming_csv_file(
self, values_iter, items_in_tuple = 20):
"""
Processing all the nonces and associated time measurements from the
given files and creates a file with tuples associating the Hamming
Severity: Minor
Found in tlsfuzzer/extract.py by radon

Function main has 50 lines of code (exceeds 25 allowed). Consider refactoring.
Open

def main():
"""Process arguments and start extraction."""
logfile = None
capture = None
output = None
Severity: Minor
Found in tlsfuzzer/extract.py - About 2 hrs to fix

    Cyclomatic complexity is too high in method process. (12)
    Open

    def process(self, state, msg):
    assert msg.contentType == ContentType.alert
    parser = Parser(msg.write())
     
    alert = Alert()
    Severity: Minor
    Found in tlsfuzzer/expect.py by radon

    Cyclomatic complexity is too high in function _summarise_chunk. (12)
    Open

    def _summarise_chunk(args):
    global _groups
    groups = _groups
    global _values
    values = _values
    Severity: Minor
    Found in tlsfuzzer/utils/stats.py by radon

    Cyclomatic complexity is too high in method conf_interval_plot. (12)
    Open

    def conf_interval_plot(self):
    """Generate the confidence inteval for differences between samples."""
    if not self.draw_conf_interval_plot:
    return
    if self.verbose:
    Severity: Minor
    Found in tlsfuzzer/analysis.py by radon

    Cyclomatic complexity is too high in method _split_data_to_pairwise. (12)
    Open

    def _split_data_to_pairwise(self, name):
    data = self._read_hamming_weight_data(name)
    try:
    pair_writers = dict()
     
     
    Severity: Minor
    Found in tlsfuzzer/analysis.py by radon

    Cyclomatic complexity is too high in method _convert_to_binary. (12)
    Open

    def _convert_to_binary(self):
    timing_bin_path = join(self.output, "timing.bin")
    timing_csv_path = join(self.output, "timing.csv")
    legend_csv_path = join(self.output, "legend.csv")
    timing_bin_shape_path = join(self.output, "timing.bin.shape")
    Severity: Minor
    Found in tlsfuzzer/analysis.py by radon

    Identical blocks of code found in 2 locations. Consider refactoring.
    Open

    if len(clnt_msgs) != len(clnt_msgs_acks): # pragma: no cover
    # no overage; assert
    print(clnt_msgs)
    print()
    print(clnt_msgs_acks)
    Severity: Major
    Found in tlsfuzzer/extract.py and 1 other location - About 1 hr to fix
    tlsfuzzer/extract.py on lines 773..779

    Function _make_signature has a Cognitive Complexity of 20 (exceeds 10 allowed). Consider refactoring.
    Confirmed

    def _make_signature(self, status):
    """Create signature for CertificateVerify message."""
    if self.private_key is None:
    raise ValueError("Can't create a signature without "
    "private key!")
    Severity: Minor
    Found in tlsfuzzer/messages.py - About 1 hr to fix

    Identical blocks of code found in 2 locations. Consider refactoring.
    Open

    if len(clnt_msgs) != len(clnt_msgs_acks): # pragma: no cover
    # no coverage; assert
    print(clnt_msgs)
    print()
    print(clnt_msgs_acks)
    Severity: Major
    Found in tlsfuzzer/extract.py and 1 other location - About 1 hr to fix
    tlsfuzzer/extract.py on lines 871..877

    Function process_and_create_multiple_csv_files has a Cognitive Complexity of 20 (exceeds 10 allowed). Consider refactoring.
    Open

    def process_and_create_multiple_csv_files(self, files = {
    "measurements.csv": "k-size",
    }, ecdh = False):
    original_measuremments_csv = self.measurements_csv
    skipped_h_weight_invert = False
    Severity: Minor
    Found in tlsfuzzer/extract.py - About 1 hr to fix

    Cyclomatic complexity is too high in method _checkParams. (11)
    Open

    def _checkParams(self, server_key_exchange):
    groups = []
    if self.valid_groups and any(i in range(256, 512)
    for i in self.valid_groups):
    groups = [RFC7919_GROUPS[i - 256] for i in self.valid_groups
    Severity: Minor
    Found in tlsfuzzer/expect.py by radon
    Severity
    Category
    Status
    Source
    Language