├── scadl
├── __init__.py
├── non_profile.py
├── multi_task.py
├── augmentation.py
├── multi_label_profile.py
├── profile.py
└── tools.py
├── pyproject.toml
├── images
├── ascad.png
└── cw_aes.png
├── .gitignore
├── presentation
└── optimist24.pdf
├── requirements.txt
├── setup.py
├── .github
└── workflows
│ └── ci.yml
├── tutorial
├── visualization
│ ├── readme.md
│ ├── ascad_vis.py
│ └── simulator.py
├── multi_task
│ └── cw
│ │ ├── test.py
│ │ └── profile.py
├── profile
│ ├── cw
│ │ ├── test.py
│ │ └── profile.py
│ └── ascad
│ │ ├── test.py
│ │ └── profile.py
├── multi_label
│ └── cw
│ │ ├── test_multi_label.py
│ │ └── multi_label.py
└── non_profile
│ ├── cw
│ └── non_profile.py
│ └── ascad
│ └── ascad_non_profile.py
├── README.md
├── COPYING.LESSER
└── COPYING
/scadl/__init__.py:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/pyproject.toml:
--------------------------------------------------------------------------------
1 | [tool.isort]
2 | profile = "black"
3 |
--------------------------------------------------------------------------------
/images/ascad.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Ledger-Donjon/scadl/HEAD/images/ascad.png
--------------------------------------------------------------------------------
/images/cw_aes.png:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Ledger-Donjon/scadl/HEAD/images/cw_aes.png
--------------------------------------------------------------------------------
/.gitignore:
--------------------------------------------------------------------------------
1 | dist/
2 | build/
3 | data/
4 | scadl.egg-info/
5 | *.pyc
6 | *__pycache__*
7 | *.keras
8 | .envrc
9 |
--------------------------------------------------------------------------------
/presentation/optimist24.pdf:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/Ledger-Donjon/scadl/HEAD/presentation/optimist24.pdf
--------------------------------------------------------------------------------
/requirements.txt:
--------------------------------------------------------------------------------
1 | h5py==3.11.0
2 | innvestigate==2.1.2
3 | keras==2.14.0
4 | matplotlib==3.9.2
5 | numpy==1.24.3
6 | scikit_learn==1.5.2
7 | setuptools==59.6.0
8 | tensorflow==2.14.0
9 | tqdm==4.66.5
10 |
--------------------------------------------------------------------------------
/setup.py:
--------------------------------------------------------------------------------
1 | from setuptools import find_packages, setup
2 |
3 | setup(
4 | name="scadl",
5 | version="0.1",
6 | description="A tool for state of the art deep learning based side-channel attacks",
7 | author="Karim M. Abdellatif",
8 | packages=find_packages(),
9 | install_requires=["numpy", "keras", "matplotlib", "tqdm", "h5py"],
10 | )
11 |
--------------------------------------------------------------------------------
/.github/workflows/ci.yml:
--------------------------------------------------------------------------------
1 | name: CI
2 |
3 | on:
4 | push:
5 | pull_request:
6 |
7 | jobs:
8 | build:
9 | runs-on: ubuntu-latest
10 |
11 | steps:
12 | - uses: actions/checkout@v4
13 | - uses: actions/setup-python@v5
14 | with:
15 | python-version: "3.12"
16 | - name: Check formatting
17 | uses: psf/black@stable
18 | with:
19 | options: "--check --verbose"
20 | src: "./scadl ./tutorial"
21 | - name: Check import sorting
22 | uses: isort/isort-action@v1
23 | with:
24 | requirementsFiles: "requirements.txt"
25 | configuration: "--profile black --check-only --diff"
26 |
--------------------------------------------------------------------------------
/tutorial/visualization/readme.md:
--------------------------------------------------------------------------------
1 | # Attribution methods
2 |
3 | Attribution methods are used in side-channel attacks (SCAs) for for interpreting DL decisions. It's mainly used for identifying leaking operations in cryptographic implementations. It was proposed by [1](https://eprint.iacr.org/2019/143.pdf).
4 |
5 | This was achieved by using several open-source tools for attribution visualization used in image processing such as:
6 | 1. [DeepExplain](https://github.com/marcoancona/DeepExplain)
7 | 2. [Innvestigate](https://github.com/albermax/innvestigate)
8 | 3. [Xplique](https://github.com/deel-ai/xplique)
9 |
10 | The above codes show the following:
11 | - An example of using [Innvestigate](https://github.com/albermax/innvestigate) as a case study in the case of [ASCAD](https://github.com/ANSSI-FR/ASCAD/tree/master/ATMEGA_AES_v1).
12 | - A second example that shows the efficiency of such methods to be used in fault injection attacks.
13 |
14 |
15 |
16 |
--------------------------------------------------------------------------------
/tutorial/multi_task/cw/test.py:
--------------------------------------------------------------------------------
1 | if __name__ == "__main__":
2 | import sys
3 | from pathlib import Path
4 |
5 | import matplotlib.pyplot as plt
6 | import numpy as np
7 | from keras.models import load_model
8 |
9 | from scadl.multi_task import compute_guessing_entropy
10 | from scadl.tools import sbox, standardize
11 |
12 | NB_BYTES = 16
13 |
14 | if len(sys.argv) != 2:
15 | print("Need to specify the location of the dataset")
16 | exit()
17 |
18 | dataset_dir = Path(sys.argv[1])
19 |
20 | # Load traces and metadata for the attack
21 | dataset_dir = Path(sys.argv[1])
22 | traces = np.load(dataset_dir / "test/traces.npy")
23 | metadata = np.load(dataset_dir / "test/combined_test.npy")
24 |
25 | correct_key = metadata["key"][0]
26 |
27 | traces = standardize(traces)
28 |
29 | model = load_model("model.keras")
30 |
31 | predictions = model.predict(traces)
32 |
33 | for i in range(NB_BYTES):
34 | guessing_entropy, number_traces = compute_guessing_entropy(
35 | predictions[i],
36 | lambda data, guess: sbox[guess ^ int(data["plaintext"][i])],
37 | metadata,
38 | 256,
39 | correct_key[i],
40 | 1,
41 | 3,
42 | )
43 | plt.plot(number_traces, guessing_entropy)
44 | plt.show()
45 |
--------------------------------------------------------------------------------
/tutorial/profile/cw/test.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import matplotlib.pyplot as plt
5 | import numpy as np
6 | from keras.models import load_model
7 |
8 | from scadl.profile import Match
9 | from scadl.tools import normalization, sbox
10 |
11 |
12 | def leakage_model(data: np.ndarray, guess: int) -> int:
13 | return sbox[guess ^ int(data["plaintext"][0])]
14 |
15 |
16 | if __name__ == "__main__":
17 | if len(sys.argv) != 2:
18 | print("Need to specify the location of testing data")
19 | exit()
20 |
21 | dataset_dir = Path(sys.argv[1])
22 | SIZE_TEST = 50
23 | leakages = np.load(dataset_dir / "test/traces.npy")[0:SIZE_TEST]
24 | metadata = np.load(dataset_dir / "test/combined_test.npy")[0:SIZE_TEST]
25 |
26 | correct_key = metadata["key"][0][0]
27 | # Select the same POIs and apply the same preprocessing as in the training
28 | poi = normalization(leakages[:, 1315:1325], feature_range=(0, 1))
29 |
30 | # Load the model and evaluate the rank
31 | model = load_model("model.keras")
32 | test_engine = Match(model=model, leakage_model=leakage_model)
33 | rank, number_traces = test_engine.match(
34 | x_test=poi, metadata=metadata, guess_range=256, correct_key=correct_key, step=1
35 | )
36 |
37 | # Plot the result
38 | plt.plot(number_traces, rank, "black")
39 | plt.xlabel("Number of traces")
40 | plt.ylabel("Average rank of K[0]")
41 | plt.show()
42 |
--------------------------------------------------------------------------------
/tutorial/multi_label/cw/test_multi_label.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import matplotlib.pyplot as plt
5 | import numpy as np
6 | from keras.models import load_model
7 |
8 | from scadl.multi_label_profile import MatchMultiLabel
9 | from scadl.tools import sbox
10 |
11 | TARGET_BYTE = 1 # or 0
12 |
13 |
14 | def leakage_model(data: np.ndarray, guess: int) -> int:
15 | return sbox[guess ^ int(data["plaintext"][TARGET_BYTE])]
16 |
17 |
18 | if __name__ == "__main__":
19 | if len(sys.argv) != 2:
20 | print("Need to specify the location of testing data")
21 | exit()
22 |
23 | dataset_dir = Path(sys.argv[1])
24 |
25 | SIZE = 50
26 | leakages = np.load(dataset_dir / "test/traces.npy")[0:SIZE]
27 | metadata = np.load(dataset_dir / "test/combined_test.npy")[0:SIZE]
28 |
29 | # Select which key byte needs to be attacked
30 | prob_range = (TARGET_BYTE * 256, 256 + TARGET_BYTE * 256)
31 | correct_key = metadata["key"][0][TARGET_BYTE]
32 |
33 | # poi have the same indexes like the profiling phase
34 | poi = np.concatenate((leakages[:, 1315:1325], leakages[:, 1490:1505]), axis=1)
35 |
36 | model = load_model("model.keras")
37 |
38 | # Matching process
39 | test_engine = MatchMultiLabel(model=model, leakage_model=leakage_model)
40 | rank, number_traces = test_engine.match(
41 | x_test=poi,
42 | metadata=metadata,
43 | guess_range=256,
44 | correct_key=correct_key,
45 | step=1,
46 | prob_range=prob_range,
47 | )
48 |
49 | # Plot the key rank
50 | plt.plot(number_traces, rank, "black")
51 | plt.xlabel("Number of traces")
52 | plt.ylabel("Average rank of K[1] ")
53 | plt.show()
54 |
--------------------------------------------------------------------------------
/tutorial/profile/ascad/test.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import h5py
5 | import matplotlib.pyplot as plt
6 | import numpy as np
7 | from keras.models import load_model
8 |
9 | from scadl.profile import Match
10 | from scadl.tools import normalization, remove_avg, sbox
11 |
12 |
13 | def leakage_model(data: np.ndarray, guess: int) -> int:
14 |
15 | return sbox[guess ^ data["plaintext"][2]]
16 |
17 |
18 | if __name__ == "__main__":
19 | if len(sys.argv) != 2:
20 | print("Need to specify the location of training data")
21 | exit()
22 | dataset_dir = Path(sys.argv[1])
23 |
24 | # Load traces and metadata for attack
25 | file = h5py.File(dataset_dir / "ASCAD.h5", "r")
26 | leakages = file["Attack_traces"]["traces"][:]
27 | metadata = file["Attack_traces"]["metadata"][:]
28 |
29 | # correct key value to estimate the rank against
30 | correct_key = metadata["key"][0][2]
31 |
32 | # Select POIs where SNR gives the max value. It should have the same index
33 | # like what is used in the profiling phase.
34 | poi = leakages
35 |
36 | # Same preprocessing as for the training
37 | poi = normalization(remove_avg(poi), feature_range=(-1, 1))
38 |
39 | # Load the model
40 | model = load_model("model.keras")
41 | SIZE = 1000
42 | TRIALS = 20
43 | test_engine = Match(model=model, leakage_model=leakage_model)
44 |
45 | for i in range(TRIALS):
46 | index = np.random.randint(len(leakages) - SIZE)
47 | sample_poi = poi[index : index + SIZE]
48 | sample_metadata = metadata[index : index + SIZE]
49 | # Test the correct key rank
50 | rank, number_traces = test_engine.match(
51 | x_test=sample_poi,
52 | metadata=sample_metadata,
53 | guess_range=256,
54 | correct_key=correct_key,
55 | step=10,
56 | )
57 | if i == 0:
58 | avg_rank = rank
59 | else:
60 | avg_rank += rank
61 | avg_rank = avg_rank / TRIALS
62 |
63 | # Plot the result
64 | plt.plot(number_traces, avg_rank, "black")
65 | plt.xlabel("Number of traces")
66 | plt.ylabel("Average rank of K[2]")
67 | plt.show()
68 |
--------------------------------------------------------------------------------
/tutorial/non_profile/cw/non_profile.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import keras
5 | import matplotlib.pyplot as plt
6 | import numpy as np
7 | import tensorflow as tf
8 | from keras.layers import Dense, Input
9 | from keras.models import Sequential
10 | from tqdm import tqdm
11 |
12 | from scadl.non_profile import NonProfile
13 | from scadl.tools import normalization, remove_avg, sbox
14 |
15 | TARGET_BYTE = 0
16 |
17 |
18 | def mlp_non_profiling(len_samples: int) -> keras.Model:
19 | model = Sequential()
20 | model.add(Input(shape=(len_samples,)))
21 | model.add(Dense(20, activation="relu"))
22 | model.add(Dense(10, activation="relu"))
23 | model.add(Dense(2, activation="softmax"))
24 | model.compile(optimizer="adam", loss="mean_squared_error", metrics=["accuracy"])
25 | return model
26 |
27 |
28 | def leakage_model(data: np.ndarray, guess: int) -> int:
29 | return 1 & ((sbox[int(data["plaintext"][TARGET_BYTE]) ^ guess])) # lsb
30 |
31 |
32 | if __name__ == "__main__":
33 | if len(sys.argv) != 2:
34 | print("Need to specify the location of training data")
35 | exit()
36 |
37 | SIZE_TRACES = 500
38 | dataset_dir = Path(sys.argv[1])
39 | leakages = np.load(dataset_dir / "test/traces.npy")[0:SIZE_TRACES]
40 | metadata = np.load(dataset_dir / "test/combined_test.npy")[0:SIZE_TRACES]
41 | correct_key = metadata["key"][0][0]
42 |
43 | # Subtract average from traces + normalization
44 | avg = remove_avg(leakages[:, 1315:1325])
45 | x_train = normalization(avg, feature_range=(0, 1))
46 |
47 | # Non-profiling DL
48 | EPOCHS = 100
49 | key_range = range(0, 256)
50 | acc = np.zeros((len(key_range), EPOCHS))
51 | profile_engine = NonProfile(leakage_model=leakage_model)
52 | for index, guess in enumerate(tqdm(key_range)):
53 | acc[index] = profile_engine.train(
54 | model=mlp_non_profiling(x_train.shape[1]),
55 | x_train=x_train,
56 | metadata=metadata,
57 | guess=guess,
58 | hist_acc="accuracy",
59 | num_classes=2,
60 | epochs=EPOCHS,
61 | batch_size=1000,
62 | verbose=0,
63 | )
64 | guessed_key = np.argmax(np.max(acc, axis=1))
65 | print(f"guessed key = {hex(guessed_key)}")
66 | plt.plot(acc.T, "grey")
67 | plt.plot(acc[correct_key], "black")
68 | plt.xlabel("Number of epochs")
69 | plt.ylabel("Accuracy ")
70 | plt.show()
71 |
--------------------------------------------------------------------------------
/scadl/non_profile.py:
--------------------------------------------------------------------------------
1 | # This file is part of scadl
2 | #
3 | # scadl is free software: you can redistribute it and/or modify
4 | # it under the terms of the GNU Lesser General Public License as published by
5 | # the Free Software Foundation, either version 3 of the License, or
6 | # (at your option) any later version.
7 | #
8 | # This program is distributed in the hope that it will be useful,
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 | # GNU Lesser General Public License for more details.
12 | #
13 | # You should have received a copy of the GNU Lesser General Public License
14 | # along with this program. If not, see .
15 | #
16 | #
17 | # Copyright 2024 Karim ABDELLATIF, PhD, Ledger - karim.abdellatif@ledger.fr
18 |
19 |
20 | from collections.abc import Callable
21 | from typing import Optional
22 |
23 | import keras
24 | import numpy as np
25 | from keras.models import Model
26 |
27 |
28 | class NonProfile:
29 | """This class is used for Non-profiling DL attacks proposed in https://eprint.iacr.org/2018/196.pdf"""
30 |
31 | def __init__(self, leakage_model: Callable):
32 | """It takes a model and a leakagae_model function"""
33 | # super().__init__()
34 | self.leakage_model = leakage_model
35 | self.acc: Optional[np.ndarray] = None
36 | self.history = None
37 |
38 | def train(
39 | self,
40 | model: Model,
41 | x_train: np.ndarray,
42 | metadata: np.ndarray,
43 | guess: int,
44 | num_classes: int,
45 | hist_acc: str,
46 | epochs: int = 300,
47 | batch_size: int = 100,
48 | validation_split: float = 0.1,
49 | verbose: int = 1,
50 | **kwargs,
51 | ) -> np.ndarray:
52 | """
53 | x_train, metadata: leakages and additional data used for training.
54 | From the paper (https://tches.iacr.org/index.php/TCHES/article/view/7387/6559), the attack may work when hist_acc= 'accuracy'
55 | or 'val_accuracy'"""
56 | y_train = np.array([self.leakage_model(i, guess) for i in metadata])
57 | y = keras.utils.to_categorical(y_train, num_classes)
58 | self.history = model.fit(
59 | x=x_train,
60 | y=y,
61 | epochs=epochs,
62 | batch_size=batch_size,
63 | validation_split=validation_split,
64 | verbose=verbose,
65 | **kwargs,
66 | )
67 |
68 | acc = self.history.history[hist_acc]
69 |
70 | self.acc = acc
71 |
72 | return acc
73 |
--------------------------------------------------------------------------------
/tutorial/non_profile/ascad/ascad_non_profile.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import h5py
5 | import keras
6 | import matplotlib.pyplot as plt
7 | import numpy as np
8 | from keras.layers import Dense, Input
9 | from keras.models import Sequential
10 | from tqdm import tqdm
11 |
12 | from scadl.non_profile import NonProfile
13 | from scadl.tools import normalization, remove_avg, sbox
14 |
15 | TARGET_BYTE = 2
16 |
17 |
18 | def leakage_model(data: np.ndarray, guess: int) -> int:
19 | # return 1 & ((sbox[data["plaintext"][TARGET_BYTE] ^ guess]) >> 7) #msb
20 | return 1 & ((sbox[data["plaintext"][TARGET_BYTE] ^ guess])) # lsb
21 | # return hw(sbox[data['plaintext'][TARGET_BYTE] ^ guess]) #hw
22 |
23 |
24 | def mlp_ascad(len_samples: int) -> keras.Model:
25 | model = Sequential()
26 | model.add(Input(shape=(len_samples,)))
27 | model.add(Dense(20, activation="relu"))
28 | model.add(Dense(10, activation="relu"))
29 | model.add(Dense(2, activation="softmax"))
30 | model.compile(loss="mean_squared_error", optimizer="adam", metrics=["accuracy"])
31 | return model
32 |
33 |
34 | if __name__ == "__main__":
35 | if len(sys.argv) != 2:
36 | print("Need to specify the location of training data")
37 | exit()
38 | dataset_dir = Path(sys.argv[1])
39 |
40 | # Load traces and metadata for training
41 | SIZE_TEST = 15000
42 | file = h5py.File(dataset_dir / "ASCAD.h5", "r")
43 | leakages = np.array(file["Profiling_traces"]["traces"][:], dtype=np.int8)[
44 | 0:SIZE_TEST
45 | ]
46 | metadata = file["Profiling_traces"]["metadata"][:][0:SIZE_TEST]
47 | correct_key = metadata["key"][0][TARGET_BYTE]
48 |
49 | # Subtract average from traces + normalization
50 | x_train = normalization(remove_avg(leakages), feature_range=(-1, 1))
51 |
52 | # Non-profiling DL
53 | EPOCHS = 10
54 | guess_range = range(0, 256)
55 | acc = np.zeros((len(guess_range), EPOCHS))
56 | profile_engine = NonProfile(leakage_model=leakage_model)
57 | for index, guess in enumerate(tqdm(guess_range)):
58 | acc[index] = profile_engine.train(
59 | model=mlp_ascad(x_train.shape[1]),
60 | x_train=x_train,
61 | metadata=metadata,
62 | hist_acc="accuracy",
63 | guess=guess,
64 | num_classes=2,
65 | epochs=EPOCHS,
66 | batch_size=1000,
67 | verbose=0,
68 | )
69 | guessed_key = np.argmax(np.max(acc, axis=1))
70 | print(f"guessed key = {guessed_key}")
71 | plt.plot(acc.T, "grey")
72 | plt.plot(acc[correct_key], "black")
73 | plt.xlabel("Number of epochs")
74 | plt.ylabel("Accuracy ")
75 | plt.show()
76 |
--------------------------------------------------------------------------------
/scadl/multi_task.py:
--------------------------------------------------------------------------------
1 | from collections.abc import Callable
2 |
3 | import numpy as np
4 |
5 |
6 | def compute_guessing_entropy(
7 | predictions: np.ndarray,
8 | leakage_model: Callable[[np.ndarray, int], int],
9 | metadata: np.ndarray,
10 | guess_range: int,
11 | correct_key: int,
12 | step: int,
13 | num_attacks: int,
14 | ):
15 | """Approximate the guessing entropy as defined in https://eprint.iacr.org/2006/139.pdf"""
16 |
17 | assert len(predictions) > 0 and len(predictions) == len(metadata)
18 | assert correct_key < guess_range
19 | assert step >= 1
20 | assert num_attacks >= 1
21 |
22 | sum_rank = np.zeros(len(range(0, len(predictions), step)), dtype=np.uint32)
23 | permutation = np.arange(len(predictions))
24 | for _ in range(num_attacks):
25 | np.random.shuffle(permutation)
26 | rank, x_rank = compute_rank(
27 | predictions[permutation],
28 | leakage_model,
29 | metadata[permutation],
30 | guess_range,
31 | correct_key,
32 | step,
33 | )
34 | sum_rank += rank
35 |
36 | return sum_rank / num_attacks, x_rank
37 |
38 |
39 | def compute_rank(
40 | predictions: np.ndarray,
41 | leakage_model: Callable[[np.ndarray, int], int],
42 | metadata: np.ndarray,
43 | guess_range: int,
44 | correct_key: int,
45 | step: int,
46 | ) -> tuple[np.ndarray, np.ndarray]:
47 | """Compute the key rank.
48 |
49 | Ref:
50 | - https://eprint.iacr.org/2006/139.pdf
51 | """
52 | assert len(predictions) > 0 and len(predictions) == len(metadata)
53 | assert correct_key < guess_range
54 | assert step >= 1
55 |
56 | chunk_starts = range(0, len(predictions), step)
57 | rank = np.zeros(len(chunk_starts), dtype=np.uint32)
58 | x_rank = np.zeros(len(chunk_starts), dtype=np.uint32)
59 | number_traces = 0
60 | guesses_score = np.zeros(guess_range)
61 | for i, chunk_start in enumerate(chunk_starts):
62 | pred_chunk = predictions[chunk_start : chunk_start + step]
63 | metadata_chunk = metadata[chunk_start : chunk_start + step]
64 | for row in range(len(pred_chunk)):
65 | m = np.min(pred_chunk[row, pred_chunk[row] != 0])
66 | for guess in range(guess_range):
67 | index = leakage_model(metadata_chunk[row], guess)
68 | # Avoid NaNs with log
69 | if pred_chunk[row, index] == 0:
70 | guesses_score[guess] += np.log2(m)
71 | else:
72 | guesses_score[guess] += np.log2(pred_chunk[row, index])
73 | rank[i] = np.where(sorted(guesses_score)[::-1] == guesses_score[correct_key])[
74 | 0
75 | ][0]
76 |
77 | number_traces += step
78 | x_rank[i] = number_traces
79 |
80 | return rank, x_rank
81 |
--------------------------------------------------------------------------------
/tutorial/multi_label/cw/multi_label.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import keras
5 | import numpy as np
6 | from keras.layers import Conv1D, Dense, Flatten, MaxPooling1D
7 | from keras.models import Sequential
8 | from sklearn.preprocessing import MultiLabelBinarizer
9 |
10 | from scadl.multi_label_profile import MultiLabelProfile
11 | from scadl.tools import gen_labels, sbox
12 |
13 |
14 | def mlp_multi_label(
15 | sample_len: int, guess_range: int, nb_neurons: int = 50, nb_layers: int = 4
16 | ) -> keras.Model:
17 | """It takes :nb_neurons: as the number of neurons per layer and :nb_layers:
18 | as the number of layers."""
19 | model = Sequential()
20 | model.add(Input(shape=(sample_len,)))
21 | model.add(Dense(nb_neurons, activation="relu"))
22 | for _ in range(nb_layers - 2):
23 | model.add(Dense(nb_neurons, activation="relu"))
24 | # Dropout(0.1)
25 | # BatchNormalization()
26 | model.add(Dense(guess_range, activation="sigmoid"))
27 | model.compile(optimizer="adam", loss="binary_crossentropy", metrics=["accuracy"])
28 | return model
29 |
30 |
31 | def cnn_multi_label(sample_len: int, guess_range: int) -> keras.Model:
32 | model = Sequential()
33 | model.add(Input(shape=(sample_len, 1)))
34 | model.add(Conv1D(filters=20, kernel_size=5))
35 | model.add(MaxPooling1D(pool_size=5))
36 | model.add(Flatten())
37 | model.add(Dense(200, activation="relu"))
38 | model.add(Dense(guess_range, activation="sigmoid"))
39 | model.compile(optimizer="adam", loss="binary_crossentropy", metrics=["accuracy"])
40 | return model
41 |
42 |
43 | def leakage_model(data: np.ndarray, key_byte: np.ndarray) -> int:
44 | return sbox[data["plaintext"][key_byte] ^ data["key"][key_byte]]
45 |
46 |
47 | if __name__ == "__main__":
48 | if len(sys.argv) != 3:
49 | print("Need to specify the location of training data")
50 | exit()
51 |
52 | dataset_dir = Path(sys.argv[1])
53 | leakages = np.load(dataset_dir / "train/traces.npy")
54 | metadata = np.load(dataset_dir / "train/combined_train.npy")
55 | size_profiling = len(metadata)
56 |
57 | # poi for sbox[p0^k0] and sbox[p1^k1]
58 | poi = np.concatenate((leakages[:, 1315:1325], leakages[:, 1490:1505]), axis=1)
59 |
60 | # Generate labels
61 | y_0 = gen_labels(
62 | leakage_model=leakage_model, metadata=metadata, key_byte=0
63 | ).reshape((size_profiling, 1))
64 | y_1 = gen_labels(
65 | leakage_model=leakage_model, metadata=metadata, key_byte=1
66 | ).reshape((size_profiling, 1))
67 |
68 | # Shift second label by 256
69 | combined_labels = np.concatenate((y_0, y_1 + 256), axis=1)
70 | label = MultiLabelBinarizer()
71 | labels_fit = label.fit_transform(combined_labels)
72 |
73 | # Build model
74 | GUESS_RANGE = 512
75 | if sys.argv[2] == "mlp":
76 | model = mlp_multi_label(poi.shape[1], GUESS_RANGE)
77 | elif sys.argv[2] == "cnn":
78 | model = cnn_multi_label(poi.shape[1], GUESS_RANGE)
79 | else:
80 | print("Invalid model type")
81 | exit()
82 |
83 | model.summary()
84 |
85 | # Call multi-label profiling engine
86 | profile = MultiLabelProfile(model)
87 | profile.train(x_train=poi, y_train=labels_fit, epochs=100)
88 | profile.save_model("model.keras")
89 |
--------------------------------------------------------------------------------
/tutorial/profile/cw/profile.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import keras
5 | import numpy as np
6 | from keras.layers import Conv1D, Dense, Flatten, Input, MaxPooling1D
7 | from keras.models import Sequential
8 |
9 | from scadl.augmentation import Mixup
10 | from scadl.profile import Profile
11 | from scadl.tools import normalization, sbox
12 |
13 |
14 | def model_mlp(sample_len: int, range_outer_layer: int) -> keras.Model:
15 | model = Sequential()
16 | model.add(Input(shape=(sample_len,)))
17 | model.add(Dense(500, activation="relu"))
18 | model.add(Dense(500, activation="relu"))
19 | model.add(Dense(500, activation="relu"))
20 | model.add(Dense(500, activation="relu"))
21 | model.add(Dense(range_outer_layer, activation="softmax"))
22 | model.compile(
23 | optimizer="adam",
24 | loss="categorical_crossentropy",
25 | metrics=["accuracy"],
26 | )
27 | return model
28 |
29 |
30 | def model_cnn(sample_len: int, range_outer_layer: int) -> keras.Model:
31 | model = Sequential()
32 | model.add(Input(shape=(sample_len, 1)))
33 | model.add(Conv1D(filters=20, kernel_size=5, activation="tanh"))
34 | model.add(MaxPooling1D(pool_size=6))
35 | model.add(Flatten())
36 | model.add(Dense(200, activation="relu"))
37 | model.add(Dense(range_outer_layer, activation="softmax"))
38 | model.compile(
39 | optimizer="adam", loss="categorical_crossentropy", metrics=["accuracy"]
40 | )
41 | return model
42 |
43 |
44 | def leakage_model(data: np.ndarray) -> int:
45 | """leakage model for sbox[0]"""
46 | return sbox[data["plaintext"][0] ^ data["key"][0]]
47 |
48 |
49 | def data_aug(
50 | x_training: np.ndarray, y_training: np.ndarray
51 | ) -> tuple[np.ndarray, np.ndarray]:
52 | """It's used for data augmentation and it takes x, y as leakages and labels"""
53 | mix = Mixup()
54 | x, y = mix.generate(x_train=x_training, y_train=y_training, ratio=0.6)
55 | return x, y
56 |
57 |
58 | if __name__ == "__main__":
59 | if len(sys.argv) != 3:
60 | print("Need to specify the location of training data and model")
61 | exit()
62 |
63 | dataset_dir = Path(sys.argv[1])
64 | leakages = np.load(dataset_dir / "train/traces.npy")
65 | metadata = np.load(dataset_dir / "train/combined_train.npy")
66 |
67 | # Select POIs where SNR gives the max value
68 | # Normalization improves the learning
69 | x_train = normalization(leakages[:, 1315:1325], feature_range=(0, 1))
70 |
71 | # Build the model
72 | len_samples = x_train.shape[1]
73 | GUESS_RANGE = 256
74 | if sys.argv[2] == "mlp":
75 | model = model_mlp(sample_len=len_samples, range_outer_layer=GUESS_RANGE)
76 | elif sys.argv[2] == "cnn":
77 | model = model_cnn(len_samples, GUESS_RANGE)
78 | else:
79 | print("Invalid model type")
80 | exit()
81 |
82 | model.summary()
83 |
84 | # Train the model
85 | profile_engine = Profile(model, leakage_model=leakage_model)
86 | profile_engine.data_augmentation(data_aug)
87 | profile_engine.train(
88 | x_train=x_train,
89 | metadata=metadata,
90 | guess_range=256,
91 | epochs=50,
92 | batch_size=100,
93 | data_augmentation=False,
94 | )
95 | profile_engine.save_model("model.keras")
96 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # SCADL
2 |
3 | Following the current direction in Deep Learning (DL), more
4 | recent papers have started to pay attention to the efficiency of DL in
5 | breaking cryptographic implementations.
6 |
7 | Scadl is a **side-channel attack tool based on deep learning**. It implements most of the state-of-the-art techniques.
8 |
9 | This project has been developed within the research activities of the Donjon team (Ledger's security team), to help us during side-channel evaluations.
10 | ## Features
11 |
12 | Scadl implements the following attacks which have been published before:
13 | - Normal profiling: A straightforward profiling technique as the attacker will use a known-key dataset to train a DL model. Then, this model is used to attack the unknown-key data set. This technique was presented by the following work: [1](https://eprint.iacr.org/2016/921) and [2](https://eprint.iacr.org/2018/053).
14 | - [Non-profiling](https://tches.iacr.org/index.php/TCHES/article/view/7387) A similar technique to differential power analysis ([DPA](https://paulkocher.com/doc/DifferentialPowerAnalysis.pdf)) but it has the several advantages over DPA to attack protected designs (masking and desynchronization).
15 | - [Multi-label](https://eprint.iacr.org/2020/436): A technique to attack multiple keys using only one DL model.
16 | - [Multi-tasking](https://eprint.iacr.org/2023/006.pdf): Another technique for attacking multiple keys using a single model.
17 | - Data augmentation: A technique to increase the dataset to boost the DL efficiency. Scadl includes [mixup](https://eprint.iacr.org/2021/328.pdf) and [random-crop](https://blog.roboflow.com/why-and-how-to-implement-random-crop-data-augmentation/).
18 | - [Attribution methods](https://eprint.iacr.org/2019/143.pdf): A technique to perform leakage detection using DL.
19 |
20 | ## Installation
21 | It can be installed using python3
22 |
23 | pip install .
24 |
25 | ## Requirements
26 | - [keras](https://keras.io/)
27 | - [matplotlib](https://matplotlib.org/)
28 | - [numpy](https://numpy.org/)
29 | - [tensorflow](https://www.tensorflow.org/)
30 | - [h5py](https://pypi.org/project/h5py/)
31 |
32 | ## Tutorial
33 |
34 | ### Datasets
35 |
36 | Scadl uses two different datasets for its tutorial. The first dataset is collected by running a non-protected AES on [ChipWhisperer-Lite](https://rtfm.newae.com/Targets/CW303%20Arm/). The figure shown below indicates the power consumption of the first round AES (top). The bottom figure shows the SNR of **sbox[P^K]**. The yellow zone indicates P^K and the gray zone is related to **sbox[P^K]** of the 16 bytes. The profiling and non-profiling tutorials use the first peak in the gray zone which is related to **sbox[P[0] ^ K[0]]**. The multi-label tutorial uses the first two peaks of **sbox[P[0] ^ K[0]]** and **sbox[P[1] ^ K[1]]**.
37 |
38 | 
39 |
40 |
41 | The second dataset is [ASCAD](https://github.com/ANSSI-FR/ASCAD/tree/master/ATMEGA_AES_v1) which is widely used in the side-channel attacks (SCAs) domain. The figure below shows the power consumption trace (top), SNR of **sbox[P[2] ^ K[2]] ^ mask** (middle), and **mask** (bottom).
42 |
43 | 
44 |
45 | ### Labeling
46 | we consider for all the experiments, one or several AES Sbox for labeling the DL architectures.
47 | ```python
48 | def leakage_model(metadata):
49 | """leakage model for sbox[0]"""
50 | return sbox[metadata["plaintext"][0] ^ metadata["key"][0]]
51 | ```
52 | ### DL models
53 | For our experiments, we use CNN and MLP models which are the most used DL models by the SCA community.
54 |
55 |
--------------------------------------------------------------------------------
/tutorial/visualization/ascad_vis.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import h5py
5 | import innvestigate
6 | import keras
7 | import matplotlib.pyplot as plt
8 | import numpy as np
9 | import tensorflow as tf
10 | from keras.layers import Dense, Input
11 | from keras.models import Sequential
12 |
13 | from scadl.augmentation import Mixup, RandomCrop
14 | from scadl.profile import Profile
15 | from scadl.tools import normalization, remove_avg, sbox
16 |
17 | tf.compat.v1.disable_eager_execution()
18 |
19 |
20 | def leakage_model(data):
21 | """leakage model for sbox[2]"""
22 | return sbox[data["plaintext"][2] ^ data["key"][2]]
23 |
24 |
25 | def aug_mixup(x: np.ndarray, y: np.ndarray) -> tuple[np.ndarray, np.ndarray]:
26 | """Data augmentation based on mixup"""
27 | mix = Mixup()
28 | x, y = mix.generate(x_train=x, y_train=y, ratio=1, alpha=1)
29 | return x, y
30 |
31 |
32 | def aug_crop(x: np.ndarray, y: np.ndarray) -> tuple[np.ndarray, np.ndarray]:
33 | """Data augmentation based on random crop"""
34 | mix = RandomCrop()
35 | x, y = mix.generate(x_train=x, y_train=y, ratio=1, window=5)
36 | return x, y
37 |
38 |
39 | def mlp_short(len_samples: int) -> keras.Model:
40 | """
41 | param len_samples: size of a single trace
42 | """
43 | model = Sequential()
44 | model.add(Input(shape=(len_samples,)))
45 | model.add(Dense(20, activation="relu"))
46 | # BatchNormalization()
47 | model.add(Dense(50, activation="relu"))
48 | model.add(Dense(256, activation="softmax"))
49 | model.compile(
50 | loss="categorical_crossentropy",
51 | optimizer="adam",
52 | metrics=["accuracy"],
53 | )
54 | return model
55 |
56 |
57 | if __name__ == "__main__":
58 | if len(sys.argv) != 2:
59 | print("Need to specify the location of training data and model")
60 | exit()
61 | dataset_dir = Path(sys.argv[1])
62 |
63 | # Load traces and metadata for training
64 | file = h5py.File(dataset_dir / "ASCAD.h5", "r")
65 | leakages = file["Profiling_traces"]["traces"][:]
66 | metadata = file["Profiling_traces"]["metadata"][:]
67 |
68 | # Select POIs where SNR is high
69 | poi = leakages
70 |
71 | # Preprocess traces
72 | x_train = normalization(remove_avg(poi), feature_range=(-1, 1))
73 | GUESS_RANGE = 256
74 |
75 | # Build the model
76 | model = mlp_short(x_train.shape[1])
77 | # Train the model
78 | profile_engine = Profile(model, leakage_model=leakage_model)
79 | profile_engine.data_augmentation(aug_mixup)
80 | profile_engine.train(
81 | x_train=x_train,
82 | metadata=metadata,
83 | guess_range=256,
84 | epochs=25,
85 | batch_size=128,
86 | validation_split=0.1,
87 | data_augmentation=False,
88 | )
89 | model = profile_engine.model
90 | # Call test traces
91 | test_traces = file["Attack_traces"]["traces"][:]
92 | test_metadata = file["Attack_traces"]["metadata"][:]
93 | test_traces = normalization(remove_avg(test_traces), feature_range=(-1, 1))
94 | model_wo_sm = innvestigate.model_wo_softmax(model)
95 | gradient_analyzer = innvestigate.analyzer.Gradient(model_wo_sm)
96 | vis_trace = np.zeros(700)
97 | for index, trace_sample in enumerate(test_traces):
98 | trace = trace_sample.reshape(1, 700)
99 | prob = model.predict(trace)
100 | vis_trace += gradient_analyzer.analyze(trace)[0]
101 | plt.plot(abs(vis_trace / len(test_traces)))
102 | plt.show()
103 |
--------------------------------------------------------------------------------
/tutorial/multi_task/cw/profile.py:
--------------------------------------------------------------------------------
1 | from keras.layers import BatchNormalization, Dense, Input
2 | from keras.models import Model
3 | from keras.optimizers import Adam
4 |
5 |
6 | def mlp_short_multi(len_samples: int, nb_bytes: int) -> Model:
7 | input_layer = Input(shape=(len_samples,))
8 |
9 | internal_layer = Dense(100, activation="relu")(input_layer)
10 | internal_layer = BatchNormalization()(internal_layer)
11 |
12 | output_layers = []
13 | for i in range(nb_bytes):
14 | output_layer = Dense(256, activation="softmax", name=f"byte_{i}")(
15 | internal_layer
16 | )
17 | output_layers.append(output_layer)
18 | model = Model(inputs=input_layer, outputs=output_layers)
19 |
20 | model.compile(
21 | loss=["categorical_crossentropy" for _ in range(nb_bytes)],
22 | optimizer=Adam(learning_rate=0.001),
23 | metrics=["accuracy" for _ in range(nb_bytes)],
24 | )
25 | return model
26 |
27 |
28 | def show_loss_history(history):
29 | plt.plot(history.history["loss"], label="Training loss")
30 | plt.plot(history.history["val_loss"], label="Validation loss")
31 | plt.xlabel("Epochs")
32 | plt.ylabel("Loss")
33 | plt.legend()
34 |
35 | plt.show()
36 |
37 |
38 | if __name__ == "__main__":
39 | import sys
40 | from pathlib import Path
41 |
42 | import matplotlib.pyplot as plt
43 | import numpy as np
44 | from keras.callbacks import ModelCheckpoint
45 | from keras.utils import to_categorical
46 | from sklearn.model_selection import train_test_split
47 |
48 | from scadl.tools import sbox, standardize
49 |
50 | NB_BYTES = 16
51 |
52 | if len(sys.argv) != 2:
53 | print("Need to specify the location of the dataset")
54 | exit()
55 |
56 | # Load traces and metadata for training
57 | dataset_dir = Path(sys.argv[1])
58 | traces = np.load(dataset_dir / "train/traces.npy")
59 | metadata = np.load(dataset_dir / "train/combined_train.npy")
60 |
61 | # Prepare inputs and labels
62 | x = traces
63 | y = metadata
64 |
65 | x_train, x_test, metadata_train, metadata_test = train_test_split(
66 | x, y, test_size=0.1
67 | )
68 |
69 | x_train = standardize(x_train)
70 | x_test = standardize(x_test)
71 |
72 | sbox_vectorized = np.vectorize(lambda x: sbox[x], otypes=[np.uint8])
73 |
74 | def leakage_model_vectorized(metadata: np.ndarray) -> np.ndarray:
75 | return sbox_vectorized(np.bitwise_xor(metadata["plaintext"], metadata["key"]))
76 |
77 | y_train = [
78 | to_categorical(
79 | leakage_model_vectorized(metadata_train[:, i]),
80 | num_classes=256,
81 | )
82 | for i in range(NB_BYTES)
83 | ]
84 |
85 | y_test = [
86 | to_categorical(
87 | leakage_model_vectorized(metadata_test[:, i]),
88 | num_classes=256,
89 | )
90 | for i in range(NB_BYTES)
91 | ]
92 |
93 | # Build the model
94 | model = mlp_short_multi(x.shape[1], NB_BYTES)
95 | model.summary()
96 |
97 | callbacks = [
98 | ModelCheckpoint(
99 | "model.checkpoint.keras",
100 | monitor="val_loss",
101 | save_best_only=True,
102 | ),
103 | ]
104 | history = model.fit(
105 | x_train,
106 | y_train,
107 | epochs=200,
108 | batch_size=256,
109 | verbose=True,
110 | validation_data=(x_test, y_test),
111 | callbacks=callbacks,
112 | )
113 |
114 | show_loss_history(history)
115 |
116 | model.save("model.keras")
117 |
--------------------------------------------------------------------------------
/scadl/augmentation.py:
--------------------------------------------------------------------------------
1 | # This file is part of scadl
2 | #
3 | # scadl is free software: you can redistribute it and/or modify
4 | # it under the terms of the GNU Lesser General Public License as published by
5 | # the Free Software Foundation, either version 3 of the License, or
6 | # (at your option) any later version.
7 | #
8 | # This program is distributed in the hope that it will be useful,
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 | # GNU Lesser General Public License for more details.
12 | #
13 | # You should have received a copy of the GNU Lesser General Public License
14 | # along with this program. If not, see .
15 | #
16 | #
17 | # Copyright 2024 Karim ABDELLATIF, PhD, Ledger - karim.abdellatif@ledger.fr
18 | import numpy as np
19 |
20 |
21 | class Mixup:
22 | """This class used for data augmentation
23 | proposed in https://eprint.iacr.org/2021/328.pdf"""
24 |
25 | def generate(
26 | self, x_train: np.ndarray, y_train: np.ndarray, ratio: float, alpha: float = 0.2
27 | ) -> tuple[np.ndarray, np.ndarray]:
28 | """It taked x_train, y_train, which are leakages and labels"""
29 |
30 | len_augmented_data = int(ratio * len(x_train))
31 | augmented_data = np.zeros((len_augmented_data, x_train.shape[1]))
32 | augmented_labels = np.zeros((len_augmented_data, y_train.shape[1]))
33 | for i in range(len_augmented_data):
34 | # lam = np.clip(np.random.beta(alpha, alpha), 0.4, 0.6)
35 | lam = np.random.beta(alpha, alpha)
36 | random_index = np.random.randint(x_train.shape[0] - 1)
37 | augmented_data[i] = (lam * x_train[random_index]) + (
38 | (1 - lam) * x_train[random_index + 1]
39 | )
40 | augmented_labels[i] = (lam * y_train[random_index]) + (
41 | (1 - lam) * y_train[random_index + 1]
42 | )
43 |
44 | return np.concatenate((x_train, augmented_data), axis=0), np.concatenate(
45 | (y_train, augmented_labels), axis=0
46 | )
47 |
48 |
49 | class RandomCrop:
50 | """A data augmentation technique shown in
51 | https://blog.roboflow.com/why-and-how-to-implement-random-crop-data-augmentation/"""
52 |
53 | def generate(
54 | self, x_train: np.ndarray, y_train: np.ndarray, ratio: float, window: int
55 | ) -> tuple[np.ndarray, np.ndarray]:
56 | """It taked x_train, y_train, ratio which are leakages, labels, and data increase ratio"""
57 | len_augmented_data = int(ratio * len(x_train))
58 | augmented_data = np.zeros((len_augmented_data, x_train.shape[1]))
59 | augmented_labels = np.zeros((len_augmented_data, y_train.shape[1]))
60 | for i in range(len_augmented_data):
61 | random_index = np.random.randint(x_train.shape[0])
62 | sample_trace = x_train[random_index]
63 | random_window = np.random.randint(x_train.shape[1] - window)
64 | sample_trace[random_window : random_window + window] = 0
65 | augmented_data[i] = sample_trace
66 | augmented_labels[i] = y_train[random_index]
67 | return np.concatenate((x_train, augmented_data), axis=0), np.concatenate(
68 | (y_train, augmented_labels), axis=0
69 | )
70 |
71 |
72 | if __name__ == "__main__":
73 | leakages = np.random.randint(10, size=(10, 10))
74 | labels = np.random.randint(3, size=(10, 5))
75 | mixup = Mixup()
76 | x, y = mixup.generate(x_train=leakages, y_train=labels, ratio=0.5)
77 | print("result of mixup")
78 | print(f"x={x}, y={y}")
79 | random_crop = RandomCrop()
80 | x, y = random_crop.generate(x_train=leakages, y_train=labels, ratio=0.5, window=5)
81 | print("result of random crop")
82 | print(f"x={x}, y={y}")
83 |
--------------------------------------------------------------------------------
/scadl/multi_label_profile.py:
--------------------------------------------------------------------------------
1 | # This file is part of scadl
2 | #
3 | # scadl is free software: you can redistribute it and/or modify
4 | # it under the terms of the GNU Lesser General Public License as published by
5 | # the Free Software Foundation, either version 3 of the License, or
6 | # (at your option) any later version.
7 | #
8 | # This program is distributed in the hope that it will be useful,
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 | # GNU Lesser General Public License for more details.
12 | #
13 | # You should have received a copy of the GNU Lesser General Public License
14 | # along with this program. If not, see .
15 | #
16 | #
17 | # Copyright 2024 Karim ABDELLATIF, PhD, Ledger - karim.abdellatif@ledger.fr
18 |
19 |
20 | from collections.abc import Callable
21 |
22 | import numpy as np
23 | from keras.models import Model
24 |
25 |
26 | class MultiLabelProfile:
27 | """This class is used for multi-label classification"""
28 |
29 | def __init__(self, model: Model):
30 | super().__init__()
31 | self.model = model
32 | self.history = None
33 |
34 | def train(
35 | self,
36 | x_train: np.ndarray,
37 | y_train: np.ndarray,
38 | epochs: int = 300,
39 | batch_size: int = 100,
40 | validation_split: float = 0.1,
41 | **kwargs,
42 | ):
43 | """This function accepts
44 | x_train: np.array,
45 | y_train: np.array,
46 | """
47 | self.history = self.model.fit(
48 | x_train,
49 | y_train,
50 | epochs=epochs,
51 | batch_size=batch_size,
52 | validation_split=validation_split,
53 | **kwargs,
54 | )
55 |
56 | def save_model(self, name: str):
57 | """It takes a string name and saves the model"""
58 | self.model.save(name)
59 |
60 |
61 | class MatchMultiLabel:
62 | """This class is used for testing the attack"""
63 |
64 | def __init__(self, model: Model, leakage_model: Callable):
65 | super().__init__()
66 | self.model = model
67 | self.leakage_model = leakage_model
68 |
69 | def match(
70 | self,
71 | x_test: np.ndarray,
72 | metadata: np.ndarray,
73 | guess_range: int,
74 | correct_key: int,
75 | step: int,
76 | prob_range: tuple[int, int] = (0, 256),
77 | ) -> tuple[np.ndarray, np.ndarray]:
78 | """
79 | x_test, metadata: data used for profiling.
80 | prob_range depending on the targeted byte
81 | for ex: k0: (0, 256), k1: (256, 512), k2: (512, 768), .... etc
82 | """
83 | predictions = self.model.predict(x_test)[:, prob_range[0] : prob_range[1]]
84 |
85 | chunk_starts = range(0, len(x_test), step)
86 | rank = np.zeros(len(chunk_starts), dtype=np.uint32)
87 | x_rank = np.zeros(len(chunk_starts), dtype=np.uint32)
88 | number_traces = 0
89 | rank_array = np.zeros(guess_range)
90 | for i, chunk_start in enumerate(chunk_starts):
91 | pred_chunk = predictions[chunk_start : chunk_start + step]
92 | metadata_chunk = metadata[chunk_start : chunk_start + step]
93 | for row in range(len(pred_chunk)):
94 | for guess in range(guess_range):
95 | index = self.leakage_model(metadata_chunk[row], guess)
96 | if pred_chunk[row, index] != 0:
97 | rank_array[guess] += np.log(pred_chunk[row, index])
98 | rank[i] = np.where(sorted(rank_array)[::-1] == rank_array[correct_key])[0][
99 | 0
100 | ]
101 |
102 | number_traces += step
103 | x_rank[i] = number_traces
104 |
105 | return rank, x_rank
106 |
--------------------------------------------------------------------------------
/tutorial/profile/ascad/profile.py:
--------------------------------------------------------------------------------
1 | import sys
2 | from pathlib import Path
3 |
4 | import h5py
5 | import keras
6 | import numpy as np
7 | from keras.layers import Conv1D, Dense, Flatten, Input, MaxPooling1D
8 | from keras.models import Sequential
9 |
10 | from scadl.augmentation import Mixup, RandomCrop
11 | from scadl.profile import Profile
12 | from scadl.tools import normalization, remove_avg, sbox
13 |
14 |
15 | def leakage_model(data):
16 | """leakage model for sbox[2]"""
17 | return sbox[data["plaintext"][2] ^ data["key"][2]]
18 |
19 |
20 | def aug_mixup(x: np.ndarray, y: np.ndarray) -> tuple[np.ndarray, np.ndarray]:
21 | """Data augmentation based on mixup"""
22 | mix = Mixup()
23 | x, y = mix.generate(x_train=x, y_train=y, ratio=1, alpha=1)
24 | return x, y
25 |
26 |
27 | def aug_crop(x: np.ndarray, y: np.ndarray) -> tuple[np.ndarray, np.ndarray]:
28 | """Data augmentation based on random crop"""
29 | mix = RandomCrop()
30 | x, y = mix.generate(x_train=x, y_train=y, ratio=1, window=5)
31 | return x, y
32 |
33 |
34 | def mlp_short(len_samples: int) -> keras.Model:
35 | model = Sequential()
36 | model.add(Input(shape=(len_samples,)))
37 | model.add(Dense(20, activation="relu"))
38 | # BatchNormalization()
39 | model.add(Dense(50, activation="relu"))
40 | model.add(Dense(256, activation="softmax"))
41 | model.compile(
42 | loss="categorical_crossentropy",
43 | optimizer="adam",
44 | metrics=["accuracy"],
45 | )
46 | return model
47 |
48 |
49 | def model_cnn(sample_len: int, range_outer_layer: int) -> keras.Model:
50 | model = Sequential()
51 | model.add(Input(shape=(sample_len, 1)))
52 | model.add(
53 | Conv1D(
54 | filters=8,
55 | kernel_size=32,
56 | padding="same",
57 | activation="relu",
58 | )
59 | )
60 | model.add(MaxPooling1D(pool_size=3))
61 | model.add(
62 | Conv1D(
63 | filters=8,
64 | kernel_size=16,
65 | padding="same",
66 | activation="tanh",
67 | )
68 | )
69 | model.add(MaxPooling1D(pool_size=3))
70 | model.add(
71 | Conv1D(
72 | filters=8,
73 | kernel_size=8,
74 | padding="same",
75 | activation="tanh",
76 | )
77 | )
78 | model.add(MaxPooling1D(pool_size=2))
79 | model.add(Flatten())
80 | model.add(Dense(50, activation="relu"))
81 | model.add(Dense(range_outer_layer, activation="softmax"))
82 | model.compile(
83 | optimizer="adam", loss="categorical_crossentropy", metrics=["accuracy"]
84 | )
85 | return model
86 |
87 |
88 | if __name__ == "__main__":
89 | if len(sys.argv) != 3:
90 | print("Need to specify the location of training data and model")
91 | exit()
92 | dataset_dir = Path(sys.argv[1])
93 |
94 | # Load traces and metadata for training
95 | file = h5py.File(dataset_dir / "ASCAD.h5", "r")
96 | leakages = file["Profiling_traces"]["traces"][:]
97 | metadata = file["Profiling_traces"]["metadata"][:]
98 |
99 | # Select POIs where SNR is high
100 | poi = (
101 | leakages # np.concatenate((leakages[:, 515:520], leakages[:, 148:158]), axis=1)
102 | )
103 |
104 | # Preprocess traces
105 | x_train = normalization(remove_avg(poi), feature_range=(-1, 1))
106 | GUESS_RANGE = 256
107 |
108 | # Build the model
109 | if sys.argv[2] == "mlp":
110 | model = mlp_short(x_train.shape[1])
111 | elif sys.argv[2] == "cnn":
112 | model = model_cnn(x_train.shape[1], GUESS_RANGE)
113 | else:
114 | print("Invalid model type")
115 | exit()
116 |
117 | model.summary()
118 |
119 | # Train the model
120 | profile_engine = Profile(model, leakage_model=leakage_model)
121 | profile_engine.data_augmentation(aug_mixup)
122 | profile_engine.train(
123 | x_train=x_train,
124 | metadata=metadata,
125 | guess_range=256,
126 | epochs=50,
127 | batch_size=128,
128 | validation_split=0.1,
129 | data_augmentation=False,
130 | )
131 |
132 | profile_engine.save_model("model.keras")
133 |
--------------------------------------------------------------------------------
/tutorial/visualization/simulator.py:
--------------------------------------------------------------------------------
1 | import innvestigate
2 | import keras
3 | import matplotlib.pyplot as plt
4 | import numpy as np
5 | import tensorflow as tf
6 | from keras.layers import Dense, Input
7 | from keras.models import Sequential
8 |
9 | from scadl.augmentation import Mixup
10 | from scadl.profile import Profile
11 | from scadl.tools import normalization
12 |
13 | tf.compat.v1.disable_eager_execution()
14 |
15 |
16 | def aug_mixup(x: np.ndarray, y: np.ndarray) -> tuple[np.ndarray, np.ndarray]:
17 | """Data augmentation based on mixup"""
18 | mix = Mixup()
19 | x, y = mix.generate(x_train=x, y_train=y, ratio=1, alpha=1)
20 | return x, y
21 |
22 |
23 | def model_mlp(sample_len: int, range_outer_layer: int) -> keras.Model:
24 | """param sample_len: number of samples
25 | param range_outer_layer: Number of guess
26 | """
27 | model = Sequential()
28 | model.add(Input(shape=(sample_len,)))
29 | model.add(Dense(50, activation="relu"))
30 | model.add(Dense(50, activation="relu"))
31 | model.add(Dense(range_outer_layer, activation="softmax"))
32 | model.compile(
33 | optimizer="adam",
34 | loss="categorical_crossentropy",
35 | metrics=["accuracy"],
36 | )
37 | return model
38 |
39 |
40 | def leakage_model(data: np.ndarray) -> int:
41 | """leakage model"""
42 | return data
43 |
44 |
45 | def simulator(
46 | len_traces: int, len_samples: int, value: int, randomize=False
47 | ) -> np.ndarray:
48 | """A leakage value simulator"""
49 | leakages = np.random.uniform(1, 1.1, size=(len_traces, len_samples))
50 | if randomize:
51 | for index in range(len_traces):
52 | random_offset = np.random.randint(len_samples)
53 | leakages[index, random_offset] = value
54 | return leakages
55 |
56 |
57 | def handy_ttest(group_a: np.ndarray, group_b: np.ndarray) -> np.ndarray:
58 | """t-test engine"""
59 | mean_a = np.mean(group_a, axis=0)
60 | mean_b = np.mean(group_b, axis=0)
61 | var_a = np.var(group_a, axis=0)
62 | var_b = np.var(group_b, axis=0)
63 | dec_1 = var_a / group_a.shape[0]
64 | dec_2 = var_b / group_b.shape[0]
65 | dec = np.sqrt(dec_1 + dec_2)
66 | num = mean_a - mean_b
67 | return num / dec
68 |
69 |
70 | def main():
71 | """main function"""
72 | len_traces = 50000
73 | len_samples = 500
74 | value_unprotect = 10
75 | traces_unprotect = simulator(
76 | len_traces=len_traces,
77 | len_samples=len_samples,
78 | value=value_unprotect,
79 | randomize=True,
80 | )
81 | value_protect = 20
82 | traces_protect = simulator(
83 | len_traces=len_traces,
84 | len_samples=len_samples,
85 | value=value_protect,
86 | randomize=True,
87 | )
88 | # t_test = handy_ttest(traces_protect, traces_unprotect)
89 | dif_mean = abs(
90 | np.average(traces_unprotect, axis=0) - np.average(traces_protect, axis=0)
91 | )
92 | plt.style.use("dark_background")
93 | _, (ax0, ax1) = plt.subplots(2)
94 | ax0.plot(traces_unprotect[0:100].T)
95 | ax0.plot(traces_protect[0:100].T)
96 | ax1.plot(dif_mean) # t_test
97 | plt.show()
98 |
99 | labels_0 = np.zeros(len_traces)
100 | labels_1 = np.ones(len_traces)
101 | leakages = np.concatenate((traces_unprotect, traces_protect), axis=0)
102 | metadata = np.concatenate((labels_0, labels_1), axis=0)
103 | x_train = normalization((leakages), feature_range=(-1, 1))
104 | guess_range = 2
105 | model = model_mlp(len_samples, guess_range)
106 | # Train the model
107 | profile_engine = Profile(model, leakage_model=leakage_model)
108 | number_epochs = 5
109 | profile_engine.data_augmentation(aug_mixup)
110 | profile_engine.train(
111 | x_train=x_train,
112 | metadata=metadata,
113 | guess_range=guess_range,
114 | epochs=number_epochs,
115 | batch_size=10,
116 | validation_split=0.1,
117 | data_augmentation=False,
118 | )
119 |
120 | # plt.plot(profile_engine.history.history['loss'], 'r')
121 | # plt.plot(profile_engine.history.history['val_loss'], 'b')
122 | # plt.show()
123 |
124 | model_wo_sm = innvestigate.model_wo_softmax(model)
125 | # gradient_analyzer = innvestigate.analyzer.Gradient(model_wo_sm)
126 | gradient_analyzer = innvestigate.analyzer.DeepTaylor(model_wo_sm)
127 | vis_trace = np.zeros(len_traces)
128 | for i in range(len_traces):
129 | trace_sample = traces_protect[i]
130 | trace = trace_sample.reshape(1, len_samples)
131 | # prob = model.predict(trace)
132 | vis_trace = gradient_analyzer.analyze(trace)[0]
133 | _, (ax0, ax1, ax2) = plt.subplots(3)
134 | ax0.plot(traces_protect[0:100].T)
135 | ax0.plot(traces_unprotect[0:100].T)
136 | ax1.plot(trace_sample, "blue")
137 | ax2.plot(abs(vis_trace), "red")
138 | plt.show()
139 |
140 |
141 | if __name__ == "__main__":
142 | main()
143 |
--------------------------------------------------------------------------------
/scadl/profile.py:
--------------------------------------------------------------------------------
1 | # This file is part of scadl
2 | #
3 | # scadl is free software: you can redistribute it and/or modify
4 | # it under the terms of the GNU Lesser General Public License as published by
5 | # the Free Software Foundation, either version 3 of the License, or
6 | # (at your option) any later version.
7 | #
8 | # This program is distributed in the hope that it will be useful,
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 | # GNU Lesser General Public License for more details.
12 | #
13 | # You should have received a copy of the GNU Lesser General Public License
14 | # along with this program. If not, see .
15 | #
16 | #
17 | # Copyright 2024 Karim ABDELLATIF, PhD, Ledger - karim.abdellatif@ledger.fr
18 |
19 | from collections.abc import Callable
20 | from typing import Optional
21 |
22 | import keras
23 | import numpy as np
24 | from keras.models import Model
25 | from sklearn.model_selection import train_test_split
26 |
27 |
28 | class Profile:
29 | """This class is used for normal profiling.
30 | It takes two argiments: the DL model and the leakage model
31 | """
32 |
33 | def __init__(self, model: Model, leakage_model: Callable[[np.ndarray], int]):
34 | super().__init__()
35 | self.model = model
36 | self.leakage_model = leakage_model
37 | self.data_aug: Optional[
38 | Callable[[np.ndarray, np.ndarray], tuple[np.ndarray, np.ndarray]]
39 | ] = None
40 | self.history = None
41 |
42 | def data_augmentation(
43 | self,
44 | func_aug: Callable[[np.ndarray, np.ndarray], tuple[np.ndarray, np.ndarray]],
45 | ):
46 | """to pass the self"""
47 | self.data_aug = func_aug
48 |
49 | def train(
50 | self,
51 | x_train: np.ndarray,
52 | metadata: np.ndarray,
53 | guess_range: int,
54 | epochs: int = 300,
55 | batch_size: int = 100,
56 | validation_split: float = 0.1,
57 | data_augmentation: bool = False,
58 | verbose: int = 1,
59 | **kwargs,
60 | ):
61 | """This function is used to train the model
62 | x_train: poi from leakages
63 | metadata: the plaintexts, keys, ciphertexts used for profiling
64 | """
65 |
66 | assert self.data_aug is not None
67 |
68 | y_train = np.array([self.leakage_model(m) for m in metadata])
69 | y_train = keras.utils.to_categorical(y_train, guess_range)
70 | if data_augmentation:
71 | x, y = self.data_aug(x_train, y_train)
72 | else:
73 | x, y = x_train, y_train
74 |
75 | x_training, x_test, y_training, y_test = train_test_split(
76 | x, y, test_size=validation_split
77 | )
78 | self.history = self.model.fit(
79 | x_training,
80 | y_training,
81 | epochs=epochs,
82 | batch_size=batch_size,
83 | verbose=verbose,
84 | validation_data=(x_test, y_test),
85 | **kwargs,
86 | )
87 |
88 | def save_model(self, name: str):
89 | """It accepts a str to save the file name"""
90 | self.model.save(name)
91 |
92 |
93 | class Match:
94 | """This class is used for testing the attack after the profiling phase"""
95 |
96 | def __init__(self, model: Model, leakage_model: Callable[[np.ndarray, int], int]):
97 | """model: after training the profile model this is fed to this class to test the attack
98 | leakage_model: The same leakage model used for profiling
99 | """
100 | super().__init__()
101 | self.model = model
102 | self.leakage_model = leakage_model
103 |
104 | def match(
105 | self,
106 | x_test: np.ndarray,
107 | metadata: np.ndarray,
108 | guess_range: int,
109 | correct_key: int,
110 | step: int,
111 | ) -> tuple[np.ndarray, np.ndarray]:
112 | """They key rank is implemented based on the sum of np.log() of the prob
113 | success rate is calculated as shown in https://eprint.iacr.org/2006/139.pdf
114 | """
115 | predictions = self.model.predict(x_test)
116 |
117 | chunk_starts = range(0, len(x_test), step)
118 | rank = np.zeros(len(chunk_starts), dtype=np.uint32)
119 | x_rank = np.zeros(len(chunk_starts), dtype=np.uint32)
120 | number_traces = 0
121 | rank_array = np.zeros(guess_range)
122 | for i, chunk_start in enumerate(chunk_starts):
123 | pred_chunk = predictions[chunk_start : chunk_start + step]
124 | metadata_chunk = metadata[chunk_start : chunk_start + step]
125 | for row in range(len(pred_chunk)):
126 | for guess in range(guess_range):
127 | index = self.leakage_model(metadata_chunk[row], guess)
128 | if pred_chunk[row, index] != 0:
129 | rank_array[guess] += np.log2(pred_chunk[row, index])
130 | rank[i] = np.where(sorted(rank_array)[::-1] == rank_array[correct_key])[0][
131 | 0
132 | ]
133 |
134 | number_traces += step
135 | x_rank[i] = number_traces
136 |
137 | return rank, x_rank
138 |
--------------------------------------------------------------------------------
/scadl/tools.py:
--------------------------------------------------------------------------------
1 | # This file is part of scadl
2 | #
3 | # scadl is free software: you can redistribute it and/or modify
4 | # it under the terms of the GNU Lesser General Public License as published by
5 | # the Free Software Foundation, either version 3 of the License, or
6 | # (at your option) any later version.
7 | #
8 | # This program is distributed in the hope that it will be useful,
9 | # but WITHOUT ANY WARRANTY; without even the implied warranty of
10 | # MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
11 | # GNU Lesser General Public License for more details.
12 | #
13 | # You should have received a copy of the GNU Lesser General Public License
14 | # along with this program. If not, see .
15 | #
16 | #
17 | # Copyright 2024 Karim ABDELLATIF, PhD, Ledger - karim.abdellatif@ledger.fr
18 |
19 |
20 | import numpy as np
21 |
22 | # fmt: off
23 | sbox = np.array([
24 | 0x63, 0x7C, 0x77, 0x7B, 0xF2, 0x6B, 0x6F, 0xC5, 0x30, 0x01, 0x67, 0x2B, 0xFE, 0xD7, 0xAB, 0x76,
25 | 0xCA, 0x82, 0xC9, 0x7D, 0xFA, 0x59, 0x47, 0xF0, 0xAD, 0xD4, 0xA2, 0xAF, 0x9C, 0xA4, 0x72, 0xC0,
26 | 0xB7, 0xFD, 0x93, 0x26, 0x36, 0x3F, 0xF7, 0xCC, 0x34, 0xA5, 0xE5, 0xF1, 0x71, 0xD8, 0x31, 0x15,
27 | 0x04, 0xC7, 0x23, 0xC3, 0x18, 0x96, 0x05, 0x9A, 0x07, 0x12, 0x80, 0xE2, 0xEB, 0x27, 0xB2, 0x75,
28 | 0x09, 0x83, 0x2C, 0x1A, 0x1B, 0x6E, 0x5A, 0xA0, 0x52, 0x3B, 0xD6, 0xB3, 0x29, 0xE3, 0x2F, 0x84,
29 | 0x53, 0xD1, 0x00, 0xED, 0x20, 0xFC, 0xB1, 0x5B, 0x6A, 0xCB, 0xBE, 0x39, 0x4A, 0x4C, 0x58, 0xCF,
30 | 0xD0, 0xEF, 0xAA, 0xFB, 0x43, 0x4D, 0x33, 0x85, 0x45, 0xF9, 0x02, 0x7F, 0x50, 0x3C, 0x9F, 0xA8,
31 | 0x51, 0xA3, 0x40, 0x8F, 0x92, 0x9D, 0x38, 0xF5, 0xBC, 0xB6, 0xDA, 0x21, 0x10, 0xFF, 0xF3, 0xD2,
32 | 0xCD, 0x0C, 0x13, 0xEC, 0x5F, 0x97, 0x44, 0x17, 0xC4, 0xA7, 0x7E, 0x3D, 0x64, 0x5D, 0x19, 0x73,
33 | 0x60, 0x81, 0x4F, 0xDC, 0x22, 0x2A, 0x90, 0x88, 0x46, 0xEE, 0xB8, 0x14, 0xDE, 0x5E, 0x0B, 0xDB,
34 | 0xE0, 0x32, 0x3A, 0x0A, 0x49, 0x06, 0x24, 0x5C, 0xC2, 0xD3, 0xAC, 0x62, 0x91, 0x95, 0xE4, 0x79,
35 | 0xE7, 0xC8, 0x37, 0x6D, 0x8D, 0xD5, 0x4E, 0xA9, 0x6C, 0x56, 0xF4, 0xEA, 0x65, 0x7A, 0xAE, 0x08,
36 | 0xBA, 0x78, 0x25, 0x2E, 0x1C, 0xA6, 0xB4, 0xC6, 0xE8, 0xDD, 0x74, 0x1F, 0x4B, 0xBD, 0x8B, 0x8A,
37 | 0x70, 0x3E, 0xB5, 0x66, 0x48, 0x03, 0xF6, 0x0E, 0x61, 0x35, 0x57, 0xB9, 0x86, 0xC1, 0x1D, 0x9E,
38 | 0xE1, 0xF8, 0x98, 0x11, 0x69, 0xD9, 0x8E, 0x94, 0x9B, 0x1E, 0x87, 0xE9, 0xCE, 0x55, 0x28, 0xDF,
39 | 0x8C, 0xA1, 0x89, 0x0D, 0xBF, 0xE6, 0x42, 0x68, 0x41, 0x99, 0x2D, 0x0F, 0xB0, 0x54, 0xBB, 0x16,
40 | ], dtype=np.uint8)
41 | # fmt: on
42 |
43 | # fmt: off
44 | inv_sbox = np.array([
45 | 0x52, 0x09, 0x6A, 0xD5, 0x30, 0x36, 0xA5, 0x38, 0xBF, 0x40, 0xA3, 0x9E, 0x81, 0xF3, 0xD7, 0xFB,
46 | 0x7C, 0xE3, 0x39, 0x82, 0x9B, 0x2F, 0xFF, 0x87, 0x34, 0x8E, 0x43, 0x44, 0xC4, 0xDE, 0xE9, 0xCB,
47 | 0x54, 0x7B, 0x94, 0x32, 0xA6, 0xC2, 0x23, 0x3D, 0xEE, 0x4C, 0x95, 0x0B, 0x42, 0xFA, 0xC3, 0x4E,
48 | 0x08, 0x2E, 0xA1, 0x66, 0x28, 0xD9, 0x24, 0xB2, 0x76, 0x5B, 0xA2, 0x49, 0x6D, 0x8B, 0xD1, 0x25,
49 | 0x72, 0xF8, 0xF6, 0x64, 0x86, 0x68, 0x98, 0x16, 0xD4, 0xA4, 0x5C, 0xCC, 0x5D, 0x65, 0xB6, 0x92,
50 | 0x6C, 0x70, 0x48, 0x50, 0xFD, 0xED, 0xB9, 0xDA, 0x5E, 0x15, 0x46, 0x57, 0xA7, 0x8D, 0x9D, 0x84,
51 | 0x90, 0xD8, 0xAB, 0x00, 0x8C, 0xBC, 0xD3, 0x0A, 0xF7, 0xE4, 0x58, 0x05, 0xB8, 0xB3, 0x45, 0x06,
52 | 0xD0, 0x2C, 0x1E, 0x8F, 0xCA, 0x3F, 0x0F, 0x02, 0xC1, 0xAF, 0xBD, 0x03, 0x01, 0x13, 0x8A, 0x6B,
53 | 0x3A, 0x91, 0x11, 0x41, 0x4F, 0x67, 0xDC, 0xEA, 0x97, 0xF2, 0xCF, 0xCE, 0xF0, 0xB4, 0xE6, 0x73,
54 | 0x96, 0xAC, 0x74, 0x22, 0xE7, 0xAD, 0x35, 0x85, 0xE2, 0xF9, 0x37, 0xE8, 0x1C, 0x75, 0xDF, 0x6E,
55 | 0x47, 0xF1, 0x1A, 0x71, 0x1D, 0x29, 0xC5, 0x89, 0x6F, 0xB7, 0x62, 0x0E, 0xAA, 0x18, 0xBE, 0x1B,
56 | 0xFC, 0x56, 0x3E, 0x4B, 0xC6, 0xD2, 0x79, 0x20, 0x9A, 0xDB, 0xC0, 0xFE, 0x78, 0xCD, 0x5A, 0xF4,
57 | 0x1F, 0xDD, 0xA8, 0x33, 0x88, 0x07, 0xC7, 0x31, 0xB1, 0x12, 0x10, 0x59, 0x27, 0x80, 0xEC, 0x5F,
58 | 0x60, 0x51, 0x7F, 0xA9, 0x19, 0xB5, 0x4A, 0x0D, 0x2D, 0xE5, 0x7A, 0x9F, 0x93, 0xC9, 0x9C, 0xEF,
59 | 0xA0, 0xE0, 0x3B, 0x4D, 0xAE, 0x2A, 0xF5, 0xB0, 0xC8, 0xEB, 0xBB, 0x3C, 0x83, 0x53, 0x99, 0x61,
60 | 0x17, 0x2B, 0x04, 0x7E, 0xBA, 0x77, 0xD6, 0x26, 0xE1, 0x69, 0x14, 0x63, 0x55, 0x21, 0x0C, 0x7D,
61 | ], dtype=np.uint8)
62 | # fmt: on
63 |
64 |
65 | def is_valid(data: np.ndarray) -> bool:
66 | """Check if all elements of :data: are valid (real values without nan)."""
67 |
68 | # np.isreal return True for nan
69 | return np.isreal(data).all() and not np.isnan(data).any()
70 |
71 |
72 | def normalization(
73 | data: np.ndarray, feature_range: tuple[float, float] = (0, 1), check: bool = True
74 | ) -> np.ndarray:
75 | """Normalize :data: between :feature_range[0]: and :feature_range[1]:.
76 |
77 | If :check: is True, the result is checked for invalid values.
78 | """
79 | assert feature_range[0] < feature_range[1]
80 |
81 | data = (data - np.min(data, axis=0)) / (np.max(data, axis=0) - np.min(data, axis=0))
82 | data = data * (feature_range[1] - feature_range[0]) + feature_range[0]
83 |
84 | if check:
85 | assert is_valid(data)
86 |
87 | return data
88 |
89 |
90 | def standardize(data: np.ndarray, check: bool = True) -> np.ndarray:
91 | """Standardize :data:.
92 |
93 | If :check: is True, the result is checked for invalid values.
94 | """
95 | data = (data - np.mean(data, axis=0)) / np.std(data, axis=0)
96 |
97 | if check:
98 | assert is_valid(data)
99 |
100 | return data
101 |
102 |
103 | def remove_avg(traces: np.ndarray) -> np.ndarray:
104 | """It takes traces as a np array and returns the subtracted average traces"""
105 | return traces - np.mean(traces, axis=0)
106 |
107 |
108 | def gen_labels(leakage_model, metadata: np.ndarray, key_byte: int) -> np.ndarray:
109 | """It is used to generate labels from metadata
110 | It takes leakage_model as a leakage function,
111 | metadata,
112 | key_byte: oreder of attacked key.
113 | It returns the labels used for DL"""
114 | return np.array([leakage_model(m, key_byte=key_byte) for m in metadata])
115 |
--------------------------------------------------------------------------------
/COPYING.LESSER:
--------------------------------------------------------------------------------
1 | GNU LESSER GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright (C) 2007 Free Software Foundation, Inc.
5 | Everyone is permitted to copy and distribute verbatim copies
6 | of this license document, but changing it is not allowed.
7 |
8 |
9 | This version of the GNU Lesser General Public License incorporates
10 | the terms and conditions of version 3 of the GNU General Public
11 | License, supplemented by the additional permissions listed below.
12 |
13 | 0. Additional Definitions.
14 |
15 | As used herein, "this License" refers to version 3 of the GNU Lesser
16 | General Public License, and the "GNU GPL" refers to version 3 of the GNU
17 | General Public License.
18 |
19 | "The Library" refers to a covered work governed by this License,
20 | other than an Application or a Combined Work as defined below.
21 |
22 | An "Application" is any work that makes use of an interface provided
23 | by the Library, but which is not otherwise based on the Library.
24 | Defining a subclass of a class defined by the Library is deemed a mode
25 | of using an interface provided by the Library.
26 |
27 | A "Combined Work" is a work produced by combining or linking an
28 | Application with the Library. The particular version of the Library
29 | with which the Combined Work was made is also called the "Linked
30 | Version".
31 |
32 | The "Minimal Corresponding Source" for a Combined Work means the
33 | Corresponding Source for the Combined Work, excluding any source code
34 | for portions of the Combined Work that, considered in isolation, are
35 | based on the Application, and not on the Linked Version.
36 |
37 | The "Corresponding Application Code" for a Combined Work means the
38 | object code and/or source code for the Application, including any data
39 | and utility programs needed for reproducing the Combined Work from the
40 | Application, but excluding the System Libraries of the Combined Work.
41 |
42 | 1. Exception to Section 3 of the GNU GPL.
43 |
44 | You may convey a covered work under sections 3 and 4 of this License
45 | without being bound by section 3 of the GNU GPL.
46 |
47 | 2. Conveying Modified Versions.
48 |
49 | If you modify a copy of the Library, and, in your modifications, a
50 | facility refers to a function or data to be supplied by an Application
51 | that uses the facility (other than as an argument passed when the
52 | facility is invoked), then you may convey a copy of the modified
53 | version:
54 |
55 | a) under this License, provided that you make a good faith effort to
56 | ensure that, in the event an Application does not supply the
57 | function or data, the facility still operates, and performs
58 | whatever part of its purpose remains meaningful, or
59 |
60 | b) under the GNU GPL, with none of the additional permissions of
61 | this License applicable to that copy.
62 |
63 | 3. Object Code Incorporating Material from Library Header Files.
64 |
65 | The object code form of an Application may incorporate material from
66 | a header file that is part of the Library. You may convey such object
67 | code under terms of your choice, provided that, if the incorporated
68 | material is not limited to numerical parameters, data structure
69 | layouts and accessors, or small macros, inline functions and templates
70 | (ten or fewer lines in length), you do both of the following:
71 |
72 | a) Give prominent notice with each copy of the object code that the
73 | Library is used in it and that the Library and its use are
74 | covered by this License.
75 |
76 | b) Accompany the object code with a copy of the GNU GPL and this license
77 | document.
78 |
79 | 4. Combined Works.
80 |
81 | You may convey a Combined Work under terms of your choice that,
82 | taken together, effectively do not restrict modification of the
83 | portions of the Library contained in the Combined Work and reverse
84 | engineering for debugging such modifications, if you also do each of
85 | the following:
86 |
87 | a) Give prominent notice with each copy of the Combined Work that
88 | the Library is used in it and that the Library and its use are
89 | covered by this License.
90 |
91 | b) Accompany the Combined Work with a copy of the GNU GPL and this license
92 | document.
93 |
94 | c) For a Combined Work that displays copyright notices during
95 | execution, include the copyright notice for the Library among
96 | these notices, as well as a reference directing the user to the
97 | copies of the GNU GPL and this license document.
98 |
99 | d) Do one of the following:
100 |
101 | 0) Convey the Minimal Corresponding Source under the terms of this
102 | License, and the Corresponding Application Code in a form
103 | suitable for, and under terms that permit, the user to
104 | recombine or relink the Application with a modified version of
105 | the Linked Version to produce a modified Combined Work, in the
106 | manner specified by section 6 of the GNU GPL for conveying
107 | Corresponding Source.
108 |
109 | 1) Use a suitable shared library mechanism for linking with the
110 | Library. A suitable mechanism is one that (a) uses at run time
111 | a copy of the Library already present on the user's computer
112 | system, and (b) will operate properly with a modified version
113 | of the Library that is interface-compatible with the Linked
114 | Version.
115 |
116 | e) Provide Installation Information, but only if you would otherwise
117 | be required to provide such information under section 6 of the
118 | GNU GPL, and only to the extent that such information is
119 | necessary to install and execute a modified version of the
120 | Combined Work produced by recombining or relinking the
121 | Application with a modified version of the Linked Version. (If
122 | you use option 4d0, the Installation Information must accompany
123 | the Minimal Corresponding Source and Corresponding Application
124 | Code. If you use option 4d1, you must provide the Installation
125 | Information in the manner specified by section 6 of the GNU GPL
126 | for conveying Corresponding Source.)
127 |
128 | 5. Combined Libraries.
129 |
130 | You may place library facilities that are a work based on the
131 | Library side by side in a single library together with other library
132 | facilities that are not Applications and are not covered by this
133 | License, and convey such a combined library under terms of your
134 | choice, if you do both of the following:
135 |
136 | a) Accompany the combined library with a copy of the same work based
137 | on the Library, uncombined with any other library facilities,
138 | conveyed under the terms of this License.
139 |
140 | b) Give prominent notice with the combined library that part of it
141 | is a work based on the Library, and explaining where to find the
142 | accompanying uncombined form of the same work.
143 |
144 | 6. Revised Versions of the GNU Lesser General Public License.
145 |
146 | The Free Software Foundation may publish revised and/or new versions
147 | of the GNU Lesser General Public License from time to time. Such new
148 | versions will be similar in spirit to the present version, but may
149 | differ in detail to address new problems or concerns.
150 |
151 | Each version is given a distinguishing version number. If the
152 | Library as you received it specifies that a certain numbered version
153 | of the GNU Lesser General Public License "or any later version"
154 | applies to it, you have the option of following the terms and
155 | conditions either of that published version or of any later version
156 | published by the Free Software Foundation. If the Library as you
157 | received it does not specify a version number of the GNU Lesser
158 | General Public License, you may choose any version of the GNU Lesser
159 | General Public License ever published by the Free Software Foundation.
160 |
161 | If the Library as you received it specifies that a proxy can decide
162 | whether future versions of the GNU Lesser General Public License shall
163 | apply, that proxy's public statement of acceptance of any version is
164 | permanent authorization for you to choose that version for the
165 | Library.
--------------------------------------------------------------------------------
/COPYING:
--------------------------------------------------------------------------------
1 | GNU GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright (C) 2007 Free Software Foundation, Inc.
5 | Everyone is permitted to copy and distribute verbatim copies
6 | of this license document, but changing it is not allowed.
7 |
8 | Preamble
9 |
10 | The GNU General Public License is a free, copyleft license for
11 | software and other kinds of works.
12 |
13 | The licenses for most software and other practical works are designed
14 | to take away your freedom to share and change the works. By contrast,
15 | the GNU General Public License is intended to guarantee your freedom to
16 | share and change all versions of a program--to make sure it remains free
17 | software for all its users. We, the Free Software Foundation, use the
18 | GNU General Public License for most of our software; it applies also to
19 | any other work released this way by its authors. You can apply it to
20 | your programs, too.
21 |
22 | When we speak of free software, we are referring to freedom, not
23 | price. Our General Public Licenses are designed to make sure that you
24 | have the freedom to distribute copies of free software (and charge for
25 | them if you wish), that you receive source code or can get it if you
26 | want it, that you can change the software or use pieces of it in new
27 | free programs, and that you know you can do these things.
28 |
29 | To protect your rights, we need to prevent others from denying you
30 | these rights or asking you to surrender the rights. Therefore, you have
31 | certain responsibilities if you distribute copies of the software, or if
32 | you modify it: responsibilities to respect the freedom of others.
33 |
34 | For example, if you distribute copies of such a program, whether
35 | gratis or for a fee, you must pass on to the recipients the same
36 | freedoms that you received. You must make sure that they, too, receive
37 | or can get the source code. And you must show them these terms so they
38 | know their rights.
39 |
40 | Developers that use the GNU GPL protect your rights with two steps:
41 | (1) assert copyright on the software, and (2) offer you this License
42 | giving you legal permission to copy, distribute and/or modify it.
43 |
44 | For the developers' and authors' protection, the GPL clearly explains
45 | that there is no warranty for this free software. For both users' and
46 | authors' sake, the GPL requires that modified versions be marked as
47 | changed, so that their problems will not be attributed erroneously to
48 | authors of previous versions.
49 |
50 | Some devices are designed to deny users access to install or run
51 | modified versions of the software inside them, although the manufacturer
52 | can do so. This is fundamentally incompatible with the aim of
53 | protecting users' freedom to change the software. The systematic
54 | pattern of such abuse occurs in the area of products for individuals to
55 | use, which is precisely where it is most unacceptable. Therefore, we
56 | have designed this version of the GPL to prohibit the practice for those
57 | products. If such problems arise substantially in other domains, we
58 | stand ready to extend this provision to those domains in future versions
59 | of the GPL, as needed to protect the freedom of users.
60 |
61 | Finally, every program is threatened constantly by software patents.
62 | States should not allow patents to restrict development and use of
63 | software on general-purpose computers, but in those that do, we wish to
64 | avoid the special danger that patents applied to a free program could
65 | make it effectively proprietary. To prevent this, the GPL assures that
66 | patents cannot be used to render the program non-free.
67 |
68 | The precise terms and conditions for copying, distribution and
69 | modification follow.
70 |
71 | TERMS AND CONDITIONS
72 |
73 | 0. Definitions.
74 |
75 | "This License" refers to version 3 of the GNU General Public License.
76 |
77 | "Copyright" also means copyright-like laws that apply to other kinds of
78 | works, such as semiconductor masks.
79 |
80 | "The Program" refers to any copyrightable work licensed under this
81 | License. Each licensee is addressed as "you". "Licensees" and
82 | "recipients" may be individuals or organizations.
83 |
84 | To "modify" a work means to copy from or adapt all or part of the work
85 | in a fashion requiring copyright permission, other than the making of an
86 | exact copy. The resulting work is called a "modified version" of the
87 | earlier work or a work "based on" the earlier work.
88 |
89 | A "covered work" means either the unmodified Program or a work based
90 | on the Program.
91 |
92 | To "propagate" a work means to do anything with it that, without
93 | permission, would make you directly or secondarily liable for
94 | infringement under applicable copyright law, except executing it on a
95 | computer or modifying a private copy. Propagation includes copying,
96 | distribution (with or without modification), making available to the
97 | public, and in some countries other activities as well.
98 |
99 | To "convey" a work means any kind of propagation that enables other
100 | parties to make or receive copies. Mere interaction with a user through
101 | a computer network, with no transfer of a copy, is not conveying.
102 |
103 | An interactive user interface displays "Appropriate Legal Notices"
104 | to the extent that it includes a convenient and prominently visible
105 | feature that (1) displays an appropriate copyright notice, and (2)
106 | tells the user that there is no warranty for the work (except to the
107 | extent that warranties are provided), that licensees may convey the
108 | work under this License, and how to view a copy of this License. If
109 | the interface presents a list of user commands or options, such as a
110 | menu, a prominent item in the list meets this criterion.
111 |
112 | 1. Source Code.
113 |
114 | The "source code" for a work means the preferred form of the work
115 | for making modifications to it. "Object code" means any non-source
116 | form of a work.
117 |
118 | A "Standard Interface" means an interface that either is an official
119 | standard defined by a recognized standards body, or, in the case of
120 | interfaces specified for a particular programming language, one that
121 | is widely used among developers working in that language.
122 |
123 | The "System Libraries" of an executable work include anything, other
124 | than the work as a whole, that (a) is included in the normal form of
125 | packaging a Major Component, but which is not part of that Major
126 | Component, and (b) serves only to enable use of the work with that
127 | Major Component, or to implement a Standard Interface for which an
128 | implementation is available to the public in source code form. A
129 | "Major Component", in this context, means a major essential component
130 | (kernel, window system, and so on) of the specific operating system
131 | (if any) on which the executable work runs, or a compiler used to
132 | produce the work, or an object code interpreter used to run it.
133 |
134 | The "Corresponding Source" for a work in object code form means all
135 | the source code needed to generate, install, and (for an executable
136 | work) run the object code and to modify the work, including scripts to
137 | control those activities. However, it does not include the work's
138 | System Libraries, or general-purpose tools or generally available free
139 | programs which are used unmodified in performing those activities but
140 | which are not part of the work. For example, Corresponding Source
141 | includes interface definition files associated with source files for
142 | the work, and the source code for shared libraries and dynamically
143 | linked subprograms that the work is specifically designed to require,
144 | such as by intimate data communication or control flow between those
145 | subprograms and other parts of the work.
146 |
147 | The Corresponding Source need not include anything that users
148 | can regenerate automatically from other parts of the Corresponding
149 | Source.
150 |
151 | The Corresponding Source for a work in source code form is that
152 | same work.
153 |
154 | 2. Basic Permissions.
155 |
156 | All rights granted under this License are granted for the term of
157 | copyright on the Program, and are irrevocable provided the stated
158 | conditions are met. This License explicitly affirms your unlimited
159 | permission to run the unmodified Program. The output from running a
160 | covered work is covered by this License only if the output, given its
161 | content, constitutes a covered work. This License acknowledges your
162 | rights of fair use or other equivalent, as provided by copyright law.
163 |
164 | You may make, run and propagate covered works that you do not
165 | convey, without conditions so long as your license otherwise remains
166 | in force. You may convey covered works to others for the sole purpose
167 | of having them make modifications exclusively for you, or provide you
168 | with facilities for running those works, provided that you comply with
169 | the terms of this License in conveying all material for which you do
170 | not control copyright. Those thus making or running the covered works
171 | for you must do so exclusively on your behalf, under your direction
172 | and control, on terms that prohibit them from making any copies of
173 | your copyrighted material outside their relationship with you.
174 |
175 | Conveying under any other circumstances is permitted solely under
176 | the conditions stated below. Sublicensing is not allowed; section 10
177 | makes it unnecessary.
178 |
179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180 |
181 | No covered work shall be deemed part of an effective technological
182 | measure under any applicable law fulfilling obligations under article
183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184 | similar laws prohibiting or restricting circumvention of such
185 | measures.
186 |
187 | When you convey a covered work, you waive any legal power to forbid
188 | circumvention of technological measures to the extent such circumvention
189 | is effected by exercising rights under this License with respect to
190 | the covered work, and you disclaim any intention to limit operation or
191 | modification of the work as a means of enforcing, against the work's
192 | users, your or third parties' legal rights to forbid circumvention of
193 | technological measures.
194 |
195 | 4. Conveying Verbatim Copies.
196 |
197 | You may convey verbatim copies of the Program's source code as you
198 | receive it, in any medium, provided that you conspicuously and
199 | appropriately publish on each copy an appropriate copyright notice;
200 | keep intact all notices stating that this License and any
201 | non-permissive terms added in accord with section 7 apply to the code;
202 | keep intact all notices of the absence of any warranty; and give all
203 | recipients a copy of this License along with the Program.
204 |
205 | You may charge any price or no price for each copy that you convey,
206 | and you may offer support or warranty protection for a fee.
207 |
208 | 5. Conveying Modified Source Versions.
209 |
210 | You may convey a work based on the Program, or the modifications to
211 | produce it from the Program, in the form of source code under the
212 | terms of section 4, provided that you also meet all of these conditions:
213 |
214 | a) The work must carry prominent notices stating that you modified
215 | it, and giving a relevant date.
216 |
217 | b) The work must carry prominent notices stating that it is
218 | released under this License and any conditions added under section
219 | 7. This requirement modifies the requirement in section 4 to
220 | "keep intact all notices".
221 |
222 | c) You must license the entire work, as a whole, under this
223 | License to anyone who comes into possession of a copy. This
224 | License will therefore apply, along with any applicable section 7
225 | additional terms, to the whole of the work, and all its parts,
226 | regardless of how they are packaged. This License gives no
227 | permission to license the work in any other way, but it does not
228 | invalidate such permission if you have separately received it.
229 |
230 | d) If the work has interactive user interfaces, each must display
231 | Appropriate Legal Notices; however, if the Program has interactive
232 | interfaces that do not display Appropriate Legal Notices, your
233 | work need not make them do so.
234 |
235 | A compilation of a covered work with other separate and independent
236 | works, which are not by their nature extensions of the covered work,
237 | and which are not combined with it such as to form a larger program,
238 | in or on a volume of a storage or distribution medium, is called an
239 | "aggregate" if the compilation and its resulting copyright are not
240 | used to limit the access or legal rights of the compilation's users
241 | beyond what the individual works permit. Inclusion of a covered work
242 | in an aggregate does not cause this License to apply to the other
243 | parts of the aggregate.
244 |
245 | 6. Conveying Non-Source Forms.
246 |
247 | You may convey a covered work in object code form under the terms
248 | of sections 4 and 5, provided that you also convey the
249 | machine-readable Corresponding Source under the terms of this License,
250 | in one of these ways:
251 |
252 | a) Convey the object code in, or embodied in, a physical product
253 | (including a physical distribution medium), accompanied by the
254 | Corresponding Source fixed on a durable physical medium
255 | customarily used for software interchange.
256 |
257 | b) Convey the object code in, or embodied in, a physical product
258 | (including a physical distribution medium), accompanied by a
259 | written offer, valid for at least three years and valid for as
260 | long as you offer spare parts or customer support for that product
261 | model, to give anyone who possesses the object code either (1) a
262 | copy of the Corresponding Source for all the software in the
263 | product that is covered by this License, on a durable physical
264 | medium customarily used for software interchange, for a price no
265 | more than your reasonable cost of physically performing this
266 | conveying of source, or (2) access to copy the
267 | Corresponding Source from a network server at no charge.
268 |
269 | c) Convey individual copies of the object code with a copy of the
270 | written offer to provide the Corresponding Source. This
271 | alternative is allowed only occasionally and noncommercially, and
272 | only if you received the object code with such an offer, in accord
273 | with subsection 6b.
274 |
275 | d) Convey the object code by offering access from a designated
276 | place (gratis or for a charge), and offer equivalent access to the
277 | Corresponding Source in the same way through the same place at no
278 | further charge. You need not require recipients to copy the
279 | Corresponding Source along with the object code. If the place to
280 | copy the object code is a network server, the Corresponding Source
281 | may be on a different server (operated by you or a third party)
282 | that supports equivalent copying facilities, provided you maintain
283 | clear directions next to the object code saying where to find the
284 | Corresponding Source. Regardless of what server hosts the
285 | Corresponding Source, you remain obligated to ensure that it is
286 | available for as long as needed to satisfy these requirements.
287 |
288 | e) Convey the object code using peer-to-peer transmission, provided
289 | you inform other peers where the object code and Corresponding
290 | Source of the work are being offered to the general public at no
291 | charge under subsection 6d.
292 |
293 | A separable portion of the object code, whose source code is excluded
294 | from the Corresponding Source as a System Library, need not be
295 | included in conveying the object code work.
296 |
297 | A "User Product" is either (1) a "consumer product", which means any
298 | tangible personal property which is normally used for personal, family,
299 | or household purposes, or (2) anything designed or sold for incorporation
300 | into a dwelling. In determining whether a product is a consumer product,
301 | doubtful cases shall be resolved in favor of coverage. For a particular
302 | product received by a particular user, "normally used" refers to a
303 | typical or common use of that class of product, regardless of the status
304 | of the particular user or of the way in which the particular user
305 | actually uses, or expects or is expected to use, the product. A product
306 | is a consumer product regardless of whether the product has substantial
307 | commercial, industrial or non-consumer uses, unless such uses represent
308 | the only significant mode of use of the product.
309 |
310 | "Installation Information" for a User Product means any methods,
311 | procedures, authorization keys, or other information required to install
312 | and execute modified versions of a covered work in that User Product from
313 | a modified version of its Corresponding Source. The information must
314 | suffice to ensure that the continued functioning of the modified object
315 | code is in no case prevented or interfered with solely because
316 | modification has been made.
317 |
318 | If you convey an object code work under this section in, or with, or
319 | specifically for use in, a User Product, and the conveying occurs as
320 | part of a transaction in which the right of possession and use of the
321 | User Product is transferred to the recipient in perpetuity or for a
322 | fixed term (regardless of how the transaction is characterized), the
323 | Corresponding Source conveyed under this section must be accompanied
324 | by the Installation Information. But this requirement does not apply
325 | if neither you nor any third party retains the ability to install
326 | modified object code on the User Product (for example, the work has
327 | been installed in ROM).
328 |
329 | The requirement to provide Installation Information does not include a
330 | requirement to continue to provide support service, warranty, or updates
331 | for a work that has been modified or installed by the recipient, or for
332 | the User Product in which it has been modified or installed. Access to a
333 | network may be denied when the modification itself materially and
334 | adversely affects the operation of the network or violates the rules and
335 | protocols for communication across the network.
336 |
337 | Corresponding Source conveyed, and Installation Information provided,
338 | in accord with this section must be in a format that is publicly
339 | documented (and with an implementation available to the public in
340 | source code form), and must require no special password or key for
341 | unpacking, reading or copying.
342 |
343 | 7. Additional Terms.
344 |
345 | "Additional permissions" are terms that supplement the terms of this
346 | License by making exceptions from one or more of its conditions.
347 | Additional permissions that are applicable to the entire Program shall
348 | be treated as though they were included in this License, to the extent
349 | that they are valid under applicable law. If additional permissions
350 | apply only to part of the Program, that part may be used separately
351 | under those permissions, but the entire Program remains governed by
352 | this License without regard to the additional permissions.
353 |
354 | When you convey a copy of a covered work, you may at your option
355 | remove any additional permissions from that copy, or from any part of
356 | it. (Additional permissions may be written to require their own
357 | removal in certain cases when you modify the work.) You may place
358 | additional permissions on material, added by you to a covered work,
359 | for which you have or can give appropriate copyright permission.
360 |
361 | Notwithstanding any other provision of this License, for material you
362 | add to a covered work, you may (if authorized by the copyright holders of
363 | that material) supplement the terms of this License with terms:
364 |
365 | a) Disclaiming warranty or limiting liability differently from the
366 | terms of sections 15 and 16 of this License; or
367 |
368 | b) Requiring preservation of specified reasonable legal notices or
369 | author attributions in that material or in the Appropriate Legal
370 | Notices displayed by works containing it; or
371 |
372 | c) Prohibiting misrepresentation of the origin of that material, or
373 | requiring that modified versions of such material be marked in
374 | reasonable ways as different from the original version; or
375 |
376 | d) Limiting the use for publicity purposes of names of licensors or
377 | authors of the material; or
378 |
379 | e) Declining to grant rights under trademark law for use of some
380 | trade names, trademarks, or service marks; or
381 |
382 | f) Requiring indemnification of licensors and authors of that
383 | material by anyone who conveys the material (or modified versions of
384 | it) with contractual assumptions of liability to the recipient, for
385 | any liability that these contractual assumptions directly impose on
386 | those licensors and authors.
387 |
388 | All other non-permissive additional terms are considered "further
389 | restrictions" within the meaning of section 10. If the Program as you
390 | received it, or any part of it, contains a notice stating that it is
391 | governed by this License along with a term that is a further
392 | restriction, you may remove that term. If a license document contains
393 | a further restriction but permits relicensing or conveying under this
394 | License, you may add to a covered work material governed by the terms
395 | of that license document, provided that the further restriction does
396 | not survive such relicensing or conveying.
397 |
398 | If you add terms to a covered work in accord with this section, you
399 | must place, in the relevant source files, a statement of the
400 | additional terms that apply to those files, or a notice indicating
401 | where to find the applicable terms.
402 |
403 | Additional terms, permissive or non-permissive, may be stated in the
404 | form of a separately written license, or stated as exceptions;
405 | the above requirements apply either way.
406 |
407 | 8. Termination.
408 |
409 | You may not propagate or modify a covered work except as expressly
410 | provided under this License. Any attempt otherwise to propagate or
411 | modify it is void, and will automatically terminate your rights under
412 | this License (including any patent licenses granted under the third
413 | paragraph of section 11).
414 |
415 | However, if you cease all violation of this License, then your
416 | license from a particular copyright holder is reinstated (a)
417 | provisionally, unless and until the copyright holder explicitly and
418 | finally terminates your license, and (b) permanently, if the copyright
419 | holder fails to notify you of the violation by some reasonable means
420 | prior to 60 days after the cessation.
421 |
422 | Moreover, your license from a particular copyright holder is
423 | reinstated permanently if the copyright holder notifies you of the
424 | violation by some reasonable means, this is the first time you have
425 | received notice of violation of this License (for any work) from that
426 | copyright holder, and you cure the violation prior to 30 days after
427 | your receipt of the notice.
428 |
429 | Termination of your rights under this section does not terminate the
430 | licenses of parties who have received copies or rights from you under
431 | this License. If your rights have been terminated and not permanently
432 | reinstated, you do not qualify to receive new licenses for the same
433 | material under section 10.
434 |
435 | 9. Acceptance Not Required for Having Copies.
436 |
437 | You are not required to accept this License in order to receive or
438 | run a copy of the Program. Ancillary propagation of a covered work
439 | occurring solely as a consequence of using peer-to-peer transmission
440 | to receive a copy likewise does not require acceptance. However,
441 | nothing other than this License grants you permission to propagate or
442 | modify any covered work. These actions infringe copyright if you do
443 | not accept this License. Therefore, by modifying or propagating a
444 | covered work, you indicate your acceptance of this License to do so.
445 |
446 | 10. Automatic Licensing of Downstream Recipients.
447 |
448 | Each time you convey a covered work, the recipient automatically
449 | receives a license from the original licensors, to run, modify and
450 | propagate that work, subject to this License. You are not responsible
451 | for enforcing compliance by third parties with this License.
452 |
453 | An "entity transaction" is a transaction transferring control of an
454 | organization, or substantially all assets of one, or subdividing an
455 | organization, or merging organizations. If propagation of a covered
456 | work results from an entity transaction, each party to that
457 | transaction who receives a copy of the work also receives whatever
458 | licenses to the work the party's predecessor in interest had or could
459 | give under the previous paragraph, plus a right to possession of the
460 | Corresponding Source of the work from the predecessor in interest, if
461 | the predecessor has it or can get it with reasonable efforts.
462 |
463 | You may not impose any further restrictions on the exercise of the
464 | rights granted or affirmed under this License. For example, you may
465 | not impose a license fee, royalty, or other charge for exercise of
466 | rights granted under this License, and you may not initiate litigation
467 | (including a cross-claim or counterclaim in a lawsuit) alleging that
468 | any patent claim is infringed by making, using, selling, offering for
469 | sale, or importing the Program or any portion of it.
470 |
471 | 11. Patents.
472 |
473 | A "contributor" is a copyright holder who authorizes use under this
474 | License of the Program or a work on which the Program is based. The
475 | work thus licensed is called the contributor's "contributor version".
476 |
477 | A contributor's "essential patent claims" are all patent claims
478 | owned or controlled by the contributor, whether already acquired or
479 | hereafter acquired, that would be infringed by some manner, permitted
480 | by this License, of making, using, or selling its contributor version,
481 | but do not include claims that would be infringed only as a
482 | consequence of further modification of the contributor version. For
483 | purposes of this definition, "control" includes the right to grant
484 | patent sublicenses in a manner consistent with the requirements of
485 | this License.
486 |
487 | Each contributor grants you a non-exclusive, worldwide, royalty-free
488 | patent license under the contributor's essential patent claims, to
489 | make, use, sell, offer for sale, import and otherwise run, modify and
490 | propagate the contents of its contributor version.
491 |
492 | In the following three paragraphs, a "patent license" is any express
493 | agreement or commitment, however denominated, not to enforce a patent
494 | (such as an express permission to practice a patent or covenant not to
495 | sue for patent infringement). To "grant" such a patent license to a
496 | party means to make such an agreement or commitment not to enforce a
497 | patent against the party.
498 |
499 | If you convey a covered work, knowingly relying on a patent license,
500 | and the Corresponding Source of the work is not available for anyone
501 | to copy, free of charge and under the terms of this License, through a
502 | publicly available network server or other readily accessible means,
503 | then you must either (1) cause the Corresponding Source to be so
504 | available, or (2) arrange to deprive yourself of the benefit of the
505 | patent license for this particular work, or (3) arrange, in a manner
506 | consistent with the requirements of this License, to extend the patent
507 | license to downstream recipients. "Knowingly relying" means you have
508 | actual knowledge that, but for the patent license, your conveying the
509 | covered work in a country, or your recipient's use of the covered work
510 | in a country, would infringe one or more identifiable patents in that
511 | country that you have reason to believe are valid.
512 |
513 | If, pursuant to or in connection with a single transaction or
514 | arrangement, you convey, or propagate by procuring conveyance of, a
515 | covered work, and grant a patent license to some of the parties
516 | receiving the covered work authorizing them to use, propagate, modify
517 | or convey a specific copy of the covered work, then the patent license
518 | you grant is automatically extended to all recipients of the covered
519 | work and works based on it.
520 |
521 | A patent license is "discriminatory" if it does not include within
522 | the scope of its coverage, prohibits the exercise of, or is
523 | conditioned on the non-exercise of one or more of the rights that are
524 | specifically granted under this License. You may not convey a covered
525 | work if you are a party to an arrangement with a third party that is
526 | in the business of distributing software, under which you make payment
527 | to the third party based on the extent of your activity of conveying
528 | the work, and under which the third party grants, to any of the
529 | parties who would receive the covered work from you, a discriminatory
530 | patent license (a) in connection with copies of the covered work
531 | conveyed by you (or copies made from those copies), or (b) primarily
532 | for and in connection with specific products or compilations that
533 | contain the covered work, unless you entered into that arrangement,
534 | or that patent license was granted, prior to 28 March 2007.
535 |
536 | Nothing in this License shall be construed as excluding or limiting
537 | any implied license or other defenses to infringement that may
538 | otherwise be available to you under applicable patent law.
539 |
540 | 12. No Surrender of Others' Freedom.
541 |
542 | If conditions are imposed on you (whether by court order, agreement or
543 | otherwise) that contradict the conditions of this License, they do not
544 | excuse you from the conditions of this License. If you cannot convey a
545 | covered work so as to satisfy simultaneously your obligations under this
546 | License and any other pertinent obligations, then as a consequence you may
547 | not convey it at all. For example, if you agree to terms that obligate you
548 | to collect a royalty for further conveying from those to whom you convey
549 | the Program, the only way you could satisfy both those terms and this
550 | License would be to refrain entirely from conveying the Program.
551 |
552 | 13. Use with the GNU Affero General Public License.
553 |
554 | Notwithstanding any other provision of this License, you have
555 | permission to link or combine any covered work with a work licensed
556 | under version 3 of the GNU Affero General Public License into a single
557 | combined work, and to convey the resulting work. The terms of this
558 | License will continue to apply to the part which is the covered work,
559 | but the special requirements of the GNU Affero General Public License,
560 | section 13, concerning interaction through a network will apply to the
561 | combination as such.
562 |
563 | 14. Revised Versions of this License.
564 |
565 | The Free Software Foundation may publish revised and/or new versions of
566 | the GNU General Public License from time to time. Such new versions will
567 | be similar in spirit to the present version, but may differ in detail to
568 | address new problems or concerns.
569 |
570 | Each version is given a distinguishing version number. If the
571 | Program specifies that a certain numbered version of the GNU General
572 | Public License "or any later version" applies to it, you have the
573 | option of following the terms and conditions either of that numbered
574 | version or of any later version published by the Free Software
575 | Foundation. If the Program does not specify a version number of the
576 | GNU General Public License, you may choose any version ever published
577 | by the Free Software Foundation.
578 |
579 | If the Program specifies that a proxy can decide which future
580 | versions of the GNU General Public License can be used, that proxy's
581 | public statement of acceptance of a version permanently authorizes you
582 | to choose that version for the Program.
583 |
584 | Later license versions may give you additional or different
585 | permissions. However, no additional obligations are imposed on any
586 | author or copyright holder as a result of your choosing to follow a
587 | later version.
588 |
589 | 15. Disclaimer of Warranty.
590 |
591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599 |
600 | 16. Limitation of Liability.
601 |
602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610 | SUCH DAMAGES.
611 |
612 | 17. Interpretation of Sections 15 and 16.
613 |
614 | If the disclaimer of warranty and limitation of liability provided
615 | above cannot be given local legal effect according to their terms,
616 | reviewing courts shall apply local law that most closely approximates
617 | an absolute waiver of all civil liability in connection with the
618 | Program, unless a warranty or assumption of liability accompanies a
619 | copy of the Program in return for a fee.
620 |
621 | END OF TERMS AND CONDITIONS
622 |
623 | How to Apply These Terms to Your New Programs
624 |
625 | If you develop a new program, and you want it to be of the greatest
626 | possible use to the public, the best way to achieve this is to make it
627 | free software which everyone can redistribute and change under these terms.
628 |
629 | To do so, attach the following notices to the program. It is safest
630 | to attach them to the start of each source file to most effectively
631 | state the exclusion of warranty; and each file should have at least
632 | the "copyright" line and a pointer to where the full notice is found.
633 |
634 |
635 | Copyright (C)
636 |
637 | This program is free software: you can redistribute it and/or modify
638 | it under the terms of the GNU General Public License as published by
639 | the Free Software Foundation, either version 3 of the License, or
640 | (at your option) any later version.
641 |
642 | This program is distributed in the hope that it will be useful,
643 | but WITHOUT ANY WARRANTY; without even the implied warranty of
644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645 | GNU General Public License for more details.
646 |
647 | You should have received a copy of the GNU General Public License
648 | along with this program. If not, see .
649 |
650 | Also add information on how to contact you by electronic and paper mail.
651 |
652 | If the program does terminal interaction, make it output a short
653 | notice like this when it starts in an interactive mode:
654 |
655 | Copyright (C)
656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657 | This is free software, and you are welcome to redistribute it
658 | under certain conditions; type `show c' for details.
659 |
660 | The hypothetical commands `show w' and `show c' should show the appropriate
661 | parts of the General Public License. Of course, your program's commands
662 | might be different; for a GUI interface, you would use an "about box".
663 |
664 | You should also get your employer (if you work as a programmer) or school,
665 | if any, to sign a "copyright disclaimer" for the program, if necessary.
666 | For more information on this, and how to apply and follow the GNU GPL, see
667 | .
668 |
669 | The GNU General Public License does not permit incorporating your program
670 | into proprietary programs. If your program is a subroutine library, you
671 | may consider it more useful to permit linking proprietary applications with
672 | the library. If this is what you want to do, use the GNU Lesser General
673 | Public License instead of this License. But first, please read
674 | .
--------------------------------------------------------------------------------