Skip to content
Snippets Groups Projects
Commit 166ded18 authored by Rishi Sharma's avatar Rishi Sharma
Browse files

More graphs

parent 7575bd98
No related branches found
No related tags found
No related merge requests found
96
0 1
0 67
0 68
0 9
0 82
0 19
0 52
0 91
0 95
1 0
1 66
1 2
1 29
1 51
1 91
1 93
2 1
2 3
2 8
2 41
2 14
2 89
2 27
2 93
3 2
3 4
3 8
3 11
3 44
3 77
3 93
4 3
4 5
4 76
4 81
4 18
4 19
4 59
4 29
5 37
5 4
5 53
5 6
6 5
6 7
6 72
6 40
6 47
6 90
7 8
7 90
7 19
7 6
8 2
8 3
8 7
8 41
8 9
8 77
8 78
8 21
9 0
9 8
9 10
9 21
9 89
9 91
9 28
10 80
10 9
10 11
10 93
11 3
11 10
11 12
11 45
11 80
11 87
12 33
12 11
12 13
12 18
12 53
12 55
13 65
13 50
13 12
13 14
14 2
14 13
14 15
14 81
14 52
14 21
14 58
14 91
15 16
15 60
15 29
15 14
16 33
16 15
16 17
16 21
16 24
16 60
17 16
17 81
17 18
17 63
18 4
18 12
18 17
18 51
18 19
18 89
18 29
19 0
19 4
19 37
19 7
19 40
19 47
19 18
19 20
19 22
19 58
20 69
20 76
20 79
20 19
20 21
20 86
20 95
21 34
21 8
21 9
21 74
21 14
21 16
21 20
21 22
22 19
22 21
22 23
23 66
23 82
23 22
23 24
23 89
23 94
24 37
24 38
24 16
24 84
24 23
24 25
24 58
24 27
25 24
25 26
25 61
26 25
26 27
26 69
27 64
27 33
27 2
27 87
27 24
27 26
27 28
27 31
28 69
28 9
28 74
28 27
28 29
29 1
29 4
29 43
29 15
29 18
29 91
29 28
29 30
30 31
30 29
30 55
31 32
31 75
31 27
31 30
32 65
32 33
32 39
32 84
32 31
33 32
33 34
33 67
33 68
33 12
33 77
33 16
33 27
34 33
34 35
34 70
34 21
34 89
34 92
34 61
35 66
35 34
35 36
35 83
35 58
36 67
36 35
36 37
36 85
36 88
37 36
37 5
37 38
37 51
37 19
37 24
37 61
38 37
38 39
38 77
38 82
38 24
39 32
39 38
39 40
39 74
39 43
39 81
40 68
40 6
40 39
40 41
40 19
40 61
41 2
41 8
41 40
41 42
41 43
41 81
42 41
42 43
42 70
43 39
43 41
43 42
43 44
43 29
44 43
44 3
44 45
45 70
45 75
45 11
45 44
45 78
45 79
45 46
46 45
46 47
47 6
47 75
47 46
47 48
47 19
48 73
48 47
48 49
49 48
49 50
49 52
50 13
50 49
50 83
50 51
50 89
50 92
51 1
51 37
51 70
51 80
51 81
51 18
51 50
51 52
51 87
52 0
52 66
52 14
52 49
52 51
52 53
52 62
52 95
53 5
53 71
53 12
53 83
53 52
53 54
54 55
54 53
54 87
55 12
55 54
55 56
55 92
55 30
55 95
56 80
56 81
56 55
56 57
56 91
57 56
57 89
57 90
57 58
58 35
58 71
58 14
58 19
58 24
58 57
58 59
59 58
59 4
59 60
60 15
60 16
60 84
60 59
60 61
61 34
61 37
61 40
61 25
61 60
61 62
62 52
62 61
62 63
63 64
63 17
63 68
63 62
64 65
64 27
64 63
65 32
65 64
65 66
65 13
65 87
66 1
66 65
66 67
66 35
66 52
66 23
67 0
67 33
67 66
67 36
67 68
67 94
68 0
68 33
68 67
68 69
68 40
68 88
68 94
68 63
69 68
69 70
69 20
69 26
69 28
70 34
70 69
70 71
70 42
70 45
70 51
71 72
71 58
71 53
71 70
72 73
72 71
72 6
72 95
73 48
73 74
73 72
73 84
74 39
74 73
74 75
74 21
74 28
75 74
75 76
75 45
75 47
75 93
75 31
76 4
76 75
76 77
76 20
76 87
76 88
76 95
77 33
77 3
77 38
77 8
77 76
77 78
78 8
78 77
78 45
78 79
79 45
79 78
79 80
79 20
79 94
80 88
80 10
80 11
80 79
80 81
80 51
80 56
80 91
81 4
81 39
81 41
81 14
81 80
81 17
81 82
81 51
81 56
82 0
82 38
82 81
82 83
82 23
82 89
83 35
83 82
83 50
83 84
83 53
84 32
84 73
84 83
84 85
84 24
84 60
85 36
85 86
85 84
86 20
86 85
86 87
87 65
87 11
87 76
87 51
87 54
87 86
87 88
87 27
88 36
88 68
88 76
88 80
88 87
88 89
89 2
89 34
89 9
89 18
89 50
89 82
89 23
89 88
89 57
89 90
90 6
90 7
90 89
90 91
90 57
91 0
91 1
91 9
91 14
91 80
91 56
91 90
91 92
91 29
92 34
92 50
92 55
92 91
92 93
93 1
93 2
93 3
93 10
93 75
93 92
93 94
94 67
94 68
94 79
94 23
94 93
94 95
95 0
95 72
95 76
95 20
95 52
95 55
95 94
96
0 1
0 36
0 13
0 46
0 28
0 95
1 0
1 33
1 2
1 36
1 4
1 43
1 14
1 21
1 91
1 95
2 1
2 3
2 5
2 9
2 23
2 89
3 2
3 4
3 13
3 18
3 90
4 1
4 34
4 3
4 5
4 73
4 10
4 88
4 95
5 2
5 66
5 4
5 6
5 74
5 54
5 90
6 5
6 7
6 74
6 16
6 49
6 80
6 31
7 6
7 8
7 80
7 53
7 21
7 92
8 64
8 68
8 7
8 41
8 9
8 11
8 45
8 54
8 88
9 32
9 2
9 35
9 8
9 10
9 76
9 17
9 85
9 55
10 34
10 4
10 38
10 9
10 11
11 8
11 42
11 10
11 76
11 12
12 73
12 11
12 13
12 56
12 58
12 88
13 0
13 3
13 74
13 12
13 14
13 80
13 25
14 1
14 42
14 13
14 15
14 63
15 39
15 14
15 47
15 16
15 25
16 34
16 36
16 6
16 15
16 17
17 9
17 45
17 79
17 16
17 18
17 24
17 26
17 59
18 3
18 17
18 19
18 84
18 91
19 39
19 41
19 48
19 18
19 20
19 91
20 90
20 19
20 21
20 22
20 26
21 32
21 1
21 7
21 74
21 20
21 22
21 90
21 95
22 74
22 50
22 20
22 21
22 23
23 2
23 66
23 40
23 46
23 48
23 22
23 24
23 95
24 17
24 27
24 25
24 23
25 13
25 15
25 88
25 24
25 26
25 94
26 17
26 20
26 25
26 27
26 61
27 34
27 69
27 45
27 28
27 24
27 26
27 60
28 0
28 64
28 85
28 57
28 27
28 29
29 65
29 78
29 50
29 28
29 61
29 30
30 38
30 43
30 93
30 29
30 31
31 32
31 67
31 6
31 48
31 93
31 30
32 33
32 35
32 37
32 9
32 43
32 21
32 91
32 92
32 93
32 31
33 32
33 1
33 34
33 71
34 33
34 35
34 4
34 10
34 16
34 81
34 27
35 32
35 34
35 36
35 9
35 51
36 0
36 1
36 35
36 37
36 16
36 56
37 32
37 60
37 38
37 36
38 37
38 39
38 10
38 45
38 30
39 40
39 19
39 38
39 15
40 39
40 41
40 48
40 23
40 91
40 63
41 8
41 40
41 42
41 19
41 85
42 41
42 43
42 11
42 14
42 53
43 32
43 1
43 42
43 44
43 45
43 30
44 43
44 67
44 45
44 46
45 38
45 8
45 43
45 44
45 46
45 17
45 87
45 27
46 0
46 44
46 77
46 45
46 47
46 23
46 61
46 95
47 48
47 65
47 46
47 15
48 40
48 47
48 49
48 19
48 86
48 23
48 60
48 31
49 6
49 79
49 48
49 50
49 89
50 81
50 49
50 51
50 22
50 29
51 35
51 50
51 52
51 86
51 90
51 94
52 66
52 51
52 53
53 7
53 42
53 52
53 54
53 56
53 90
54 8
54 53
54 5
54 55
55 65
55 9
55 56
55 54
56 36
56 74
56 12
56 53
56 55
56 57
57 56
57 58
57 28
58 57
58 59
58 12
59 70
59 75
59 17
59 58
59 60
60 37
60 59
60 48
60 27
60 61
61 46
61 29
61 26
61 60
61 93
61 62
62 68
62 93
62 85
62 61
62 63
63 64
63 40
63 14
63 93
63 62
64 8
64 65
64 28
64 63
65 64
65 66
65 69
65 74
65 47
65 55
65 29
66 65
66 67
66 69
66 5
66 52
66 23
67 66
67 68
67 44
67 86
67 31
68 8
68 67
68 69
68 62
69 65
69 66
69 68
69 70
69 77
69 83
69 27
70 59
70 69
70 78
70 71
71 33
71 70
71 72
71 87
71 90
72 73
72 90
72 71
73 72
73 74
73 4
73 12
74 65
74 5
74 6
74 73
74 75
74 13
74 21
74 22
74 56
75 74
75 59
75 76
76 9
76 75
76 11
76 77
77 69
77 76
77 78
77 46
77 93
78 70
78 77
78 79
78 87
78 29
79 80
79 17
79 78
79 49
80 6
80 7
80 13
80 79
80 81
80 85
81 34
81 80
81 50
81 82
81 88
82 81
82 83
83 82
83 84
83 69
84 18
84 83
84 85
84 95
85 9
85 41
85 80
85 84
85 86
85 88
85 28
85 62
86 67
86 48
86 51
86 85
86 87
86 88
87 71
87 45
87 78
87 86
87 88
88 89
88 4
88 8
88 12
88 81
88 85
88 86
88 87
88 25
89 88
89 49
89 2
89 90
90 3
90 5
90 71
90 72
90 51
90 20
90 21
90 53
90 89
90 91
91 32
91 1
91 40
91 18
91 19
91 90
91 92
92 32
92 91
92 93
92 7
93 32
93 77
93 63
93 30
93 94
93 92
93 61
93 62
93 31
94 25
94 51
94 93
94 95
95 0
95 1
95 4
95 46
95 84
95 21
95 23
95 94
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from datasets.Femnist import Femnist from datasets.Femnist import Femnist
from graphs import SmallWorld from graphs import SmallWorld
from collections import defaultdict from collections import defaultdict
import os import os
import json import json
import numpy as np import numpy as np
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
a = FEMNIST a = FEMNIST
a a
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
b = SmallWorld(6, 2, 2, 1) b = SmallWorld(6, 2, 2, 1)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
b.adj_list b.adj_list
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
for i in range(12): for i in range(12):
print(b.neighbors(i)) print(b.neighbors(i))
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
clients = [] clients = []
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
num_samples = [] num_samples = []
data = defaultdict(lambda : None) data = defaultdict(lambda : None)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
datadir = "./leaf/data/femnist/data/train" datadir = "./leaf/data/femnist/data/train"
files = os.listdir(datadir) files = os.listdir(datadir)
total_users=0 total_users=0
users = set() users = set()
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
files = os.listdir(datadir)[0:1] files = os.listdir(datadir)[0:1]
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
for f in files: for f in files:
file_path = os.path.join(datadir, f) file_path = os.path.join(datadir, f)
print(file_path) print(file_path)
with open(file_path, 'r') as inf: with open(file_path, 'r') as inf:
client_data = json.load(inf) client_data = json.load(inf)
current_users = len(client_data['users']) current_users = len(client_data['users'])
print("Current_Users: ", current_users) print("Current_Users: ", current_users)
total_users += current_users total_users += current_users
users.update(client_data['users']) users.update(client_data['users'])
print("total_users: ", total_users) print("total_users: ", total_users)
print("total_users: ", len(users)) print("total_users: ", len(users))
print(client_data['user_data'].keys()) print(client_data['user_data'].keys())
print(np.array(client_data['user_data']['f3408_47']['x']).shape) print(np.array(client_data['user_data']['f3408_47']['x']).shape)
print(np.array(client_data['user_data']['f3408_47']['y']).shape) print(np.array(client_data['user_data']['f3408_47']['y']).shape)
print(np.array(client_data['user_data']['f3327_11']['x']).shape) print(np.array(client_data['user_data']['f3327_11']['x']).shape)
print(np.array(client_data['user_data']['f3327_11']['y']).shape) print(np.array(client_data['user_data']['f3327_11']['y']).shape)
print(np.unique(np.array(client_data['user_data']['f3327_11']['y']))) print(np.unique(np.array(client_data['user_data']['f3327_11']['y'])))
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
file = 'run.py' file = 'run.py'
with open(file, 'r') as inf: with open(file, 'r') as inf:
print(inf.readline().strip()) print(inf.readline().strip())
print(inf.readlines()) print(inf.readlines())
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
def f(l): def f(l):
l[2] = 'c' l[2] = 'c'
a = ['a', 'a', 'a'] a = ['a', 'a', 'a']
print(a) print(a)
f(a) f(a)
print(a) print(a)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
l = ['a', 'b', 'c'] l = ['a', 'b', 'c']
print(l[:-1]) print(l[:-1])
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from localconfig import LocalConfig from localconfig import LocalConfig
def read_ini(file_path): def read_ini(file_path):
config = LocalConfig(file_path) config = LocalConfig(file_path)
for section in config: for section in config:
print("Section: ", section) print("Section: ", section)
for key, value in config.items(section): for key, value in config.items(section):
print((key, value)) print((key, value))
print(dict(config.items('DATASET'))) print(dict(config.items('DATASET')))
return config return config
config = read_ini("config.ini") config = read_ini("config.ini")
for section in config: for section in config:
print(section) print(section)
#d = dict(config.sections()) #d = dict(config.sections())
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
def func(a = 1, b = 2, c = 3): def func(a = 1, b = 2, c = 3):
print(a + b + c) print(a + b + c)
l = [3, 5, 7] l = [3, 5, 7]
func(*l) func(*l)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from torch import multiprocessing as mp from torch import multiprocessing as mp
mp.spawn(fn = func, nprocs = 2, args = [], kwargs = {'a': 4, 'b': 5, 'c': 6}) mp.spawn(fn = func, nprocs = 2, args = [], kwargs = {'a': 4, 'b': 5, 'c': 6})
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
l = '[0.4, 0.2, 0.3, 0.1]' l = '[0.4, 0.2, 0.3, 0.1]'
type(eval(l)) type(eval(l))
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from decentralizepy.datasets.Femnist import Femnist from decentralizepy.datasets.Femnist import Femnist
f1 = Femnist(0, 1, 'leaf/data/femnist/data/train') f1 = Femnist(0, 1, 'leaf/data/femnist/data/train')
ts = f1.get_trainset(1) ts = f1.get_trainset(1)
for data, target in ts: for data, target in ts:
print(data) print(data)
break break
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from decentralizepy.datasets.Femnist import Femnist from decentralizepy.datasets.Femnist import Femnist
from decentralizepy.graphs.SmallWorld import SmallWorld from decentralizepy.graphs.SmallWorld import SmallWorld
from decentralizepy.mappings.Linear import Linear from decentralizepy.mappings.Linear import Linear
f = Femnist(2, 'leaf/data/femnist/data/train', sizes=[0.6, 0.4]) f = Femnist(2, 'leaf/data/femnist/data/train', sizes=[0.6, 0.4])
g = SmallWorld(4, 1, 0.5) g = SmallWorld(4, 1, 0.5)
l = Linear(2, 2) l = Linear(2, 2)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from decentralizepy.node.Node import Node from decentralizepy.node.Node import Node
from torch import multiprocessing as mp from torch import multiprocessing as mp
import logging import logging
n1 = Node(0, l, g, f, "./results", logging.DEBUG) n1 = Node(0, l, g, f, "./results", logging.DEBUG)
n2 = Node(1, l, g, f, "./results", logging.DEBUG) n2 = Node(1, l, g, f, "./results", logging.DEBUG)
# mp.spawn(fn = Node, nprocs = 2, args=[l,g,f]) # mp.spawn(fn = Node, nprocs = 2, args=[l,g,f])
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from testing import f from testing import f
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from torch import multiprocessing as mp from torch import multiprocessing as mp
import torch import torch
m1 = torch.nn.Linear(1,1) m1 = torch.nn.Linear(1,1)
o1 = torch.optim.SGD(m1.parameters(), 0.6) o1 = torch.optim.SGD(m1.parameters(), 0.6)
print(m1) print(m1)
mp.spawn(fn = f, nprocs = 2, args=[m1, o1]) mp.spawn(fn = f, nprocs = 2, args=[m1, o1])
``` ```
%% Cell type:markdown id: tags: %% Cell type:markdown id: tags:
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
o1.param_groups o1.param_groups
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
with torch.no_grad(): with torch.no_grad():
o1.param_groups[0]["params"][0].copy_(torch.zeros(1,)) o1.param_groups[0]["params"][0].copy_(torch.zeros(1,))
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
o1.param_groups o1.param_groups
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
m1.state_dict() m1.state_dict()
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
import torch import torch
loss = getattr(torch.nn.functional, 'nll_loss') loss = getattr(torch.nn.functional, 'nll_loss')
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
loss loss
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
%matplotlib inline %matplotlib inline
from decentralizepy.node.Node import Node from decentralizepy.node.Node import Node
from decentralizepy.graphs.SmallWorld import SmallWorld from decentralizepy.graphs.SmallWorld import SmallWorld
from decentralizepy.graphs.Graph import Graph from decentralizepy.graphs.Graph import Graph
from decentralizepy.mappings.Linear import Linear from decentralizepy.mappings.Linear import Linear
from torch import multiprocessing as mp from torch import multiprocessing as mp
import torch import torch
import logging import logging
from localconfig import LocalConfig from localconfig import LocalConfig
def read_ini(file_path): def read_ini(file_path):
config = LocalConfig(file_path) config = LocalConfig(file_path)
for section in config: for section in config:
print("Section: ", section) print("Section: ", section)
for key, value in config.items(section): for key, value in config.items(section):
print((key, value)) print((key, value))
print(dict(config.items('DATASET'))) print(dict(config.items('DATASET')))
return config return config
config = read_ini("config.ini") config = read_ini("config.ini")
my_config = dict() my_config = dict()
for section in config: for section in config:
my_config[section] = dict(config.items(section)) my_config[section] = dict(config.items(section))
#f = Femnist(2, 'leaf/data/femnist/data/train', sizes=[0.6, 0.4]) #f = Femnist(2, 'leaf/data/femnist/data/train', sizes=[0.6, 0.4])
g = Graph() g = Graph()
g.read_graph_from_file("36_nodes.edges", "edges") g.read_graph_from_file("36_nodes.edges", "edges")
l = Linear(1, 36) l = Linear(1, 36)
#Node(0, 0, l, g, my_config, 20, "results", logging.DEBUG) #Node(0, 0, l, g, my_config, 20, "results", logging.DEBUG)
mp.spawn(fn = Node, nprocs = g.n_procs, args=[0,l,g,my_config,20,"results",logging.INFO]) mp.spawn(fn = Node, nprocs = g.n_procs, args=[0,l,g,my_config,20,"results",logging.INFO])
# mp.spawn(fn = Node, args = [l, g, config, 10, "results", logging.DEBUG], nprocs=2) # mp.spawn(fn = Node, args = [l, g, config, 10, "results", logging.DEBUG], nprocs=2)
``` ```
%% Output %% Output
Section: GRAPH Section: GRAPH
('package', 'decentralizepy.graphs.SmallWorld') ('package', 'decentralizepy.graphs.SmallWorld')
('graph_class', 'SmallWorld') ('graph_class', 'SmallWorld')
Section: DATASET Section: DATASET
('dataset_package', 'decentralizepy.datasets.Femnist') ('dataset_package', 'decentralizepy.datasets.Femnist')
('dataset_class', 'Femnist') ('dataset_class', 'Femnist')
('model_class', 'CNN') ('model_class', 'CNN')
('n_procs', 36) ('n_procs', 36)
('train_dir', 'leaf/data/femnist/per_user_data/train') ('train_dir', 'leaf/data/femnist/per_user_data/train')
('test_dir', 'leaf/data/femnist/data/test') ('test_dir', 'leaf/data/femnist/data/test')
('sizes', '') ('sizes', '')
Section: OPTIMIZER_PARAMS Section: OPTIMIZER_PARAMS
('optimizer_package', 'torch.optim') ('optimizer_package', 'torch.optim')
('optimizer_class', 'Adam') ('optimizer_class', 'Adam')
('lr', 0.01) ('lr', 0.01)
Section: TRAIN_PARAMS Section: TRAIN_PARAMS
('training_package', 'decentralizepy.training.Training') ('training_package', 'decentralizepy.training.Training')
('training_class', 'Training') ('training_class', 'Training')
('epochs_per_round', 1) ('epochs_per_round', 1)
('batch_size', 1024) ('batch_size', 1024)
('shuffle', True) ('shuffle', True)
('loss_package', 'torch.nn') ('loss_package', 'torch.nn')
('loss_class', 'CrossEntropyLoss') ('loss_class', 'CrossEntropyLoss')
Section: COMMUNICATION Section: COMMUNICATION
('comm_package', 'decentralizepy.communication.TCP') ('comm_package', 'decentralizepy.communication.TCP')
('comm_class', 'TCP') ('comm_class', 'TCP')
('addresses_filepath', 'ip_addr.json') ('addresses_filepath', 'ip_addr.json')
Section: SHARING Section: SHARING
('sharing_package', 'decentralizepy.sharing.Sharing') ('sharing_package', 'decentralizepy.sharing.Sharing')
('sharing_class', 'Sharing') ('sharing_class', 'Sharing')
{'dataset_package': 'decentralizepy.datasets.Femnist', 'dataset_class': 'Femnist', 'model_class': 'CNN', 'n_procs': 36, 'train_dir': 'leaf/data/femnist/per_user_data/train', 'test_dir': 'leaf/data/femnist/data/test', 'sizes': ''} {'dataset_package': 'decentralizepy.datasets.Femnist', 'dataset_class': 'Femnist', 'model_class': 'CNN', 'n_procs': 36, 'train_dir': 'leaf/data/femnist/per_user_data/train', 'test_dir': 'leaf/data/femnist/data/test', 'sizes': ''}
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from decentralizepy.mappings.Linear import Linear from decentralizepy.mappings.Linear import Linear
from testing import f from testing import f
from torch import multiprocessing as mp from torch import multiprocessing as mp
l = Linear(1, 2) l = Linear(1, 2)
mp.spawn(fn = f, nprocs = 2, args = [0, 2, "ip_addr.json", l]) mp.spawn(fn = f, nprocs = 2, args = [0, 2, "ip_addr.json", l])
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from decentralizepy.datasets.Femnist import Femnist from decentralizepy.datasets.Femnist import Femnist
f = Femnist() f = Femnist()
f.file_per_user('../leaf/data/femnist/data/train','../leaf/data/femnist/per_user_data/train') f.file_per_user('../leaf/data/femnist/data/train','../leaf/data/femnist/per_user_data/train')
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
a = set() a = set()
a.update([2, 3, 4, 5]) a.update([2, 3, 4, 5])
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
a a
``` ```
%% Output %% Output
{2, 3, 4, 5} {2, 3, 4, 5}
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
print(*a) print(*a)
``` ```
%% Output %% Output
2 3 4 5 2 3 4 5
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from decentralizepy.graphs.FullyConnected import FullyConnected from decentralizepy.graphs.FullyConnected import FullyConnected
s = FullyConnected(96) s = FullyConnected(96)
s.write_graph_to_file('96_node_fullyConnected.edges') s.write_graph_to_file('96_node_fullyConnected.edges')
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
from decentralizepy.graphs.SmallWorld import SmallWorld from decentralizepy.graphs.SmallWorld import SmallWorld
s = SmallWorld(96, 2, .5) s1 = SmallWorld(96, 2, 1.0)
s.write_graph_to_file('96_nodes.edges') s1.write_graph_to_file('96_nodes_random1.edges')
s2 = SmallWorld(96, 2, 1.0)
s2.write_graph_to_file('96_nodes_random2.edges')
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
import sys import sys
sys.argv sys.argv
``` ```
%% Output %% Output
['/home/risharma/miniconda3/envs/decpy/lib/python3.9/site-packages/ipykernel_launcher.py', ['/home/risharma/miniconda3/envs/decpy/lib/python3.9/site-packages/ipykernel_launcher.py',
'--ip=127.0.0.1', '--ip=127.0.0.1',
'--stdin=9008', '--stdin=9008',
'--control=9006', '--control=9006',
'--hb=9005', '--hb=9005',
'--Session.signature_scheme="hmac-sha256"', '--Session.signature_scheme="hmac-sha256"',
'--Session.key=b"eac5d2f8-c460-45f1-a268-1e4b46a6efd6"', '--Session.key=b"eac5d2f8-c460-45f1-a268-1e4b46a6efd6"',
'--shell=9007', '--shell=9007',
'--transport="tcp"', '--transport="tcp"',
'--iopub=9009', '--iopub=9009',
'--f=/tmp/tmp-21212479paJaUBJBN84.json'] '--f=/tmp/tmp-21212479paJaUBJBN84.json']
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
import torch import torch
from decentralizepy.datasets.Femnist import CNN from decentralizepy.datasets.Femnist import CNN
m1 = CNN() m1 = CNN()
o1 = torch.optim.SGD(m1.parameters(), 0.6) o1 = torch.optim.SGD(m1.parameters(), 0.6)
#print("m1_parameters: ", {k:v.data for k, v in zip(m1.state_dict(), m1.parameters())}) #print("m1_parameters: ", {k:v.data for k, v in zip(m1.state_dict(), m1.parameters())})
#print("m1_state_dict: ", m1.state_dict()) #print("m1_state_dict: ", m1.state_dict())
#print("o1_state_dict: ", o1.state_dict()) #print("o1_state_dict: ", o1.state_dict())
tensors_to_cat = [v.data for _, v in m1.state_dict().items()] tensors_to_cat = [v.data for _, v in m1.state_dict().items()]
print("tensors to cat: ", tensors_to_cat) print("tensors to cat: ", tensors_to_cat)
``` ```
%% Output %% Output
tensors to cat: [tensor([[[[-0.0754, -0.1052, 0.0336, 0.0976, -0.1882], tensors to cat: [tensor([[[[-0.0754, -0.1052, 0.0336, 0.0976, -0.1882],
[ 0.1723, 0.0860, -0.0557, -0.1182, 0.1845], [ 0.1723, 0.0860, -0.0557, -0.1182, 0.1845],
[-0.1112, 0.0554, -0.0190, 0.1264, -0.0851], [-0.1112, 0.0554, -0.0190, 0.1264, -0.0851],
[-0.0788, 0.1281, -0.0477, 0.1910, 0.0089], [-0.0788, 0.1281, -0.0477, 0.1910, 0.0089],
[-0.1205, -0.0230, -0.0574, 0.1633, -0.1101]]], [-0.1205, -0.0230, -0.0574, 0.1633, -0.1101]]],
[[[ 0.0271, -0.1426, -0.0381, 0.1128, 0.0396], [[[ 0.0271, -0.1426, -0.0381, 0.1128, 0.0396],
[-0.0023, 0.1021, -0.0483, -0.0167, 0.0090], [-0.0023, 0.1021, -0.0483, -0.0167, 0.0090],
[-0.0645, -0.1863, -0.1494, -0.0473, 0.1572], [-0.0645, -0.1863, -0.1494, -0.0473, 0.1572],
[-0.0027, -0.0923, 0.0186, 0.0787, 0.1404], [-0.0027, -0.0923, 0.0186, 0.0787, 0.1404],
[ 0.0714, 0.0434, 0.1853, 0.1262, 0.0502]]], [ 0.0714, 0.0434, 0.1853, 0.1262, 0.0502]]],
[[[-0.1708, 0.0628, -0.1877, -0.0879, -0.1190], [[[-0.1708, 0.0628, -0.1877, -0.0879, -0.1190],
[-0.0111, 0.0684, -0.1659, -0.1297, 0.0684], [-0.0111, 0.0684, -0.1659, -0.1297, 0.0684],
[ 0.1629, -0.1196, -0.0096, 0.1854, -0.0721], [ 0.1629, -0.1196, -0.0096, 0.1854, -0.0721],
[ 0.1122, -0.0325, 0.1887, -0.0685, 0.1441], [ 0.1122, -0.0325, 0.1887, -0.0685, 0.1441],
[ 0.1734, 0.0720, -0.0941, -0.0536, 0.0355]]], [ 0.1734, 0.0720, -0.0941, -0.0536, 0.0355]]],
[[[-0.1465, -0.0899, 0.1122, 0.0323, -0.1287], [[[-0.1465, -0.0899, 0.1122, 0.0323, -0.1287],
[-0.1237, 0.0852, -0.1984, -0.0995, 0.0734], [-0.1237, 0.0852, -0.1984, -0.0995, 0.0734],
[ 0.0521, -0.1664, -0.0875, 0.0643, -0.1166], [ 0.0521, -0.1664, -0.0875, 0.0643, -0.1166],
[ 0.1862, 0.1754, -0.0282, 0.1074, -0.1337], [ 0.1862, 0.1754, -0.0282, 0.1074, -0.1337],
[ 0.1779, 0.1883, 0.1376, -0.1518, -0.1619]]], [ 0.1779, 0.1883, 0.1376, -0.1518, -0.1619]]],
[[[ 0.0158, -0.1902, 0.0607, 0.1062, 0.0373], [[[ 0.0158, -0.1902, 0.0607, 0.1062, 0.0373],
[-0.1737, -0.1112, 0.0597, -0.1030, -0.0897], [-0.1737, -0.1112, 0.0597, -0.1030, -0.0897],
[ 0.0892, 0.1166, -0.0127, 0.1859, -0.1859], [ 0.0892, 0.1166, -0.0127, 0.1859, -0.1859],
[ 0.1935, -0.1200, 0.0472, -0.0144, -0.0219], [ 0.1935, -0.1200, 0.0472, -0.0144, -0.0219],
[-0.0596, -0.0255, -0.1733, 0.1746, 0.1136]]], [-0.0596, -0.0255, -0.1733, 0.1746, 0.1136]]],
[[[-0.1769, 0.0885, 0.0706, 0.1742, 0.1178], [[[-0.1769, 0.0885, 0.0706, 0.1742, 0.1178],
[-0.1831, 0.0380, -0.0769, -0.1068, 0.1958], [-0.1831, 0.0380, -0.0769, -0.1068, 0.1958],
[-0.0166, -0.0059, 0.0663, 0.0610, 0.0321], [-0.0166, -0.0059, 0.0663, 0.0610, 0.0321],
[-0.1393, -0.1884, -0.0678, -0.1400, 0.1042], [-0.1393, -0.1884, -0.0678, -0.1400, 0.1042],
[ 0.0653, -0.1733, 0.0744, 0.1278, 0.0482]]], [ 0.0653, -0.1733, 0.0744, 0.1278, 0.0482]]],
[[[ 0.0935, -0.0115, -0.1991, 0.1451, -0.1708], [[[ 0.0935, -0.0115, -0.1991, 0.1451, -0.1708],
[ 0.1979, -0.0709, 0.1771, 0.0486, 0.1292], [ 0.1979, -0.0709, 0.1771, 0.0486, 0.1292],
[-0.1624, 0.0661, -0.0004, -0.0079, 0.1195], [-0.1624, 0.0661, -0.0004, -0.0079, 0.1195],
[ 0.0655, 0.1849, 0.1179, 0.1520, -0.0723], [ 0.0655, 0.1849, 0.1179, 0.1520, -0.0723],
[ 0.1374, 0.1547, 0.1871, 0.1924, 0.0539]]], [ 0.1374, 0.1547, 0.1871, 0.1924, 0.0539]]],
[[[-0.1602, 0.1380, 0.1321, -0.0722, -0.0291], [[[-0.1602, 0.1380, 0.1321, -0.0722, -0.0291],
[ 0.0820, -0.0490, -0.1345, 0.1719, 0.1306], [ 0.0820, -0.0490, -0.1345, 0.1719, 0.1306],
[-0.0183, 0.0615, 0.1189, -0.0707, -0.0696], [-0.0183, 0.0615, 0.1189, -0.0707, -0.0696],
[ 0.0017, -0.1185, -0.0766, 0.1017, -0.0518], [ 0.0017, -0.1185, -0.0766, 0.1017, -0.0518],
[-0.1316, -0.1323, -0.0135, 0.0103, -0.1509]]], [-0.1316, -0.1323, -0.0135, 0.0103, -0.1509]]],
[[[-0.1189, -0.1300, -0.1908, -0.0222, 0.0017], [[[-0.1189, -0.1300, -0.1908, -0.0222, 0.0017],
[-0.0644, 0.0996, 0.0454, 0.0198, -0.1608], [-0.0644, 0.0996, 0.0454, 0.0198, -0.1608],
[-0.0614, -0.0721, -0.0223, -0.0746, -0.0081], [-0.0614, -0.0721, -0.0223, -0.0746, -0.0081],
[-0.0133, 0.1326, 0.1275, -0.0954, 0.0281], [-0.0133, 0.1326, 0.1275, -0.0954, 0.0281],
[ 0.1004, -0.0435, -0.1027, 0.1867, 0.0034]]], [ 0.1004, -0.0435, -0.1027, 0.1867, 0.0034]]],
[[[-0.0022, -0.0830, -0.0399, 0.1039, -0.1230], [[[-0.0022, -0.0830, -0.0399, 0.1039, -0.1230],
[-0.0067, 0.1032, 0.0895, 0.0679, 0.0133], [-0.0067, 0.1032, 0.0895, 0.0679, 0.0133],
[ 0.0148, -0.1275, 0.0391, -0.0173, 0.1621], [ 0.0148, -0.1275, 0.0391, -0.0173, 0.1621],
[ 0.0532, -0.1873, -0.1499, 0.0605, -0.0016], [ 0.0532, -0.1873, -0.1499, 0.0605, -0.0016],
[ 0.0386, -0.0360, 0.0255, -0.0580, -0.1954]]], [ 0.0386, -0.0360, 0.0255, -0.0580, -0.1954]]],
[[[-0.0186, -0.1922, -0.1356, 0.1696, 0.0111], [[[-0.0186, -0.1922, -0.1356, 0.1696, 0.0111],
[ 0.1121, -0.0990, 0.0789, -0.0053, 0.0729], [ 0.1121, -0.0990, 0.0789, -0.0053, 0.0729],
[-0.0565, -0.0405, 0.0559, 0.1065, -0.1723], [-0.0565, -0.0405, 0.0559, 0.1065, -0.1723],
[ 0.0569, -0.0432, -0.0484, 0.1533, 0.0421], [ 0.0569, -0.0432, -0.0484, 0.1533, 0.0421],
[-0.0369, 0.1050, 0.1623, -0.1503, 0.0680]]], [-0.0369, 0.1050, 0.1623, -0.1503, 0.0680]]],
[[[-0.0028, -0.0167, 0.0598, 0.0854, -0.1996], [[[-0.0028, -0.0167, 0.0598, 0.0854, -0.1996],
[ 0.0548, -0.0586, 0.1987, -0.0415, -0.0477], [ 0.0548, -0.0586, 0.1987, -0.0415, -0.0477],
[ 0.1172, -0.0924, 0.1351, 0.1319, 0.0983], [ 0.1172, -0.0924, 0.1351, 0.1319, 0.0983],
[ 0.0756, 0.1345, 0.1360, -0.0779, -0.1169], [ 0.0756, 0.1345, 0.1360, -0.0779, -0.1169],
[ 0.1524, 0.0208, 0.0672, 0.1320, 0.1958]]], [ 0.1524, 0.0208, 0.0672, 0.1320, 0.1958]]],
[[[-0.0794, -0.1947, 0.0410, 0.1480, -0.0168], [[[-0.0794, -0.1947, 0.0410, 0.1480, -0.0168],
[ 0.0836, -0.0750, -0.0903, -0.1946, 0.0676], [ 0.0836, -0.0750, -0.0903, -0.1946, 0.0676],
[-0.1202, 0.0276, 0.0058, -0.1465, 0.1545], [-0.1202, 0.0276, 0.0058, -0.1465, 0.1545],
[-0.1914, 0.0926, -0.0799, -0.0715, -0.0607], [-0.1914, 0.0926, -0.0799, -0.0715, -0.0607],
[ 0.1624, 0.1929, -0.1532, -0.0007, -0.1312]]], [ 0.1624, 0.1929, -0.1532, -0.0007, -0.1312]]],
[[[ 0.0281, -0.0407, -0.0482, 0.0492, 0.0039], [[[ 0.0281, -0.0407, -0.0482, 0.0492, 0.0039],
[-0.0752, -0.0298, 0.0764, -0.0738, -0.0705], [-0.0752, -0.0298, 0.0764, -0.0738, -0.0705],
[ 0.1404, -0.1254, 0.1613, -0.0501, -0.1432], [ 0.1404, -0.1254, 0.1613, -0.0501, -0.1432],
[ 0.0132, -0.0170, -0.0971, -0.0117, 0.1406], [ 0.0132, -0.0170, -0.0971, -0.0117, 0.1406],
[ 0.0655, -0.1970, 0.1782, 0.0665, 0.1174]]], [ 0.0655, -0.1970, 0.1782, 0.0665, 0.1174]]],
[[[ 0.1879, 0.0858, 0.0340, 0.1138, -0.0841], [[[ 0.1879, 0.0858, 0.0340, 0.1138, -0.0841],
[ 0.1503, -0.1680, 0.0618, 0.1971, -0.0156], [ 0.1503, -0.1680, 0.0618, 0.1971, -0.0156],
[-0.1211, -0.0485, 0.0675, -0.0536, 0.0046], [-0.1211, -0.0485, 0.0675, -0.0536, 0.0046],
[ 0.1134, 0.1310, -0.0966, -0.1924, -0.0812], [ 0.1134, 0.1310, -0.0966, -0.1924, -0.0812],
[-0.0800, -0.0553, 0.1818, -0.0217, 0.0107]]], [-0.0800, -0.0553, 0.1818, -0.0217, 0.0107]]],
[[[ 0.1118, 0.1368, 0.0269, -0.0690, 0.0168], [[[ 0.1118, 0.1368, 0.0269, -0.0690, 0.0168],
[-0.1371, -0.1983, 0.0016, -0.0094, 0.0502], [-0.1371, -0.1983, 0.0016, -0.0094, 0.0502],
[-0.0219, 0.1939, -0.1821, 0.0988, -0.1016], [-0.0219, 0.1939, -0.1821, 0.0988, -0.1016],
[-0.0063, 0.0789, 0.1037, -0.0646, -0.1694], [-0.0063, 0.0789, 0.1037, -0.0646, -0.1694],
[-0.0464, 0.0624, 0.0681, -0.1855, 0.1219]]], [-0.0464, 0.0624, 0.0681, -0.1855, 0.1219]]],
[[[ 0.0640, 0.1420, -0.0144, -0.0606, 0.0963], [[[ 0.0640, 0.1420, -0.0144, -0.0606, 0.0963],
[ 0.1349, 0.0906, 0.0826, -0.0385, -0.1635], [ 0.1349, 0.0906, 0.0826, -0.0385, -0.1635],
[ 0.0278, 0.1875, 0.1559, 0.1424, 0.0878], [ 0.0278, 0.1875, 0.1559, 0.1424, 0.0878],
[-0.0593, -0.0847, -0.0458, 0.1402, 0.0160], [-0.0593, -0.0847, -0.0458, 0.1402, 0.0160],
[ 0.1887, -0.1220, -0.0316, -0.0892, -0.0401]]], [ 0.1887, -0.1220, -0.0316, -0.0892, -0.0401]]],
[[[-0.1103, -0.0884, 0.0775, -0.0338, -0.1245], [[[-0.1103, -0.0884, 0.0775, -0.0338, -0.1245],
[-0.1848, 0.0657, 0.0945, 0.0677, -0.1089], [-0.1848, 0.0657, 0.0945, 0.0677, -0.1089],
[ 0.0585, -0.0791, -0.0535, 0.0527, -0.0741], [ 0.0585, -0.0791, -0.0535, 0.0527, -0.0741],
[ 0.1146, 0.1908, -0.0167, -0.0033, -0.1418], [ 0.1146, 0.1908, -0.0167, -0.0033, -0.1418],
[-0.1011, 0.0829, -0.1472, -0.1310, -0.0991]]], [-0.1011, 0.0829, -0.1472, -0.1310, -0.0991]]],
[[[ 0.0865, 0.1608, 0.0168, -0.1126, -0.0889], [[[ 0.0865, 0.1608, 0.0168, -0.1126, -0.0889],
[-0.1322, 0.1052, 0.1285, 0.0574, 0.1983], [-0.1322, 0.1052, 0.1285, 0.0574, 0.1983],
[ 0.0788, 0.0125, -0.0781, -0.0303, -0.1089], [ 0.0788, 0.0125, -0.0781, -0.0303, -0.1089],
[-0.1860, -0.1878, 0.0608, 0.0402, -0.0390], [-0.1860, -0.1878, 0.0608, 0.0402, -0.0390],
[-0.1797, -0.1828, 0.0475, -0.1754, 0.0657]]], [-0.1797, -0.1828, 0.0475, -0.1754, 0.0657]]],
[[[-0.0632, 0.1015, -0.1493, -0.0855, 0.1822], [[[-0.0632, 0.1015, -0.1493, -0.0855, 0.1822],
[-0.1492, 0.0236, -0.1418, -0.1826, -0.1978], [-0.1492, 0.0236, -0.1418, -0.1826, -0.1978],
[ 0.0854, -0.1503, 0.1953, -0.0668, -0.1042], [ 0.0854, -0.1503, 0.1953, -0.0668, -0.1042],
[ 0.1537, -0.0573, 0.0178, 0.1350, 0.1823], [ 0.1537, -0.0573, 0.0178, 0.1350, 0.1823],
[-0.1131, -0.0838, -0.1445, -0.1476, -0.0288]]], [-0.1131, -0.0838, -0.1445, -0.1476, -0.0288]]],
[[[ 0.1445, 0.1748, 0.0146, -0.0807, -0.1905], [[[ 0.1445, 0.1748, 0.0146, -0.0807, -0.1905],
[-0.1474, -0.0938, 0.0965, 0.0233, 0.0757], [-0.1474, -0.0938, 0.0965, 0.0233, 0.0757],
[-0.1257, -0.1316, -0.0105, 0.1447, -0.1200], [-0.1257, -0.1316, -0.0105, 0.1447, -0.1200],
[ 0.0921, -0.0332, -0.1153, -0.0260, 0.0640], [ 0.0921, -0.0332, -0.1153, -0.0260, 0.0640],
[-0.1528, 0.1237, 0.1956, 0.0167, 0.0205]]], [-0.1528, 0.1237, 0.1956, 0.0167, 0.0205]]],
[[[-0.0236, -0.1264, -0.0342, -0.1199, -0.1381], [[[-0.0236, -0.1264, -0.0342, -0.1199, -0.1381],
[-0.0895, -0.0116, -0.1108, 0.0994, 0.0744], [-0.0895, -0.0116, -0.1108, 0.0994, 0.0744],
[-0.1190, -0.1275, -0.1097, 0.0678, -0.1547], [-0.1190, -0.1275, -0.1097, 0.0678, -0.1547],
[-0.1405, 0.0500, -0.1195, 0.0917, -0.0952], [-0.1405, 0.0500, -0.1195, 0.0917, -0.0952],
[-0.1799, -0.1516, 0.0956, -0.0402, 0.0426]]], [-0.1799, -0.1516, 0.0956, -0.0402, 0.0426]]],
[[[ 0.0556, 0.1145, 0.0388, 0.0917, -0.0881], [[[ 0.0556, 0.1145, 0.0388, 0.0917, -0.0881],
[-0.0933, 0.0580, 0.1433, -0.1127, -0.0142], [-0.0933, 0.0580, 0.1433, -0.1127, -0.0142],
[-0.0014, 0.0792, -0.0451, -0.0696, 0.0486], [-0.0014, 0.0792, -0.0451, -0.0696, 0.0486],
[ 0.0006, 0.0264, -0.1252, 0.0796, 0.0561], [ 0.0006, 0.0264, -0.1252, 0.0796, 0.0561],
[-0.0203, -0.0664, -0.0443, 0.1173, -0.0193]]], [-0.0203, -0.0664, -0.0443, 0.1173, -0.0193]]],
[[[-0.0796, -0.1144, -0.0728, 0.0867, -0.1342], [[[-0.0796, -0.1144, -0.0728, 0.0867, -0.1342],
[-0.0150, -0.0840, -0.0134, 0.0634, 0.1743], [-0.0150, -0.0840, -0.0134, 0.0634, 0.1743],
[-0.0773, -0.0827, -0.0470, -0.0879, 0.0346], [-0.0773, -0.0827, -0.0470, -0.0879, 0.0346],
[ 0.0642, 0.1078, 0.0373, 0.1683, 0.0433], [ 0.0642, 0.1078, 0.0373, 0.1683, 0.0433],
[-0.0024, 0.1508, -0.1980, 0.0808, -0.1028]]], [-0.0024, 0.1508, -0.1980, 0.0808, -0.1028]]],
[[[-0.1358, -0.1896, -0.1495, 0.1018, 0.0048], [[[-0.1358, -0.1896, -0.1495, 0.1018, 0.0048],
[-0.1030, -0.1340, 0.0090, -0.0781, 0.0415], [-0.1030, -0.1340, 0.0090, -0.0781, 0.0415],
[-0.1069, 0.0658, -0.0144, 0.1455, 0.1006], [-0.1069, 0.0658, -0.0144, 0.1455, 0.1006],
[-0.0209, 0.0757, -0.1481, -0.0273, -0.0880], [-0.0209, 0.0757, -0.1481, -0.0273, -0.0880],
[ 0.0859, -0.1996, -0.1787, -0.1869, -0.0310]]], [ 0.0859, -0.1996, -0.1787, -0.1869, -0.0310]]],
[[[-0.1005, 0.1260, -0.1949, 0.1433, 0.0845], [[[-0.1005, 0.1260, -0.1949, 0.1433, 0.0845],
[ 0.0336, 0.1110, -0.0389, 0.0584, -0.0470], [ 0.0336, 0.1110, -0.0389, 0.0584, -0.0470],
[ 0.0333, 0.0861, -0.0371, 0.1471, -0.0385], [ 0.0333, 0.0861, -0.0371, 0.1471, -0.0385],
[ 0.1889, 0.1999, 0.1643, 0.0767, -0.1738], [ 0.1889, 0.1999, 0.1643, 0.0767, -0.1738],
[-0.1423, -0.0945, -0.0881, 0.0847, -0.0170]]], [-0.1423, -0.0945, -0.0881, 0.0847, -0.0170]]],
[[[-0.1417, 0.0120, -0.1466, -0.1825, -0.1018], [[[-0.1417, 0.0120, -0.1466, -0.1825, -0.1018],
[ 0.1989, -0.1077, -0.1263, -0.0141, 0.0296], [ 0.1989, -0.1077, -0.1263, -0.0141, 0.0296],
[ 0.0414, 0.1355, -0.1630, -0.1348, 0.1382], [ 0.0414, 0.1355, -0.1630, -0.1348, 0.1382],
[ 0.0589, 0.1278, 0.0218, 0.1217, 0.1369], [ 0.0589, 0.1278, 0.0218, 0.1217, 0.1369],
[ 0.0659, 0.1200, 0.0031, 0.0240, -0.1200]]], [ 0.0659, 0.1200, 0.0031, 0.0240, -0.1200]]],
[[[ 0.1293, -0.1227, -0.0351, -0.1777, 0.0515], [[[ 0.1293, -0.1227, -0.0351, -0.1777, 0.0515],
[-0.1087, 0.0541, 0.0684, 0.0929, -0.1989], [-0.1087, 0.0541, 0.0684, 0.0929, -0.1989],
[ 0.0596, -0.0794, -0.0461, 0.0260, -0.1812], [ 0.0596, -0.0794, -0.0461, 0.0260, -0.1812],
[-0.0437, 0.0989, -0.1171, -0.0531, -0.1493], [-0.0437, 0.0989, -0.1171, -0.0531, -0.1493],
[-0.1677, -0.1009, 0.0778, -0.1462, 0.0935]]], [-0.1677, -0.1009, 0.0778, -0.1462, 0.0935]]],
[[[-0.0794, 0.1744, -0.0915, -0.1289, -0.0317], [[[-0.0794, 0.1744, -0.0915, -0.1289, -0.0317],
[ 0.1679, 0.1361, 0.0542, 0.1967, 0.1620], [ 0.1679, 0.1361, 0.0542, 0.1967, 0.1620],
[-0.0898, 0.0305, -0.1617, -0.1643, -0.0253], [-0.0898, 0.0305, -0.1617, -0.1643, -0.0253],
[ 0.1339, -0.1151, 0.0954, -0.0359, -0.1790], [ 0.1339, -0.1151, 0.0954, -0.0359, -0.1790],
[-0.0408, -0.1074, 0.0081, -0.0884, 0.0163]]], [-0.0408, -0.1074, 0.0081, -0.0884, 0.0163]]],
[[[ 0.0022, -0.1097, 0.0842, -0.1927, 0.0277], [[[ 0.0022, -0.1097, 0.0842, -0.1927, 0.0277],
[ 0.0127, -0.0666, 0.1235, -0.1696, 0.1377], [ 0.0127, -0.0666, 0.1235, -0.1696, 0.1377],
[ 0.0741, -0.1627, -0.0953, 0.1622, 0.0921], [ 0.0741, -0.1627, -0.0953, 0.1622, 0.0921],
[ 0.1111, 0.1422, -0.1884, -0.0130, -0.0533], [ 0.1111, 0.1422, -0.1884, -0.0130, -0.0533],
[ 0.0838, 0.1769, 0.0109, 0.0086, -0.0795]]], [ 0.0838, 0.1769, 0.0109, 0.0086, -0.0795]]],
[[[-0.0502, -0.1545, -0.1304, 0.1680, -0.0439], [[[-0.0502, -0.1545, -0.1304, 0.1680, -0.0439],
[ 0.1891, 0.1681, -0.1068, 0.1589, -0.1011], [ 0.1891, 0.1681, -0.1068, 0.1589, -0.1011],
[-0.0923, -0.1569, 0.1337, 0.0743, 0.1794], [-0.0923, -0.1569, 0.1337, 0.0743, 0.1794],
[-0.1630, 0.0071, -0.0524, -0.1211, -0.0362], [-0.1630, 0.0071, -0.0524, -0.1211, -0.0362],
[-0.1111, 0.1168, 0.0010, -0.0352, 0.1756]]], [-0.1111, 0.1168, 0.0010, -0.0352, 0.1756]]],
[[[-0.0282, -0.0647, -0.0309, 0.0586, 0.1442], [[[-0.0282, -0.0647, -0.0309, 0.0586, 0.1442],
[ 0.1881, 0.1471, 0.0371, -0.0573, 0.1329], [ 0.1881, 0.1471, 0.0371, -0.0573, 0.1329],
[ 0.0760, -0.0789, -0.0935, 0.1464, 0.1953], [ 0.0760, -0.0789, -0.0935, 0.1464, 0.1953],
[-0.0965, -0.0576, 0.1569, -0.0305, 0.1716], [-0.0965, -0.0576, 0.1569, -0.0305, 0.1716],
[ 0.1946, 0.1921, 0.1460, 0.1270, 0.0192]]]]), tensor([ 0.1092, -0.1012, 0.1255, -0.0428, 0.1610, 0.0467, 0.1861, 0.1967, [ 0.1946, 0.1921, 0.1460, 0.1270, 0.0192]]]]), tensor([ 0.1092, -0.1012, 0.1255, -0.0428, 0.1610, 0.0467, 0.1861, 0.1967,
-0.0119, -0.0437, -0.0364, -0.1432, -0.1487, 0.1987, -0.0183, 0.0156, -0.0119, -0.0437, -0.0364, -0.1432, -0.1487, 0.1987, -0.0183, 0.0156,
0.0776, 0.1051, 0.1560, 0.1141, -0.1168, -0.0773, 0.1036, 0.0045, 0.0776, 0.1051, 0.1560, 0.1141, -0.1168, -0.0773, 0.1036, 0.0045,
0.0811, 0.0564, 0.1103, 0.1955, 0.1101, 0.1653, 0.1274, 0.0707]), tensor([[[[ 3.4215e-02, 3.1851e-02, -3.1990e-02, -2.4633e-02, -1.4061e-02], 0.0811, 0.0564, 0.1103, 0.1955, 0.1101, 0.1653, 0.1274, 0.0707]), tensor([[[[ 3.4215e-02, 3.1851e-02, -3.1990e-02, -2.4633e-02, -1.4061e-02],
[ 1.2874e-02, 1.9820e-02, -2.9326e-02, -3.5093e-02, -1.6611e-02], [ 1.2874e-02, 1.9820e-02, -2.9326e-02, -3.5093e-02, -1.6611e-02],
[-1.8654e-02, -2.6634e-02, -1.0565e-03, -3.2393e-02, 2.5394e-02], [-1.8654e-02, -2.6634e-02, -1.0565e-03, -3.2393e-02, 2.5394e-02],
[ 2.8582e-02, 2.3981e-02, -3.6160e-03, -3.3239e-02, -3.3089e-02], [ 2.8582e-02, 2.3981e-02, -3.6160e-03, -3.3239e-02, -3.3089e-02],
[-1.4720e-02, -1.4778e-02, -1.3619e-02, -8.4345e-03, 3.2735e-02]], [-1.4720e-02, -1.4778e-02, -1.3619e-02, -8.4345e-03, 3.2735e-02]],
[[-8.9284e-03, -1.1837e-02, -2.9223e-02, 2.7357e-02, 3.1966e-02], [[-8.9284e-03, -1.1837e-02, -2.9223e-02, 2.7357e-02, 3.1966e-02],
[-3.2790e-02, -2.8377e-02, 6.0738e-03, -1.0012e-02, 2.0444e-03], [-3.2790e-02, -2.8377e-02, 6.0738e-03, -1.0012e-02, 2.0444e-03],
[-2.6949e-02, 4.2564e-03, 2.1914e-02, -1.0865e-02, 6.7992e-03], [-2.6949e-02, 4.2564e-03, 2.1914e-02, -1.0865e-02, 6.7992e-03],
[-3.0702e-02, 3.4266e-02, -1.4731e-02, 5.6710e-03, -1.2800e-02], [-3.0702e-02, 3.4266e-02, -1.4731e-02, 5.6710e-03, -1.2800e-02],
[-2.6684e-02, -1.1684e-02, 2.0707e-02, 2.5899e-02, -2.3985e-02]], [-2.6684e-02, -1.1684e-02, 2.0707e-02, 2.5899e-02, -2.3985e-02]],
[[ 1.5901e-02, 2.6151e-02, 3.1089e-02, -1.1213e-02, 3.3065e-02], [[ 1.5901e-02, 2.6151e-02, 3.1089e-02, -1.1213e-02, 3.3065e-02],
[ 3.1703e-03, 2.5620e-02, 9.4295e-03, 2.3775e-02, -1.4995e-02], [ 3.1703e-03, 2.5620e-02, 9.4295e-03, 2.3775e-02, -1.4995e-02],
[ 2.9061e-02, -2.2070e-02, 1.8273e-02, -1.7573e-02, -8.4471e-03], [ 2.9061e-02, -2.2070e-02, 1.8273e-02, -1.7573e-02, -8.4471e-03],
[-2.3864e-02, -1.0588e-02, -1.6631e-02, 5.6322e-04, -3.7847e-03], [-2.3864e-02, -1.0588e-02, -1.6631e-02, 5.6322e-04, -3.7847e-03],
[-3.4452e-02, -4.9367e-03, -8.4581e-03, 9.4274e-03, 1.8410e-02]], [-3.4452e-02, -4.9367e-03, -8.4581e-03, 9.4274e-03, 1.8410e-02]],
..., ...,
[[ 1.8886e-02, -1.4010e-02, 3.1874e-02, 3.2005e-02, -2.3095e-02], [[ 1.8886e-02, -1.4010e-02, 3.1874e-02, 3.2005e-02, -2.3095e-02],
[-3.0070e-03, -7.3744e-04, 7.5279e-03, 3.4897e-03, 2.9886e-02], [-3.0070e-03, -7.3744e-04, 7.5279e-03, 3.4897e-03, 2.9886e-02],
[-7.0828e-03, -1.5942e-03, -2.5189e-02, 3.1235e-02, 1.0274e-02], [-7.0828e-03, -1.5942e-03, -2.5189e-02, 3.1235e-02, 1.0274e-02],
[-6.7937e-03, -6.4489e-03, -1.9903e-03, -2.9061e-03, -2.7987e-02], [-6.7937e-03, -6.4489e-03, -1.9903e-03, -2.9061e-03, -2.7987e-02],
[ 1.3823e-02, -2.3655e-02, -1.3454e-02, -1.0748e-02, -2.5365e-02]], [ 1.3823e-02, -2.3655e-02, -1.3454e-02, -1.0748e-02, -2.5365e-02]],
[[-1.7251e-02, 2.2574e-02, -3.1775e-02, 1.1056e-02, 3.0264e-02], [[-1.7251e-02, 2.2574e-02, -3.1775e-02, 1.1056e-02, 3.0264e-02],
[-2.4015e-03, 3.6797e-04, 3.1375e-02, -2.4610e-02, -1.7509e-02], [-2.4015e-03, 3.6797e-04, 3.1375e-02, -2.4610e-02, -1.7509e-02],
[ 1.0531e-02, 1.4460e-02, 2.5519e-02, -2.7404e-02, -3.1486e-02], [ 1.0531e-02, 1.4460e-02, 2.5519e-02, -2.7404e-02, -3.1486e-02],
[-7.5893e-03, 7.7746e-03, -7.8301e-03, 1.8347e-02, -1.2583e-02], [-7.5893e-03, 7.7746e-03, -7.8301e-03, 1.8347e-02, -1.2583e-02],
[-9.9493e-03, 1.7974e-03, -8.8425e-03, -1.0114e-02, -8.3833e-03]], [-9.9493e-03, 1.7974e-03, -8.8425e-03, -1.0114e-02, -8.3833e-03]],
[[-2.7485e-02, -1.0523e-02, -2.7561e-02, -3.0887e-02, 1.6148e-02], [[-2.7485e-02, -1.0523e-02, -2.7561e-02, -3.0887e-02, 1.6148e-02],
[-2.1342e-02, -3.1566e-02, -2.8830e-02, 2.3036e-02, 2.0477e-02], [-2.1342e-02, -3.1566e-02, -2.8830e-02, 2.3036e-02, 2.0477e-02],
[ 2.5157e-02, 1.3644e-02, 2.7742e-02, 2.7517e-02, 1.9181e-02], [ 2.5157e-02, 1.3644e-02, 2.7742e-02, 2.7517e-02, 1.9181e-02],
[ 2.8086e-02, -3.4166e-02, -2.4337e-02, 2.3341e-02, -1.8879e-02], [ 2.8086e-02, -3.4166e-02, -2.4337e-02, 2.3341e-02, -1.8879e-02],
[ 1.1548e-02, 3.8810e-03, 2.1140e-02, -1.9494e-02, 1.9646e-02]]], [ 1.1548e-02, 3.8810e-03, 2.1140e-02, -1.9494e-02, 1.9646e-02]]],
[[[ 1.4197e-02, 2.2693e-02, 2.4212e-02, 2.3371e-03, 3.2226e-02], [[[ 1.4197e-02, 2.2693e-02, 2.4212e-02, 2.3371e-03, 3.2226e-02],
[-1.2037e-03, 1.7354e-03, 3.5187e-02, 1.1291e-02, -4.9615e-04], [-1.2037e-03, 1.7354e-03, 3.5187e-02, 1.1291e-02, -4.9615e-04],
[ 9.9413e-03, -3.5207e-04, 9.8214e-03, 1.9830e-02, 3.4511e-02], [ 9.9413e-03, -3.5207e-04, 9.8214e-03, 1.9830e-02, 3.4511e-02],
[-5.0710e-03, 2.5797e-02, -2.4624e-02, -2.3307e-02, -1.6377e-02], [-5.0710e-03, 2.5797e-02, -2.4624e-02, -2.3307e-02, -1.6377e-02],
[-2.6477e-02, -5.8760e-04, -4.7716e-03, -2.0889e-02, 1.0995e-02]], [-2.6477e-02, -5.8760e-04, -4.7716e-03, -2.0889e-02, 1.0995e-02]],
[[-2.7908e-02, 3.3644e-02, 5.9671e-03, -1.6862e-02, -3.5059e-02], [[-2.7908e-02, 3.3644e-02, 5.9671e-03, -1.6862e-02, -3.5059e-02],
[ 1.6676e-02, 1.2755e-02, 1.5726e-02, -3.4257e-03, 2.4924e-02], [ 1.6676e-02, 1.2755e-02, 1.5726e-02, -3.4257e-03, 2.4924e-02],
[-2.3684e-02, 2.5474e-02, -1.5754e-02, -2.3486e-02, -2.9507e-03], [-2.3684e-02, 2.5474e-02, -1.5754e-02, -2.3486e-02, -2.9507e-03],
[ 1.2314e-02, 2.9314e-02, 3.3555e-02, 1.8871e-02, 1.8055e-02], [ 1.2314e-02, 2.9314e-02, 3.3555e-02, 1.8871e-02, 1.8055e-02],
[-1.4242e-02, 2.5115e-02, -2.2893e-02, -2.3078e-02, 1.8173e-02]], [-1.4242e-02, 2.5115e-02, -2.2893e-02, -2.3078e-02, 1.8173e-02]],
[[ 5.5846e-03, 2.1291e-03, 5.2495e-03, 1.2620e-03, -2.2444e-02], [[ 5.5846e-03, 2.1291e-03, 5.2495e-03, 1.2620e-03, -2.2444e-02],
[-1.6137e-02, 2.2707e-03, -2.3414e-02, -2.6367e-02, 1.8329e-02], [-1.6137e-02, 2.2707e-03, -2.3414e-02, -2.6367e-02, 1.8329e-02],
[-2.9248e-02, -4.8049e-03, -1.7246e-02, -1.4117e-02, -2.3186e-02], [-2.9248e-02, -4.8049e-03, -1.7246e-02, -1.4117e-02, -2.3186e-02],
[ 2.5687e-02, 3.1482e-02, -3.5057e-02, -2.6718e-02, -2.1352e-02], [ 2.5687e-02, 3.1482e-02, -3.5057e-02, -2.6718e-02, -2.1352e-02],
[ 6.3401e-03, 2.7100e-02, -8.5221e-04, -3.0839e-02, 3.2441e-02]], [ 6.3401e-03, 2.7100e-02, -8.5221e-04, -3.0839e-02, 3.2441e-02]],
..., ...,
[[-1.4497e-02, -1.4756e-02, 2.8516e-02, 6.1750e-03, -2.4559e-02], [[-1.4497e-02, -1.4756e-02, 2.8516e-02, 6.1750e-03, -2.4559e-02],
[-3.5086e-02, -2.1854e-02, -1.5571e-02, -1.5009e-02, 9.1230e-03], [-3.5086e-02, -2.1854e-02, -1.5571e-02, -1.5009e-02, 9.1230e-03],
[ 1.7028e-02, 8.5996e-03, 1.8000e-02, 9.9304e-03, 1.4644e-02], [ 1.7028e-02, 8.5996e-03, 1.8000e-02, 9.9304e-03, 1.4644e-02],
[ 2.5854e-02, -9.1312e-03, -1.5623e-02, 2.3103e-02, 1.4050e-02], [ 2.5854e-02, -9.1312e-03, -1.5623e-02, 2.3103e-02, 1.4050e-02],
[-2.2471e-02, 1.3548e-02, 1.4819e-02, -3.2382e-02, -3.3049e-02]], [-2.2471e-02, 1.3548e-02, 1.4819e-02, -3.2382e-02, -3.3049e-02]],
[[ 3.0104e-02, 4.6211e-03, -2.6502e-02, 8.1366e-04, -1.1328e-02], [[ 3.0104e-02, 4.6211e-03, -2.6502e-02, 8.1366e-04, -1.1328e-02],
[ 1.8561e-02, -1.8155e-02, 2.2092e-02, -1.9655e-02, 8.1872e-03], [ 1.8561e-02, -1.8155e-02, 2.2092e-02, -1.9655e-02, 8.1872e-03],
[ 1.3457e-02, -2.8087e-02, -2.1339e-02, 4.8732e-03, -3.0627e-02], [ 1.3457e-02, -2.8087e-02, -2.1339e-02, 4.8732e-03, -3.0627e-02],
[ 3.2684e-02, 3.3806e-02, -3.2461e-02, 1.3585e-02, -1.0189e-02], [ 3.2684e-02, 3.3806e-02, -3.2461e-02, 1.3585e-02, -1.0189e-02],
[ 3.5266e-02, 9.2128e-03, 1.3231e-02, -1.6795e-02, -7.2222e-03]], [ 3.5266e-02, 9.2128e-03, 1.3231e-02, -1.6795e-02, -7.2222e-03]],
[[-1.7422e-02, 2.3825e-02, -3.3179e-02, 2.6211e-02, -9.6366e-03], [[-1.7422e-02, 2.3825e-02, -3.3179e-02, 2.6211e-02, -9.6366e-03],
[-1.6651e-02, 2.9791e-02, -3.0735e-02, 1.5957e-02, -5.1611e-03], [-1.6651e-02, 2.9791e-02, -3.0735e-02, 1.5957e-02, -5.1611e-03],
[-2.5785e-02, 8.2464e-03, -2.6953e-02, -1.0548e-02, -3.5286e-02], [-2.5785e-02, 8.2464e-03, -2.6953e-02, -1.0548e-02, -3.5286e-02],
[-6.2504e-03, -1.3809e-02, -3.2342e-02, -5.4005e-03, 1.3821e-02], [-6.2504e-03, -1.3809e-02, -3.2342e-02, -5.4005e-03, 1.3821e-02],
[ 1.0036e-02, -2.3806e-02, -2.7316e-03, 8.0372e-03, -1.7473e-02]]], [ 1.0036e-02, -2.3806e-02, -2.7316e-03, 8.0372e-03, -1.7473e-02]]],
[[[-1.5131e-03, 2.1498e-02, 2.2542e-02, 7.8614e-04, 8.0929e-03], [[[-1.5131e-03, 2.1498e-02, 2.2542e-02, 7.8614e-04, 8.0929e-03],
[-2.2980e-02, -3.3852e-02, -2.1393e-02, -3.3051e-02, 1.5066e-02], [-2.2980e-02, -3.3852e-02, -2.1393e-02, -3.3051e-02, 1.5066e-02],
[ 1.2730e-02, 2.4478e-02, 2.8772e-02, 2.8062e-02, -3.9068e-03], [ 1.2730e-02, 2.4478e-02, 2.8772e-02, 2.8062e-02, -3.9068e-03],
[-9.3188e-03, -3.2090e-03, 1.1349e-03, -4.4513e-03, -2.2566e-02], [-9.3188e-03, -3.2090e-03, 1.1349e-03, -4.4513e-03, -2.2566e-02],
[ 1.5475e-02, -2.4088e-02, 4.7109e-03, -1.9576e-02, 2.5590e-03]], [ 1.5475e-02, -2.4088e-02, 4.7109e-03, -1.9576e-02, 2.5590e-03]],
[[ 3.4533e-02, 1.5805e-02, -8.1290e-03, -2.9473e-02, -2.7830e-02], [[ 3.4533e-02, 1.5805e-02, -8.1290e-03, -2.9473e-02, -2.7830e-02],
[-1.1189e-03, -2.3883e-02, -2.2408e-02, 8.0706e-03, -2.0912e-02], [-1.1189e-03, -2.3883e-02, -2.2408e-02, 8.0706e-03, -2.0912e-02],
[-6.0190e-03, 1.8530e-02, 3.1111e-02, 8.7140e-03, 1.4883e-02], [-6.0190e-03, 1.8530e-02, 3.1111e-02, 8.7140e-03, 1.4883e-02],
[ 1.0062e-02, -1.3259e-02, 7.0008e-03, -3.3429e-02, -2.5512e-02], [ 1.0062e-02, -1.3259e-02, 7.0008e-03, -3.3429e-02, -2.5512e-02],
[-3.3497e-02, 1.3196e-02, -2.5707e-02, 3.1772e-02, 3.0991e-03]], [-3.3497e-02, 1.3196e-02, -2.5707e-02, 3.1772e-02, 3.0991e-03]],
[[-1.4290e-02, 1.0892e-03, 2.0707e-02, 8.9846e-03, 2.7689e-02], [[-1.4290e-02, 1.0892e-03, 2.0707e-02, 8.9846e-03, 2.7689e-02],
[ 3.4180e-02, -2.9902e-02, -1.7447e-02, -2.6233e-02, -1.4028e-02], [ 3.4180e-02, -2.9902e-02, -1.7447e-02, -2.6233e-02, -1.4028e-02],
[-1.2295e-02, -2.0546e-02, -9.8043e-03, -2.7490e-02, -3.0622e-02], [-1.2295e-02, -2.0546e-02, -9.8043e-03, -2.7490e-02, -3.0622e-02],
[ 4.3674e-04, 2.9490e-02, 3.5398e-03, -4.6837e-03, 2.4393e-02], [ 4.3674e-04, 2.9490e-02, 3.5398e-03, -4.6837e-03, 2.4393e-02],
[-1.4666e-03, 1.1251e-02, -2.6159e-02, -2.1956e-02, -2.7467e-02]], [-1.4666e-03, 1.1251e-02, -2.6159e-02, -2.1956e-02, -2.7467e-02]],
..., ...,
[[-2.3644e-02, 3.0802e-02, -1.8002e-02, -3.0707e-02, -2.2172e-02], [[-2.3644e-02, 3.0802e-02, -1.8002e-02, -3.0707e-02, -2.2172e-02],
[-3.1019e-02, -1.9739e-02, -9.3868e-03, 7.9575e-03, -2.2624e-02], [-3.1019e-02, -1.9739e-02, -9.3868e-03, 7.9575e-03, -2.2624e-02],
[ 3.4551e-02, 2.0078e-02, -3.3876e-02, 3.1283e-02, 3.3183e-02], [ 3.4551e-02, 2.0078e-02, -3.3876e-02, 3.1283e-02, 3.3183e-02],
[-3.3918e-02, -2.3870e-02, -2.3575e-02, -2.9056e-02, -2.8837e-02], [-3.3918e-02, -2.3870e-02, -2.3575e-02, -2.9056e-02, -2.8837e-02],
[-2.5986e-02, -3.4845e-02, 7.3302e-03, -1.4608e-02, 1.0884e-02]], [-2.5986e-02, -3.4845e-02, 7.3302e-03, -1.4608e-02, 1.0884e-02]],
[[-1.4381e-04, -3.4048e-02, -2.3154e-02, -6.6671e-04, -1.3698e-03], [[-1.4381e-04, -3.4048e-02, -2.3154e-02, -6.6671e-04, -1.3698e-03],
[-2.2083e-02, -7.6509e-03, 1.5926e-02, 3.3870e-02, -8.7966e-03], [-2.2083e-02, -7.6509e-03, 1.5926e-02, 3.3870e-02, -8.7966e-03],
[ 1.0156e-02, -1.1124e-02, 3.2173e-02, -1.7401e-02, -1.3916e-02], [ 1.0156e-02, -1.1124e-02, 3.2173e-02, -1.7401e-02, -1.3916e-02],
[ 2.8669e-02, 7.6822e-03, -4.2535e-04, 2.8009e-02, 2.8256e-02], [ 2.8669e-02, 7.6822e-03, -4.2535e-04, 2.8009e-02, 2.8256e-02],
[-1.7353e-02, 2.9599e-02, 2.0556e-02, 2.6827e-02, -1.6913e-02]], [-1.7353e-02, 2.9599e-02, 2.0556e-02, 2.6827e-02, -1.6913e-02]],
[[-1.5121e-02, -2.5019e-02, 9.3863e-04, 1.8527e-02, 2.6718e-02], [[-1.5121e-02, -2.5019e-02, 9.3863e-04, 1.8527e-02, 2.6718e-02],
[-1.2542e-02, -9.8915e-03, -3.3949e-02, 2.0921e-02, 1.1817e-02], [-1.2542e-02, -9.8915e-03, -3.3949e-02, 2.0921e-02, 1.1817e-02],
[-3.3119e-02, 1.9118e-02, 2.8166e-02, 1.4008e-02, 7.0924e-03], [-3.3119e-02, 1.9118e-02, 2.8166e-02, 1.4008e-02, 7.0924e-03],
[-8.0425e-04, 4.4199e-03, 4.9465e-03, 6.0860e-03, -9.1247e-03], [-8.0425e-04, 4.4199e-03, 4.9465e-03, 6.0860e-03, -9.1247e-03],
[-8.3654e-03, 2.9056e-02, 2.2258e-02, -3.4614e-02, 2.7133e-02]]], [-8.3654e-03, 2.9056e-02, 2.2258e-02, -3.4614e-02, 2.7133e-02]]],
..., ...,
[[[ 2.4745e-02, 2.5447e-02, 7.8521e-03, 2.7326e-02, 4.1176e-03], [[[ 2.4745e-02, 2.5447e-02, 7.8521e-03, 2.7326e-02, 4.1176e-03],
[-9.2149e-03, -2.6898e-02, 1.9378e-02, -1.9195e-02, -3.2935e-02], [-9.2149e-03, -2.6898e-02, 1.9378e-02, -1.9195e-02, -3.2935e-02],
[-3.1293e-04, -5.0356e-03, -2.7392e-02, -2.9394e-02, 3.3510e-02], [-3.1293e-04, -5.0356e-03, -2.7392e-02, -2.9394e-02, 3.3510e-02],
[ 2.8223e-02, 3.3830e-02, 1.9529e-02, 2.8768e-02, -3.0616e-02], [ 2.8223e-02, 3.3830e-02, 1.9529e-02, 2.8768e-02, -3.0616e-02],
[ 9.7381e-03, 3.0267e-02, -2.2226e-02, -2.2444e-03, -3.5101e-02]], [ 9.7381e-03, 3.0267e-02, -2.2226e-02, -2.2444e-03, -3.5101e-02]],
[[-1.6917e-02, 1.1137e-02, 1.9355e-02, -1.4685e-02, 3.0447e-02], [[-1.6917e-02, 1.1137e-02, 1.9355e-02, -1.4685e-02, 3.0447e-02],
[-3.4133e-02, -1.8560e-02, 1.5245e-02, 9.2734e-03, 1.3745e-02], [-3.4133e-02, -1.8560e-02, 1.5245e-02, 9.2734e-03, 1.3745e-02],
[-2.7426e-02, 1.0003e-02, -2.2804e-02, 1.2485e-02, -1.3537e-02], [-2.7426e-02, 1.0003e-02, -2.2804e-02, 1.2485e-02, -1.3537e-02],
[-6.3152e-03, 3.4890e-02, 2.8487e-02, -2.7528e-02, 2.7217e-02], [-6.3152e-03, 3.4890e-02, 2.8487e-02, -2.7528e-02, 2.7217e-02],
[ 1.2584e-02, 3.0535e-02, 1.0904e-02, -2.9158e-02, 2.5303e-02]], [ 1.2584e-02, 3.0535e-02, 1.0904e-02, -2.9158e-02, 2.5303e-02]],
[[ 2.6163e-02, 2.8920e-02, 3.4083e-02, 1.2934e-02, -3.4496e-02], [[ 2.6163e-02, 2.8920e-02, 3.4083e-02, 1.2934e-02, -3.4496e-02],
[ 3.2103e-02, 7.7680e-03, 2.3343e-02, 2.5724e-02, -2.5661e-02], [ 3.2103e-02, 7.7680e-03, 2.3343e-02, 2.5724e-02, -2.5661e-02],
[-1.9408e-02, 1.8801e-02, 2.1603e-02, 3.1151e-02, 1.0190e-03], [-1.9408e-02, 1.8801e-02, 2.1603e-02, 3.1151e-02, 1.0190e-03],
[ 2.7702e-02, -2.0514e-02, -1.2836e-02, -2.3691e-02, 2.1812e-03], [ 2.7702e-02, -2.0514e-02, -1.2836e-02, -2.3691e-02, 2.1812e-03],
[-1.8351e-02, 2.5027e-02, -3.0688e-02, 4.3821e-03, 1.9501e-02]], [-1.8351e-02, 2.5027e-02, -3.0688e-02, 4.3821e-03, 1.9501e-02]],
..., ...,
[[-2.8270e-02, -2.3117e-02, -1.6419e-02, -3.4316e-02, -2.0557e-02], [[-2.8270e-02, -2.3117e-02, -1.6419e-02, -3.4316e-02, -2.0557e-02],
[-5.8688e-03, 2.0617e-03, 9.5386e-03, 1.3106e-02, 2.9971e-02], [-5.8688e-03, 2.0617e-03, 9.5386e-03, 1.3106e-02, 2.9971e-02],
[ 3.3426e-02, -2.6644e-02, 1.8090e-02, 7.3730e-03, -6.1319e-03], [ 3.3426e-02, -2.6644e-02, 1.8090e-02, 7.3730e-03, -6.1319e-03],
[-2.1524e-03, 6.3283e-03, 2.8402e-02, 1.5039e-02, 2.4081e-02], [-2.1524e-03, 6.3283e-03, 2.8402e-02, 1.5039e-02, 2.4081e-02],
[ 3.0103e-02, -1.2804e-02, -9.0004e-03, 1.6461e-02, -3.5154e-02]], [ 3.0103e-02, -1.2804e-02, -9.0004e-03, 1.6461e-02, -3.5154e-02]],
[[ 4.9777e-03, -8.0093e-03, -3.4668e-02, -8.8671e-03, -1.6355e-02], [[ 4.9777e-03, -8.0093e-03, -3.4668e-02, -8.8671e-03, -1.6355e-02],
[ 2.3159e-02, -3.2046e-02, -2.8806e-02, 3.3175e-02, -1.3727e-02], [ 2.3159e-02, -3.2046e-02, -2.8806e-02, 3.3175e-02, -1.3727e-02],
[ 2.8510e-02, -1.3080e-02, 3.2749e-02, 5.1974e-03, -1.6185e-02], [ 2.8510e-02, -1.3080e-02, 3.2749e-02, 5.1974e-03, -1.6185e-02],
[-3.3439e-02, -1.7652e-02, 3.2421e-02, 3.0137e-02, 7.4495e-03], [-3.3439e-02, -1.7652e-02, 3.2421e-02, 3.0137e-02, 7.4495e-03],
[-3.7330e-03, -7.3890e-03, -8.5357e-03, 2.2519e-02, -1.9516e-02]], [-3.7330e-03, -7.3890e-03, -8.5357e-03, 2.2519e-02, -1.9516e-02]],
[[ 6.3844e-03, -2.5492e-02, -1.6749e-02, -1.2231e-02, -1.0527e-02], [[ 6.3844e-03, -2.5492e-02, -1.6749e-02, -1.2231e-02, -1.0527e-02],
[-1.2054e-02, 3.7704e-03, 1.1407e-02, 1.4060e-02, -1.6531e-02], [-1.2054e-02, 3.7704e-03, 1.1407e-02, 1.4060e-02, -1.6531e-02],
[-1.1867e-02, 1.8624e-02, -1.1234e-02, -2.0407e-02, -1.2726e-02], [-1.1867e-02, 1.8624e-02, -1.1234e-02, -2.0407e-02, -1.2726e-02],
[-1.5739e-02, 1.7892e-02, -1.1010e-02, 1.8143e-02, 1.7330e-03], [-1.5739e-02, 1.7892e-02, -1.1010e-02, 1.8143e-02, 1.7330e-03],
[-2.6158e-04, -2.3484e-02, 2.8945e-02, -6.6862e-03, 3.0367e-02]]], [-2.6158e-04, -2.3484e-02, 2.8945e-02, -6.6862e-03, 3.0367e-02]]],
[[[-3.4974e-02, -2.3324e-02, 1.6922e-02, 2.8184e-02, -3.1625e-02], [[[-3.4974e-02, -2.3324e-02, 1.6922e-02, 2.8184e-02, -3.1625e-02],
[-8.9075e-03, -3.1058e-02, -6.1406e-03, 9.4089e-03, -1.1909e-02], [-8.9075e-03, -3.1058e-02, -6.1406e-03, 9.4089e-03, -1.1909e-02],
[-7.0569e-03, 2.3088e-02, 7.7149e-03, 2.7768e-02, 2.4761e-02], [-7.0569e-03, 2.3088e-02, 7.7149e-03, 2.7768e-02, 2.4761e-02],
[ 2.9902e-02, 7.4964e-03, -2.7370e-02, -1.8732e-02, -3.2662e-02], [ 2.9902e-02, 7.4964e-03, -2.7370e-02, -1.8732e-02, -3.2662e-02],
[-2.2868e-02, -2.0314e-02, -6.4008e-03, -6.3764e-03, 2.2569e-02]], [-2.2868e-02, -2.0314e-02, -6.4008e-03, -6.3764e-03, 2.2569e-02]],
[[-7.0775e-03, -4.1908e-03, 2.1538e-02, 1.4809e-02, -2.0912e-02], [[-7.0775e-03, -4.1908e-03, 2.1538e-02, 1.4809e-02, -2.0912e-02],
[ 2.1704e-02, 6.8497e-03, -3.4349e-02, -9.2698e-03, 4.3580e-03], [ 2.1704e-02, 6.8497e-03, -3.4349e-02, -9.2698e-03, 4.3580e-03],
[ 1.0022e-02, -2.7300e-03, -9.4724e-03, -7.5713e-04, 1.1330e-02], [ 1.0022e-02, -2.7300e-03, -9.4724e-03, -7.5713e-04, 1.1330e-02],
[-3.1236e-02, 3.3219e-02, 2.8622e-03, -2.5236e-02, 2.8182e-02], [-3.1236e-02, 3.3219e-02, 2.8622e-03, -2.5236e-02, 2.8182e-02],
[-1.1205e-02, 2.5173e-02, 1.3256e-02, -2.3062e-02, 1.1284e-02]], [-1.1205e-02, 2.5173e-02, 1.3256e-02, -2.3062e-02, 1.1284e-02]],
[[-9.4235e-03, -4.8488e-03, -2.8878e-02, 9.0096e-04, -1.7386e-02], [[-9.4235e-03, -4.8488e-03, -2.8878e-02, 9.0096e-04, -1.7386e-02],
[ 2.0669e-02, -2.1336e-02, 8.0332e-03, 1.7914e-02, -1.0405e-02], [ 2.0669e-02, -2.1336e-02, 8.0332e-03, 1.7914e-02, -1.0405e-02],
[-1.1635e-02, 3.3581e-02, -3.2376e-02, 2.9582e-02, 2.3664e-02], [-1.1635e-02, 3.3581e-02, -3.2376e-02, 2.9582e-02, 2.3664e-02],
[ 3.1593e-02, 3.4160e-02, -2.6020e-02, 8.8575e-03, 3.0042e-02], [ 3.1593e-02, 3.4160e-02, -2.6020e-02, 8.8575e-03, 3.0042e-02],
[ 2.8393e-02, -2.4792e-02, 3.2831e-02, -4.0711e-04, -3.4198e-02]], [ 2.8393e-02, -2.4792e-02, 3.2831e-02, -4.0711e-04, -3.4198e-02]],
..., ...,
[[ 1.0640e-02, -3.4773e-03, 1.7848e-02, 2.3339e-02, 6.5579e-03], [[ 1.0640e-02, -3.4773e-03, 1.7848e-02, 2.3339e-02, 6.5579e-03],
[-3.3516e-02, -9.5260e-03, 3.2948e-02, -1.4124e-02, -1.5335e-02], [-3.3516e-02, -9.5260e-03, 3.2948e-02, -1.4124e-02, -1.5335e-02],
[-2.6022e-02, -2.4680e-02, -2.5792e-02, -3.4441e-02, 5.2088e-03], [-2.6022e-02, -2.4680e-02, -2.5792e-02, -3.4441e-02, 5.2088e-03],
[ 1.1213e-02, -3.2379e-03, 2.8332e-02, 3.4966e-02, 1.9965e-02], [ 1.1213e-02, -3.2379e-03, 2.8332e-02, 3.4966e-02, 1.9965e-02],
[-6.4067e-05, 1.3670e-02, 3.2867e-03, 8.6834e-03, -1.4342e-02]], [-6.4067e-05, 1.3670e-02, 3.2867e-03, 8.6834e-03, -1.4342e-02]],
[[ 1.2845e-02, 1.6605e-03, -1.9191e-02, 1.9487e-02, -1.7243e-02], [[ 1.2845e-02, 1.6605e-03, -1.9191e-02, 1.9487e-02, -1.7243e-02],
[ 5.8301e-03, 1.1036e-02, -2.3355e-02, 1.0064e-02, 1.4885e-03], [ 5.8301e-03, 1.1036e-02, -2.3355e-02, 1.0064e-02, 1.4885e-03],
[ 3.3737e-02, 2.7396e-02, 6.1884e-03, 2.8059e-03, 1.3686e-02], [ 3.3737e-02, 2.7396e-02, 6.1884e-03, 2.8059e-03, 1.3686e-02],
[-1.2152e-02, -5.8316e-04, -3.1750e-03, 8.5284e-03, 1.3943e-02], [-1.2152e-02, -5.8316e-04, -3.1750e-03, 8.5284e-03, 1.3943e-02],
[-1.9356e-02, -1.0950e-04, -3.4086e-02, -7.5568e-04, 1.3694e-03]], [-1.9356e-02, -1.0950e-04, -3.4086e-02, -7.5568e-04, 1.3694e-03]],
[[ 1.8200e-02, 2.1633e-02, -2.4332e-02, 2.3583e-02, 2.9063e-02], [[ 1.8200e-02, 2.1633e-02, -2.4332e-02, 2.3583e-02, 2.9063e-02],
[-1.3286e-02, 2.9338e-03, 6.6651e-03, 2.9277e-03, 1.3872e-02], [-1.3286e-02, 2.9338e-03, 6.6651e-03, 2.9277e-03, 1.3872e-02],
[-2.3295e-02, 2.3481e-02, 6.5986e-03, 2.2366e-02, -3.5175e-02], [-2.3295e-02, 2.3481e-02, 6.5986e-03, 2.2366e-02, -3.5175e-02],
[-4.7833e-03, -1.3990e-02, 1.0363e-02, -3.1975e-02, 4.7487e-03], [-4.7833e-03, -1.3990e-02, 1.0363e-02, -3.1975e-02, 4.7487e-03],
[ 3.5129e-02, -3.8194e-03, -1.7300e-02, -1.9970e-02, 2.8379e-02]]], [ 3.5129e-02, -3.8194e-03, -1.7300e-02, -1.9970e-02, 2.8379e-02]]],
[[[ 1.4019e-02, 4.9214e-03, 1.0111e-02, -3.1889e-02, -2.7347e-02], [[[ 1.4019e-02, 4.9214e-03, 1.0111e-02, -3.1889e-02, -2.7347e-02],
[-8.1394e-03, -1.3291e-02, 2.4524e-02, -2.3494e-02, -2.0170e-03], [-8.1394e-03, -1.3291e-02, 2.4524e-02, -2.3494e-02, -2.0170e-03],
[-2.7336e-02, -1.9178e-02, -1.8323e-02, -8.8052e-03, 1.3123e-02], [-2.7336e-02, -1.9178e-02, -1.8323e-02, -8.8052e-03, 1.3123e-02],
[-3.5069e-02, -2.1669e-02, -1.1426e-02, 3.0186e-02, 1.9609e-02], [-3.5069e-02, -2.1669e-02, -1.1426e-02, 3.0186e-02, 1.9609e-02],
[-2.8254e-02, 1.6397e-02, -1.5657e-02, -2.0507e-02, -2.3217e-02]], [-2.8254e-02, 1.6397e-02, -1.5657e-02, -2.0507e-02, -2.3217e-02]],
[[ 8.5709e-03, -2.3691e-02, 1.7627e-02, 3.5223e-02, 2.6195e-03], [[ 8.5709e-03, -2.3691e-02, 1.7627e-02, 3.5223e-02, 2.6195e-03],
[-3.0899e-02, -3.4378e-02, 6.3869e-03, 3.2724e-03, -3.1556e-02], [-3.0899e-02, -3.4378e-02, 6.3869e-03, 3.2724e-03, -3.1556e-02],
[ 2.6143e-02, -2.5520e-02, -1.4755e-02, 3.2573e-02, 2.3502e-02], [ 2.6143e-02, -2.5520e-02, -1.4755e-02, 3.2573e-02, 2.3502e-02],
[-1.6250e-02, -4.1612e-03, -1.8213e-02, 3.1232e-02, 2.3861e-02], [-1.6250e-02, -4.1612e-03, -1.8213e-02, 3.1232e-02, 2.3861e-02],
[-2.1876e-02, -1.5494e-02, 3.0345e-03, 1.4424e-02, 2.9874e-02]], [-2.1876e-02, -1.5494e-02, 3.0345e-03, 1.4424e-02, 2.9874e-02]],
[[-2.5268e-02, 2.2410e-02, 2.3594e-02, 2.0579e-02, 1.6479e-02], [[-2.5268e-02, 2.2410e-02, 2.3594e-02, 2.0579e-02, 1.6479e-02],
[ 2.8202e-02, 3.0909e-03, -2.5594e-02, 2.6350e-02, 2.8560e-02], [ 2.8202e-02, 3.0909e-03, -2.5594e-02, 2.6350e-02, 2.8560e-02],
[-2.0345e-02, 3.2249e-02, 6.1006e-03, 2.1407e-02, -1.6071e-02], [-2.0345e-02, 3.2249e-02, 6.1006e-03, 2.1407e-02, -1.6071e-02],
[ 1.8856e-02, 2.9315e-02, -2.6081e-02, -1.4907e-02, -2.5855e-02], [ 1.8856e-02, 2.9315e-02, -2.6081e-02, -1.4907e-02, -2.5855e-02],
[ 4.3871e-03, -1.1070e-02, -4.8881e-03, 1.8845e-02, -1.4769e-03]], [ 4.3871e-03, -1.1070e-02, -4.8881e-03, 1.8845e-02, -1.4769e-03]],
..., ...,
[[ 1.2379e-02, 7.2388e-03, -1.4122e-02, 1.8673e-02, 1.8472e-02], [[ 1.2379e-02, 7.2388e-03, -1.4122e-02, 1.8673e-02, 1.8472e-02],
[-3.0360e-02, -2.3880e-02, -3.0080e-03, 1.0441e-02, 4.9348e-03], [-3.0360e-02, -2.3880e-02, -3.0080e-03, 1.0441e-02, 4.9348e-03],
[-3.0038e-02, 2.0413e-02, 2.8522e-02, -2.0159e-02, -1.8271e-02], [-3.0038e-02, 2.0413e-02, 2.8522e-02, -2.0159e-02, -1.8271e-02],
[-1.3495e-02, 9.2360e-03, -2.6240e-02, -2.5488e-02, 2.7568e-02], [-1.3495e-02, 9.2360e-03, -2.6240e-02, -2.5488e-02, 2.7568e-02],
[-1.5597e-02, -2.8057e-02, -2.9999e-02, -1.4414e-02, -4.7575e-03]], [-1.5597e-02, -2.8057e-02, -2.9999e-02, -1.4414e-02, -4.7575e-03]],
[[ 1.8291e-02, -1.6408e-02, -1.8680e-02, -2.2817e-02, -3.1854e-02], [[ 1.8291e-02, -1.6408e-02, -1.8680e-02, -2.2817e-02, -3.1854e-02],
[-2.9158e-02, 3.5350e-02, -6.7355e-04, 7.1955e-03, -2.8063e-02], [-2.9158e-02, 3.5350e-02, -6.7355e-04, 7.1955e-03, -2.8063e-02],
[ 1.7034e-02, 7.2418e-04, -1.2859e-02, 2.6168e-02, 1.9541e-02], [ 1.7034e-02, 7.2418e-04, -1.2859e-02, 2.6168e-02, 1.9541e-02],
[-3.5163e-03, 1.3653e-02, -2.5039e-02, 1.7310e-02, 1.2980e-02], [-3.5163e-03, 1.3653e-02, -2.5039e-02, 1.7310e-02, 1.2980e-02],
[-3.1598e-02, 1.0932e-02, -1.6623e-02, 2.7386e-03, -1.7496e-02]], [-3.1598e-02, 1.0932e-02, -1.6623e-02, 2.7386e-03, -1.7496e-02]],
[[ 3.2967e-03, 9.0435e-03, -6.3722e-03, 6.3557e-06, -1.6582e-04], [[ 3.2967e-03, 9.0435e-03, -6.3722e-03, 6.3557e-06, -1.6582e-04],
[-1.2603e-02, 2.8532e-02, 2.7132e-02, -3.3654e-02, -2.3288e-02], [-1.2603e-02, 2.8532e-02, 2.7132e-02, -3.3654e-02, -2.3288e-02],
[-3.3183e-02, -2.8453e-02, -3.4745e-02, 1.1531e-03, 1.3664e-02], [-3.3183e-02, -2.8453e-02, -3.4745e-02, 1.1531e-03, 1.3664e-02],
[ 3.2060e-02, 2.5101e-02, 2.1017e-02, 5.9951e-03, 2.6360e-03], [ 3.2060e-02, 2.5101e-02, 2.1017e-02, 5.9951e-03, 2.6360e-03],
[-2.0098e-02, 9.6565e-03, 1.4130e-02, -2.1930e-02, -1.9950e-03]]]]), tensor([ 0.0258, -0.0098, 0.0090, -0.0266, -0.0003, 0.0192, -0.0012, -0.0049, [-2.0098e-02, 9.6565e-03, 1.4130e-02, -2.1930e-02, -1.9950e-03]]]]), tensor([ 0.0258, -0.0098, 0.0090, -0.0266, -0.0003, 0.0192, -0.0012, -0.0049,
0.0099, 0.0069, 0.0019, 0.0319, 0.0208, 0.0209, -0.0220, -0.0238, 0.0099, 0.0069, 0.0019, 0.0319, 0.0208, 0.0209, -0.0220, -0.0238,
0.0177, -0.0307, 0.0308, 0.0066, -0.0072, -0.0034, 0.0152, 0.0293, 0.0177, -0.0307, 0.0308, 0.0066, -0.0072, -0.0034, 0.0152, 0.0293,
0.0169, 0.0327, -0.0059, 0.0043, 0.0057, 0.0174, -0.0194, -0.0216, 0.0169, 0.0327, -0.0059, 0.0043, 0.0057, 0.0174, -0.0194, -0.0216,
-0.0226, -0.0270, -0.0041, 0.0242, -0.0233, 0.0118, 0.0283, -0.0090, -0.0226, -0.0270, -0.0041, 0.0242, -0.0233, 0.0118, 0.0283, -0.0090,
-0.0180, -0.0280, -0.0086, -0.0122, 0.0003, -0.0277, -0.0038, 0.0353, -0.0180, -0.0280, -0.0086, -0.0122, 0.0003, -0.0277, -0.0038, 0.0353,
-0.0082, -0.0109, -0.0251, 0.0068, 0.0323, 0.0098, -0.0314, 0.0030, -0.0082, -0.0109, -0.0251, 0.0068, 0.0323, 0.0098, -0.0314, 0.0030,
0.0113, 0.0020, -0.0341, 0.0107, 0.0091, 0.0199, -0.0185, 0.0127]), tensor([[-0.0081, -0.0059, -0.0143, ..., 0.0117, -0.0148, 0.0108], 0.0113, 0.0020, -0.0341, 0.0107, 0.0091, 0.0199, -0.0185, 0.0127]), tensor([[-0.0081, -0.0059, -0.0143, ..., 0.0117, -0.0148, 0.0108],
[ 0.0151, 0.0105, 0.0061, ..., 0.0053, -0.0170, 0.0006], [ 0.0151, 0.0105, 0.0061, ..., 0.0053, -0.0170, 0.0006],
[-0.0148, -0.0145, -0.0035, ..., 0.0020, -0.0044, -0.0102], [-0.0148, -0.0145, -0.0035, ..., 0.0020, -0.0044, -0.0102],
..., ...,
[-0.0045, 0.0173, -0.0167, ..., -0.0103, 0.0056, -0.0051], [-0.0045, 0.0173, -0.0167, ..., -0.0103, 0.0056, -0.0051],
[ 0.0080, 0.0030, -0.0087, ..., 0.0165, -0.0167, 0.0033], [ 0.0080, 0.0030, -0.0087, ..., 0.0165, -0.0167, 0.0033],
[ 0.0098, 0.0124, 0.0158, ..., -0.0083, -0.0151, 0.0059]]), tensor([ 0.0140, -0.0175, 0.0051, -0.0030, -0.0019, -0.0134, -0.0133, -0.0009, [ 0.0098, 0.0124, 0.0158, ..., -0.0083, -0.0151, 0.0059]]), tensor([ 0.0140, -0.0175, 0.0051, -0.0030, -0.0019, -0.0134, -0.0133, -0.0009,
0.0023, 0.0163, -0.0042, -0.0130, -0.0113, -0.0176, -0.0152, 0.0139, 0.0023, 0.0163, -0.0042, -0.0130, -0.0113, -0.0176, -0.0152, 0.0139,
0.0032, 0.0127, 0.0173, 0.0080, -0.0009, -0.0097, -0.0020, 0.0114, 0.0032, 0.0127, 0.0173, 0.0080, -0.0009, -0.0097, -0.0020, 0.0114,
0.0059, -0.0070, 0.0143, -0.0143, 0.0008, 0.0067, -0.0162, 0.0094, 0.0059, -0.0070, 0.0143, -0.0143, 0.0008, 0.0067, -0.0162, 0.0094,
0.0165, 0.0118, -0.0017, -0.0064, -0.0134, -0.0012, 0.0107, 0.0039, 0.0165, 0.0118, -0.0017, -0.0064, -0.0134, -0.0012, 0.0107, 0.0039,
-0.0173, -0.0093, -0.0028, -0.0070, 0.0075, 0.0142, 0.0102, 0.0140, -0.0173, -0.0093, -0.0028, -0.0070, 0.0075, 0.0142, 0.0102, 0.0140,
-0.0120, 0.0057, 0.0095, 0.0077, -0.0151, -0.0022, 0.0130, 0.0005, -0.0120, 0.0057, 0.0095, 0.0077, -0.0151, -0.0022, 0.0130, 0.0005,
0.0123, 0.0055, 0.0120, -0.0058, -0.0142, -0.0129, 0.0048, -0.0020, 0.0123, 0.0055, 0.0120, -0.0058, -0.0142, -0.0129, 0.0048, -0.0020,
0.0016, 0.0016, 0.0061, -0.0174, -0.0123, -0.0010, 0.0032, -0.0170, 0.0016, 0.0016, 0.0061, -0.0174, -0.0123, -0.0010, 0.0032, -0.0170,
0.0020, 0.0172, 0.0146, -0.0086, 0.0032, 0.0093, 0.0139, 0.0085, 0.0020, 0.0172, 0.0146, -0.0086, 0.0032, 0.0093, 0.0139, 0.0085,
0.0061, -0.0039, 0.0154, -0.0014, -0.0082, -0.0030, 0.0009, 0.0150, 0.0061, -0.0039, 0.0154, -0.0014, -0.0082, -0.0030, 0.0009, 0.0150,
0.0054, 0.0015, 0.0047, 0.0063, 0.0095, 0.0081, 0.0125, -0.0027, 0.0054, 0.0015, 0.0047, 0.0063, 0.0095, 0.0081, 0.0125, -0.0027,
0.0130, -0.0039, -0.0038, 0.0035, 0.0163, -0.0174, 0.0106, 0.0007, 0.0130, -0.0039, -0.0038, 0.0035, 0.0163, -0.0174, 0.0106, 0.0007,
0.0087, 0.0070, -0.0141, -0.0139, -0.0068, -0.0126, 0.0130, -0.0088, 0.0087, 0.0070, -0.0141, -0.0139, -0.0068, -0.0126, 0.0130, -0.0088,
0.0088, -0.0121, -0.0087, 0.0134, 0.0101, -0.0139, -0.0002, -0.0055, 0.0088, -0.0121, -0.0087, 0.0134, 0.0101, -0.0139, -0.0002, -0.0055,
0.0039, -0.0103, -0.0068, 0.0082, 0.0059, -0.0069, -0.0138, -0.0164, 0.0039, -0.0103, -0.0068, 0.0082, 0.0059, -0.0069, -0.0138, -0.0164,
-0.0138, 0.0156, 0.0020, 0.0154, -0.0147, -0.0134, 0.0118, 0.0084, -0.0138, 0.0156, 0.0020, 0.0154, -0.0147, -0.0134, 0.0118, 0.0084,
0.0071, 0.0033, 0.0166, -0.0116, -0.0003, 0.0178, -0.0074, -0.0174, 0.0071, 0.0033, 0.0166, -0.0116, -0.0003, 0.0178, -0.0074, -0.0174,
-0.0094, 0.0002, 0.0010, -0.0066, -0.0090, -0.0143, 0.0048, -0.0169, -0.0094, 0.0002, 0.0010, -0.0066, -0.0090, -0.0143, 0.0048, -0.0169,
-0.0163, -0.0105, -0.0090, -0.0105, -0.0167, -0.0037, 0.0024, 0.0075, -0.0163, -0.0105, -0.0090, -0.0105, -0.0167, -0.0037, 0.0024, 0.0075,
-0.0023, -0.0143, 0.0052, -0.0041, -0.0139, -0.0123, 0.0161, -0.0053, -0.0023, -0.0143, 0.0052, -0.0041, -0.0139, -0.0123, 0.0161, -0.0053,
-0.0059, 0.0012, -0.0077, -0.0133, -0.0012, -0.0127, 0.0005, -0.0130, -0.0059, 0.0012, -0.0077, -0.0133, -0.0012, -0.0127, 0.0005, -0.0130,
-0.0176, -0.0114, 0.0139, -0.0153, 0.0124, 0.0151, -0.0092, -0.0168, -0.0176, -0.0114, 0.0139, -0.0153, 0.0124, 0.0151, -0.0092, -0.0168,
-0.0070, 0.0114, -0.0125, 0.0129, -0.0066, 0.0150, 0.0046, 0.0043, -0.0070, 0.0114, -0.0125, 0.0129, -0.0066, 0.0150, 0.0046, 0.0043,
0.0111, 0.0087, -0.0170, -0.0174, -0.0116, 0.0087, 0.0146, 0.0078, 0.0111, 0.0087, -0.0170, -0.0174, -0.0116, 0.0087, 0.0146, 0.0078,
-0.0055, -0.0119, -0.0041, -0.0017, -0.0093, 0.0170, 0.0093, -0.0026, -0.0055, -0.0119, -0.0041, -0.0017, -0.0093, 0.0170, 0.0093, -0.0026,
-0.0117, 0.0008, -0.0084, -0.0079, -0.0071, -0.0148, -0.0170, 0.0022, -0.0117, 0.0008, -0.0084, -0.0079, -0.0071, -0.0148, -0.0170, 0.0022,
-0.0049, 0.0149, 0.0078, -0.0042, 0.0118, -0.0100, 0.0136, 0.0112, -0.0049, 0.0149, 0.0078, -0.0042, 0.0118, -0.0100, 0.0136, 0.0112,
-0.0024, 0.0149, -0.0113, -0.0128, 0.0030, 0.0005, -0.0037, 0.0042, -0.0024, 0.0149, -0.0113, -0.0128, 0.0030, 0.0005, -0.0037, 0.0042,
0.0034, -0.0113, 0.0158, 0.0071, 0.0074, -0.0136, -0.0155, 0.0028, 0.0034, -0.0113, 0.0158, 0.0071, 0.0074, -0.0136, -0.0155, 0.0028,
0.0136, 0.0177, -0.0068, 0.0063, 0.0021, -0.0085, 0.0044, 0.0164, 0.0136, 0.0177, -0.0068, 0.0063, 0.0021, -0.0085, 0.0044, 0.0164,
-0.0075, 0.0109, 0.0103, -0.0105, -0.0167, 0.0114, 0.0093, -0.0098, -0.0075, 0.0109, 0.0103, -0.0105, -0.0167, 0.0114, 0.0093, -0.0098,
0.0115, -0.0011, 0.0097, 0.0087, 0.0075, -0.0120, 0.0011, -0.0067, 0.0115, -0.0011, 0.0097, 0.0087, 0.0075, -0.0120, 0.0011, -0.0067,
0.0058, -0.0050, -0.0077, 0.0039, -0.0112, -0.0015, -0.0152, 0.0074, 0.0058, -0.0050, -0.0077, 0.0039, -0.0112, -0.0015, -0.0152, 0.0074,
-0.0085, 0.0085, 0.0060, -0.0059, -0.0125, -0.0065, -0.0160, 0.0137, -0.0085, 0.0085, 0.0060, -0.0059, -0.0125, -0.0065, -0.0160, 0.0137,
0.0145, -0.0097, -0.0032, 0.0059, -0.0137, -0.0119, 0.0141, -0.0012, 0.0145, -0.0097, -0.0032, 0.0059, -0.0137, -0.0119, 0.0141, -0.0012,
0.0105, -0.0023, 0.0089, -0.0011, 0.0122, 0.0144, -0.0047, 0.0127, 0.0105, -0.0023, 0.0089, -0.0011, 0.0122, 0.0144, -0.0047, 0.0127,
-0.0046, 0.0057, -0.0112, 0.0062, 0.0176, 0.0162, 0.0004, -0.0061, -0.0046, 0.0057, -0.0112, 0.0062, 0.0176, 0.0162, 0.0004, -0.0061,
-0.0010, -0.0111, -0.0037, -0.0083, 0.0007, -0.0071, 0.0121, -0.0104, -0.0010, -0.0111, -0.0037, -0.0083, 0.0007, -0.0071, 0.0121, -0.0104,
0.0169, -0.0023, -0.0051, -0.0170, -0.0123, 0.0005, -0.0006, -0.0099, 0.0169, -0.0023, -0.0051, -0.0170, -0.0123, 0.0005, -0.0006, -0.0099,
0.0049, -0.0009, 0.0176, 0.0136, -0.0070, 0.0170, -0.0169, 0.0073, 0.0049, -0.0009, 0.0176, 0.0136, -0.0070, 0.0170, -0.0169, 0.0073,
-0.0058, -0.0111, -0.0014, -0.0076, -0.0080, 0.0006, 0.0075, 0.0064, -0.0058, -0.0111, -0.0014, -0.0076, -0.0080, 0.0006, 0.0075, 0.0064,
-0.0090, -0.0023, 0.0173, 0.0155, -0.0130, 0.0142, 0.0103, -0.0153, -0.0090, -0.0023, 0.0173, 0.0155, -0.0130, 0.0142, 0.0103, -0.0153,
-0.0011, 0.0139, 0.0086, -0.0130, 0.0029, 0.0074, -0.0003, 0.0030, -0.0011, 0.0139, 0.0086, -0.0130, 0.0029, 0.0074, -0.0003, 0.0030,
-0.0004, 0.0141, -0.0008, -0.0158, 0.0103, -0.0035, 0.0115, -0.0030, -0.0004, 0.0141, -0.0008, -0.0158, 0.0103, -0.0035, 0.0115, -0.0030,
-0.0049, -0.0023, -0.0098, 0.0058, -0.0002, 0.0170, 0.0124, 0.0086, -0.0049, -0.0023, -0.0098, 0.0058, -0.0002, 0.0170, 0.0124, 0.0086,
-0.0128, -0.0120, 0.0043, 0.0066, 0.0069, -0.0064, 0.0058, -0.0016, -0.0128, -0.0120, 0.0043, 0.0066, 0.0069, -0.0064, 0.0058, -0.0016,
0.0105, 0.0117, 0.0121, -0.0135, 0.0078, -0.0032, -0.0041, 0.0038, 0.0105, 0.0117, 0.0121, -0.0135, 0.0078, -0.0032, -0.0041, 0.0038,
0.0156, 0.0073, 0.0012, 0.0168, -0.0071, -0.0160, 0.0059, 0.0089, 0.0156, 0.0073, 0.0012, 0.0168, -0.0071, -0.0160, 0.0059, 0.0089,
-0.0043, 0.0019, -0.0178, 0.0018, 0.0017, -0.0090, 0.0136, -0.0167, -0.0043, 0.0019, -0.0178, 0.0018, 0.0017, -0.0090, 0.0136, -0.0167,
0.0018, 0.0094, -0.0153, -0.0061, -0.0172, -0.0072, -0.0088, -0.0058, 0.0018, 0.0094, -0.0153, -0.0061, -0.0172, -0.0072, -0.0088, -0.0058,
0.0103, -0.0004, 0.0163, 0.0048, 0.0093, 0.0106, -0.0093, 0.0023, 0.0103, -0.0004, 0.0163, 0.0048, 0.0093, 0.0106, -0.0093, 0.0023,
0.0003, -0.0150, -0.0174, -0.0041, -0.0008, -0.0068, 0.0174, 0.0111, 0.0003, -0.0150, -0.0174, -0.0041, -0.0008, -0.0068, 0.0174, 0.0111,
0.0168, 0.0034, -0.0021, 0.0051, 0.0053, 0.0017, -0.0137, 0.0046, 0.0168, 0.0034, -0.0021, 0.0051, 0.0053, 0.0017, -0.0137, 0.0046,
-0.0005, -0.0071, -0.0112, 0.0125, 0.0107, -0.0157, -0.0031, -0.0079, -0.0005, -0.0071, -0.0112, 0.0125, 0.0107, -0.0157, -0.0031, -0.0079,
-0.0030, 0.0099, -0.0033, 0.0061, 0.0009, 0.0099, -0.0055, -0.0099, -0.0030, 0.0099, -0.0033, 0.0061, 0.0009, 0.0099, -0.0055, -0.0099,
0.0057, 0.0126, -0.0092, -0.0071, -0.0156, -0.0092, -0.0086, 0.0043, 0.0057, 0.0126, -0.0092, -0.0071, -0.0156, -0.0092, -0.0086, 0.0043,
0.0026, 0.0127, -0.0045, 0.0163, -0.0036, 0.0123, 0.0060, 0.0123, 0.0026, 0.0127, -0.0045, 0.0163, -0.0036, 0.0123, 0.0060, 0.0123,
-0.0040, 0.0074, -0.0113, -0.0036, -0.0157, -0.0103, -0.0140, 0.0143, -0.0040, 0.0074, -0.0113, -0.0036, -0.0157, -0.0103, -0.0140, 0.0143,
0.0035, -0.0015, -0.0158, -0.0157, -0.0079, 0.0111, 0.0134, 0.0094, 0.0035, -0.0015, -0.0158, -0.0157, -0.0079, 0.0111, 0.0134, 0.0094,
-0.0120, -0.0178, -0.0065, -0.0007, 0.0016, -0.0102, -0.0133, -0.0095, -0.0120, -0.0178, -0.0065, -0.0007, 0.0016, -0.0102, -0.0133, -0.0095,
-0.0143, -0.0050, -0.0029, -0.0060, -0.0126, 0.0008, 0.0079, 0.0073, -0.0143, -0.0050, -0.0029, -0.0060, -0.0126, 0.0008, 0.0079, 0.0073,
-0.0082, -0.0120, 0.0162, -0.0058, 0.0126, 0.0147, -0.0131, -0.0136, -0.0082, -0.0120, 0.0162, -0.0058, 0.0126, 0.0147, -0.0131, -0.0136,
-0.0013, -0.0102, -0.0065, -0.0032, -0.0055, 0.0150, -0.0159, -0.0142]), tensor([[-0.0060, -0.0238, -0.0167, ..., -0.0295, -0.0283, 0.0013], -0.0013, -0.0102, -0.0065, -0.0032, -0.0055, 0.0150, -0.0159, -0.0142]), tensor([[-0.0060, -0.0238, -0.0167, ..., -0.0295, -0.0283, 0.0013],
[ 0.0272, -0.0164, 0.0261, ..., 0.0342, 0.0118, -0.0295], [ 0.0272, -0.0164, 0.0261, ..., 0.0342, 0.0118, -0.0295],
[-0.0431, -0.0081, -0.0005, ..., 0.0110, 0.0334, -0.0054], [-0.0431, -0.0081, -0.0005, ..., 0.0110, 0.0334, -0.0054],
..., ...,
[-0.0257, -0.0054, 0.0206, ..., -0.0191, 0.0412, -0.0227], [-0.0257, -0.0054, 0.0206, ..., -0.0191, 0.0412, -0.0227],
[ 0.0334, -0.0221, 0.0043, ..., 0.0383, -0.0305, 0.0005], [ 0.0334, -0.0221, 0.0043, ..., 0.0383, -0.0305, 0.0005],
[-0.0314, 0.0262, 0.0313, ..., -0.0048, 0.0006, 0.0426]]), tensor([-0.0431, -0.0134, -0.0107, 0.0229, 0.0287, -0.0393, -0.0060, -0.0009, [-0.0314, 0.0262, 0.0313, ..., -0.0048, 0.0006, 0.0426]]), tensor([-0.0431, -0.0134, -0.0107, 0.0229, 0.0287, -0.0393, -0.0060, -0.0009,
0.0412, -0.0344, -0.0020, 0.0278, -0.0317, -0.0283, 0.0094, 0.0142, 0.0412, -0.0344, -0.0020, 0.0278, -0.0317, -0.0283, 0.0094, 0.0142,
-0.0097, -0.0215, -0.0283, -0.0332, -0.0333, 0.0352, -0.0130, -0.0117, -0.0097, -0.0215, -0.0283, -0.0332, -0.0333, 0.0352, -0.0130, -0.0117,
-0.0346, -0.0030, -0.0087, -0.0178, 0.0225, 0.0430, 0.0153, -0.0233, -0.0346, -0.0030, -0.0087, -0.0178, 0.0225, 0.0430, 0.0153, -0.0233,
0.0231, -0.0374, -0.0200, 0.0007, 0.0341, 0.0077, -0.0098, -0.0249, 0.0231, -0.0374, -0.0200, 0.0007, 0.0341, 0.0077, -0.0098, -0.0249,
0.0400, -0.0381, -0.0155, 0.0188, -0.0362, 0.0095, -0.0313, 0.0109, 0.0400, -0.0381, -0.0155, 0.0188, -0.0362, 0.0095, -0.0313, 0.0109,
0.0298, -0.0086, 0.0114, -0.0137, -0.0284, 0.0243, 0.0336, 0.0356, 0.0298, -0.0086, 0.0114, -0.0137, -0.0284, 0.0243, 0.0336, 0.0356,
0.0076, 0.0008, -0.0358, -0.0435, 0.0303, 0.0354])] 0.0076, 0.0008, -0.0358, -0.0435, 0.0303, 0.0354])]
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
a = [(3, 2), (2, 5), (2, 6)] a = [(3, 2), (2, 5), (2, 6)]
#a.sort(reverse = True) #a.sort(reverse = True)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
b = [(13, 12), (12, 15), (12, 16)] b = [(13, 12), (12, 15), (12, 16)]
import torch import torch
a_t = torch.tensor(a) a_t = torch.tensor(a)
b_t = torch.tensor(b) b_t = torch.tensor(b)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
print(a_t) print(a_t)
print(b_t) print(b_t)
``` ```
%% Output %% Output
tensor([[3, 2], tensor([[3, 2],
[2, 5], [2, 5],
[2, 6]]) [2, 6]])
tensor([[13, 12], tensor([[13, 12],
[12, 15], [12, 15],
[12, 16]]) [12, 16]])
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
indices = torch.tensor([2, 3, 5]) indices = torch.tensor([2, 3, 5])
torch.flatten(a_t)[indices] = torch.flatten(b_t)[:3] torch.flatten(a_t)[indices] = torch.flatten(b_t)[:3]
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
c = torch.stack((torch.flatten(a_t), torch.flatten(b_t)), dim=-1) c = torch.stack((torch.flatten(a_t), torch.flatten(b_t)), dim=-1)
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
print(a_t) print(a_t)
print(b_t) print(b_t)
``` ```
%% Output %% Output
tensor([[ 3, 2], tensor([[ 3, 2],
[13, 12], [13, 12],
[ 2, 12]]) [ 2, 12]])
tensor([[13, 12], tensor([[13, 12],
[12, 15], [12, 15],
[12, 16]]) [12, 16]])
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
c c
``` ```
%% Output %% Output
tensor([[ 3, 13], tensor([[ 3, 13],
[ 2, 12], [ 2, 12],
[ 2, 12], [ 2, 12],
[ 5, 15], [ 5, 15],
[ 2, 12], [ 2, 12],
[ 6, 16]]) [ 6, 16]])
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
c[-1] = 26 c[-1] = 26
``` ```
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
c c
``` ```
%% Output %% Output
tensor([ 3, 2, 2, 5, 2, 6, 13, 12, 12, 15, 12, 26]) tensor([ 3, 2, 2, 5, 2, 6, 13, 12, 12, 15, 12, 26])
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
b b
``` ```
%% Output %% Output
[(13, 12), (12, 15), (12, 16)] [(13, 12), (12, 15), (12, 16)]
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
torch.tensor([1,2,3]).shape[0] torch.tensor([1,2,3]).shape[0]
``` ```
%% Output %% Output
3 3
%% Cell type:code id: tags: %% Cell type:code id: tags:
``` ```
``` ```
......
...@@ -36,6 +36,7 @@ def plot(means, stdevs, mins, maxs, title, label, loc): ...@@ -36,6 +36,7 @@ def plot(means, stdevs, mins, maxs, title, label, loc):
def plot_results(path): def plot_results(path):
folders = os.listdir(path) folders = os.listdir(path)
folders.sort()
print("Reading folders from: ", path) print("Reading folders from: ", path)
print("Folders: ", folders) print("Folders: ", folders)
for folder in folders: for folder in folders:
...@@ -103,4 +104,4 @@ def plot_parameters(path): ...@@ -103,4 +104,4 @@ def plot_parameters(path):
if __name__ == "__main__": if __name__ == "__main__":
assert len(sys.argv) == 2 assert len(sys.argv) == 2
plot_results(sys.argv[1]) plot_results(sys.argv[1])
plot_parameters(sys.argv[1]) # plot_parameters(sys.argv[1])
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment