├── .gitattributes ├── datasets └── processed │ ├── data_0.pt │ ├── data_1.pt │ ├── data_2.pt │ ├── data_3.pt │ ├── data_4.pt │ ├── data_5.pt │ ├── data_6.pt │ ├── data_7.pt │ ├── data_8.pt │ └── data_9.pt ├── __pycache__ └── outcome.cpython-36.pyc ├── 10_rndm_zinc_drugs_clean.smi ├── README.md ├── make.py ├── generator.py ├── discriminator.py ├── outcome.py ├── GCN.py ├── features.py └── 10.sdf /.gitattributes: -------------------------------------------------------------------------------- 1 | # Auto detect text files and perform LF normalization 2 | * text=auto 3 | -------------------------------------------------------------------------------- /datasets/processed/data_0.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_0.pt -------------------------------------------------------------------------------- /datasets/processed/data_1.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_1.pt -------------------------------------------------------------------------------- /datasets/processed/data_2.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_2.pt -------------------------------------------------------------------------------- /datasets/processed/data_3.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_3.pt -------------------------------------------------------------------------------- /datasets/processed/data_4.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_4.pt -------------------------------------------------------------------------------- /datasets/processed/data_5.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_5.pt -------------------------------------------------------------------------------- /datasets/processed/data_6.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_6.pt -------------------------------------------------------------------------------- /datasets/processed/data_7.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_7.pt -------------------------------------------------------------------------------- /datasets/processed/data_8.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_8.pt -------------------------------------------------------------------------------- /datasets/processed/data_9.pt: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/datasets/processed/data_9.pt -------------------------------------------------------------------------------- /__pycache__/outcome.cpython-36.pyc: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/flawnson/Graph_convolutional_NN_molecule_maker/HEAD/__pycache__/outcome.cpython-36.pyc -------------------------------------------------------------------------------- /10_rndm_zinc_drugs_clean.smi: -------------------------------------------------------------------------------- 1 | CC(C)(C)c1ccc2occ(CC(=O)Nc3ccccc3F)c2c1 2 | C[C@@H]1CC(Nc2cncc(-c3nncn3C)c2)C[C@@H](C)C1 3 | N#Cc1ccc(-c2ccc(O[C@@H](C(=O)N3CCCC3)c3ccccc3)cc2)cc1 4 | CCOC(=O)[C@@H]1CCCN(C(=O)c2nc(-c3ccc(C)cc3)n3c2CCCCC3)C1 5 | N#CC1=C(SCC(=O)Nc2cccc(Cl)c2)N=C([O-])[C@H](C#N)C12CCCCC2 6 | CC[NH+](CC)[C@](C)(CC)[C@H](O)c1cscc1Br 7 | COc1ccc(C(=O)N(C)[C@@H](C)C/C(N)=N/O)cc1O 8 | O=C(Nc1nc[nH]n1)c1cccnc1Nc1cccc(F)c1 9 | Cc1c(/C=N/c2cc(Br)ccn2)c(O)n2c(nc3ccccc32)c1C#N 10 | C[C@@H]1CN(C(=O)c2cc(Br)cn2C)CC[C@H]1[NH3+] -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Graph_convolutional_NN_molecule_maker 2 | 3 | *WORK IN PROGRESS* 4 | 5 | ## Project 6 | This is a Graph Convolutional Networks that aims to learn from molecular data represented as graphs. Ideally, the model would be able to encode a molecular structure and learn distributions from both specific entities in the graph (atoms and bonds in the molecule) as well as the overall structure (molecule). The model in interest is an objective reinforced generative model capable of learning from representation of inorganic molecules as well as viral structures. 7 | 8 | ## Future plans and outcomes 9 | While the base prototype is in working progress, data augmentation of molecules, while volatile, may be a future direction, as is the use of transfer learning to improve model performance. 10 | 11 | Upwards and onwards, always and only :rocket:! 12 | -------------------------------------------------------------------------------- /make.py: -------------------------------------------------------------------------------- 1 | import os.path as osp 2 | import numpy as np 3 | import features 4 | 5 | import torch 6 | 7 | from torch_geometric.data import Data, Dataset 8 | 9 | class MyOwnDataset(Dataset): 10 | def __init__(self, root, transform=None, pre_transform=None): 11 | super(MyOwnDataset, self).__init__(root, transform, pre_transform) 12 | 13 | @property 14 | def raw_file_names(self): 15 | return ["single file"] 16 | 17 | @property 18 | def processed_file_names(self): 19 | return ["single file"] 20 | 21 | def __len__(self): 22 | return len(self.processed_file_names) 23 | 24 | def download(self): 25 | self = self.raw_dir 26 | 27 | def process(self): 28 | i = 0 29 | for raw_path in self.raw_paths: 30 | # Read data from `raw_path`. 31 | for node_attr, edge_index, edge_attr, target in features.return_data(features.data_instance): 32 | data = Data(node_attr, edge_index, edge_attr, target) 33 | 34 | if self.pre_filter is not None and not self.pre_filter(data): 35 | continue 36 | 37 | if self.pre_transform is not None: 38 | data = self.pre_transform(data) 39 | 40 | torch.save(data, osp.join(self.processed_dir, 'data_{}.pth'.format(i))) 41 | i += 1 42 | 43 | d = MyOwnDataset(root=r"C:\Users\Flawnson\Documents\Project Seraph & Cherub\Project Outcome\datasets") -------------------------------------------------------------------------------- /generator.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn.functional as F 3 | 4 | import outcome 5 | 6 | from torch_geometric.nn import GCNConv 7 | 8 | examples = outcome.yielder 9 | num_attr = outcome.get_network_params(examples) 10 | 11 | # class Net(torch.nn.Module): 12 | # def __init__(self): 13 | # super(Net, self).__init__() 14 | # self.conv1 = GCNConv(num_attr, 16) 15 | # self.conv2 = GCNConv(16, data.num_classes) 16 | 17 | # def forward(self, data): 18 | # x, edge_index = data.x, data.edge_index 19 | 20 | # x = self.conv1(x, edge_index) 21 | # x = F.relu(x) 22 | # x = F.dropout(x, training=self.training) 23 | # x = self.conv2(x, edge_index) 24 | 25 | # return F.sigmoid(x, dim=1) 26 | 27 | # device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') 28 | # generator = Net().to(device) 29 | # data = dataset[0].to(device) 30 | # optimizer = torch.optim.Adam(generator.parameters(), lr=0.01, weight_decay=5e-4) 31 | 32 | # generator.train() 33 | # for epoch in range(200): 34 | # optimizer.zero_grad() 35 | # out = generator(data) 36 | # loss = F.BCELoss(out[data.train_mask], data.y[data.train_mask]) 37 | # loss.backward() 38 | # optimizer.step() 39 | 40 | # generator.eval() 41 | # _, pred = generator(data).max(dim=1) 42 | # correct = pred[data.test_mask].eq(data.y[data.test_mask]).sum().item() 43 | # acc = correct / data.test_mask.sum().item() 44 | # print('Accuracy: {:.4f}'.format(acc)) -------------------------------------------------------------------------------- /discriminator.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import torch.nn.functional as F 3 | 4 | import outcome 5 | 6 | from scipy.sparse import random 7 | from torch_geometric.nn import GCNConv 8 | 9 | examples = outcome.yielder 10 | num_attr = outcome.get_network_params(examples) 11 | 12 | class Net(torch.nn.Module): 13 | def __init__(self): 14 | super(Net, self).__init__() 15 | self.conv1 = GCNConv(num_attr, 16) 16 | self.conv2 = GCNConv(16, data.num_classes) 17 | 18 | def forward(self, data): 19 | x, edge_index = data.x, data.edge_index 20 | 21 | x = self.conv1(x, edge_index) 22 | x = F.relu(x) 23 | x = F.dropout(x, training=self.training) 24 | x = self.conv2(x, edge_index) 25 | 26 | return F.sigmoid(x, dim=1) 27 | 28 | device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') 29 | discriminator = Net().to(device) 30 | data = dataset[0].to(device) 31 | optimizer = torch.optim.Adam(discriminator.parameters(), lr=0.01, weight_decay=5e-4) 32 | 33 | discriminator.train() 34 | for epoch in range(200): 35 | optimizer.zero_grad() 36 | out = discriminator(data) 37 | loss = F.BCELoss(out[data.train_mask], data.y[data.train_mask]) 38 | loss.backward() 39 | optimizer.step() 40 | 41 | discriminator.eval() 42 | _, pred = discriminator(data).max(dim=1) 43 | correct = pred[data.test_mask].eq(data.y[data.test_mask]).sum().item() 44 | acc = correct / data.test_mask.sum().item() 45 | print('Accuracy: {:.4f}'.format(acc)) -------------------------------------------------------------------------------- /outcome.py: -------------------------------------------------------------------------------- 1 | import torch 2 | import os 3 | 4 | import numpy as np 5 | import torch.nn.functional as F 6 | 7 | from torch_geometric.data import Data 8 | from torch_geometric.nn import GCNConv 9 | from scipy.sparse import random 10 | from scipy import stats 11 | from numpy.random import normal 12 | from sklearn import model_selection 13 | 14 | # Define variables for the document below 15 | training_size = .70 16 | validation_size = .15 17 | testing_size = .15 18 | 19 | def yielder(): 20 | for subdir, dirs, files in os.walk(r"C:\Users\Flawnson\Documents\Project Seraph & Cherub\Project Outcome\datasets\processed"): 21 | for file in files: 22 | filepath = subdir + os.sep + file 23 | if filepath.endswith(".pth"): 24 | datasets = torch.load(str(filepath)) 25 | 26 | yield datasets 27 | 28 | def get_network_params(): 29 | datasets = list(yielder()) 30 | datapoint = datasets[0] 31 | num_attr = datapoint.num_features 32 | list_num_atoms = [] 33 | 34 | for datapoint in datasets: 35 | list_num_atoms.append(datapoint.num_nodes) 36 | 37 | # Other details are defined and included here 38 | return num_attr, list_num_atoms 39 | 40 | num_attr, list_num_atoms = get_network_params() 41 | 42 | def splitter(datasets): 43 | assert training_size >= 0 and training_size <= 1, "Invalid training set fraction" 44 | 45 | train, tmp = model_selection.train_test_split(datasets, train_size=training_size) 46 | val, test = model_selection.train_test_split(tmp, train_size=0.5) # This splits the tmp value 47 | 48 | return train, val, test 49 | 50 | train, val, test = splitter(list(yielder())) -------------------------------------------------------------------------------- /GCN.py: -------------------------------------------------------------------------------- 1 | # TODO: The damn thing still keeps showing the same error, either the features.py data making algorithm is flawed or this model is doing something funky 2 | # A potential source of the problem is in the dataset, datasetpoint, and data variables in the model training section of this program 3 | 4 | import os 5 | import outcome 6 | import features 7 | import torch 8 | 9 | import numpy as np 10 | import torch.nn.functional as F 11 | 12 | from torch_geometric.data import Data 13 | from torch_geometric.nn import GCNConv 14 | from scipy.sparse import random 15 | from scipy import stats 16 | from numpy.random import normal 17 | 18 | """DATA IMPORTING""" 19 | num_attr, list_num_atoms = outcome.get_network_params() 20 | print(num_attr) 21 | print(features.get_num_classes(features.get_atom_symbols(features.suppl))) 22 | 23 | train, val, test = outcome.splitter(list(outcome.yielder())) 24 | edge_index = train[0].edge_index 25 | x = train[0].x 26 | 27 | """MODEL ARCHITECTURE""" 28 | class Net(torch.nn.Module): 29 | def __init__(self): 30 | super(Net, self).__init__() 31 | self.conv1 = GCNConv(3, 16) 32 | self.conv2 = GCNConv(16, 32) 33 | self.conv3 = GCNConv(32, 5) 34 | 35 | def forward(self, data): 36 | num_attr, list_num_atoms = outcome.get_network_params() 37 | train, val, test = outcome.splitter(list(outcome.yielder())) 38 | edge_index = train[0].edge_index 39 | x = train[0].x 40 | 41 | x = self.conv1(x.float(), edge_index.long()) 42 | x = F.relu(x) 43 | x = F.dropout(x, training=self.training) 44 | x = self.conv2(x.float(), edge_index.long()) 45 | 46 | return F.log_softmax(x, dim=1) 47 | 48 | """MODEL TRAINING""" 49 | # torch.set_default_tensor_type('torch.DoubleTensor') 50 | device = torch.device('cuda' if torch.cuda.is_available() else 'cpu') 51 | model = Net().to(device=device, dtype=torch.float) 52 | optimizer = torch.optim.Adam(model.parameters(), lr=0.01, weight_decay=5e-4) 53 | 54 | model.train() 55 | # for epoch in range(200): 56 | for datapoint in train: 57 | data = datapoint.to(device) 58 | optimizer.zero_grad() 59 | out = model(datapoint) 60 | loss = F.nll_loss(out[data.train], datapoint.y[data.train]) 61 | loss.backward() 62 | optimizer.step() 63 | 64 | model.eval() 65 | _, pred = model(data).max(dim=1) 66 | correct = pred[data.test_mask].eq(data.y[data.test_mask]).sum().item() 67 | acc = correct / data.test_mask.sum().item() 68 | print('Accuracy: {:.4f}'.format(acc)) -------------------------------------------------------------------------------- /features.py: -------------------------------------------------------------------------------- 1 | """This file is comprised of basic UDFs that may be useful in Project Outcome. 2 | There are a couple possible input types for each function: 3 | 1. molecules (the raw sdf file, opened to be iterated upon) 4 | 2. atom list (the atoms that each molecule of the sdf is comprised of, which can be represented by any unique ID) 5 | """ 6 | 7 | # TODO: get_atom_properties is generalized; adding a .specific_property to element(atom) would allow 8 | # access to different properties from the mendeleev package. Also note that this funciton is 9 | # dependant on the get_atom_symbols generator directly above it. 10 | 11 | import scipy 12 | import numpy as np 13 | import pandas as pd 14 | import itertools 15 | 16 | import tqdm 17 | import torch 18 | 19 | import os.path as osp 20 | import rdkit.Chem as Chem 21 | import networkx as nx 22 | 23 | from torch_geometric.data import Data, Dataset 24 | from mendeleev import element 25 | 26 | suppl = Chem.SDMolSupplier('10.sdf') 27 | 28 | smiles = open("10_rndm_zinc_drugs_clean.smi").read().splitlines() 29 | 30 | # suppl = Chem.SDMolSupplier("qm9-100-sdf.sdf") 31 | 32 | # targets = list(pd.read_csv("qm9-100.csv").mu) 33 | 34 | # smiles = open("qm9-100-smiles.smi").read().splitlines() 35 | 36 | # assert len(suppl) == len(targets) == len(smiles), "the datasets must be of the same length!" 37 | 38 | # =================================================================================== # 39 | """ GRAPH REPRESENTATION """ 40 | # =================================================================================== # 41 | 42 | def get_adj_matrix_coo(molecules): 43 | for mol in molecules: 44 | adj_mat = scipy.sparse.csr_matrix(Chem.rdmolops.GetAdjacencyMatrix(mol)) 45 | nx_graph = nx.from_scipy_sparse_matrix(adj_mat) 46 | coo_matrix = nx.to_scipy_sparse_matrix(nx_graph, dtype=float, format="coo") 47 | 48 | yield coo_matrix.row, coo_matrix.col 49 | 50 | coo_adj_matrix = list(get_adj_matrix_coo(suppl)) 51 | 52 | # =================================================================================== # 53 | """ GRAPH ATTRIBUTES """ 54 | # =================================================================================== # 55 | 56 | def get_num_bonds(molecules): 57 | for mol in suppl: 58 | number_of_bonds = mol.GetNumBonds() 59 | 60 | yield number_of_bonds 61 | 62 | number_of_bonds = list(get_num_bonds(suppl)) 63 | 64 | # =================================================================================== # 65 | """ NODE ATTRIBUTES """ 66 | # =================================================================================== # 67 | 68 | def get_atom_symbols(molecules): 69 | for mol in molecules: 70 | atoms = mol.GetAtoms() 71 | atom_symbols = [atom.GetSymbol() for atom in atoms] 72 | yield atom_symbols 73 | 74 | def get_atom_properties(atom_list): 75 | for atoms in atom_list: 76 | atomic_number = [element(atom).atomic_number for atom in atoms] 77 | atomic_volume = [element(atom).atomic_volume for atom in atoms] 78 | atomic_weight = [element(atom).atomic_weight for atom in atoms] 79 | all_atom_properties = list(zip(atomic_number, atomic_volume, atomic_weight)) 80 | yield all_atom_properties 81 | 82 | all_atom_properties = list(get_atom_properties(get_atom_symbols(suppl))) 83 | 84 | # for buh in all_atom_properties: 85 | # ah = buh[0] 86 | # print(type(ah)) 87 | # for duh in buh: 88 | # eh = duh[0] 89 | # print(type(eh)) 90 | 91 | # =================================================================================== # 92 | """ EDGE ATTRIBUTES """ 93 | # =================================================================================== # 94 | 95 | def get_num_bonds(molecules): 96 | for mol in suppl: 97 | number_of_bonds = mol.GetNumBonds() 98 | 99 | yield number_of_bonds 100 | 101 | def get_bonds_info(molecules): 102 | for mol in suppl: 103 | number_of_bonds = mol.GetNumBonds() 104 | bond_types = [int(bond.GetBondTypeAsDouble()) for bond in mol.GetBonds()] 105 | 106 | yield bond_types 107 | 108 | bond_types = list(get_bonds_info(suppl)) 109 | 110 | # for luh in bond_types: 111 | # ah = luh[0] 112 | # print(type(ah)) 113 | # =================================================================================== # 114 | """ TARGETS """ 115 | # =================================================================================== # 116 | def get_targets(atom_list): 117 | # Same as the get_atom_properties function from node attributes section 118 | for atoms in atom_list: 119 | boiling_points = list([element(atom).boiling_point for atom in atoms]) 120 | 121 | yield boiling_points 122 | 123 | targets = list(get_targets(get_atom_symbols(suppl))) 124 | 125 | def get_num_classes(atom_list): 126 | num_classes = [] 127 | for atoms in atom_list: 128 | boiling_points = list([element(atom).boiling_point for atom in atoms]) 129 | classes = len(sorted(list(set(boiling_points)))) 130 | num_classes.append(classes) 131 | return max(num_classes) # We return the max value, as this is the number of classes of the most diverse molecule 132 | 133 | num_classes = get_num_classes(get_atom_symbols(suppl)) 134 | 135 | def normalize(numerical_dataset): 136 | raw = list(itertools.chain.from_iterable(numerical_dataset)) 137 | 138 | maximum = max(raw) 139 | minimum = min(raw) 140 | 141 | for targets in numerical_dataset: 142 | norm_dataset = [(target - minimum) / (maximum - minimum) for target in targets] 143 | norm_dataset = torch.tensor(norm_dataset, dtype=torch.float) 144 | 145 | return norm_dataset 146 | 147 | norm_targets = normalize(targets) 148 | 149 | # for buh in norm_targets: 150 | # print(type(ah)) 151 | 152 | # =================================================================================== # 153 | """ BUILD DATASETS """ 154 | # =================================================================================== # 155 | 156 | def seperator(datasets): 157 | for example in datasets: 158 | tensorized_example = torch.tensor(example, dtype=torch.float) 159 | 160 | yield tensorized_example 161 | 162 | edge_index = list(seperator(coo_adj_matrix)) 163 | node_attr = list(seperator(all_atom_properties)) 164 | edge_attr = list(seperator(bond_types)) 165 | Y_data = list(seperator(targets)) 166 | 167 | data_instance = list(map(list, zip(node_attr, edge_index, edge_attr, Y_data))) 168 | 169 | def return_data(zipped_data): 170 | for instance in data_instance: 171 | node_attr = instance[0] 172 | edge_index = instance[1] 173 | edge_attr = instance[2] 174 | target = instance[3] 175 | 176 | yield node_attr, edge_index, edge_attr, target 177 | 178 | # def data_maker(datapoints): 179 | # for node_attr, edge_index, edge_attr, target in datapoints: 180 | # data = Data(x=node_attr, edge_index=edge_index, edge_attr=edge_attr, y=target) 181 | # print(type(data)) 182 | # yield data 183 | # print(list(data_maker(return_data(data_instance)))) -------------------------------------------------------------------------------- /10.sdf: -------------------------------------------------------------------------------- 1 | 2 | OpenBabel03181914192D 3 | 4 | 24 26 0 0 0 0 0 0 0 0999 V2000 5 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 6 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 7 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 8 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 9 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 10 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 11 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 12 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 13 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 14 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 15 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 16 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 17 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 18 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 19 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 20 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 21 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 22 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 23 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 24 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 25 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 26 | 0.0000 0.0000 0.0000 F 0 0 0 0 0 0 0 0 0 0 0 0 27 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 28 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 29 | 1 2 1 0 0 0 0 30 | 2 3 1 0 0 0 0 31 | 2 4 1 0 0 0 0 32 | 2 5 1 0 0 0 0 33 | 5 24 1 0 0 0 0 34 | 5 6 2 0 0 0 0 35 | 6 7 1 0 0 0 0 36 | 7 8 2 0 0 0 0 37 | 8 23 1 0 0 0 0 38 | 8 9 1 0 0 0 0 39 | 9 10 1 0 0 0 0 40 | 10 11 2 0 0 0 0 41 | 11 12 1 0 0 0 0 42 | 11 23 1 0 0 0 0 43 | 12 13 1 0 0 0 0 44 | 13 14 2 0 0 0 0 45 | 13 15 1 0 0 0 0 46 | 15 16 1 0 0 0 0 47 | 16 21 1 0 0 0 0 48 | 16 17 2 0 0 0 0 49 | 17 18 1 0 0 0 0 50 | 18 19 2 0 0 0 0 51 | 19 20 1 0 0 0 0 52 | 20 21 2 0 0 0 0 53 | 21 22 1 0 0 0 0 54 | 23 24 2 0 0 0 0 55 | M END 56 | $$$$ 57 | 58 | OpenBabel03181914192D 59 | 60 | 23 25 0 0 1 0 0 0 0 0999 V2000 61 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 62 | 0.0000 0.0000 0.0000 C 0 0 2 0 0 0 0 0 0 0 0 0 63 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 64 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 65 | 0.0000 0.0000 0.0000 C 0 0 3 0 0 0 0 0 0 0 0 0 66 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 67 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 68 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 69 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 70 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 71 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 72 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 73 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 74 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 75 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 76 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 77 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 78 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 79 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 80 | 0.0000 0.0000 0.0000 C 0 0 1 0 0 0 0 0 0 0 0 0 81 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 82 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 83 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 84 | 1 2 1 0 0 0 0 85 | 2 3 1 0 0 0 0 86 | 2 23 1 0 0 0 0 87 | 2 4 1 0 0 0 0 88 | 4 5 1 0 0 0 0 89 | 5 6 1 0 0 0 0 90 | 5 19 1 0 0 0 0 91 | 6 7 1 0 0 0 0 92 | 7 18 1 0 0 0 0 93 | 7 8 2 0 0 0 0 94 | 8 9 1 0 0 0 0 95 | 9 10 2 0 0 0 0 96 | 10 11 1 0 0 0 0 97 | 11 12 1 0 0 0 0 98 | 11 18 2 0 0 0 0 99 | 12 16 1 0 0 0 0 100 | 12 13 2 0 0 0 0 101 | 13 14 1 0 0 0 0 102 | 14 15 2 0 0 0 0 103 | 15 16 1 0 0 0 0 104 | 16 17 1 0 0 0 0 105 | 19 20 1 0 0 0 0 106 | 20 21 1 0 0 0 0 107 | 20 22 1 0 0 0 0 108 | 20 23 1 0 0 0 0 109 | M END 110 | $$$$ 111 | 112 | OpenBabel03181914192D 113 | 114 | 30 33 0 0 1 0 0 0 0 0999 V2000 115 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 116 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 117 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 118 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 119 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 120 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 121 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 122 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 123 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 124 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 125 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 126 | 0.0000 0.0000 0.0000 C 0 0 1 0 0 0 0 0 0 0 0 0 127 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 128 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 129 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 130 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 131 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 132 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 133 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 134 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 135 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 136 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 137 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 138 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 139 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 140 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 141 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 142 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 143 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 144 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 145 | 1 2 3 0 0 0 0 146 | 2 3 1 0 0 0 0 147 | 3 30 1 0 0 0 0 148 | 3 4 2 0 0 0 0 149 | 4 5 1 0 0 0 0 150 | 5 6 2 0 0 0 0 151 | 6 7 1 0 0 0 0 152 | 6 29 1 0 0 0 0 153 | 7 28 1 0 0 0 0 154 | 7 8 2 0 0 0 0 155 | 8 9 1 0 0 0 0 156 | 9 10 2 0 0 0 0 157 | 10 11 1 0 0 0 0 158 | 10 27 1 0 0 0 0 159 | 11 12 1 0 0 0 0 160 | 12 13 1 0 0 0 0 161 | 12 14 1 0 0 0 0 162 | 12 21 1 0 0 0 0 163 | 14 15 2 0 0 0 0 164 | 14 16 1 0 0 0 0 165 | 16 20 1 0 0 0 0 166 | 16 17 1 0 0 0 0 167 | 17 18 1 0 0 0 0 168 | 18 19 1 0 0 0 0 169 | 19 20 1 0 0 0 0 170 | 21 26 1 0 0 0 0 171 | 21 22 2 0 0 0 0 172 | 22 23 1 0 0 0 0 173 | 23 24 2 0 0 0 0 174 | 24 25 1 0 0 0 0 175 | 25 26 2 0 0 0 0 176 | 27 28 2 0 0 0 0 177 | 29 30 2 0 0 0 0 178 | M END 179 | $$$$ 180 | 181 | OpenBabel03181914192D 182 | 183 | 31 34 0 0 1 0 0 0 0 0999 V2000 184 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 185 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 186 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 187 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 188 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 189 | 0.0000 0.0000 0.0000 C 0 0 2 0 0 0 0 0 0 0 0 0 190 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 191 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 192 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 193 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 194 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 195 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 196 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 197 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 198 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 199 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 200 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 201 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 202 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 203 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 204 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 205 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 206 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 207 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 208 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 209 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 210 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 211 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 212 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 213 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 214 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 215 | 1 2 1 0 0 0 0 216 | 2 3 1 0 0 0 0 217 | 3 4 1 0 0 0 0 218 | 4 5 2 0 0 0 0 219 | 4 6 1 0 0 0 0 220 | 6 7 1 0 0 0 0 221 | 6 31 1 0 0 0 0 222 | 6 8 1 0 0 0 0 223 | 8 9 1 0 0 0 0 224 | 9 10 1 0 0 0 0 225 | 10 11 1 0 0 0 0 226 | 11 12 1 0 0 0 0 227 | 11 31 1 0 0 0 0 228 | 12 13 2 0 0 0 0 229 | 12 14 1 0 0 0 0 230 | 14 25 2 0 0 0 0 231 | 14 15 1 0 0 0 0 232 | 15 16 2 0 0 0 0 233 | 16 17 1 0 0 0 0 234 | 16 24 1 0 0 0 0 235 | 17 23 1 0 0 0 0 236 | 17 18 2 0 0 0 0 237 | 18 19 1 0 0 0 0 238 | 19 20 2 0 0 0 0 239 | 20 21 1 0 0 0 0 240 | 20 22 1 0 0 0 0 241 | 22 23 2 0 0 0 0 242 | 24 30 1 0 0 0 0 243 | 24 25 1 0 0 0 0 244 | 25 26 1 0 0 0 0 245 | 26 27 1 0 0 0 0 246 | 27 28 1 0 0 0 0 247 | 28 29 1 0 0 0 0 248 | 29 30 1 0 0 0 0 249 | M END 250 | $$$$ 251 | 252 | OpenBabel03181914192D 253 | 254 | 29 31 0 0 1 0 0 0 0 0999 V2000 255 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 256 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 257 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 258 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 259 | 0.0000 0.0000 0.0000 S 0 0 0 0 0 0 0 0 0 0 0 0 260 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 261 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 262 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 263 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 264 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 265 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 266 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 267 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 268 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 269 | 0.0000 0.0000 0.0000 Cl 0 0 0 0 0 0 0 0 0 0 0 0 270 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 271 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 272 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 273 | 0.0000 0.0000 0.0000 O 0 5 0 0 0 0 0 0 0 0 0 0 274 | 0.0000 0.0000 0.0000 C 0 0 2 0 0 0 0 0 0 0 0 0 275 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 276 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 277 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 278 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 279 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 280 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 281 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 282 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 283 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 284 | 1 2 3 0 0 0 0 285 | 2 3 1 6 0 0 0 286 | 3 24 1 1 0 0 0 287 | 3 4 2 0 0 0 0 288 | 4 5 1 6 0 0 0 289 | 4 17 1 1 0 0 0 290 | 5 6 1 0 0 0 0 291 | 6 7 1 0 0 0 0 292 | 7 8 2 0 0 0 0 293 | 7 9 1 0 0 0 0 294 | 9 10 1 0 0 0 0 295 | 10 16 1 0 0 0 0 296 | 10 11 2 0 0 0 0 297 | 11 12 1 0 0 0 0 298 | 12 13 2 0 0 0 0 299 | 13 14 1 0 0 0 0 300 | 14 15 1 0 0 0 0 301 | 14 16 2 0 0 0 0 302 | 17 18 2 0 0 0 0 303 | 18 19 1 6 0 0 0 304 | 18 20 1 1 0 0 0 305 | 20 21 1 0 0 0 0 306 | 20 22 1 0 0 0 0 307 | 20 24 1 0 0 0 0 308 | 22 23 3 0 0 0 0 309 | 24 29 1 0 0 0 0 310 | 24 25 1 0 0 0 0 311 | 25 26 1 0 0 0 0 312 | 26 27 1 0 0 0 0 313 | 27 28 1 0 0 0 0 314 | 28 29 1 0 0 0 0 315 | M CHG 1 19 -1 316 | M END 317 | $$$$ 318 | 319 | OpenBabel03181914192D 320 | 321 | 19 19 0 0 1 0 0 0 0 0999 V2000 322 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 323 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 324 | 0.0000 0.0000 0.0000 N 0 3 0 0 0 0 0 0 0 0 0 0 325 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 326 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 327 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 328 | 0.0000 0.0000 0.0000 C 0 0 2 0 0 0 0 0 0 0 0 0 329 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 330 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 331 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 332 | 0.0000 0.0000 0.0000 C 0 0 2 0 0 0 0 0 0 0 0 0 333 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 334 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 335 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 336 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 337 | 0.0000 0.0000 0.0000 S 0 0 0 0 0 0 0 0 0 0 0 0 338 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 339 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 340 | 0.0000 0.0000 0.0000 Br 0 0 0 0 0 0 0 0 0 0 0 0 341 | 1 2 1 0 0 0 0 342 | 2 3 1 0 0 0 0 343 | 3 4 1 0 0 0 0 344 | 3 5 1 0 0 0 0 345 | 3 7 1 0 0 0 0 346 | 5 6 1 0 0 0 0 347 | 7 8 1 0 0 0 0 348 | 7 9 1 0 0 0 0 349 | 7 11 1 0 0 0 0 350 | 9 10 1 0 0 0 0 351 | 11 12 1 0 0 0 0 352 | 11 13 1 0 0 0 0 353 | 11 14 1 0 0 0 0 354 | 14 18 1 0 0 0 0 355 | 14 15 2 0 0 0 0 356 | 15 16 1 0 0 0 0 357 | 16 17 1 0 0 0 0 358 | 17 18 2 0 0 0 0 359 | 18 19 1 0 0 0 0 360 | M CHG 1 3 1 361 | M END 362 | $$$$ 363 | 364 | OpenBabel03181914192D 365 | 366 | 21 21 0 0 1 0 0 0 0 0999 V2000 367 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 368 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 369 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 370 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 371 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 372 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 373 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 374 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 375 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 376 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 377 | 0.0000 0.0000 0.0000 C 0 0 1 0 0 0 0 0 0 0 0 0 378 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 379 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 380 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 381 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 382 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 383 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 384 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 385 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 386 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 387 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 388 | 1 2 1 0 0 0 0 389 | 2 3 1 0 0 0 0 390 | 3 20 1 0 0 0 0 391 | 3 4 2 0 0 0 0 392 | 4 5 1 0 0 0 0 393 | 5 6 2 0 0 0 0 394 | 6 7 1 0 0 0 0 395 | 6 19 1 0 0 0 0 396 | 7 8 2 0 0 0 0 397 | 7 9 1 0 0 0 0 398 | 9 10 1 0 0 0 0 399 | 9 11 1 0 0 0 0 400 | 11 12 1 0 0 0 0 401 | 11 13 1 0 0 0 0 402 | 11 14 1 0 0 0 0 403 | 14 15 1 1 0 0 0 404 | 15 16 1 6 0 0 0 405 | 15 17 2 0 0 0 0 406 | 17 18 1 6 0 0 0 407 | 19 20 2 0 0 0 0 408 | 20 21 1 0 0 0 0 409 | M END 410 | $$$$ 411 | 412 | OpenBabel03181914192D 413 | 414 | 23 25 0 0 0 0 0 0 0 0999 V2000 415 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 416 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 417 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 418 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 419 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 420 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 421 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 422 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 423 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 424 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 425 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 426 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 427 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 428 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 429 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 430 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 431 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 432 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 433 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 434 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 435 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 436 | 0.0000 0.0000 0.0000 F 0 0 0 0 0 0 0 0 0 0 0 0 437 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 438 | 1 2 2 0 0 0 0 439 | 2 3 1 0 0 0 0 440 | 2 10 1 0 0 0 0 441 | 3 4 1 0 0 0 0 442 | 4 9 2 0 0 0 0 443 | 4 5 1 0 0 0 0 444 | 5 6 2 0 0 0 0 445 | 6 7 1 0 0 0 0 446 | 7 8 1 0 0 0 0 447 | 7 9 1 0 0 0 0 448 | 10 15 1 0 0 0 0 449 | 10 11 2 0 0 0 0 450 | 11 12 1 0 0 0 0 451 | 12 13 2 0 0 0 0 452 | 13 14 1 0 0 0 0 453 | 14 15 2 0 0 0 0 454 | 15 16 1 0 0 0 0 455 | 16 17 1 0 0 0 0 456 | 17 23 1 0 0 0 0 457 | 17 18 2 0 0 0 0 458 | 18 19 1 0 0 0 0 459 | 19 20 2 0 0 0 0 460 | 20 21 1 0 0 0 0 461 | 21 22 1 0 0 0 0 462 | 21 23 2 0 0 0 0 463 | M END 464 | $$$$ 465 | 466 | OpenBabel03181914192D 467 | 468 | 26 29 0 0 0 0 0 0 0 0999 V2000 469 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 470 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 471 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 472 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 473 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 474 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 475 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 476 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 477 | 0.0000 0.0000 0.0000 Br 0 0 0 0 0 0 0 0 0 0 0 0 478 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 479 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 480 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 481 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 482 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 483 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 484 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 485 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 486 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 487 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 488 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 489 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 490 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 491 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 492 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 493 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 494 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 495 | 1 2 1 0 0 0 0 496 | 2 24 2 0 0 0 0 497 | 2 3 1 0 0 0 0 498 | 3 4 1 1 0 0 0 499 | 3 13 2 0 0 0 0 500 | 4 5 2 0 0 0 0 501 | 5 6 1 6 0 0 0 502 | 6 12 1 0 0 0 0 503 | 6 7 2 0 0 0 0 504 | 7 8 1 0 0 0 0 505 | 8 9 1 0 0 0 0 506 | 8 10 2 0 0 0 0 507 | 10 11 1 0 0 0 0 508 | 11 12 2 0 0 0 0 509 | 13 14 1 0 0 0 0 510 | 13 15 1 0 0 0 0 511 | 15 23 1 0 0 0 0 512 | 15 16 1 0 0 0 0 513 | 16 17 2 0 0 0 0 514 | 16 24 1 0 0 0 0 515 | 17 18 1 0 0 0 0 516 | 18 23 1 0 0 0 0 517 | 18 19 2 0 0 0 0 518 | 19 20 1 0 0 0 0 519 | 20 21 2 0 0 0 0 520 | 21 22 1 0 0 0 0 521 | 22 23 2 0 0 0 0 522 | 24 25 1 0 0 0 0 523 | 25 26 3 0 0 0 0 524 | M END 525 | $$$$ 526 | 527 | OpenBabel03181914192D 528 | 529 | 22 23 0 0 1 0 0 0 0 0999 V2000 530 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 531 | 0.0000 0.0000 0.0000 C 0 0 2 0 0 0 0 0 0 0 0 0 532 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 533 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 534 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 535 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 536 | 0.0000 0.0000 0.0000 O 0 0 0 0 0 0 0 0 0 0 0 0 537 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 538 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 539 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 540 | 0.0000 0.0000 0.0000 Br 0 0 0 0 0 0 0 0 0 0 0 0 541 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 542 | 0.0000 0.0000 0.0000 N 0 0 0 0 0 0 0 0 0 0 0 0 543 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 544 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 545 | 0.0000 0.0000 0.0000 C 0 0 0 0 0 0 0 0 0 0 0 0 546 | 0.0000 0.0000 0.0000 C 0 0 1 0 0 0 0 0 0 0 0 0 547 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 548 | 0.0000 0.0000 0.0000 N 0 3 0 0 0 0 0 0 0 0 0 0 549 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 550 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 551 | 0.0000 0.0000 0.0000 H 0 0 0 0 0 0 0 0 0 0 0 0 552 | 1 2 1 0 0 0 0 553 | 2 3 1 0 0 0 0 554 | 2 17 1 0 0 0 0 555 | 2 4 1 0 0 0 0 556 | 4 5 1 0 0 0 0 557 | 5 6 1 0 0 0 0 558 | 5 15 1 0 0 0 0 559 | 6 7 2 0 0 0 0 560 | 6 8 1 0 0 0 0 561 | 8 13 1 0 0 0 0 562 | 8 9 2 0 0 0 0 563 | 9 10 1 0 0 0 0 564 | 10 11 1 0 0 0 0 565 | 10 12 2 0 0 0 0 566 | 12 13 1 0 0 0 0 567 | 13 14 1 0 0 0 0 568 | 15 16 1 0 0 0 0 569 | 16 17 1 0 0 0 0 570 | 17 18 1 0 0 0 0 571 | 17 19 1 0 0 0 0 572 | 19 20 1 0 0 0 0 573 | 19 21 1 0 0 0 0 574 | 19 22 1 0 0 0 0 575 | M CHG 1 19 1 576 | M END 577 | $$$$ 578 | --------------------------------------------------------------------------------