├── .gitignore ├── Data └── placeholder ├── Figure_Iris_Ground_Truth.png ├── Figure_Iris_Star_Clustering.png ├── Figure_Plot_Cluster_Comparison.png ├── Figure_StarClustering.png ├── LICENSE ├── README.md ├── basic_english_AffinityPropagation.txt ├── basic_english_StarClustering_original.txt ├── basic_english_upper_angular_centered_lim0p618.txt ├── distances.py ├── iris_limit-0p618.png ├── plot_cluster_comparison.py ├── plot_cluster_iris.py ├── star_clustering.py └── word_vectors_test.py /.gitignore: -------------------------------------------------------------------------------- 1 | .idea 2 | *.vec 3 | __pycache__ -------------------------------------------------------------------------------- /Data/placeholder: -------------------------------------------------------------------------------- 1 | a 2 | -------------------------------------------------------------------------------- /Figure_Iris_Ground_Truth.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/josephius/star-clustering/f4c1e6991f03a0ac543aead1a68d4f6882e2f5b9/Figure_Iris_Ground_Truth.png -------------------------------------------------------------------------------- /Figure_Iris_Star_Clustering.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/josephius/star-clustering/f4c1e6991f03a0ac543aead1a68d4f6882e2f5b9/Figure_Iris_Star_Clustering.png -------------------------------------------------------------------------------- /Figure_Plot_Cluster_Comparison.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/josephius/star-clustering/f4c1e6991f03a0ac543aead1a68d4f6882e2f5b9/Figure_Plot_Cluster_Comparison.png -------------------------------------------------------------------------------- /Figure_StarClustering.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/josephius/star-clustering/f4c1e6991f03a0ac543aead1a68d4f6882e2f5b9/Figure_StarClustering.png -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | Copyright 2020 Joseph Lin Chu 2 | 3 | Apache License 4 | Version 2.0, January 2004 5 | http://www.apache.org/licenses/ 6 | 7 | TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION 8 | 9 | 1. Definitions. 10 | 11 | "License" shall mean the terms and conditions for use, reproduction, 12 | and distribution as defined by Sections 1 through 9 of this document. 13 | 14 | "Licensor" shall mean the copyright owner or entity authorized by 15 | the copyright owner that is granting the License. 16 | 17 | "Legal Entity" shall mean the union of the acting entity and all 18 | other entities that control, are controlled by, or are under common 19 | control with that entity. For the purposes of this definition, 20 | "control" means (i) the power, direct or indirect, to cause the 21 | direction or management of such entity, whether by contract or 22 | otherwise, or (ii) ownership of fifty percent (50%) or more of the 23 | outstanding shares, or (iii) beneficial ownership of such entity. 24 | 25 | "You" (or "Your") shall mean an individual or Legal Entity 26 | exercising permissions granted by this License. 27 | 28 | "Source" form shall mean the preferred form for making modifications, 29 | including but not limited to software source code, documentation 30 | source, and configuration files. 31 | 32 | "Object" form shall mean any form resulting from mechanical 33 | transformation or translation of a Source form, including but 34 | not limited to compiled object code, generated documentation, 35 | and conversions to other media types. 36 | 37 | "Work" shall mean the work of authorship, whether in Source or 38 | Object form, made available under the License, as indicated by a 39 | copyright notice that is included in or attached to the work 40 | (an example is provided in the Appendix below). 41 | 42 | "Derivative Works" shall mean any work, whether in Source or Object 43 | form, that is based on (or derived from) the Work and for which the 44 | editorial revisions, annotations, elaborations, or other modifications 45 | represent, as a whole, an original work of authorship. For the purposes 46 | of this License, Derivative Works shall not include works that remain 47 | separable from, or merely link (or bind by name) to the interfaces of, 48 | the Work and Derivative Works thereof. 49 | 50 | "Contribution" shall mean any work of authorship, including 51 | the original version of the Work and any modifications or additions 52 | to that Work or Derivative Works thereof, that is intentionally 53 | submitted to Licensor for inclusion in the Work by the copyright owner 54 | or by an individual or Legal Entity authorized to submit on behalf of 55 | the copyright owner. For the purposes of this definition, "submitted" 56 | means any form of electronic, verbal, or written communication sent 57 | to the Licensor or its representatives, including but not limited to 58 | communication on electronic mailing lists, source code control systems, 59 | and issue tracking systems that are managed by, or on behalf of, the 60 | Licensor for the purpose of discussing and improving the Work, but 61 | excluding communication that is conspicuously marked or otherwise 62 | designated in writing by the copyright owner as "Not a Contribution." 63 | 64 | "Contributor" shall mean Licensor and any individual or Legal Entity 65 | on behalf of whom a Contribution has been received by Licensor and 66 | subsequently incorporated within the Work. 67 | 68 | 2. Grant of Copyright License. Subject to the terms and conditions of 69 | this License, each Contributor hereby grants to You a perpetual, 70 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 71 | copyright license to reproduce, prepare Derivative Works of, 72 | publicly display, publicly perform, sublicense, and distribute the 73 | Work and such Derivative Works in Source or Object form. 74 | 75 | 3. Grant of Patent License. Subject to the terms and conditions of 76 | this License, each Contributor hereby grants to You a perpetual, 77 | worldwide, non-exclusive, no-charge, royalty-free, irrevocable 78 | (except as stated in this section) patent license to make, have made, 79 | use, offer to sell, sell, import, and otherwise transfer the Work, 80 | where such license applies only to those patent claims licensable 81 | by such Contributor that are necessarily infringed by their 82 | Contribution(s) alone or by combination of their Contribution(s) 83 | with the Work to which such Contribution(s) was submitted. If You 84 | institute patent litigation against any entity (including a 85 | cross-claim or counterclaim in a lawsuit) alleging that the Work 86 | or a Contribution incorporated within the Work constitutes direct 87 | or contributory patent infringement, then any patent licenses 88 | granted to You under this License for that Work shall terminate 89 | as of the date such litigation is filed. 90 | 91 | 4. Redistribution. You may reproduce and distribute copies of the 92 | Work or Derivative Works thereof in any medium, with or without 93 | modifications, and in Source or Object form, provided that You 94 | meet the following conditions: 95 | 96 | (a) You must give any other recipients of the Work or 97 | Derivative Works a copy of this License; and 98 | 99 | (b) You must cause any modified files to carry prominent notices 100 | stating that You changed the files; and 101 | 102 | (c) You must retain, in the Source form of any Derivative Works 103 | that You distribute, all copyright, patent, trademark, and 104 | attribution notices from the Source form of the Work, 105 | excluding those notices that do not pertain to any part of 106 | the Derivative Works; and 107 | 108 | (d) If the Work includes a "NOTICE" text file as part of its 109 | distribution, then any Derivative Works that You distribute must 110 | include a readable copy of the attribution notices contained 111 | within such NOTICE file, excluding those notices that do not 112 | pertain to any part of the Derivative Works, in at least one 113 | of the following places: within a NOTICE text file distributed 114 | as part of the Derivative Works; within the Source form or 115 | documentation, if provided along with the Derivative Works; or, 116 | within a display generated by the Derivative Works, if and 117 | wherever such third-party notices normally appear. The contents 118 | of the NOTICE file are for informational purposes only and 119 | do not modify the License. You may add Your own attribution 120 | notices within Derivative Works that You distribute, alongside 121 | or as an addendum to the NOTICE text from the Work, provided 122 | that such additional attribution notices cannot be construed 123 | as modifying the License. 124 | 125 | You may add Your own copyright statement to Your modifications and 126 | may provide additional or different license terms and conditions 127 | for use, reproduction, or distribution of Your modifications, or 128 | for any such Derivative Works as a whole, provided Your use, 129 | reproduction, and distribution of the Work otherwise complies with 130 | the conditions stated in this License. 131 | 132 | 5. Submission of Contributions. Unless You explicitly state otherwise, 133 | any Contribution intentionally submitted for inclusion in the Work 134 | by You to the Licensor shall be under the terms and conditions of 135 | this License, without any additional terms or conditions. 136 | Notwithstanding the above, nothing herein shall supersede or modify 137 | the terms of any separate license agreement you may have executed 138 | with Licensor regarding such Contributions. 139 | 140 | 6. Trademarks. This License does not grant permission to use the trade 141 | names, trademarks, service marks, or product names of the Licensor, 142 | except as required for reasonable and customary use in describing the 143 | origin of the Work and reproducing the content of the NOTICE file. 144 | 145 | 7. Disclaimer of Warranty. Unless required by applicable law or 146 | agreed to in writing, Licensor provides the Work (and each 147 | Contributor provides its Contributions) on an "AS IS" BASIS, 148 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or 149 | implied, including, without limitation, any warranties or conditions 150 | of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A 151 | PARTICULAR PURPOSE. You are solely responsible for determining the 152 | appropriateness of using or redistributing the Work and assume any 153 | risks associated with Your exercise of permissions under this License. 154 | 155 | 8. Limitation of Liability. In no event and under no legal theory, 156 | whether in tort (including negligence), contract, or otherwise, 157 | unless required by applicable law (such as deliberate and grossly 158 | negligent acts) or agreed to in writing, shall any Contributor be 159 | liable to You for damages, including any direct, indirect, special, 160 | incidental, or consequential damages of any character arising as a 161 | result of this License or out of the use or inability to use the 162 | Work (including but not limited to damages for loss of goodwill, 163 | work stoppage, computer failure or malfunction, or any and all 164 | other commercial damages or losses), even if such Contributor 165 | has been advised of the possibility of such damages. 166 | 167 | 9. Accepting Warranty or Additional Liability. While redistributing 168 | the Work or Derivative Works thereof, You may choose to offer, 169 | and charge a fee for, acceptance of support, warranty, indemnity, 170 | or other liability obligations and/or rights consistent with this 171 | License. However, in accepting such obligations, You may act only 172 | on Your own behalf and on Your sole responsibility, not on behalf 173 | of any other Contributor, and only if You agree to indemnify, 174 | defend, and hold each Contributor harmless for any liability 175 | incurred by, or claims asserted against, such Contributor by reason 176 | of your accepting any such warranty or additional liability. 177 | 178 | END OF TERMS AND CONDITIONS 179 | 180 | APPENDIX: How to apply the Apache License to your work. 181 | 182 | To apply the Apache License to your work, attach the following 183 | boilerplate notice, with the fields enclosed by brackets "[]" 184 | replaced with your own identifying information. (Don't include 185 | the brackets!) The text should be enclosed in the appropriate 186 | comment syntax for the file format. We also recommend that a 187 | file or class name and description of purpose be included on the 188 | same "printed page" as the copyright notice for easier 189 | identification within third-party archives. 190 | 191 | Copyright [yyyy] [name of copyright owner] 192 | 193 | Licensed under the Apache License, Version 2.0 (the "License"); 194 | you may not use this file except in compliance with the License. 195 | You may obtain a copy of the License at 196 | 197 | http://www.apache.org/licenses/LICENSE-2.0 198 | 199 | Unless required by applicable law or agreed to in writing, software 200 | distributed under the License is distributed on an "AS IS" BASIS, 201 | WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 202 | See the License for the specific language governing permissions and 203 | limitations under the License. 204 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # Star Clustering 2 | 3 | ## Introduction 4 | 5 | The Star Clustering algorithm is a clustering technique that is loosely inspired and analogous to the process of star system formation. Its purpose is as an alternative clustering algorithm that does not require knowing the number of clusters in advance or any hyperparameter tuning. 6 | 7 | ## Installation 8 | 9 | The following dependencies should be installed: 10 | 11 | * numpy 12 | * scipy 13 | 14 | ## Setup 15 | 16 | It is recommended that you have Scikit-Learn as the implementation provided here is meant to work with Scikit-Learn as a drop in replacement for other algorithms. 17 | 18 | The actual algorithm is located in star_clustering.py and can be called by the import statement: 19 | 20 | `from star_clustering import StarCluster` 21 | 22 | Then create an object to instantiate an instance of the algorithm: 23 | 24 | `star = StarCluster()` 25 | 26 | Then call the fit or predict functions as you would any other clustering algorithm in Scikit-Learn. 27 | 28 | ## Test Scripts 29 | 30 | Three test scripts are provided that are meant to show the effectiveness of the algorithm on very different types of data. 31 | 32 | * plot_cluster_comparison.py 33 | * word_vectors_test.py 34 | * plot_cluster_iris.py 35 | 36 | Note that the word vectors test requires a copy of the FastText pretrained word vectors or some equivalent, which is not provided here. (https://fasttext.cc/docs/en/english-vectors.html) 37 | 38 | ## Example Plot Results 39 | 40 | ### Performance On Various Synthetic Test Data 41 | 42 | ![Plot Cluster Comparison - Star Clustering](Figure_StarClustering.png) 43 | 44 | ### Comparison To Other Algorithms 45 | 46 | ![Plot Cluster Comparison - Star Clustering](Figure_Plot_Cluster_Comparison.png) 47 | 48 | ### Performance On Iris Data 49 | 50 | ![Plot Cluster Iris - Star Clustering](Figure_Iris_Star_Clustering.png) 51 | 52 | ### Iris Data Ground Truth 53 | 54 | ![Plot Cluster Iris - Ground Truth](Figure_Iris_Ground_Truth.png) 55 | 56 | ## High-Dimensional Clustering Focused Updates 57 | 58 | * angular distance metric added (see https://en.wikipedia.org/wiki/Cosine_similarity#Angular_distance_and_similarity) 59 | * distance calculations moved to classes in distances.py to enable switching via method argument and future distance metric additions 60 | * optional centering of data rows and columns added to distance metrics 61 | * optional limit_exp argument added to the StarCluster.fit() method to allow for scaling of the constant used in calculating the limit value 62 | * optional upper threshold added that prevents connections from being made to nodes above a certain mass added in an attempt to (very) roughly emulate golden section search 63 | * new features were (hopefully) added in a way that everything should work identically to the original code if default arguments are sent to the Starcluster.fit() method 64 | * **basic_english_upper_angular_centered_lim0p618.txt** contains results of word clustering with arguments of fit(vectors, upper=True, limit_exp=-1, dis_type='angular') 65 | 66 | ## Refactoring Update 67 | 68 | * switched to using the pdist class from scipy for distance metrics 69 | * changed hyperparameter names to be a bit more verbose but clearer 70 | * allow the constant factors to take any float value 71 | * added a verbose debugging flag 72 | 73 | ## License 74 | 75 | Apache 2.0 -------------------------------------------------------------------------------- /basic_english_AffinityPropagation.txt: -------------------------------------------------------------------------------- 1 | 0: the, of, a, for, by, group, left, history, part, form, development, position, number, current, among, death, control, general, test, interest, order, side, attack, growth, knowledge, structure, rate, size, base, natural, unit, loss, medical, degree, amount, physical, weight, star, hearing, division, mass, memory, flight, driving, distribution, exchange, motion, selection, transport, approval, payment, representative, ring, frame, square, waiting, chain, discovery, destruction, punishment, muscle, stem, dependent, receipt, arch, stocking, jewel, 2 | 1: and, to, in, on, with, as, from, at, after, over, where, through, under, between, against, future, second, present, early, open, lead, near, meeting, across, match, living, late, fall, increase, addition, middle, comparison, opposite, till, parallel, hanging, pocket, 3 | 2: this, about, page, here, way, name, how, place, much, change, point, discussion, why, process, question, list, line, war, view, material, same, language, record, reason, account, enough, word, event, wrong, true, space, decision, statement, idea, rule, hope, theory, answer, simple, stage, table, act, argument, disease, picture, complex, agreement, bit, box, plant, peace, voice, error, balance, detail, connection, net, protest, expansion, thread, wave, root, cloud, trick, sock, bell, paste, 4 | 3: I, you, please, 5 | 4: that, all, some, other, any, new, right, last, different, free, every, old, full, top, together, common, almost, past, certain, complete, key, dead, normal, separate, respect, regular, false, fixed, rest, secret, equal, flat, mixed, fat, dust, probable, 6 | 5: talk, work, help, experience, play, field, design, writing, care, force, value, teaching, produce, attention, fight, credit, damage, condition, train, direction, mine, copy, waste, prose, observation, circle, plate, paint, clock, reward, invention, adjustment, stamp, brush, canvas, polish, stitch, plough, 7 | 6: or, not, have, be, but, he, will, so, only, if, like, who, when, there, use, then, now, because, than, no, first, such, before, while, even, make, may, support, again, still, say, very, well, need, though, possible, fact, little, thought, small, ever, level, example, face, request, note, cause, far, effect, offer, sense, opinion, mind, turn, sign, quite, cover, seem, range, judge, measure, yes, limit, able, chance, scale, doubt, mark, attempt, trouble, existence, touch, danger, surprise, frequent, lift, stretch, automatic, crack, twist, tendency, bent, crush, skirt, 8 | 7: government, system, public, country, power, law, money, education, political, land, military, science, society, private, tax, religion, authority, committee, nation, debt, army, island, branch, apparatus, 9 | 8: good, important, high, great, clear, bad, poor, quality, hard, necessary, strong, low, serious, safe, happy, clean, responsible, beautiful, quick, healthy, ready, solid, cheap, ill, bright, wise, smooth, rough, dear, foolish, feeble, 10 | 9: up, out, back, off, down, end, cut, forward, front, step, straight, button, edge, roll, kick, bone, lock, hook, boot, burst, fold, drain, cushion, 11 | 10: time, year, day, week, news, night, month, morning, yesterday, hour, tomorrow, minute, 12 | 11: company, business, market, industry, office, organization, competition, trade, learning, operation, manager, crime, store, insurance, profit, advertisement, shoe, 13 | 12: book, story, paper, reading, map, library, letter, guide, print, fiction, bag, ticket, verse, shelf, 14 | 13: do, see, get, take, put, go, look, start, run, move, give, keep, come, stop, let, watch, drop, join, send, walk, push, fly, stick, pull, jump, shut, swim, smash, rub, comb, 15 | 14: man, person, young, woman, girl, brain, machine, boy, animal, camera, earth, gun, hole, hat, moon, shirt, snake, monkey, worm, trousers, knot, porter, 16 | 15: water, air, oil, blood, fish, river, sea, heat, ice, chemical, coal, sugar, salt, sand, electric, liquid, grass, cup, grain, pump, sky, breath, steam, pipe, bottle, basin, soap, powder, bath, bucket, boiling, mist, sponge, 17 | 16: long, short, deep, wide, slow, narrow, tall, thin, thick, sharp, stiff, 18 | 17: music, song, band, art, sound, instrument, noise, rhythm, harmony, 19 | 18: family, school, building, house, town, property, station, church, room, hospital, price, owner, prison, window, wall, door, seat, tree, ball, floor, farm, stone, garden, wing, dress, roof, cook, brick, servant, basket, carriage, oven, parcel, drawer, ornament, 20 | 19: food, drink, meat, wine, rice, meal, soup, potato, tray, digestion, 21 | 20: love, hate, desire, pleasure, kiss, 22 | 21: body, head, chief, hair, arm, secretary, neck, blow, tail, chest, knee, stomach, wound, bite, umbrella, horn, whip, toe, 23 | 22: light, dark, sun, shade, brake, ray, bulb, 24 | 23: hand, foot, leg, finger, pin, thumb, grip, shake, glove, 25 | 24: fire, smoke, burn, flame, 26 | 25: mother, father, son, married, daughter, friend, birth, brother, baby, sister, egg, tongue, 27 | 26: road, street, distance, bridge, mountain, journey, rail, wheel, curve, slope, cart, 28 | 27: summer, spring, winter, 29 | 28: thing, special, round, kind, heart, purpose, sort, behaviour, expert, suggestion, self, substance, strange, soft, quiet, seed, sweet, angle, relation, sudden, cruel, conscious, nerve, poison, delicate, needle, hollow, nut, amusement, 30 | 29: board, ship, engine, plane, flag, boat, vessel, harbour, sail, 31 | 30: south, north, west, east, 32 | 31: female, sex, male, 33 | 32: fear, pain, feeling, reaction, belief, taste, shock, comfort, shame, smell, attraction, regret, impulse, disgust, 34 | 33: dog, horse, cat, rat, collar, 35 | 34: metal, glass, wood, steel, iron, copper, wire, brass, tin, rod, 36 | 35: bed, sleep, awake, 37 | 36: weather, wind, snow, rain, acid, thunder, 38 | 37: cold, dry, warm, wet, wash, wax, fertile, sticky, 39 | 38: black, white, red, green, blue, card, colour, yellow, brown, orange, grey, 40 | 39: gold, silver, spoon, 41 | 40: fruit, flower, leaf, apple, berry, 42 | 41: bird, feather, fowl, 43 | 42: violent, sad, angry, tired, loud, bitter, 44 | 43: broken, loose, tight, dirty, slip, screw, elastic, 45 | 44: bread, cheese, cake, butter, 46 | 45: eye, skin, mouth, nose, ear, throat, lip, chin, 47 | 46: laugh, cry, smile, humour, 48 | 47: pot, kettle, 49 | 48: knife, blade, fork, scissors, 50 | 49: cotton, coat, leather, cloth, silk, wool, cord, linen, 51 | 50: milk, sheep, cow, pig, goat, 52 | 51: hammer, nail, tooth, 53 | 52: whistle, 54 | 53: cough, sneeze, 55 | 54: pen, ink, pencil, chalk, 56 | 55: curtain, 57 | 56: insect, bee, ant, 58 | 57: jelly, 59 | 58: cork, 60 | -------------------------------------------------------------------------------- /basic_english_StarClustering_original.txt: -------------------------------------------------------------------------------- 1 | 0: south, north, west, east, 2 | 1: mother, father, son, married, daughter, friend, birth, brother, baby, sister, milk, bread, cheese, egg, butter, servant, porter, 3 | 2: female, sex, male, 4 | 3: light, black, white, red, green, blue, flag, card, dark, colour, yellow, fruit, hole, brown, grass, cloud, orange, grey, sky, knife, coat, powder, apple, blade, collar, mist, berry, 5 | 4: high, top, low, 6 | 5: the, and, of, to, in, a, that, for, on, with, as, or, by, from, at, this, not, all, about, but, page, he, so, only, if, time, some, like, who, other, when, any, after, there, talk, use, then, up, over, out, here, now, because, do, work, than, no, new, where, first, such, before, way, through, while, name, how, even, see, make, under, year, right, get, between, good, place, back, again, against, system, much, still, change, say, point, day, last, left, very, off, well, down, take, discussion, why, family, school, process, history, question, part, put, different, list, power, future, book, though, law, free, money, line, second, go, every, possible, look, form, water, view, present, development, early, material, position, please, thing, long, number, great, start, current, building, play, fact, old, among, story, death, field, same, language, control, run, record, general, end, education, little, open, move, thought, test, week, small, reason, account, clear, lead, house, give, full, keep, enough, come, near, ever, level, word, event, together, example, design, writing, interest, body, wrong, order, land, office, face, side, care, space, food, decision, news, common, meeting, stop, head, let, statement, almost, past, night, note, attack, force, rule, value, hope, town, certain, paper, cause, hand, far, effect, growth, complete, young, across, offer, bad, poor, short, reading, match, property, quality, sense, living, late, month, answer, air, fire, opinion, cut, station, church, mind, board, structure, hard, simple, rate, turn, sign, necessary, quite, strong, room, cover, special, size, base, fall, competition, stage, dead, increase, table, hospital, act, road, argument, price, disease, religion, picture, summer, operation, produce, attention, complex, loss, authority, agreement, addition, fight, round, map, bit, kind, box, plant, separate, forward, heart, credit, degree, library, respect, range, judge, purpose, crime, peace, voice, chief, serious, measure, damage, amount, error, physical, morning, yes, letter, weight, star, debt, watch, safe, blood, train, balance, hearing, front, direction, store, limit, happy, step, fear, mass, able, chance, street, mine, copy, memory, scale, fixed, brain, flight, doubt, detail, mark, feeling, river, sea, machine, rest, driving, engine, drop, prison, plane, sort, window, distribution, wall, door, seat, deep, spring, join, exchange, send, heat, net, distance, winter, wide, weather, tree, motion, bridge, ball, eye, print, clean, straight, slow, floor, camera, selection, behaviour, transport, trouble, broken, equal, protest, reaction, skin, ice, wind, profit, responsible, waste, beautiful, expansion, payment, walk, existence, hour, representative, touch, bed, chemical, quick, metal, button, ring, farm, flat, hair, sleep, foot, frame, square, mixed, waiting, healthy, ready, stone, self, middle, solid, snow, mountain, rain, garden, belief, fiction, chain, danger, wing, branch, substance, thread, arm, minute, discovery, wood, cheap, violent, edge, mouth, strange, opposite, journey, fly, surprise, stick, till, prose, instrument, noise, dress, smoke, sad, soft, destruction, parallel, narrow, shock, secretary, leg, roof, quiet, steel, punishment, angry, iron, wave, sand, roll, neck, bag, circle, plate, ill, root, ticket, bright, kick, stretch, seed, jump, sweet, electric, sun, shut, dust, clock, loose, rail, liquid, bone, blow, shame, acid, angle, automatic, nose, sheep, reward, hat, lock, hook, sharp, smooth, trick, cup, tail, tight, wheel, meal, muscle, laugh, wire, burn, curve, chest, pump, rough, pot, stem, dirty, breath, steam, cry, smell, smile, cake, hanging, invention, pipe, loud, sudden, crack, knee, attraction, verse, flower, ear, dependent, sock, stomach, cook, finger, wound, bitter, adjustment, pocket, slip, tongue, pen, stamp, bite, rhythm, conscious, basin, dear, boot, burst, brick, leaf, twist, nerve, humour, regret, poison, slope, tendency, probable, throat, pin, fold, drain, bell, soup, paste, harmony, apparatus, cow, ink, flame, delicate, basket, shade, canvas, pig, hammer, bent, receipt, shelf, fork, thumb, grip, shoe, carriage, umbrella, bath, rat, nail, insect, tooth, worm, wool, potato, arch, needle, crush, screw, hollow, cord, impulse, bucket, rod, whip, skirt, toe, goat, polish, bee, brake, fertile, nut, awake, oven, cough, pencil, lip, amusement, curtain, boiling, knot, smash, disgust, chin, parcel, ray, glove, bulb, ant, thunder, chalk, tray, rub, feather, kettle, digestion, jelly, comb, sponge, scissors, drawer, stocking, jewel, ornament, stitch, spade, plough, fowl, sneeze, 7 | 6: wise, tired, cruel, foolish, feeble, 8 | 7: important, key, secret, 9 | 8: gold, silver, rice, copper, shirt, cotton, leather, cloth, brass, silk, tin, trousers, spoon, linen, cork, 10 | 9: yesterday, tomorrow, 11 | 10: man, person, woman, girl, boy, 12 | 11: condition, connection, comparison, relation, 13 | 12: cold, dry, warm, paint, wet, soap, brush, wash, shake, wax, 14 | 13: ship, fish, gun, boat, meat, vessel, snake, harbour, swim, sail, 15 | 14: love, hate, desire, kiss, 16 | 15: country, nation, island, earth, moon, 17 | 17: music, science, art, sound, 18 | 18: have, be, seem, 19 | 19: group, organization, song, society, band, unit, committee, horn, whistle, 20 | 21: fat, tall, thin, thick, stiff, sticky, elastic, 21 | 23: public, private, 22 | 25: natural, normal, regular, frequent, 23 | 26: will, may, 24 | 27: request, idea, theory, attempt, approval, suggestion, 25 | 28: company, business, market, industry, oil, trade, manager, division, owner, animal, dog, drink, horse, wine, glass, cat, coal, sugar, bird, salt, grain, bottle, advertisement, monkey, cart, 26 | 29: I, you, 27 | 32: true, false, 28 | 33: learning, teaching, 29 | 34: government, war, political, military, tax, medical, army, insurance, 30 | 35: support, help, need, guide, lift, 31 | 36: push, pull, 32 | 37: experience, knowledge, pain, expert, taste, observation, pleasure, comfort, cushion, 33 | -------------------------------------------------------------------------------- /basic_english_upper_angular_centered_lim0p618.txt: -------------------------------------------------------------------------------- 1 | 0: south, north, west, east, middle, 2 | 1: light, black, white, red, green, blue, dark, skin, hair, colour, yellow, brown, orange, grey, shade, ray, bulb, 3 | 2: female, sex, male, 4 | 3: right, good, left, possible, important, high, great, general, clear, top, enough, wrong, common, bad, poor, key, necessary, strong, low, error, driving, responsible, beautiful, mixed, solid, bright, sharp, probable, 5 | 4: war, death, peace, balance, birth, hanging, rhythm, harmony, 6 | 5: mother, father, son, married, daughter, friend, mine, brother, baby, sister, coal, dear, 7 | 6: dry, wet, sticky, 8 | 7: early, late, 9 | 8: man, person, living, woman, dead, girl, boy, self, 10 | 9: stage, gold, net, metal, frame, wood, silver, steel, iron, blow, acid, grass, wheel, copper, wire, pipe, knife, blade, brass, horn, tin, rod, whistle, wax, brake, curtain, 11 | 10: the, and, of, to, in, a, that, for, on, with, as, or, by, from, at, I, this, you, not, have, be, all, about, but, he, will, so, only, if, some, like, who, other, when, any, after, there, use, then, over, here, now, because, than, no, where, such, before, through, while, even, see, make, may, under, get, between, back, again, against, much, still, change, very, well, take, part, put, different, though, go, every, look, form, please, fact, among, same, record, little, move, test, small, lead, give, keep, come, near, ever, together, example, side, stop, statement, almost, note, attack, certain, far, across, offer, match, turn, sign, quite, cover, produce, loss, addition, fight, bit, box, separate, forward, seem, damage, watch, front, fixed, join, send, broken, profit, push, button, edge, opposite, stick, destruction, parallel, pull, lift, kick, crack, dependent, wound, twist, fold, fork, shake, crush, whip, knot, smash, stitch, 12 | 11: window, door, drink, wine, glass, bottle, poison, cork, 13 | 12: body, head, mind, heart, brain, mouth, neck, chest, stomach, tongue, throat, cord, 14 | 13: time, year, day, week, night, month, fall, summer, morning, yesterday, spring, winter, hour, tomorrow, minute, 15 | 14: level, picture, degree, machine, engine, motion, camera, gun, discovery, angle, steam, invention, hammer, nail, plough, 16 | 15: flat, smooth, rough, 17 | 16: history, country, nation, island, 18 | 17: laugh, cry, smile, humour, 19 | 18: weight, fat, tall, thin, thick, elastic, 20 | 19: first, last, second, round, 21 | 20: animal, ball, milk, comfort, bread, sheep, cheese, egg, cake, cotton, butter, cloth, powder, cow, basket, pig, rat, monkey, silk, wool, bucket, goat, cushion, linen, jelly, sponge, spade, 22 | 21: point, question, view, answer, opinion, purpose, yes, doubt, instrument, 23 | 22: law, field, science, tax, act, religion, crime, debt, fiction, punishment, 24 | 24: music, art, sound, voice, noise, loud, 25 | 25: ship, boat, vessel, harbour, sail, 26 | 26: public, private, secret, 27 | 28: up, out, off, down, free, open, rate, trade, exchange, transport, shut, rail, lock, burst, 28 | 29: long, short, range, deep, wide, straight, narrow, bent, 29 | 30: work, writing, reading, 30 | 31: true, false, 31 | 32: thing, kind, sort, hat, trick, 32 | 33: ring, clock, hook, curve, slope, bell, 33 | 34: disease, snake, insect, worm, bee, ant, 34 | 35: request, idea, suggestion, observation, 35 | 36: violent, wise, cruel, foolish, hollow, feeble, 36 | 37: fire, burn, flame, 37 | 38: name, school, book, line, language, education, word, town, station, church, board, hospital, road, learning, teaching, medical, committee, library, store, street, memory, prison, tree, square, branch, circle, reward, shelf, drawer, stocking, 38 | 39: fruit, seed, leaf, apple, nut, berry, 39 | 40: force, cause, effect, special, natural, normal, serious, physical, regular, trouble, chemical, frequent, root, automatic, stem, 40 | 41: page, talk, discussion, material, account, argument, substance, 41 | 42: heat, cold, warm, 42 | 43: care, happy, protest, strange, sad, angry, ill, sweet, tired, sudden, bitter, potato, 43 | 44: power, position, control, order, office, authority, seat, approval, ticket, electric, stamp, grip, 44 | 45: development, building, event, design, land, space, meeting, growth, property, structure, competition, hearing, expansion, payment, roof, adjustment, receipt, parcel, 45 | 46: hard, soft, 46 | 47: interest, love, hope, attention, fear, chance, pain, feeling, hate, danger, desire, pleasure, shame, attraction, regret, tendency, kiss, impulse, amusement, disgust, rub, 47 | 48: way, how, why, reason, sense, 48 | 49: wall, stone, brick, arch, 49 | 50: scissors, 50 | 51: young, safe, healthy, quiet, fertile, 51 | 52: clean, dirty, drain, wash, 52 | 53: paper, pen, ink, pencil, chalk, 53 | 54: quality, touch, taste, smell, 54 | 55: reaction, surprise, shock, 55 | 56: government, family, political, house, military, society, base, plant, army, farm, garden, flower, jewel, ornament, 56 | 57: cough, sneeze, 57 | 58: group, list, number, organization, band, table, amount, 58 | 59: blood, cup, pot, soup, boiling, kettle, 59 | 60: full, complete, equal, 60 | 61: cut, copy, print, paint, bone, muscle, brush, paste, tooth, polish, comb, 61 | 62: water, story, news, condition, sea, weather, ice, wind, snow, mountain, rain, wave, liquid, 62 | 63: credit, card, dog, cat, bag, bite, pin, collar, 63 | 64: foot, arm, leg, knee, nerve, umbrella, toe, 64 | 65: room, rest, floor, bed, sleep, conscious, soap, bath, awake, 65 | 66: song, detail, prose, verse, 66 | 67: face, wing, nose, tail, lip, chin, 67 | 68: river, bridge, basin, 68 | 69: money, waste, dress, shirt, pocket, coat, boot, leather, shoe, trousers, skirt, 69 | 70: eye, ear, 70 | 71: roll, loose, tight, pump, slip, screw, stiff, 71 | 72: oil, sugar, salt, sand, dust, rice, grain, canvas, 72 | 73: rule, hand, size, map, mass, scale, thread, hole, finger, thumb, needle, glove, 73 | 74: connection, behaviour, comparison, relation, 74 | 75: simple, complex, slow, quick, cheap, delicate, 75 | 76: new, future, present, current, old, past, 76 | 77: start, end, mark, till, 77 | 78: train, horse, carriage, cart, 78 | 79: system, process, thought, decision, agreement, judge, letter, direction, step, able, attempt, waiting, ready, apparatus, 79 | 81: play, run, fish, flight, drop, plane, flag, walk, fly, bird, jump, swim, feather, fowl, 80 | 82: company, business, industry, operation, unit, division, distribution, insurance, selection, existence, advertisement, 81 | 83: air, star, earth, smoke, sun, cloud, sky, breath, moon, thunder, mist, 82 | 84: experience, knowledge, theory, distance, expert, belief, journey, 83 | 85: do, place, say, let, 84 | 86: manager, chief, owner, representative, secretary, 85 | 87: increase, measure, limit, stretch, 86 | 88: market, food, value, price, meat, chain, plate, meal, sock, cook, servant, oven, tray, spoon, digestion, porter, 87 | 89: support, help, need, respect, guide, 88 | -------------------------------------------------------------------------------- /distances.py: -------------------------------------------------------------------------------- 1 | import numpy as np 2 | ''' 3 | classes for creating a matrix of the distances between all members of a set of vectors 4 | ''' 5 | 6 | 7 | class Angular(object): 8 | 9 | @staticmethod 10 | def calc(vectors, center_cols=True, center_rows=True): 11 | n = vectors.shape[0] 12 | target = np.zeros((n, n), dtype='float32') 13 | if center_cols: 14 | vectors = vectors - np.expand_dims(np.average(vectors, axis=0), axis=0) 15 | if center_rows: 16 | vectors = vectors - np.expand_dims(np.average(vectors, axis=-1), axis=-1) 17 | 18 | # vector norms remain the same each loop iteration so only have to be calculated once 19 | norms = np.linalg.norm(vectors, axis=-1) 20 | # loop calculates cosine similarity between 1 vector and all vectors each iteration 21 | for i in range(n): 22 | target[i] = (vectors.dot(vectors[i])) / (norms[i] * norms) 23 | 24 | # convert cosine similarities to angular distance 25 | target = np.arccos(target) / np.pi 26 | return target 27 | 28 | 29 | class Euclidean(object): 30 | 31 | @staticmethod 32 | def calc(vectors, center_cols=False, center_rows=False): 33 | n = vectors.shape[0] 34 | d = vectors.shape[1] 35 | target = np.zeros((n, n, d), dtype='float32') 36 | 37 | if center_cols: 38 | vectors = vectors - np.expand_dims(np.average(vectors, axis=0), axis=0) 39 | if center_rows: 40 | vectors = vectors - np.expand_dims(np.average(vectors, axis=-1), axis=-1) 41 | 42 | for i in range(n): 43 | for j in range(n): 44 | target[i, j] = vectors[i] - vectors[j] 45 | 46 | target = np.linalg.norm(target, axis=-1) 47 | return target 48 | -------------------------------------------------------------------------------- /iris_limit-0p618.png: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/josephius/star-clustering/f4c1e6991f03a0ac543aead1a68d4f6882e2f5b9/iris_limit-0p618.png -------------------------------------------------------------------------------- /plot_cluster_comparison.py: -------------------------------------------------------------------------------- 1 | """ 2 | ========================================================= 3 | Comparing different clustering algorithms on toy datasets 4 | ========================================================= 5 | 6 | This example shows characteristics of different 7 | clustering algorithms on datasets that are "interesting" 8 | but still in 2D. With the exception of the last dataset, 9 | the parameters of each of these dataset-algorithm pairs 10 | has been tuned to produce good clustering results. Some 11 | algorithms are more sensitive to parameter values than 12 | others. 13 | 14 | The last dataset is an example of a 'null' situation for 15 | clustering: the data is homogeneous, and there is no good 16 | clustering. For this example, the null dataset uses the 17 | same parameters as the dataset in the row above it, which 18 | represents a mismatch in the parameter values and the 19 | data structure. 20 | 21 | While these examples give some intuition about the 22 | algorithms, this intuition might not apply to very high 23 | dimensional data. 24 | """ 25 | print(__doc__) 26 | 27 | import time 28 | import warnings 29 | 30 | import numpy as np 31 | import matplotlib.pyplot as plt 32 | 33 | from sklearn import cluster, datasets, mixture 34 | from sklearn.neighbors import kneighbors_graph 35 | from sklearn.preprocessing import StandardScaler 36 | from itertools import cycle, islice 37 | from star_clustering import StarCluster 38 | 39 | np.random.seed(0) 40 | 41 | # ============ 42 | # Generate datasets. We choose the size big enough to see the scalability 43 | # of the algorithms, but not too big to avoid too long running times 44 | # ============ 45 | n_samples = 1500 46 | noisy_circles = datasets.make_circles(n_samples=n_samples, factor=.5, 47 | noise=.05) 48 | noisy_moons = datasets.make_moons(n_samples=n_samples, noise=.05) 49 | blobs = datasets.make_blobs(n_samples=n_samples, random_state=8) 50 | no_structure = np.random.rand(n_samples, 2), None 51 | 52 | # Anisotropicly distributed data 53 | random_state = 170 54 | X, y = datasets.make_blobs(n_samples=n_samples, random_state=random_state) 55 | transformation = [[0.6, -0.6], [-0.4, 0.8]] 56 | X_aniso = np.dot(X, transformation) 57 | aniso = (X_aniso, y) 58 | 59 | # blobs with varied variances 60 | varied = datasets.make_blobs(n_samples=n_samples, 61 | cluster_std=[1.0, 2.5, 0.5], 62 | random_state=random_state) 63 | 64 | # ============ 65 | # Set up cluster parameters 66 | # ============ 67 | plt.figure(figsize=(9 * 2 + 3, 12.5)) 68 | plt.subplots_adjust(left=.02, right=.98, bottom=.001, top=.96, wspace=.05, 69 | hspace=.01) 70 | 71 | plot_num = 1 72 | 73 | default_base = {'quantile': .3, 74 | 'eps': .3, 75 | 'damping': .9, 76 | 'preference': -200, 77 | 'n_neighbors': 10, 78 | 'n_clusters': 3} 79 | 80 | datasets = [ 81 | (noisy_circles, {'damping': .77, 'preference': -240, 82 | 'quantile': .2, 'n_clusters': 2}), 83 | (noisy_moons, {'damping': .75, 'preference': -220, 'n_clusters': 2}), 84 | (varied, {'eps': .18, 'n_neighbors': 2}), 85 | (aniso, {'eps': .15, 'n_neighbors': 2}), 86 | (blobs, {}), 87 | (no_structure, {})] 88 | 89 | for i_dataset, (dataset, algo_params) in enumerate(datasets): 90 | # update parameters with dataset-specific values 91 | params = default_base.copy() 92 | params.update(algo_params) 93 | 94 | X, y = dataset 95 | 96 | # normalize dataset for easier parameter selection 97 | X = StandardScaler().fit_transform(X) 98 | 99 | # estimate bandwidth for mean shift 100 | bandwidth = cluster.estimate_bandwidth(X, quantile=params['quantile']) 101 | 102 | # connectivity matrix for structured Ward 103 | connectivity = kneighbors_graph( 104 | X, n_neighbors=params['n_neighbors'], include_self=False) 105 | # make connectivity symmetric 106 | connectivity = 0.5 * (connectivity + connectivity.T) 107 | 108 | # ============ 109 | # Create cluster objects 110 | # ============ 111 | ms = cluster.MeanShift(bandwidth=bandwidth, bin_seeding=True) 112 | two_means = cluster.MiniBatchKMeans(n_clusters=params['n_clusters']) 113 | ward = cluster.AgglomerativeClustering( 114 | n_clusters=params['n_clusters'], linkage='ward', 115 | connectivity=connectivity) 116 | spectral = cluster.SpectralClustering( 117 | n_clusters=params['n_clusters'], eigen_solver='arpack', 118 | affinity="nearest_neighbors") 119 | dbscan = cluster.DBSCAN(eps=params['eps']) 120 | affinity_propagation = cluster.AffinityPropagation( 121 | damping=params['damping'], preference=params['preference']) 122 | average_linkage = cluster.AgglomerativeClustering( 123 | linkage="average", affinity="cityblock", 124 | n_clusters=params['n_clusters'], connectivity=connectivity) 125 | birch = cluster.Birch(n_clusters=params['n_clusters']) 126 | gmm = mixture.GaussianMixture( 127 | n_components=params['n_clusters'], covariance_type='full') 128 | star = StarCluster() 129 | 130 | clustering_algorithms = ( 131 | ('MiniBatchKMeans', two_means), 132 | ('AffinityPropagation', affinity_propagation), 133 | ('MeanShift', ms), 134 | ('SpectralClustering', spectral), 135 | ('Ward', ward), 136 | ('AgglomerativeClustering', average_linkage), 137 | ('DBSCAN', dbscan), 138 | ('Birch', birch), 139 | ('GaussianMixture', gmm), 140 | ('StarClustering', star), 141 | ) 142 | 143 | for name, algorithm in clustering_algorithms: 144 | t0 = time.time() 145 | 146 | # catch warnings related to kneighbors_graph 147 | with warnings.catch_warnings(): 148 | warnings.filterwarnings( 149 | "ignore", 150 | message="the number of connected components of the " + 151 | "connectivity matrix is [0-9]{1,2}" + 152 | " > 1. Completing it to avoid stopping the tree early.", 153 | category=UserWarning) 154 | warnings.filterwarnings( 155 | "ignore", 156 | message="Graph is not fully connected, spectral embedding" + 157 | " may not work as expected.", 158 | category=UserWarning) 159 | algorithm.fit(X) 160 | 161 | t1 = time.time() 162 | if hasattr(algorithm, 'labels_'): 163 | y_pred = algorithm.labels_.astype(np.int) 164 | else: 165 | y_pred = algorithm.predict(X) 166 | 167 | plt.subplot(len(datasets), len(clustering_algorithms), plot_num) 168 | if i_dataset == 0: 169 | plt.title(name, size=18) 170 | 171 | colors = np.array(list(islice(cycle(['#377eb8', '#ff7f00', '#4daf4a', 172 | '#f781bf', '#a65628', '#984ea3', 173 | '#999999', '#e41a1c', '#dede00']), 174 | int(max(y_pred) + 1)))) 175 | # add black color for outliers (if any) 176 | colors = np.append(colors, ["#000000"]) 177 | plt.scatter(X[:, 0], X[:, 1], s=10, color=colors[y_pred]) 178 | 179 | plt.xlim(-2.5, 2.5) 180 | plt.ylim(-2.5, 2.5) 181 | plt.xticks(()) 182 | plt.yticks(()) 183 | plt.text(.99, .01, ('%.2fs' % (t1 - t0)).lstrip('0'), 184 | transform=plt.gca().transAxes, size=15, 185 | horizontalalignment='right') 186 | plot_num += 1 187 | 188 | plt.show() 189 | -------------------------------------------------------------------------------- /plot_cluster_iris.py: -------------------------------------------------------------------------------- 1 | #!/usr/bin/python 2 | # -*- coding: utf-8 -*- 3 | 4 | """ 5 | ========================================================= 6 | Star Clustering 7 | ========================================================= 8 | 9 | """ 10 | print(__doc__) 11 | 12 | # Adapted by: Joseph Chu 13 | # Code source: Gaël Varoquaux 14 | # Modified for documentation by Jaques Grobler 15 | # License: BSD 3 clause 16 | 17 | import numpy as np 18 | import matplotlib.pyplot as plt 19 | # Though the following import is not directly being used, it is required 20 | # for 3D projection to work 21 | from mpl_toolkits.mplot3d import Axes3D 22 | 23 | from sklearn.cluster import KMeans 24 | from sklearn import datasets 25 | 26 | from star_clustering import StarCluster 27 | 28 | np.random.seed(5) 29 | 30 | iris = datasets.load_iris() 31 | X = iris.data 32 | y = iris.target 33 | 34 | estimators = [('star_clustering_iris', StarCluster())] 35 | 36 | fignum = 1 37 | titles = ['star clustering'] 38 | for name, est in estimators: 39 | fig = plt.figure(fignum, figsize=(4, 3)) 40 | ax = Axes3D(fig, rect=[0, 0, .95, 1], elev=48, azim=134) 41 | est.fit(X) 42 | labels = est.labels_ 43 | 44 | ax.scatter(X[:, 3], X[:, 0], X[:, 2], 45 | c=labels.astype(np.float), edgecolor='k') 46 | 47 | ax.w_xaxis.set_ticklabels([]) 48 | ax.w_yaxis.set_ticklabels([]) 49 | ax.w_zaxis.set_ticklabels([]) 50 | ax.set_xlabel('Petal width') 51 | ax.set_ylabel('Sepal length') 52 | ax.set_zlabel('Petal length') 53 | ax.set_title(titles[fignum - 1]) 54 | ax.dist = 12 55 | fignum = fignum + 1 56 | 57 | # Plot the ground truth 58 | fig = plt.figure(fignum, figsize=(4, 3)) 59 | ax = Axes3D(fig, rect=[0, 0, .95, 1], elev=48, azim=134) 60 | 61 | for name, label in [('Setosa', 0), 62 | ('Versicolour', 1), 63 | ('Virginica', 2)]: 64 | ax.text3D(X[y == label, 3].mean(), 65 | X[y == label, 0].mean(), 66 | X[y == label, 2].mean() + 2, name, 67 | horizontalalignment='center', 68 | bbox=dict(alpha=.2, edgecolor='w', facecolor='w')) 69 | # Reorder the labels to have colors matching the cluster results 70 | y = np.choose(y, [1, 2, 0]).astype(np.float) 71 | ax.scatter(X[:, 3], X[:, 0], X[:, 2], c=y, edgecolor='k') 72 | 73 | ax.w_xaxis.set_ticklabels([]) 74 | ax.w_yaxis.set_ticklabels([]) 75 | ax.w_zaxis.set_ticklabels([]) 76 | ax.set_xlabel('Petal width') 77 | ax.set_ylabel('Sepal length') 78 | ax.set_zlabel('Petal length') 79 | ax.set_title('Ground Truth') 80 | ax.dist = 12 81 | 82 | plt.show() 83 | -------------------------------------------------------------------------------- /star_clustering.py: -------------------------------------------------------------------------------- 1 | # Copyright 2020 Joseph Lin Chu 2 | # 3 | # Licensed under the Apache License, Version 2.0 (the "License"); 4 | # you may not use this file except in compliance with the License. 5 | # You may obtain a copy of the License at 6 | # 7 | # http://www.apache.org/licenses/LICENSE-2.0 8 | # 9 | # Unless required by applicable law or agreed to in writing, software 10 | # distributed under the License is distributed on an "AS IS" BASIS, 11 | # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 12 | # See the License for the specific language governing permissions and 13 | # limitations under the License. 14 | 15 | import numpy as np 16 | from scipy.spatial.distance import pdist, squareform 17 | 18 | class StarCluster(object): 19 | """Star Clustering Algorithm""" 20 | def __init__(self, has_upper_cutoff=False, limit_factor='golden_ratio', threshold_factor='golden_ratio_conjugate', distance_type='euclidean', debug=False): 21 | self.has_upper_cutoff = has_upper_cutoff 22 | # Set Constant of Proportionality 23 | if limit_factor == 'golden_ratio': 24 | self.limit_factor = ((1.0 + (5.0 ** 0.5)) / 2.0) 25 | elif limit_factor == 'golden_ratio_conjugate': 26 | self.limit_factor = ((1.0 + (5.0 ** 0.5)) / 2.0) - 1.0 27 | elif isinstance(limit_factor, float): 28 | self.limit_factor = limit_factor 29 | else: 30 | raise 31 | if threshold_factor == 'golden_ratio': 32 | self.threshold_factor = ((1.0 + (5.0 ** 0.5)) / 2.0) 33 | elif threshold_factor == 'golden_ratio_conjugate': 34 | self.threshold_factor = ((1.0 + (5.0 ** 0.5)) / 2.0) - 1.0 35 | elif isinstance(threshold_factor, float): 36 | self.threshold_factor = threshold_factor 37 | else: 38 | raise 39 | self.distance_type = distance_type 40 | self.debug = debug 41 | 42 | def fit(self, X): 43 | # Number of Nodes 44 | n = X.shape[0] 45 | # Number of Features 46 | d = X.shape[1] 47 | 48 | # Construct Node-To-Node Matrix of Distances 49 | distances_matrix = squareform(pdist(X, self.distance_type)) 50 | 51 | # Determine Average Distance And Extend By Constant of Proportionality To Set Limit 52 | limit = np.sum(distances_matrix) / (n * n - n) * self.limit_factor 53 | 54 | # Construct List of Distances Less Than Limit 55 | distances_list = [] 56 | for i in range(n): 57 | for j in range(n): 58 | if i < j: 59 | distances_list.append((i, j, distances_matrix[i, j])) 60 | 61 | # Sort List of Distances From Shortest To Longest 62 | distances_list.sort(key=lambda x: x[2]) 63 | 64 | # Build Matrix of Connections Starting With Closest Pairs Until Average Mass Exceeds Limit 65 | empty_clusters = [] 66 | mindex = 0 67 | self.labels_ = np.zeros(n, dtype='int32') - 1 68 | if self.has_upper_cutoff: 69 | self.ulabels = np.zeros(n, dtype='int32') 70 | mass = np.zeros(n, dtype='float32') 71 | while np.mean(mass) <= limit: 72 | i, j, distance = distances_list[mindex] 73 | mass[i] += distance 74 | mass[j] += distance 75 | if self.labels_[i] == -1 and self.labels_[j] == -1: 76 | if not empty_clusters: 77 | empty_clusters = [np.max(self.labels_) + 1] 78 | empty_clusters.sort() 79 | cluster = empty_clusters.pop(0) 80 | self.labels_[i] = cluster 81 | self.labels_[j] = cluster 82 | elif self.labels_[i] == -1: 83 | self.labels_[i] = self.labels_[j] 84 | elif self.labels_[j] == -1: 85 | self.labels_[j] = self.labels_[i] 86 | elif self.labels_[i] < self.labels_[j]: 87 | empty_clusters.append(self.labels_[j]) 88 | self.labels_[np.argwhere(self.labels_ == self.labels_[j])] = self.labels_[i] 89 | elif self.labels_[j] < self.labels_[i]: 90 | empty_clusters.append(self.labels_[i]) 91 | self.labels_[np.argwhere(self.labels_ == self.labels_[i])] = self.labels_[j] 92 | mindex += 1 93 | 94 | distances_matrix[distances_matrix == 0.0] = np.inf 95 | # Reduce Mass of Each Node By Effectively Twice The Distance To Nearest Neighbour 96 | for i in range(n): 97 | mindex = np.argmin(distances_matrix[i]) 98 | distance = distances_matrix[i, mindex] 99 | mass[i] -= distance 100 | mass[mindex] -= distance 101 | 102 | # Set Threshold Based On Average Modified By Deviation Reduced By Constant of Proportionality 103 | threshold = np.mean(mass) - np.std(mass) * self.threshold_factor 104 | 105 | # Disconnect Lower Mass Nodes 106 | for i in range(n): 107 | if mass[i] <= threshold: 108 | self.labels_[i] = -1 109 | if self.has_upper_cutoff: 110 | uthreshold = np.mean(mass) + np.std(mass) * self.threshold_factor 111 | for i in range(n): 112 | if mass[i] >= uthreshold: 113 | self.ulabels[i] = -1 114 | 115 | # Ignore Masses of Nodes In Clusters Now 116 | mass[self.labels_ != -1] = -np.inf 117 | acount = 0 118 | # Go Through Disconnected Nodes From Highest To Lowest Mass And Reconnect To Nearest Neighbour That Hasn't Already Connected To It Earlier 119 | while -1 in self.labels_: 120 | i = np.argmax(mass) 121 | mindex = i 122 | if self.has_upper_cutoff: 123 | while self.labels_[mindex] == -1: 124 | dsorted = np.argsort(distances_matrix[i]) 125 | not_connected = True 126 | sidx = 0 127 | while not_connected: 128 | cidx = dsorted[sidx] 129 | if self.ulabels[cidx] == -1: 130 | acount += 1 131 | sidx += 1 132 | else: 133 | mindex = cidx 134 | not_connected = False 135 | distances_matrix[i, mindex] = np.inf 136 | else: 137 | while self.labels_[mindex] == -1: 138 | mindex = np.argmin(distances_matrix[i]) 139 | distance = distances_matrix[i, mindex] 140 | distances_matrix[i, mindex] = np.inf 141 | self.labels_[i] = self.labels_[mindex] 142 | mass[i] = -np.inf 143 | 144 | if self.has_upper_cutoff and self.debug: 145 | print('Connections to nodes above upper mass threshold skipped: {}'.format(acount)) 146 | return self 147 | 148 | def predict(self, X): 149 | self.fit(X) 150 | return self.labels_ -------------------------------------------------------------------------------- /word_vectors_test.py: -------------------------------------------------------------------------------- 1 | import codecs 2 | import numpy as np 3 | import nltk 4 | from star_clustering import StarCluster 5 | 6 | wordlist = nltk.corpus.words.words('en-basic') 7 | 8 | print('Getting vectors...') 9 | words = [] 10 | vectors = [] 11 | count = 0 12 | with codecs.open('Data/wiki-news-300d-1M.vec', 'r', 'utf-8') as f: 13 | for i, line in enumerate(f): 14 | """ 15 | if i > 0 and i <= 1000: 16 | data = line.strip().split() 17 | words.append(data[0]) 18 | vectors.append(np.array([float(value) for value in data[1:]])) 19 | if i > 1000: 20 | break 21 | """ 22 | word = line.strip().split()[0] 23 | if word in wordlist: 24 | words.append(word) 25 | vectors.append(np.array([float(value) for value in line.split()[1:]])) 26 | count += 1 27 | if count == len(wordlist): 28 | break 29 | print(len(words)) 30 | vectors = np.array(vectors) 31 | 32 | for name, algorithm in [('StarClustering', StarCluster())]: 33 | print(name) 34 | algorithm.fit(vectors, upper=True, limit_exp=-1, dis_type='angular') 35 | # algorithm.fit(vectors) 36 | labels = algorithm.labels_ 37 | with codecs.open('basic_english_' + name + '.txt', 'w', 'utf-8') as f: 38 | for cluster in range(max(labels)+1): 39 | if cluster in labels: 40 | f.write(str(cluster) + ': ') 41 | for index in np.argwhere(labels == cluster): 42 | f.write(words[int(index)] + ', ') 43 | f.write('\n') 44 | 45 | 46 | 47 | --------------------------------------------------------------------------------