├── .gitignore
├── LICENSE
├── README.md
├── data
├── eth
│ ├── hotel
│ │ ├── annotated.svg
│ │ ├── obs_map.pkl
│ │ ├── pixel_pos.csv
│ │ └── pixel_pos_interpolate.csv
│ └── univ
│ │ ├── obs_map.pkl
│ │ ├── pixel_pos.csv
│ │ └── pixel_pos_interpolate.csv
├── pixel_pos_format.md
└── ucy
│ ├── univ
│ ├── annotated.svg
│ ├── obs_map.pkl
│ ├── pixel_pos.csv
│ └── pixel_pos_interpolate.csv
│ └── zara
│ ├── zara01
│ ├── annotated.svg
│ ├── obs_map.pkl
│ ├── pixel_pos.csv
│ └── pixel_pos_interpolate.csv
│ └── zara02
│ ├── annotated.svg
│ ├── obs_map.pkl
│ ├── pixel_pos.csv
│ └── pixel_pos_interpolate.csv
├── make_directories.sh
├── scripts
└── plot_training_curve.py
└── srnn
├── __init__.py
├── attn_visualize.py
├── criterion.py
├── helper.py
├── model.py
├── sample.py
├── st_graph.py
├── test
├── __init__.py
├── criterion_test.py
├── graph_visualization_test.py
└── st_graph_test.py
├── train.py
├── utils.py
└── visualize.py
/.gitignore:
--------------------------------------------------------------------------------
1 | *.rar
2 | *.avi
3 | *.vsp
4 | *.txt
5 | *.mat
6 | *.m
7 | *.png
8 | *.pyc
9 | save
10 | *.cpkl
11 | *.backup
12 | test.py
13 | save_backup
14 | save_lstm
15 | plot
16 | backup
17 | debug.py
18 | *.pdf
19 | debug*
20 | tests/
21 | save_*
22 | plot_*
23 | copy_from_server.sh
24 | log
25 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | GNU GENERAL PUBLIC LICENSE
2 | Version 3, 29 June 2007
3 |
4 | Copyright (C) 2007 Free Software Foundation, Inc.
5 | Everyone is permitted to copy and distribute verbatim copies
6 | of this license document, but changing it is not allowed.
7 |
8 | Preamble
9 |
10 | The GNU General Public License is a free, copyleft license for
11 | software and other kinds of works.
12 |
13 | The licenses for most software and other practical works are designed
14 | to take away your freedom to share and change the works. By contrast,
15 | the GNU General Public License is intended to guarantee your freedom to
16 | share and change all versions of a program--to make sure it remains free
17 | software for all its users. We, the Free Software Foundation, use the
18 | GNU General Public License for most of our software; it applies also to
19 | any other work released this way by its authors. You can apply it to
20 | your programs, too.
21 |
22 | When we speak of free software, we are referring to freedom, not
23 | price. Our General Public Licenses are designed to make sure that you
24 | have the freedom to distribute copies of free software (and charge for
25 | them if you wish), that you receive source code or can get it if you
26 | want it, that you can change the software or use pieces of it in new
27 | free programs, and that you know you can do these things.
28 |
29 | To protect your rights, we need to prevent others from denying you
30 | these rights or asking you to surrender the rights. Therefore, you have
31 | certain responsibilities if you distribute copies of the software, or if
32 | you modify it: responsibilities to respect the freedom of others.
33 |
34 | For example, if you distribute copies of such a program, whether
35 | gratis or for a fee, you must pass on to the recipients the same
36 | freedoms that you received. You must make sure that they, too, receive
37 | or can get the source code. And you must show them these terms so they
38 | know their rights.
39 |
40 | Developers that use the GNU GPL protect your rights with two steps:
41 | (1) assert copyright on the software, and (2) offer you this License
42 | giving you legal permission to copy, distribute and/or modify it.
43 |
44 | For the developers' and authors' protection, the GPL clearly explains
45 | that there is no warranty for this free software. For both users' and
46 | authors' sake, the GPL requires that modified versions be marked as
47 | changed, so that their problems will not be attributed erroneously to
48 | authors of previous versions.
49 |
50 | Some devices are designed to deny users access to install or run
51 | modified versions of the software inside them, although the manufacturer
52 | can do so. This is fundamentally incompatible with the aim of
53 | protecting users' freedom to change the software. The systematic
54 | pattern of such abuse occurs in the area of products for individuals to
55 | use, which is precisely where it is most unacceptable. Therefore, we
56 | have designed this version of the GPL to prohibit the practice for those
57 | products. If such problems arise substantially in other domains, we
58 | stand ready to extend this provision to those domains in future versions
59 | of the GPL, as needed to protect the freedom of users.
60 |
61 | Finally, every program is threatened constantly by software patents.
62 | States should not allow patents to restrict development and use of
63 | software on general-purpose computers, but in those that do, we wish to
64 | avoid the special danger that patents applied to a free program could
65 | make it effectively proprietary. To prevent this, the GPL assures that
66 | patents cannot be used to render the program non-free.
67 |
68 | The precise terms and conditions for copying, distribution and
69 | modification follow.
70 |
71 | TERMS AND CONDITIONS
72 |
73 | 0. Definitions.
74 |
75 | "This License" refers to version 3 of the GNU General Public License.
76 |
77 | "Copyright" also means copyright-like laws that apply to other kinds of
78 | works, such as semiconductor masks.
79 |
80 | "The Program" refers to any copyrightable work licensed under this
81 | License. Each licensee is addressed as "you". "Licensees" and
82 | "recipients" may be individuals or organizations.
83 |
84 | To "modify" a work means to copy from or adapt all or part of the work
85 | in a fashion requiring copyright permission, other than the making of an
86 | exact copy. The resulting work is called a "modified version" of the
87 | earlier work or a work "based on" the earlier work.
88 |
89 | A "covered work" means either the unmodified Program or a work based
90 | on the Program.
91 |
92 | To "propagate" a work means to do anything with it that, without
93 | permission, would make you directly or secondarily liable for
94 | infringement under applicable copyright law, except executing it on a
95 | computer or modifying a private copy. Propagation includes copying,
96 | distribution (with or without modification), making available to the
97 | public, and in some countries other activities as well.
98 |
99 | To "convey" a work means any kind of propagation that enables other
100 | parties to make or receive copies. Mere interaction with a user through
101 | a computer network, with no transfer of a copy, is not conveying.
102 |
103 | An interactive user interface displays "Appropriate Legal Notices"
104 | to the extent that it includes a convenient and prominently visible
105 | feature that (1) displays an appropriate copyright notice, and (2)
106 | tells the user that there is no warranty for the work (except to the
107 | extent that warranties are provided), that licensees may convey the
108 | work under this License, and how to view a copy of this License. If
109 | the interface presents a list of user commands or options, such as a
110 | menu, a prominent item in the list meets this criterion.
111 |
112 | 1. Source Code.
113 |
114 | The "source code" for a work means the preferred form of the work
115 | for making modifications to it. "Object code" means any non-source
116 | form of a work.
117 |
118 | A "Standard Interface" means an interface that either is an official
119 | standard defined by a recognized standards body, or, in the case of
120 | interfaces specified for a particular programming language, one that
121 | is widely used among developers working in that language.
122 |
123 | The "System Libraries" of an executable work include anything, other
124 | than the work as a whole, that (a) is included in the normal form of
125 | packaging a Major Component, but which is not part of that Major
126 | Component, and (b) serves only to enable use of the work with that
127 | Major Component, or to implement a Standard Interface for which an
128 | implementation is available to the public in source code form. A
129 | "Major Component", in this context, means a major essential component
130 | (kernel, window system, and so on) of the specific operating system
131 | (if any) on which the executable work runs, or a compiler used to
132 | produce the work, or an object code interpreter used to run it.
133 |
134 | The "Corresponding Source" for a work in object code form means all
135 | the source code needed to generate, install, and (for an executable
136 | work) run the object code and to modify the work, including scripts to
137 | control those activities. However, it does not include the work's
138 | System Libraries, or general-purpose tools or generally available free
139 | programs which are used unmodified in performing those activities but
140 | which are not part of the work. For example, Corresponding Source
141 | includes interface definition files associated with source files for
142 | the work, and the source code for shared libraries and dynamically
143 | linked subprograms that the work is specifically designed to require,
144 | such as by intimate data communication or control flow between those
145 | subprograms and other parts of the work.
146 |
147 | The Corresponding Source need not include anything that users
148 | can regenerate automatically from other parts of the Corresponding
149 | Source.
150 |
151 | The Corresponding Source for a work in source code form is that
152 | same work.
153 |
154 | 2. Basic Permissions.
155 |
156 | All rights granted under this License are granted for the term of
157 | copyright on the Program, and are irrevocable provided the stated
158 | conditions are met. This License explicitly affirms your unlimited
159 | permission to run the unmodified Program. The output from running a
160 | covered work is covered by this License only if the output, given its
161 | content, constitutes a covered work. This License acknowledges your
162 | rights of fair use or other equivalent, as provided by copyright law.
163 |
164 | You may make, run and propagate covered works that you do not
165 | convey, without conditions so long as your license otherwise remains
166 | in force. You may convey covered works to others for the sole purpose
167 | of having them make modifications exclusively for you, or provide you
168 | with facilities for running those works, provided that you comply with
169 | the terms of this License in conveying all material for which you do
170 | not control copyright. Those thus making or running the covered works
171 | for you must do so exclusively on your behalf, under your direction
172 | and control, on terms that prohibit them from making any copies of
173 | your copyrighted material outside their relationship with you.
174 |
175 | Conveying under any other circumstances is permitted solely under
176 | the conditions stated below. Sublicensing is not allowed; section 10
177 | makes it unnecessary.
178 |
179 | 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180 |
181 | No covered work shall be deemed part of an effective technological
182 | measure under any applicable law fulfilling obligations under article
183 | 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184 | similar laws prohibiting or restricting circumvention of such
185 | measures.
186 |
187 | When you convey a covered work, you waive any legal power to forbid
188 | circumvention of technological measures to the extent such circumvention
189 | is effected by exercising rights under this License with respect to
190 | the covered work, and you disclaim any intention to limit operation or
191 | modification of the work as a means of enforcing, against the work's
192 | users, your or third parties' legal rights to forbid circumvention of
193 | technological measures.
194 |
195 | 4. Conveying Verbatim Copies.
196 |
197 | You may convey verbatim copies of the Program's source code as you
198 | receive it, in any medium, provided that you conspicuously and
199 | appropriately publish on each copy an appropriate copyright notice;
200 | keep intact all notices stating that this License and any
201 | non-permissive terms added in accord with section 7 apply to the code;
202 | keep intact all notices of the absence of any warranty; and give all
203 | recipients a copy of this License along with the Program.
204 |
205 | You may charge any price or no price for each copy that you convey,
206 | and you may offer support or warranty protection for a fee.
207 |
208 | 5. Conveying Modified Source Versions.
209 |
210 | You may convey a work based on the Program, or the modifications to
211 | produce it from the Program, in the form of source code under the
212 | terms of section 4, provided that you also meet all of these conditions:
213 |
214 | a) The work must carry prominent notices stating that you modified
215 | it, and giving a relevant date.
216 |
217 | b) The work must carry prominent notices stating that it is
218 | released under this License and any conditions added under section
219 | 7. This requirement modifies the requirement in section 4 to
220 | "keep intact all notices".
221 |
222 | c) You must license the entire work, as a whole, under this
223 | License to anyone who comes into possession of a copy. This
224 | License will therefore apply, along with any applicable section 7
225 | additional terms, to the whole of the work, and all its parts,
226 | regardless of how they are packaged. This License gives no
227 | permission to license the work in any other way, but it does not
228 | invalidate such permission if you have separately received it.
229 |
230 | d) If the work has interactive user interfaces, each must display
231 | Appropriate Legal Notices; however, if the Program has interactive
232 | interfaces that do not display Appropriate Legal Notices, your
233 | work need not make them do so.
234 |
235 | A compilation of a covered work with other separate and independent
236 | works, which are not by their nature extensions of the covered work,
237 | and which are not combined with it such as to form a larger program,
238 | in or on a volume of a storage or distribution medium, is called an
239 | "aggregate" if the compilation and its resulting copyright are not
240 | used to limit the access or legal rights of the compilation's users
241 | beyond what the individual works permit. Inclusion of a covered work
242 | in an aggregate does not cause this License to apply to the other
243 | parts of the aggregate.
244 |
245 | 6. Conveying Non-Source Forms.
246 |
247 | You may convey a covered work in object code form under the terms
248 | of sections 4 and 5, provided that you also convey the
249 | machine-readable Corresponding Source under the terms of this License,
250 | in one of these ways:
251 |
252 | a) Convey the object code in, or embodied in, a physical product
253 | (including a physical distribution medium), accompanied by the
254 | Corresponding Source fixed on a durable physical medium
255 | customarily used for software interchange.
256 |
257 | b) Convey the object code in, or embodied in, a physical product
258 | (including a physical distribution medium), accompanied by a
259 | written offer, valid for at least three years and valid for as
260 | long as you offer spare parts or customer support for that product
261 | model, to give anyone who possesses the object code either (1) a
262 | copy of the Corresponding Source for all the software in the
263 | product that is covered by this License, on a durable physical
264 | medium customarily used for software interchange, for a price no
265 | more than your reasonable cost of physically performing this
266 | conveying of source, or (2) access to copy the
267 | Corresponding Source from a network server at no charge.
268 |
269 | c) Convey individual copies of the object code with a copy of the
270 | written offer to provide the Corresponding Source. This
271 | alternative is allowed only occasionally and noncommercially, and
272 | only if you received the object code with such an offer, in accord
273 | with subsection 6b.
274 |
275 | d) Convey the object code by offering access from a designated
276 | place (gratis or for a charge), and offer equivalent access to the
277 | Corresponding Source in the same way through the same place at no
278 | further charge. You need not require recipients to copy the
279 | Corresponding Source along with the object code. If the place to
280 | copy the object code is a network server, the Corresponding Source
281 | may be on a different server (operated by you or a third party)
282 | that supports equivalent copying facilities, provided you maintain
283 | clear directions next to the object code saying where to find the
284 | Corresponding Source. Regardless of what server hosts the
285 | Corresponding Source, you remain obligated to ensure that it is
286 | available for as long as needed to satisfy these requirements.
287 |
288 | e) Convey the object code using peer-to-peer transmission, provided
289 | you inform other peers where the object code and Corresponding
290 | Source of the work are being offered to the general public at no
291 | charge under subsection 6d.
292 |
293 | A separable portion of the object code, whose source code is excluded
294 | from the Corresponding Source as a System Library, need not be
295 | included in conveying the object code work.
296 |
297 | A "User Product" is either (1) a "consumer product", which means any
298 | tangible personal property which is normally used for personal, family,
299 | or household purposes, or (2) anything designed or sold for incorporation
300 | into a dwelling. In determining whether a product is a consumer product,
301 | doubtful cases shall be resolved in favor of coverage. For a particular
302 | product received by a particular user, "normally used" refers to a
303 | typical or common use of that class of product, regardless of the status
304 | of the particular user or of the way in which the particular user
305 | actually uses, or expects or is expected to use, the product. A product
306 | is a consumer product regardless of whether the product has substantial
307 | commercial, industrial or non-consumer uses, unless such uses represent
308 | the only significant mode of use of the product.
309 |
310 | "Installation Information" for a User Product means any methods,
311 | procedures, authorization keys, or other information required to install
312 | and execute modified versions of a covered work in that User Product from
313 | a modified version of its Corresponding Source. The information must
314 | suffice to ensure that the continued functioning of the modified object
315 | code is in no case prevented or interfered with solely because
316 | modification has been made.
317 |
318 | If you convey an object code work under this section in, or with, or
319 | specifically for use in, a User Product, and the conveying occurs as
320 | part of a transaction in which the right of possession and use of the
321 | User Product is transferred to the recipient in perpetuity or for a
322 | fixed term (regardless of how the transaction is characterized), the
323 | Corresponding Source conveyed under this section must be accompanied
324 | by the Installation Information. But this requirement does not apply
325 | if neither you nor any third party retains the ability to install
326 | modified object code on the User Product (for example, the work has
327 | been installed in ROM).
328 |
329 | The requirement to provide Installation Information does not include a
330 | requirement to continue to provide support service, warranty, or updates
331 | for a work that has been modified or installed by the recipient, or for
332 | the User Product in which it has been modified or installed. Access to a
333 | network may be denied when the modification itself materially and
334 | adversely affects the operation of the network or violates the rules and
335 | protocols for communication across the network.
336 |
337 | Corresponding Source conveyed, and Installation Information provided,
338 | in accord with this section must be in a format that is publicly
339 | documented (and with an implementation available to the public in
340 | source code form), and must require no special password or key for
341 | unpacking, reading or copying.
342 |
343 | 7. Additional Terms.
344 |
345 | "Additional permissions" are terms that supplement the terms of this
346 | License by making exceptions from one or more of its conditions.
347 | Additional permissions that are applicable to the entire Program shall
348 | be treated as though they were included in this License, to the extent
349 | that they are valid under applicable law. If additional permissions
350 | apply only to part of the Program, that part may be used separately
351 | under those permissions, but the entire Program remains governed by
352 | this License without regard to the additional permissions.
353 |
354 | When you convey a copy of a covered work, you may at your option
355 | remove any additional permissions from that copy, or from any part of
356 | it. (Additional permissions may be written to require their own
357 | removal in certain cases when you modify the work.) You may place
358 | additional permissions on material, added by you to a covered work,
359 | for which you have or can give appropriate copyright permission.
360 |
361 | Notwithstanding any other provision of this License, for material you
362 | add to a covered work, you may (if authorized by the copyright holders of
363 | that material) supplement the terms of this License with terms:
364 |
365 | a) Disclaiming warranty or limiting liability differently from the
366 | terms of sections 15 and 16 of this License; or
367 |
368 | b) Requiring preservation of specified reasonable legal notices or
369 | author attributions in that material or in the Appropriate Legal
370 | Notices displayed by works containing it; or
371 |
372 | c) Prohibiting misrepresentation of the origin of that material, or
373 | requiring that modified versions of such material be marked in
374 | reasonable ways as different from the original version; or
375 |
376 | d) Limiting the use for publicity purposes of names of licensors or
377 | authors of the material; or
378 |
379 | e) Declining to grant rights under trademark law for use of some
380 | trade names, trademarks, or service marks; or
381 |
382 | f) Requiring indemnification of licensors and authors of that
383 | material by anyone who conveys the material (or modified versions of
384 | it) with contractual assumptions of liability to the recipient, for
385 | any liability that these contractual assumptions directly impose on
386 | those licensors and authors.
387 |
388 | All other non-permissive additional terms are considered "further
389 | restrictions" within the meaning of section 10. If the Program as you
390 | received it, or any part of it, contains a notice stating that it is
391 | governed by this License along with a term that is a further
392 | restriction, you may remove that term. If a license document contains
393 | a further restriction but permits relicensing or conveying under this
394 | License, you may add to a covered work material governed by the terms
395 | of that license document, provided that the further restriction does
396 | not survive such relicensing or conveying.
397 |
398 | If you add terms to a covered work in accord with this section, you
399 | must place, in the relevant source files, a statement of the
400 | additional terms that apply to those files, or a notice indicating
401 | where to find the applicable terms.
402 |
403 | Additional terms, permissive or non-permissive, may be stated in the
404 | form of a separately written license, or stated as exceptions;
405 | the above requirements apply either way.
406 |
407 | 8. Termination.
408 |
409 | You may not propagate or modify a covered work except as expressly
410 | provided under this License. Any attempt otherwise to propagate or
411 | modify it is void, and will automatically terminate your rights under
412 | this License (including any patent licenses granted under the third
413 | paragraph of section 11).
414 |
415 | However, if you cease all violation of this License, then your
416 | license from a particular copyright holder is reinstated (a)
417 | provisionally, unless and until the copyright holder explicitly and
418 | finally terminates your license, and (b) permanently, if the copyright
419 | holder fails to notify you of the violation by some reasonable means
420 | prior to 60 days after the cessation.
421 |
422 | Moreover, your license from a particular copyright holder is
423 | reinstated permanently if the copyright holder notifies you of the
424 | violation by some reasonable means, this is the first time you have
425 | received notice of violation of this License (for any work) from that
426 | copyright holder, and you cure the violation prior to 30 days after
427 | your receipt of the notice.
428 |
429 | Termination of your rights under this section does not terminate the
430 | licenses of parties who have received copies or rights from you under
431 | this License. If your rights have been terminated and not permanently
432 | reinstated, you do not qualify to receive new licenses for the same
433 | material under section 10.
434 |
435 | 9. Acceptance Not Required for Having Copies.
436 |
437 | You are not required to accept this License in order to receive or
438 | run a copy of the Program. Ancillary propagation of a covered work
439 | occurring solely as a consequence of using peer-to-peer transmission
440 | to receive a copy likewise does not require acceptance. However,
441 | nothing other than this License grants you permission to propagate or
442 | modify any covered work. These actions infringe copyright if you do
443 | not accept this License. Therefore, by modifying or propagating a
444 | covered work, you indicate your acceptance of this License to do so.
445 |
446 | 10. Automatic Licensing of Downstream Recipients.
447 |
448 | Each time you convey a covered work, the recipient automatically
449 | receives a license from the original licensors, to run, modify and
450 | propagate that work, subject to this License. You are not responsible
451 | for enforcing compliance by third parties with this License.
452 |
453 | An "entity transaction" is a transaction transferring control of an
454 | organization, or substantially all assets of one, or subdividing an
455 | organization, or merging organizations. If propagation of a covered
456 | work results from an entity transaction, each party to that
457 | transaction who receives a copy of the work also receives whatever
458 | licenses to the work the party's predecessor in interest had or could
459 | give under the previous paragraph, plus a right to possession of the
460 | Corresponding Source of the work from the predecessor in interest, if
461 | the predecessor has it or can get it with reasonable efforts.
462 |
463 | You may not impose any further restrictions on the exercise of the
464 | rights granted or affirmed under this License. For example, you may
465 | not impose a license fee, royalty, or other charge for exercise of
466 | rights granted under this License, and you may not initiate litigation
467 | (including a cross-claim or counterclaim in a lawsuit) alleging that
468 | any patent claim is infringed by making, using, selling, offering for
469 | sale, or importing the Program or any portion of it.
470 |
471 | 11. Patents.
472 |
473 | A "contributor" is a copyright holder who authorizes use under this
474 | License of the Program or a work on which the Program is based. The
475 | work thus licensed is called the contributor's "contributor version".
476 |
477 | A contributor's "essential patent claims" are all patent claims
478 | owned or controlled by the contributor, whether already acquired or
479 | hereafter acquired, that would be infringed by some manner, permitted
480 | by this License, of making, using, or selling its contributor version,
481 | but do not include claims that would be infringed only as a
482 | consequence of further modification of the contributor version. For
483 | purposes of this definition, "control" includes the right to grant
484 | patent sublicenses in a manner consistent with the requirements of
485 | this License.
486 |
487 | Each contributor grants you a non-exclusive, worldwide, royalty-free
488 | patent license under the contributor's essential patent claims, to
489 | make, use, sell, offer for sale, import and otherwise run, modify and
490 | propagate the contents of its contributor version.
491 |
492 | In the following three paragraphs, a "patent license" is any express
493 | agreement or commitment, however denominated, not to enforce a patent
494 | (such as an express permission to practice a patent or covenant not to
495 | sue for patent infringement). To "grant" such a patent license to a
496 | party means to make such an agreement or commitment not to enforce a
497 | patent against the party.
498 |
499 | If you convey a covered work, knowingly relying on a patent license,
500 | and the Corresponding Source of the work is not available for anyone
501 | to copy, free of charge and under the terms of this License, through a
502 | publicly available network server or other readily accessible means,
503 | then you must either (1) cause the Corresponding Source to be so
504 | available, or (2) arrange to deprive yourself of the benefit of the
505 | patent license for this particular work, or (3) arrange, in a manner
506 | consistent with the requirements of this License, to extend the patent
507 | license to downstream recipients. "Knowingly relying" means you have
508 | actual knowledge that, but for the patent license, your conveying the
509 | covered work in a country, or your recipient's use of the covered work
510 | in a country, would infringe one or more identifiable patents in that
511 | country that you have reason to believe are valid.
512 |
513 | If, pursuant to or in connection with a single transaction or
514 | arrangement, you convey, or propagate by procuring conveyance of, a
515 | covered work, and grant a patent license to some of the parties
516 | receiving the covered work authorizing them to use, propagate, modify
517 | or convey a specific copy of the covered work, then the patent license
518 | you grant is automatically extended to all recipients of the covered
519 | work and works based on it.
520 |
521 | A patent license is "discriminatory" if it does not include within
522 | the scope of its coverage, prohibits the exercise of, or is
523 | conditioned on the non-exercise of one or more of the rights that are
524 | specifically granted under this License. You may not convey a covered
525 | work if you are a party to an arrangement with a third party that is
526 | in the business of distributing software, under which you make payment
527 | to the third party based on the extent of your activity of conveying
528 | the work, and under which the third party grants, to any of the
529 | parties who would receive the covered work from you, a discriminatory
530 | patent license (a) in connection with copies of the covered work
531 | conveyed by you (or copies made from those copies), or (b) primarily
532 | for and in connection with specific products or compilations that
533 | contain the covered work, unless you entered into that arrangement,
534 | or that patent license was granted, prior to 28 March 2007.
535 |
536 | Nothing in this License shall be construed as excluding or limiting
537 | any implied license or other defenses to infringement that may
538 | otherwise be available to you under applicable patent law.
539 |
540 | 12. No Surrender of Others' Freedom.
541 |
542 | If conditions are imposed on you (whether by court order, agreement or
543 | otherwise) that contradict the conditions of this License, they do not
544 | excuse you from the conditions of this License. If you cannot convey a
545 | covered work so as to satisfy simultaneously your obligations under this
546 | License and any other pertinent obligations, then as a consequence you may
547 | not convey it at all. For example, if you agree to terms that obligate you
548 | to collect a royalty for further conveying from those to whom you convey
549 | the Program, the only way you could satisfy both those terms and this
550 | License would be to refrain entirely from conveying the Program.
551 |
552 | 13. Use with the GNU Affero General Public License.
553 |
554 | Notwithstanding any other provision of this License, you have
555 | permission to link or combine any covered work with a work licensed
556 | under version 3 of the GNU Affero General Public License into a single
557 | combined work, and to convey the resulting work. The terms of this
558 | License will continue to apply to the part which is the covered work,
559 | but the special requirements of the GNU Affero General Public License,
560 | section 13, concerning interaction through a network will apply to the
561 | combination as such.
562 |
563 | 14. Revised Versions of this License.
564 |
565 | The Free Software Foundation may publish revised and/or new versions of
566 | the GNU General Public License from time to time. Such new versions will
567 | be similar in spirit to the present version, but may differ in detail to
568 | address new problems or concerns.
569 |
570 | Each version is given a distinguishing version number. If the
571 | Program specifies that a certain numbered version of the GNU General
572 | Public License "or any later version" applies to it, you have the
573 | option of following the terms and conditions either of that numbered
574 | version or of any later version published by the Free Software
575 | Foundation. If the Program does not specify a version number of the
576 | GNU General Public License, you may choose any version ever published
577 | by the Free Software Foundation.
578 |
579 | If the Program specifies that a proxy can decide which future
580 | versions of the GNU General Public License can be used, that proxy's
581 | public statement of acceptance of a version permanently authorizes you
582 | to choose that version for the Program.
583 |
584 | Later license versions may give you additional or different
585 | permissions. However, no additional obligations are imposed on any
586 | author or copyright holder as a result of your choosing to follow a
587 | later version.
588 |
589 | 15. Disclaimer of Warranty.
590 |
591 | THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592 | APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593 | HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594 | OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595 | THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596 | PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597 | IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598 | ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599 |
600 | 16. Limitation of Liability.
601 |
602 | IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603 | WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604 | THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605 | GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606 | USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607 | DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608 | PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609 | EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610 | SUCH DAMAGES.
611 |
612 | 17. Interpretation of Sections 15 and 16.
613 |
614 | If the disclaimer of warranty and limitation of liability provided
615 | above cannot be given local legal effect according to their terms,
616 | reviewing courts shall apply local law that most closely approximates
617 | an absolute waiver of all civil liability in connection with the
618 | Program, unless a warranty or assumption of liability accompanies a
619 | copy of the Program in return for a fee.
620 |
621 | END OF TERMS AND CONDITIONS
622 |
623 | How to Apply These Terms to Your New Programs
624 |
625 | If you develop a new program, and you want it to be of the greatest
626 | possible use to the public, the best way to achieve this is to make it
627 | free software which everyone can redistribute and change under these terms.
628 |
629 | To do so, attach the following notices to the program. It is safest
630 | to attach them to the start of each source file to most effectively
631 | state the exclusion of warranty; and each file should have at least
632 | the "copyright" line and a pointer to where the full notice is found.
633 |
634 |
635 | Copyright (C)
636 |
637 | This program is free software: you can redistribute it and/or modify
638 | it under the terms of the GNU General Public License as published by
639 | the Free Software Foundation, either version 3 of the License, or
640 | (at your option) any later version.
641 |
642 | This program is distributed in the hope that it will be useful,
643 | but WITHOUT ANY WARRANTY; without even the implied warranty of
644 | MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645 | GNU General Public License for more details.
646 |
647 | You should have received a copy of the GNU General Public License
648 | along with this program. If not, see .
649 |
650 | Also add information on how to contact you by electronic and paper mail.
651 |
652 | If the program does terminal interaction, make it output a short
653 | notice like this when it starts in an interactive mode:
654 |
655 | Copyright (C)
656 | This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657 | This is free software, and you are welcome to redistribute it
658 | under certain conditions; type `show c' for details.
659 |
660 | The hypothetical commands `show w' and `show c' should show the appropriate
661 | parts of the General Public License. Of course, your program's commands
662 | might be different; for a GUI interface, you would use an "about box".
663 |
664 | You should also get your employer (if you work as a programmer) or school,
665 | if any, to sign a "copyright disclaimer" for the program, if necessary.
666 | For more information on this, and how to apply and follow the GNU GPL, see
667 | .
668 |
669 | The GNU General Public License does not permit incorporating your program
670 | into proprietary programs. If your program is a subroutine library, you
671 | may consider it more useful to permit linking proprietary applications with
672 | the library. If this is what you want to do, use the GNU Lesser General
673 | Public License instead of this License. But first, please read
674 | .
675 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Structural RNN using PyTorch
2 |
3 | This code/implementation is available for research purposes. If you are using this code for your work, please cite the following paper
4 |
5 | > Anirudh Vemula, Katharina Muelling and Jean Oh. **Social Attention : Modeling Attention in Human Crowds**. *Submitted to the International Conference on Robotics and Automation (ICRA) 2018*.
6 |
7 | Or use the following BibTeX entry
8 | ```
9 |
10 | @ARTICLE{2017arXiv171004689V,
11 | author = {{Vemula}, A. and {Muelling}, K. and {Oh}, J.},
12 | title = "{Social Attention: Modeling Attention in Human Crowds}",
13 | journal = {ArXiv e-prints},
14 | archivePrefix = "arXiv",
15 | eprint = {1710.04689},
16 | primaryClass = "cs.RO",
17 | keywords = {Computer Science - Robotics, Computer Science - Learning},
18 | year = 2017,
19 | month = oct,
20 | adsurl = {http://adsabs.harvard.edu/abs/2017arXiv171004689V},
21 | adsnote = {Provided by the SAO/NASA Astrophysics Data System}
22 | }
23 | ```
24 |
25 | **Author** : Anirudh Vemula
26 |
27 | **Affiliation** : Robotics Institute, Carnegie Mellon University
28 |
29 | **License** : GPL v3
30 |
31 | ## Requirements
32 | * Python 3
33 | * Seaborn (https://seaborn.pydata.org/)
34 | * PyTorch (http://pytorch.org/)
35 | * Numpy
36 | * Matplotlib
37 | * Scipy
38 |
39 | ## How to Run
40 | * Before running the code, create the required directories by running the script `make_directories.sh`
41 | * To train the model run `python srnn/train.py` (See the code to understand all the arguments that can be given to the command)
42 | * To test the model run `python srnn/sample.py --epoch=n` where `n` is the epoch at which you want to load the saved model. (See the code to understand all the arguments that can be given to the command)
43 |
--------------------------------------------------------------------------------
/data/eth/hotel/obs_map.pkl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/BenJamesbabala/srnn-pytorch/01a48e5ff267bf7773819902fa8077ee6fa9254f/data/eth/hotel/obs_map.pkl
--------------------------------------------------------------------------------
/data/eth/univ/obs_map.pkl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/BenJamesbabala/srnn-pytorch/01a48e5ff267bf7773819902fa8077ee6fa9254f/data/eth/univ/obs_map.pkl
--------------------------------------------------------------------------------
/data/pixel_pos_format.md:
--------------------------------------------------------------------------------
1 | # Format of pixel_pos.csv file for each dataset
2 |
3 | Size of the matrix is 4 x numTrajectoryPoints
4 |
5 | The first row contains all the frame numbers
6 |
7 | The second row contains all the pedestrian IDs
8 |
9 | The third row contains all the y-coordinates
10 |
11 | The fourth row contains all the x-coordinates
12 |
--------------------------------------------------------------------------------
/data/ucy/univ/obs_map.pkl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/BenJamesbabala/srnn-pytorch/01a48e5ff267bf7773819902fa8077ee6fa9254f/data/ucy/univ/obs_map.pkl
--------------------------------------------------------------------------------
/data/ucy/zara/zara01/obs_map.pkl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/BenJamesbabala/srnn-pytorch/01a48e5ff267bf7773819902fa8077ee6fa9254f/data/ucy/zara/zara01/obs_map.pkl
--------------------------------------------------------------------------------
/data/ucy/zara/zara01/pixel_pos.csv:
--------------------------------------------------------------------------------
1 | 1,1,1,1,1,1,1,1,3,4,4,4,4,5,5,5,5,6,7,8,8,8,9,9,9,9,10,11,11,11,11,12,13,13,14,15,15,15,15,16,16,17,17,18,18,18,19,19,19,20,20,21,21,22,22,23,23,23,23,23,25,25,26,26,27,27,28,28,28,28,30,30,30,31,32,32,32,33,33,34,34,34,36,36,37,37,37,38,39,41,41,41,41,42,42,42,45,45,46,46,48,49,49,49,51,52,53,53,53,54,55,57,58,58,58,59,61,62,62,62,63,63,65,65,66,66,67,68,68,69,69,69,70,70,72,73,73,73,73,74,75,75,76,76,76,76,77,77,78,79,80,80,80,80,80,81,81,81,81,82,83,83,83,85,85,85,85,85,85,86,86,87,87,88,88,89,89,90,90,91,92,92,93,93,93,93,93,95,95,96,96,97,97,98,98,100,100,101,102,102,103,104,104,106,106,107,107,107,108,111,111,111,112,112,112,115,115,115,116,116,117,117,119,119,119,120,121,121,123,123,123,125,125,125,126,127,127,128,130,131,131,132,132,133,135,135,136,136,137,138,138,139,140,141,142,142,143,144,145,145,146,146,146,148,149,150,150,151,151,151,152,154,154,154,156,157,157,158,159,160,161,162,163,163,163,164,166,167,167,167,169,169,169,170,171,171,173,174,174,174,174,175,178,178,179,182,185,186,190,190,192,194,194,195,196,196,196,198,198,199,199,199,200,201,202,203,203,204,205,206,207,208,208,208,209,209,209,210,210,212,212,212,212,213,213,215,215,216,216,218,218,218,218,218,219,219,221,221,222,222,222,223,224,224,225,227,227,227,228,229,230,231,232,236,237,237,237,240,241,241,241,245,246,246,246,247,249,249,251,251,252,254,254,254,256,256,258,259,259,259,260,261,261,262,262,263,264,265,265,265,265,268,268,269,269,270,272,273,273,273,273,275,277,279,279,280,281,285,285,287,289,292,293,294,300,303,303,316,316,319,320,324,324,329,330,335,335,339,340,343,344,347,347,350,351,359,360,363,364,367,367,368,371,372,373,375,376,378,379,380,380,381,381,382,382,384,384,385,386,386,387,388,389,389,390,391,391,391,391,392,393,394,394,395,395,396,396,397,398,398,399,399,399,401,401,401,401,401,401,402,402,403,404,404,404,404,404,405,405,405,406,407,407,408,408,408,409,409,410,412,412,413,413,414,415,416,416,417,417,418,418,419,421,421,422,422,422,423,423,425,425,425,426,427,427,427,428,429,429,430,431,432,432,432,433,434,435,436,436,436,438,438,439,439,440,443,443,447,447,448,449,449,449,450,451,452,453,453,454,454,454,455,456,456,456,457,457,458,459,460,460,460,461,462,463,463,464,464,464,465,467,467,467,468,468,468,470,470,470,471,471,471,472,472,473,473,474,474,474,475,475,476,476,477,477,478,478,478,478,478,478,479,480,480,481,482,482,482,482,482,482,483,483,484,484,485,485,485,485,486,486,486,487,488,488,488,489,489,489,490,491,491,491,492,492,492,493,493,494,494,495,496,497,497,497,499,501,501,502,502,502,507,508,509,509,509,513,513,519,522,525,526,528,529,532,533,536,538,540,541,542,544,544,545,547,547,547,549,549,551,551,552,552,553,554,555,556,557,558,559,559,559,560,562,563,563,565,565,565,566,567,567,569,570,570,571,571,571,571,573,575,576,577,578,578,578,581,581,582,584,585,586,588,590,590,592,594,594,594,595,597,598,598,599,600,601,603,604,605,606,607,608,610,610,611,614,614,614,617,619,619,619,623,626,627,628,628,629,630,631,632,632,633,634,634,635,635,637,637,637,638,638,639,640,640,641,641,641,642,642,642,643,644,644,645,645,646,646,646,646,647,648,650,650,650,650,650,650,651,651,652,652,654,654,655,655,655,656,656,656,657,657,659,659,659,660,660,660,661,661,661,662,662,663,663,664,664,664,665,665,665,665,665,667,667,667,667,668,668,668,668,668,668,668,669,669,669,670,671,671,672,672,672,672,672,673,673,673,673,673,673,673,673,673,674,675,675,675,675,676,676,676,677,677,677,677,677,677,677,677,677,678,679,679,679,679,679,680,680,680,680,681,681,681,681,682,682,682,682,682,682,682,683,683,683,683,684,684,685,685,685,685,685,686,686,686,686,687,687,687,687,687,687,688,688,688,688,688,688,690,690,690,690,691,691,691,692,692,692,693,693,693,693,693,693,693,693,694,695,695,695,695,695,696,696,697,697,697,697,698,698,698,698,699,699,699,700,700,701,701,701,701,702,702,702,703,703,703,703,704,704,704,705,705,705,706,706,706,707,707,707,708,708,709,710,710,710,710,712,714,714,714,714,715,715,715,716,717,718,718,718,719,721,721,722,723,723,724,728,728,728,733,733,735,736,737,738,740,740,742,743,744,744,747,747,748,749,749,750,752,754,754,755,756,756,759,759,760,761,762,762,763,763,766,767,767,767,767,768,768,768,770,771,772,772,773,773,773,775,775,776,777,778,778,781,782,783,784,785,786,787,787,788,789,789,791,791,792,793,796,796,797,797,798,798,798,800,800,801,803,804,805,807,810,811,812,812,814,814,815,816,817,817,817,817,818,818,819,819,820,820,820,820,821,822,822,823,823,824,825,825,826,826,826,827,827,827,827,828,830,830,830,831,831,831,831,833,835,835,835,835,835,835,836,836,836,837,839,839,839,840,840,840,841,843,843,843,843,844,845,845,846,847,847,847,847,848,850,850,850,850,851,851,852,852,852,853,853,854,855,855,855,855,855,856,856,857,857,857,858,858,859,859,860,861,862,863,864,865,865,867,869,869,873,873,874,874,875,876,876,877,877,877,879,880,881,881,881,881,881,884,885,885,885,885,886,889,889,890,891,891,892,893,895,897,897,898,899,903,903,904,904,907,908,908,908,912,913,913,917,917,917,918,921,921,921,922,922,925,926,926,926,927,930,930,931,931,933,934,935,936,936,938,940,940,940,942,943,943,945,945,946,947,949,949,950,950,951,953,954,955,955,955,955,957,959,959,960,960,960,960,961,963,963,964,964,964,965,965,966,967,968,968,968,970,970,971,971,971,972,973,976,976,977,977,977,978,980,981,983,983,983,987,987,987,989,991,991,992,992,996,996,996,999,1000,1001,1001,1002,1002,1004,1005,1006,1006,1006,1009,1009,1011,1011,1012,1013,1014,1014,1015,1017,1017,1019,1019,1023,1024,1045,1050,1051,1055,1055,1058,1060,1061,1064,1065,1066,1069,1070,1070,1073,1073,1074,1075,1076,1076,1078,1078,1079,1080,1080,1080,1081,1082,1082,1082,1083,1083,1083,1083,1083,1084,1085,1085,1085,1085,1086,1086,1086,1086,1086,1087,1088,1088,1088,1088,1089,1089,1089,1090,1090,1090,1091,1092,1092,1092,1093,1093,1094,1095,1095,1096,1096,1097,1097,1099,1100,1100,1101,1101,1101,1102,1102,1106,1106,1106,1107,1107,1110,1110,1111,1112,1114,1115,1115,1115,1116,1117,1119,1120,1121,1122,1123,1125,1126,1126,1127
2 | 1,2,3,4,5,6,7,8,9,1,7,8,2,3,6,4,5,9,1,6,2,7,8,3,5,9,6,4,7,1,2,9,6,8,7,1,2,3,10,5,9,4,8,7,6,1,2,9,10,6,8,3,7,5,1,4,2,9,6,10,11,8,2,3,1,9,4,11,6,10,8,2,5,11,3,9,10,4,6,1,2,11,8,3,10,9,11,4,3,8,4,11,9,12,13,10,11,13,12,8,10,13,11,12,8,11,13,10,12,8,10,13,14,12,15,8,15,14,16,13,12,8,15,17,14,16,8,12,13,17,9,15,16,14,17,13,12,15,18,9,14,16,17,18,19,13,12,15,18,13,12,9,18,20,14,15,16,17,19,12,15,18,20,14,15,17,16,18,19,20,15,14,18,15,19,16,14,17,15,19,15,14,16,18,20,15,17,20,15,19,18,15,16,20,17,18,19,20,17,16,18,19,20,16,17,22,19,18,21,16,17,21,18,19,22,21,19,16,17,22,18,8,21,19,22,16,17,8,21,19,22,16,8,21,17,19,22,21,8,19,16,22,21,17,23,19,22,21,24,25,23,22,21,24,23,25,22,21,22,26,23,24,25,26,21,23,24,8,26,22,25,26,23,8,24,26,25,23,8,26,24,23,25,26,8,27,27,24,26,23,25,27,8,26,23,27,24,26,27,25,23,8,26,27,8,27,8,27,8,27,27,28,29,8,30,27,31,28,8,27,30,29,31,28,30,31,8,29,30,28,31,30,32,33,29,8,31,28,30,33,32,31,30,29,28,30,33,31,32,29,28,30,33,32,8,31,32,33,30,31,28,29,32,33,8,28,31,30,34,29,28,34,35,34,32,8,35,32,34,35,8,35,33,34,8,32,33,35,32,34,33,36,32,35,33,36,32,34,35,36,37,33,32,37,36,34,35,32,37,33,36,37,34,35,36,37,32,36,33,34,37,35,37,36,34,35,37,37,35,34,37,37,34,35,37,34,35,38,39,38,39,39,38,39,38,39,38,39,38,38,39,38,39,38,39,40,41,40,41,40,42,41,42,40,41,42,40,41,43,42,44,45,40,41,43,44,45,42,40,43,41,44,45,42,43,40,46,47,41,44,45,43,42,46,47,41,40,44,43,42,45,46,48,47,40,41,42,43,49,44,50,48,40,46,42,45,50,49,47,44,41,43,48,50,46,45,49,44,47,48,50,45,46,49,47,44,45,48,46,51,50,49,51,53,47,48,52,50,49,53,51,52,49,48,46,50,53,47,52,51,49,48,50,53,51,52,53,49,48,50,49,51,52,53,50,52,53,54,53,52,55,56,51,54,53,52,55,56,53,58,54,53,55,56,58,59,54,57,55,56,58,54,59,57,55,58,56,54,59,57,58,55,59,56,57,54,59,58,60,55,61,62,57,56,54,59,60,58,62,61,57,56,63,55,59,63,54,60,58,62,57,61,63,56,55,62,58,60,57,54,59,63,64,55,61,56,64,60,63,57,54,62,58,59,55,64,63,56,60,57,61,62,55,59,54,64,63,58,57,60,59,64,62,61,63,60,62,64,65,66,61,65,66,60,61,62,66,65,66,65,66,65,66,65,66,65,66,65,67,66,65,67,68,69,66,67,68,65,69,68,67,66,69,65,68,67,69,70,66,68,67,65,69,70,68,69,65,66,67,70,71,68,69,71,70,67,65,66,68,71,70,68,71,68,67,70,70,67,68,72,70,68,72,70,68,72,73,74,68,70,73,72,74,70,73,72,74,75,73,72,75,74,73,75,72,74,73,75,72,75,74,73,74,76,73,74,75,77,76,78,77,75,79,78,76,75,77,79,81,80,78,82,76,77,85,80,78,81,79,82,83,85,77,84,80,76,81,83,78,82,79,84,81,83,76,77,78,80,82,84,85,79,81,78,80,83,84,82,76,77,79,81,82,84,80,83,85,78,81,86,76,79,77,82,84,80,83,85,86,87,88,89,81,78,76,79,82,80,84,77,90,86,89,83,87,88,85,81,78,89,84,91,92,86,82,80,79,83,87,76,77,93,90,88,85,89,79,81,84,86,78,92,88,83,93,94,87,91,82,77,95,80,88,89,96,84,97,83,94,76,87,86,82,93,95,81,88,92,78,91,84,89,77,90,96,83,97,94,87,86,89,84,88,83,93,82,95,81,85,90,92,78,91,77,96,88,89,97,94,86,76,87,93,81,88,95,89,78,82,90,77,97,91,85,81,92,94,96,86,88,87,95,93,89,78,97,76,92,94,91,90,88,77,89,87,95,96,93,90,97,86,94,78,92,88,76,91,89,95,90,77,78,87,93,88,94,97,92,96,90,89,93,95,89,91,90,92,94,93,90,95,94,98,96,91,97,92,93,90,95,98,94,91,92,90,95,93,98,95,90,98,99,100,90,98,99,100,101,90,98,100,99,101,90,102,98,101,100,99,102,98,101,100,99,102,103,104,101,102,100,99,104,103,101,99,104,105,103,106,100,102,105,106,99,103,100,104,102,105,100,99,106,103,104,105,106,104,103,107,105,108,104,106,103,107,108,105,104,106,107,105,108,103,104,105,106,109,110,107,108,109,110,107,108,111,110,107,112,111,113,108,109,114,115,116,112,111,113,107,110,115,108,114,111,112,116,113,115,111,114,117,109,116,118,112,113,111,115,110,114,117,118,113,116,111,112,115,110,109,113,111,117,114,116,118,112,115,111,113,114,117,116,118,115,111,112,114,115,113,116,118,117,119,115,111,114,112,116,119,113,118,115,120,111,117,121,119,114,118,112,113,115,119,120,116,121,111,114,112,117,118,119,120,113,121,118,119,120,117,118,121,119,120,121,122,123,119,120,122,124,119,123,125,121,122,124,123,125,119,120,122,121,123,120,124,125,122,121,123,120,124,125,122,123,124,125,122,123,125,124,123,122,125,124,122,123,124,125,126,127,126,129,128,130,127,128,126,129,130,127,126,128,129,130,126,127,128,129,130,126,128,127,131,126,129,130,131,127,128,126,130,131,129,127,128,130,126,131,129,127,128,132,130,126,131,129,132,133,128,127,130,126,134,129,133,131,130,132,126,128,134,127,129,133,126,131,135,134,132,136,133,134,131,136,132,135,133,132,134,135,136,133,134,132,133,136,132,134,135,133,136,134,133,135,134,133,137,138,136,134,133,138,137,135,136,138,133,137,138,135,136,137,138,137,138,137,138,137,138,139,139,140,140,139,140,139,140,139,140,139,139,148,140,141,148,139,140,141,142,141,143,142,139,140,148,143,141,144,139,142,145,146,140,147,141,139,148,143,144,140,146,147,142,145,141,143,140,139,148,144,142,147,141,146,143,145,140,142,144,143,147,141,142,146,143,144,147,141,143,145,142,141,144,147,143,146,143,144,147,146,145,143,144,147,146,143,145,146,144,147,143,147,143,144,147,148,143,144,147,148
3 | 0.42708,0.35417,0.35417,0.25,0.35764,0.49653,0.44097,-0.069444,0.47222,0.42708,0.49653,-0.041667,0.35764,0.31944,0.51736,0.22222,0.38542,0.51736,0.40972,0.51389,0.33681,0.54167,-0.055556,0.29861,0.44792,0.52431,0.47569,0.22222,0.57639,0.46875,0.38194,0.50694,0.35069,-0.0625,0.59722,0.52431,0.44097,0.26736,-0.017361,0.5,0.55208,0.18403,-0.079861,0.59722,0.16667,0.54514,0.46528,0.57986,0.024306,0.038194,-0.10069,0.26389,0.51736,0.53819,0.55556,0.15625,0.46528,0.55903,-0.13542,0.065972,-0.079861,-0.090278,0.49306,0.25347,0.60764,0.57292,0.125,-0.0034722,-0.36111,0.097222,-0.10764,0.51389,0.53819,0.013889,0.20139,0.55208,0.13889,0.086806,-0.61111,0.65972,0.54861,0.083333,-0.090278,0.11111,0.14583,0.53819,0.17014,0.0034722,0.059028,-0.086806,-0.052083,0.27778,0.52778,0.44792,0.52778,0.17014,0.30556,0.52431,0.42708,-0.052083,0.14931,0.51736,0.34722,0.40972,-0.045139,0.37153,0.45486,0.15625,0.36111,-0.020833,0.097222,0.39931,0.53125,0.31597,0.41667,-0.024306,0.40278,0.5,-0.055556,0.34722,0.26389,-0.010417,0.35069,0.059028,0.45833,0.0069444,-0.065972,0.19097,0.28819,0.10069,0.50694,0.30556,0.0069444,0.36806,0.072917,0.26736,0.17708,0.27431,-0.0069444,0.49306,0.33333,0.020833,0.034722,0.045139,-0.22222,0.20139,0.14236,0.22222,0.090278,0.15972,0.15625,0.48958,0.11806,0.076389,0.27778,0.21181,0.0069444,0.03125,-0.10417,0.11806,0.20833,0.13194,0.13889,0.25694,0.19097,0.017361,-0.041667,0.13194,-0.0069444,0.21875,0.19444,0.26042,0.15972,0.19444,0.013889,-0.0625,0.24306,0,0.21875,0.0069444,0.24653,0.14931,-0.059028,0.17361,0.20139,0.26042,0.038194,0.24653,0.20139,0.03125,0.31944,0.13194,-0.041667,0.37153,0.034722,0.48958,0.027778,0.50694,0.048611,-0.024306,0.65972,0.013889,0.65972,0.055556,0.10764,-0.27778,0.03125,0.86111,-0.61806,0.14931,0.18403,-0.42361,0.9375,0.065972,-0.125,-0.25694,0.072917,0.21181,0.26736,0.0069444,1,-0.11458,-0.12153,0.10764,0.038194,0.28472,0.34722,-0.086806,0.055556,0.16667,0.034722,0.32639,-0.065972,0.13194,0.42014,0.25694,0.041667,0.17708,-0.03125,0.31597,0.35764,0.076389,0.21181,0.44097,0.03125,0.32639,0.13542,0.25694,0.57639,0.5,0.079861,0.16667,0.28125,0.57292,0.086806,0.48958,0.21181,0.27778,0.23611,0.52778,0.14931,0.61458,0.46181,0.47222,0.23958,0.1875,0.59028,0.010417,0.39931,0.33333,0.44097,0.35417,0.25694,-0.013889,0.56597,0.27778,0.42708,0.32986,-0.020833,0.25694,0.55903,0.34722,0.44444,0.25694,-0.034722,-0.0034722,0.034722,0.5625,0.31597,0.36111,0.45139,0.083333,-0.041667,0.41319,0.38889,0.12153,0.59375,0.47569,0.13889,0.49653,0.41319,-0.069444,0.50694,0.11458,-0.059028,0.097222,-0.038194,0.069444,-0.024306,0.0034722,-0.076389,0.28819,0.16319,-0.03125,0.33681,-0.21528,0.45486,0.29167,-0.034722,-0.30903,0.34028,0.18403,0.42361,0.30556,0.31597,0.42361,-0.034722,0.1875,0.3125,0.29861,0.43403,0.34028,0.41319,0.29514,0.19792,-0.0069444,0.4375,0.30556,0.36111,0.29167,0.40278,0.46528,0.36111,0.19097,0.29861,0.35417,0.27778,0.52083,0.38889,0.21181,0.3125,0.39931,0.22222,0.32292,0,0.53819,0.28819,0.19444,0.41667,0.52431,0.375,0.26042,0.28819,0.17014,0.0034722,0.41319,0.52778,0.39583,-0.024306,0.31944,0.44792,0.013889,0.16667,0.097222,0.26736,-0.020833,0.22569,0.25347,0.125,0.22222,0.010417,0.21875,0.15625,0.13542,-0.0069444,0.23958,0.15972,0.19792,0.28472,0.09375,0.20139,0.21875,0.31944,0.16667,0.21875,0.28472,0.35417,0.055556,0.15972,0.31944,0.32986,0.24653,0.375,0.27083,0.35417,0.052083,0.18403,0.37847,0.21528,0.28472,0.38194,0.15972,0.083333,0.22917,0.42014,0.13194,0.45139,0.45486,0.32639,0.14236,0.069444,0.29861,0.03125,0.54861,0.20139,0.35764,0.010417,-0.024306,0.42014,0.23264,-0.079861,-0.12847,0.28125,0.41667,-0.30556,0.35417,0.49653,0.34722,0.27083,0.32292,0.24306,0.20139,0.30903,0.17708,0.29861,0.18056,0.29167,0.16667,0.35069,0.40625,0.25694,0.44444,0.3125,0.47222,0.35417,0.44097,0.38194,0.41319,0.35069,0.38194,0.16319,0.30556,0.17361,0.35764,0.26736,0.15972,0.31597,0.28472,0.33333,0.15625,0.29167,0.39583,0.27083,0.25,0.3125,0.30208,0.4375,0.15278,0.25694,0.34722,0.22917,0.32639,0.44792,0.15625,0.33333,0.20833,0.42014,0.52778,0.1875,0.35417,0.47917,0.27083,0.19097,0.40278,0.49653,0.14236,0.12847,0.36806,0.24306,0.22917,0.48264,0.37153,0.5,0.49653,0.024306,0.086806,0.25,0.22569,0.48264,0.38542,0.35417,0.51736,-0.055556,0.375,0.29167,0.5625,0.30556,0.44444,0.50694,0.43056,-0.010417,0.22917,0.52083,0.27778,0.40972,0.59028,0.37847,0.47917,0.52431,0.52778,0.21528,0.59722,0.42014,0.30208,0.52083,0.46528,0.57292,0.52778,0.42361,0.52778,0.18056,0.32986,0.53125,0.0069444,0.51042,0.52083,0.40972,0.15625,0.29167,0.045139,0.54861,0.42361,0.24653,0.52083,0.41667,0.12153,0.048611,0.53125,0.44792,0.57639,0.18056,0.54514,0.10417,0.0625,0.58681,0.46875,0.14583,0.090278,0.55903,0.017361,0.0069444,0.59375,0.46181,0.23264,-0.065972,0.48264,0.27083,0.42708,0.36111,0.50694,0.32986,-0.21181,0.625,0.38542,0.44444,0.57639,0.41319,-0.045139,0.51389,0.32292,0.33681,0.52431,0.46181,0.041667,0.26389,0.36458,0.32639,0.076389,0.44792,0.079861,0.20139,0.33333,0.26736,0.23264,0.4375,0.17014,0.097222,0.32639,0.25,0.29861,0.13542,0.4375,0.23611,0.12847,0.41319,0.34028,0.20486,0.079861,0.40625,0.42014,0.48264,0.57639,0.50694,0.20833,0.32292,0.10764,0.41667,0.017361,0.56597,0.49653,0.54167,0.23264,0.59722,0.40972,0.10764,0.61111,0.27431,0.41667,0.010417,0.60069,0.47917,0.51389,0.61111,0.26736,0.37153,0.60069,0.013889,0.42361,0.42708,0.26042,0.11111,0.65972,0.51042,0.39583,0.51736,0.30208,0.58333,0.40625,0.73264,0.38194,0.29514,0.5625,0,0.13194,0.45139,0.63542,0.80208,0.35764,0.38194,0.38542,0.46875,0.52778,0.47569,0.15625,0.37847,0.75347,0.94097,0.086806,0.44792,0.35069,0.20139,0.89931,0.52778,0.45486,1.0347,0.34375,0.51389,1.0312,-0.78125,-0.77778,0.44792,-0.58333,-0.64236,0.31597,0.42708,0.51389,-0.4375,-0.40625,-0.26736,-0.15278,-0.059028,-0.013889,0.065972,0.065972,0.17014,0.13194,0.21528,0.125,0,0.20486,0.083333,0.041667,0.42708,0.59722,0.15278,0.10764,0.40972,0.020833,0.58333,0.41667,0.15625,0.10764,0.56597,-0.0069444,0.42708,0.13194,0.56597,0.33681,0.083333,0.4375,0.079861,-0.034722,0.55556,0.34375,0.44792,0.54861,-0.013889,0.086806,0.069444,0.34028,0.18403,0.42361,0.53472,0.30556,0.32639,0.045139,-0.079861,0.041667,0.43403,0.43056,0.28819,0.41319,0.57292,0.375,0.048611,0.25694,0.23264,0.024306,0.29861,0.40625,0.15278,0.22917,0.375,0.059028,0.12847,0.42708,0.31944,0.24653,0.065972,0.017361,0.33681,0.40625,0.23264,-0.059028,0.375,0.39236,0.21528,0.36111,0.30903,0.43056,0.32639,0.21181,0.34722,0.26736,0.38542,0.21875,0.41319,0.21528,0.32639,0.15625,0.24653,0.44097,0.28819,0.43403,0.51042,0.30556,0.055556,-0.10764,0.43056,-0.052083,0,-0.052083,0.42708,0.059028,0.44097,-0.15972,0.079861,0.39583,-0.013889,0.43403,0.19792,0.11111,0.44097,0.16319,0.57639,0.39931,0.26736,0.013889,0.40972,0.10417,0.020833,0.57986,0.17708,0.15625,0.37847,0.4375,0.048611,0.14236,0.28125,0.17014,0.45486,0.23958,0.12153,0.19444,0.40625,0.19444,0.27778,0.37153,0.25,0.26042,0.57292,0.49306,0.15972,0.28125,0.46181,0.28819,0.35417,0.36111,0.37153,0.15625,0.49653,0.22569,0.40278,0.4375,0.52778,0.37847,0.55556,0.24653,0.30556,0.47569,0.31944,0.48611,0.12847,0.43403,0.52778,0.56944,0.46875,0.53819,0.46875,0.45486,0.37153,0.0034722,0.29514,0.17361,0.28472,0.39583,0.42708,0.58681,0.58681,0.024306,-0.80208,0.48958,0.048611,0.53472,0.45139,0.35764,0.56944,0.33681,0.13542,0.10417,0.625,0.38194,0.54514,0.51042,0.44097,0.60417,0.29861,0.53472,0.46181,0.22569,-0.017361,0.44097,-0.65972,0.37847,0.54514,0.14931,0.28819,0.32292,0.60417,0.55208,0.083333,0.49306,0.4375,0.51736,0.44444,0.51736,0.51042,0.34722,0.39931,-0.034722,0.40278,0.62153,0.43056,0.090278,0.61806,0.62153,0.47917,0.52083,0.46181,0.15625,0.51389,0.51736,0.39236,0.37847,0.35417,0.29514,0.40972,0.46181,0.038194,0.29514,0.65278,0.034722,-0.055556,-0.51042,0.60764,0.48958,0.49306,0.40625,0.47917,0.53819,-0.024306,0.625,0.39931,0.47569,0.35417,0.43403,0.30903,0.30556,0.55556,-0.36806,0.38889,0.027778,0.26042,-0.090278,0.60764,0.42014,-0.083333,0.5,0.36111,0.55208,0.17361,0.52083,0.29514,0.36111,0.44097,0.25347,-0.090278,0.013889,0.48958,-0.15625,-0.11458,0.47569,0.24306,0.54514,0.40625,0.35764,0.28819,0.55903,0.54514,0.39583,0.50347,0.15625,0.28125,-0.086806,0.0069444,0.45486,0.20139,0.37153,0.17708,0.26736,0.041667,0.36111,-0.11458,-0.10069,0.45833,0.052083,0.54514,0.28125,0.10764,0.44444,0.53472,0.13889,-0.017361,0.44097,0.30208,0.23611,0.29167,-0.11111,0.017361,0.13542,-0.09375,0,0.37847,0.33333,0.27778,0.086806,0.44444,0.49653,0.54514,0.09375,-0.24306,0.375,-0.0069444,-0.38194,0.26736,0.03125,0.48611,0.086806,0.39583,-0.027778,-0.013889,0.052083,0.63194,0.53472,0.30556,0.48264,0.51042,0.41667,-0.020833,-0.048611,0.58333,-0.020833,0.38194,0.52778,-0.010417,-0.14931,0.45139,0.58681,-0.25347,-0.013889,0.60069,0.15625,0.28125,-0.0069444,0.57292,0.15972,0.29514,0.42361,-0.0625,0.61806,0.31944,0.1875,0.43056,-0.22917,0.49306,0.59375,0.49653,0.34375,0.20833,0.44792,0.61111,0.52778,0.36111,0.21528,0.47222,-0.024306,-0.16667,0.50694,0.45833,0.31944,0.19444,-0.069444,0.055556,0.51042,0.17361,0.013889,0.40625,0.17361,0.53472,0.27778,0.49306,0.37153,0.51736,0.076389,0.19792,0.13889,0.090278,0.51042,0.38194,0.090278,-0.055556,0.50694,0.27431,0.14236,0.39236,0.5,0.16319,0.29514,0.63889,0.32639,0.55208,0.19444,0.46528,0.37153,0.64583,0.55556,0.27083,0.30903,0.41667,0.63194,0.28472,0.53125,0.46875,0.34375,0.30556,0.43403,0.50347,0.62847,0.61111,0.49653,0.50347,0.60417,0.60069,0.47917,0.23264,0.59722,0.59375,0.36458,0.17361,0.24653,0.48264,0.48264,0.09375,-0.010417,0.46528,0.28472,0.16319,0.1875,0.64931,0.59028,-0.0034722,0.48264,0.11458,0.17014,0.29861,0.43403,0.19444,0.017361,0.17361,0.13194,0.31944,0.45833,0.39583,0.44444,0.31944,0.21181,0.15972,0.020833,0.59375,0.13194,0.32292,0.42708,0.20139,0.39236,0.17014,0.30208,0.027778,0.60764,0.50347,0.22917,0.17708,0.35069,0.076389,0.43056,0.45486,0.30208,0.03125,0.19444,0.22569,0.11111,0.40278,0.375,0.50347,0.069444,0.16667,0.28819,0.17014,0.086806,0.22569,0.34028,0.51042,0.39583,0.45833,0.090278,0.16667,0.17014,0.30208,0.29514,0.46181,0.20486,0.52083,0.0625,0.42708,0.17361,0.36806,0.53125,0.54861,0.13889,0.47222,0.28125,0.20486,-0.020833,0.59028,0.40278,0.20139,0.53472,0.21181,0.079861,0.33681,0.36111,0.44792,0.60764,0.39931,0.25347,0.53819,0.48611,0.61458,0.39236,0.375,0.48958,0.49306,0.62153,0.35069,0.49306,0.53819,0.37153,0.59375,0.29514,0.47569,0.43403,0.625,0.31944,0.52083,0.47569,0.41667,0.40625,0.28472,0.51042,0.59375,0.29167,0.37847,0.49653,0.24653,0.30556,0.33681,0.43056,0.39236,0.47222,0.21528,0.34375,0.28819,0.40972,0.38542,0.20139,0.28472,0.41667,0.39583,0.19792,0.37847,0.27083,0.26389,0.41667,0.38542,0.28819,0.50347,0.35417,0.36458,0.45486,0.1875,0.55903,0.18403,0.21875,0.32986,0.39236,0.47917,0.34722,0.18403,0.25347,0.48264,0.50347,0.15625,0.35764,0.25347,0.53125,0.14583,0.5,0.38194,0.23958,0.61806,0.12847,0.36458,0.5,0.38542,0.090278,0.20139,0.61111,0.35417,0.47917,0.31944,0.055556,0.58333,0.33681,0.13889,0.40972,0.27778,0.51042,0.027778,0.3125,0.11806,0.29514,0.20833,0.51736,0.40278,0,0.29861,0.045139,0.51042,0.32639,0.10764,0.19444,0.26736,-0.059028,0.48958,-0.038194,0.3125,0.29514,0.1875,0.48958,-0.17708,-0.024306,0.46528,0.055556,-0.15278,0.34722,-0.28125,0.27431,0.58333,0.4375,0.46875,0.44444,0.32292,0.41319,0.25,0.43403,0.53472,0.55556,0.28125,0.52431,0.37153,0.54861,0.42708,0.22222,0.26736,0.47222,0.14931,0.41667,0.46181,0.22569,0.52778,0.065972,0.38889,0.12153,0.013889,0.52778,0.045139,-0.072917,1.0312,1.0312,0.36111,-0.12847,-0.17361,0.89583,0.93403,0.57639,0.40278,0.68056,-0.30556,0.65278,0.52083,0.5625,0.38889,0.57292,0.45486,0.48958,0.39583,0.48958,0.37847,0.48958,0.35417,0.47917,0.43403,0.048611,0.10417,0.42361,0.15278,0.42014,0.19444,0.41319,0.11458,0.40278,0.38889,-0.27083,0.086806,1.0347,-0.20486,0.36806,0.034722,0.83681,1.0347,0.57986,0.29514,0.92708,0.36111,0.034722,-0.10069,0.24306,0.45833,0.33333,0.375,0.77778,0.65625,0.54167,0.027778,0.20833,0.38542,0.37153,-0.059028,0.22569,0.30208,-0.03125,0.52778,0.18056,0.62847,0.63194,0.36111,0.27431,-0.15278,0.40625,0,0.30903,0.60069,0.15972,0.37847,0.5,0.28472,0.63889,-0.23958,0.59722,0.32639,0.29514,0.17361,0.39236,0.60764,0.47917,0.24306,0.30208,0.19097,0.42708,0.21875,0.59722,0.59028,0.45139,0.28472,0.1875,0.21875,0.44097,0.20486,0.27431,0.15625,0.43056,0.56597,0.16319,0.23611,0.125,0.40278,0.10069,0.5625,0.41319,0.25694,0.15278,0.09375,0.1875,0.12153,0.27778,0.21875,0.034722,0.17708,0.38889,0.28125,0.13889
4 | 0.775,0.76111,0.57222,0.56111,0.15,0.072222,-0.16667,-0.42778,0.98889,0.60556,-0.011111,-0.35556,0.58333,0.4,0.27778,0.36111,-0.022222,0.81389,0.41111,0.4,0.39722,0.23056,-0.275,0.20556,-0.18333,0.65,0.47222,0.11111,0.36389,0.22778,0.20833,0.50833,0.60556,-0.19722,0.54722,0.041667,0.022222,-0.058333,-0.95833,-0.43889,0.36111,-0.15556,-0.16944,0.74167,0.75556,-0.16944,-0.20278,0.18056,-0.84722,0.78056,-0.175,-0.3,0.96944,-0.67778,-0.38611,-0.4,-0.40556,-0.0083333,0.81667,-0.62778,-0.97222,-0.29722,-0.6,-0.525,-0.62222,-0.22778,-0.58889,-0.81944,0.81111,-0.39167,-0.38889,-0.79444,-0.98333,-0.60556,-0.75,-0.45556,-0.16667,-0.76111,0.80278,-0.96111,-0.97222,-0.36944,-0.51111,-0.90833,0.05,-0.71111,-0.13333,-0.88889,-0.98333,-0.60556,-0.96944,0.13889,-0.82222,0.99167,0.95833,0.33889,0.45833,0.74722,0.75833,-0.62778,0.60833,0.54444,0.75833,0.54167,-0.66944,0.99722,0.33056,0.89444,0.33333,-0.71667,0.98611,0.069444,0.97222,0.1,0.98889,-0.74722,0.78611,0.69722,-0.96667,-0.16667,-0.16111,-0.80278,0.52778,-0.975,0.425,-0.92222,-0.80833,-0.39444,-0.44167,-0.88056,-0.86667,0.28889,-0.89444,0.14722,-0.81111,-0.7,-0.64444,0.05,-0.95278,-0.9,-0.13889,-0.83889,-0.75278,-0.93611,-0.96944,-0.88611,-0.81389,-0.17778,-0.88611,-0.98611,-0.91389,-0.96944,-0.80278,-0.96389,-0.41944,-0.35833,-0.75278,-0.69444,-0.92778,-1,-0.45,-0.71944,-0.86944,-0.65556,-0.55,-0.60278,-0.63333,-0.64722,-0.81111,-0.83611,-0.60556,-0.78889,-0.58611,-0.65833,-0.68611,-0.50556,-0.875,-0.50556,-0.73333,-0.575,-0.79722,-0.98333,-0.38611,-0.59722,-0.80278,-0.87222,-0.37778,-0.80278,-0.95556,-0.44167,-0.66667,-0.99444,-0.23333,-0.81944,-0.24167,-0.75278,-0.28611,-0.88611,-0.11111,-0.1,-0.82778,-0.18333,-0.98056,0.055556,0.025,-0.95556,-0.097222,-0.9,0.85,0.225,0.19167,0.83056,-0.95278,0.038889,-0.88889,0.76944,0.16667,0.39444,0.35278,-0.79444,-0.99444,-0.80556,0.69167,0.28611,-0.65556,0.58889,0.53889,-0.81667,0.59444,0.43333,-0.48611,0.75,-0.83333,0.475,0.75,0.58333,-0.28611,0.29444,-0.85278,0.76667,0.96944,-0.083333,0.061111,0.975,0.98611,0.99722,0.11944,-0.18611,0.99167,0.98611,0.83333,0.28611,-0.475,0.76389,0.63611,0.74444,0.48611,-0.72222,0.59444,-0.96944,0.44722,0.49722,0.46944,-0.76389,-0.96111,0.22778,0.26389,-0.80556,-0.60556,0.95833,0.17222,-0.44444,0.013889,-0.73056,-0.052778,-0.25833,-0.10833,-0.14722,-0.61389,-0.10833,-0.31111,-0.35833,-0.38889,0.080556,-0.50556,0.98056,0.82778,-0.59444,0.28333,-0.60833,-0.68333,0.65278,-0.41111,0.49722,-0.81111,0.48889,-0.97222,0.69722,0.30556,-0.96944,-0.98056,-0.35556,0.975,0.061111,-0.38056,-0.21944,-0.47222,-0.475,-0.55278,-0.71944,-0.83333,-0.97778,-0.97778,-0.65,0.97222,-0.93889,0.96389,-0.80833,-0.68056,-0.98056,0.76111,-0.75,0.71667,-0.60556,0.55833,0.48889,-0.73056,-0.475,0.36389,-0.35833,0.26667,0.16111,-0.975,-0.98333,-0.21389,-0.78611,0.091667,-0.16111,0.011111,-0.81389,-0.78333,-0.072222,-0.10278,0.033333,0.033333,-0.275,-0.63333,-0.27222,-0.59167,0.28056,0.30278,-0.44167,-0.5,-0.49444,-0.76389,-0.46111,-0.40556,-0.35556,-0.64722,-0.68056,0.55833,0.575,-0.29167,-0.3,-0.69722,0.80278,-0.95833,-0.98056,-0.975,0.975,0.975,-0.90833,-0.96667,-0.81389,-0.24722,-0.76389,-0.85278,-0.20278,-0.73333,-0.76111,-0.85556,-0.69722,-0.275,-0.61944,-0.98889,-0.16389,-0.19444,-0.60556,-0.083333,-0.525,-0.058333,-0.99444,0.063889,-0.51667,0.14167,-0.78056,0.225,-0.49167,-0.46389,-0.60278,0.98611,0.38611,0.4,0.83611,-0.36111,-0.34722,-0.28889,0.55833,0.66667,0.55278,-0.12222,0.49444,-0.14722,-0.12778,0.18056,0.33056,0.97778,0.46944,0.975,0.033333,0.14167,0.052778,-0.091667,0.95556,0.23611,0.21389,-0.30833,-0.57222,0.43056,0.46667,-0.77778,-0.90556,0.63611,0.71111,-0.93889,0.95833,0.96944,0.96944,0.96944,0.78056,0.78333,0.50556,0.48333,0.21667,0.19722,-0.066667,-0.10556,-0.28333,-0.36944,-0.53889,-0.55556,-0.75556,-0.74444,-0.96944,-0.98056,0.97778,0.98889,0.78333,0.80833,0.58333,-0.98333,0.60556,-0.76667,0.33056,0.37778,-0.54167,0.12778,0.16389,0.98056,-0.30833,-0.96667,-0.97778,-0.11667,-0.022222,0.72222,-0.78889,-0.77222,-0.041667,-0.32222,0.47222,-0.21944,-0.56111,-0.54722,0.19444,0.21944,-0.56111,-0.95833,-0.96389,-0.43611,-0.31944,-0.27222,-0.055556,0.44722,-0.75556,-0.775,-0.625,-0.75278,-0.075,-0.32778,0.66944,0.019444,-0.54722,-0.96111,-0.51111,-0.93611,-0.8,0.82778,-0.6,0.97222,0.18889,0.98333,-0.775,-1.0028,-0.3,0.96944,0.3,0.81667,0.77778,-0.25278,0.39444,-0.97778,-0.99167,-0.55556,0.63056,-0.047222,0.54167,0.51944,0.60833,0.025,-0.29722,0.38889,0.78611,0.20556,0.26389,0.31111,0.98333,0.98889,-0.083333,0.46667,-0.975,0.075,0.013889,-0.76111,-0.98333,0.63889,0.2,-0.96944,-0.18056,-0.225,-0.82222,-0.54167,-0.8,-0.38889,0.47778,0.99167,-0.39722,-0.63056,0.99167,-0.55833,-0.27222,-0.61389,0.75,-0.60278,-0.41944,-0.027778,-0.24444,-0.22222,-0.86111,0.98056,-0.79167,-0.96667,0.27778,0.05,0.038889,-0.93056,0.36667,0.28333,0.98611,0.53333,0.64167,0.98056,-0.97222,0.98889,0.89722,0.74722,0.97222,0.93889,-0.90833,0.93611,0.98611,0.87222,0.98611,0.82222,-0.75556,0.875,0.98611,0.76111,-0.97778,0.64444,-0.58889,0.69444,0.60833,0.75833,-0.81389,0.44444,0.51111,-0.40278,0.39722,0.60556,-0.64167,0.35556,0.21944,0.44444,-0.18333,-0.47222,0.16667,0.30556,0.21944,-0.96111,0.036111,-0.96111,-0.95278,-0.27778,0.0027778,-0.036111,0.15,-0.8,0.041667,-0.78889,-0.78056,-0.077778,0.23056,0.99444,-0.275,-0.069444,0.91667,-0.26667,-0.60833,-0.16389,-0.61111,0.14444,-0.56389,0.80833,0.47778,-0.43611,-0.44167,-0.36944,-0.425,0.35556,-0.48333,-0.29722,0.71111,0.98611,-0.62222,-0.32222,0.725,0.89167,-0.24722,0.62222,0.53333,-0.68056,-0.18333,-0.63333,-0.57778,-0.83611,0.75278,0.54444,0.98889,-0.055556,0.79167,-0.019444,0.022222,-0.98333,-0.79167,-0.98611,0.64167,0.5,-0.97778,0.98333,0.18056,-0.975,0.53056,0.33056,0.30833,0.45,0.44167,0.56389,0.43333,0.66944,0.76667,0.61111,0.71667,0.775,0.96389,0.975,0.98611,0.80833,0.71389,0.81389,0.71667,0.81111,0.68889,0.76389,0.63889,0.65556,0.50833,0.48056,0.32778,0.98333,0.21389,0.15278,0.79167,-0.98333,-0.96667,-0.027778,0.65,-0.85,-0.094444,-0.66111,-0.68611,0.42222,-0.23611,-0.41667,-0.29167,-0.51944,0.225,-0.10833,-0.35278,-0.47222,-0.325,0.044444,-0.525,0.20556,-0.15278,-0.10833,0.53333,-0.74722,-0.75556,-0.26667,0.047222,-0.98333,0.066667,0.98333,-0.84444,0.20556,-0.51111,-0.96944,-0.97222,0.25,-0.69167,0.33889,0.39444,-0.52222,0.46944,-0.84444,0.42778,0.49167,-0.97778,0.57778,-0.96944,0.625,0.68056,-0.69167,0.71667,0.83056,-0.45,0.98333,0.98611,0.975,0.84722,0.79722,-0.15,0.75556,0.98056,0.58333,0.069444,0.46111,0.97778,0.31111,0.31389,0.74167,0.16389,0.016667,0.48056,0.60833,-0.16389,-0.23333,0.21667,0.98056,-0.058333,-0.48889,-0.51111,-0.72222,0.97778,-0.98056,-0.97222,-0.63611,0.98611,0.80278,0.98611,0.88611,-0.88611,-0.96944,0.91667,0.75278,-0.975,0.80556,-0.78889,0.98889,-0.97222,0.825,0.98889,0.675,0.725,-0.97778,-0.78056,0.71944,0.85,-0.56111,0.88889,-0.98056,-0.85,0.60833,-0.975,-0.56944,0.54167,0.74722,-0.84444,0.57222,0.78056,-0.29722,-0.81111,0.61389,-0.66667,0.37222,0.43611,0.41111,-0.30556,0.58333,-0.65,-0.81667,-0.066667,0.49167,0.31111,-0.091667,-0.41389,-0.46667,0.44167,0.18056,0.27778,0.18333,0.37778,0.34444,-0.29167,0.047222,-0.21667,-0.775,0.15833,0.28333,-0.98056,0.044444,0.41667,0.13611,0.21944,-0.11667,0.2,-0.019444,-0.59444,-0.78333,-0.975,-0.96944,0.98333,0.13333,0.0027778,-0.075,0.69444,0.05,0.37778,0.097222,0.013889,0.59444,-0.61667,0.81389,0.18333,-0.76944,-0.73611,-0.32222,-0.027778,-0.10278,0.64444,0.34167,0.96944,0.98056,-0.41111,-0.15,0.65833,0.89722,0.40833,-0.56111,-0.22222,-0.10278,0.98056,0.58889,-0.51667,-0.072222,0.425,0.98889,-0.21389,0.55833,-0.26944,-0.225,0.775,-0.37778,0.58056,0.81111,0.975,-0.38333,0.75,-0.30556,-0.22778,0.97778,0.97222,-0.28889,0.18889,-0.97222,0.70278,-0.95833,0.725,0.81111,-0.43333,-0.21111,-0.061111,-0.48333,0.60278,0.79444,-0.44722,-0.15,0.525,-0.34444,0.50278,0.81111,-0.027778,-0.33056,0.58611,-0.75556,0.88889,-0.725,0.65,-0.038889,0.12778,-0.18056,0.98056,0.0055556,0.98889,0.40833,-0.69722,0.60556,-0.69167,0.65,0.59444,0.28611,-0.47778,0.26667,-0.46389,-0.46667,0.11111,-0.33611,-0.5,0.46111,0.36111,-0.58611,0.175,0.20556,-0.875,0.22222,0.40278,-0.47778,-0.61389,-0.975,0.61111,-0.63889,-0.26389,0.030556,0.99167,-0.98611,0.047222,0.25,-0.18056,0.61944,0.40556,0.44167,0.24167,0.030556,-0.61944,-0.76389,-0.027778,-0.79167,-0.097222,0.069444,-0.13056,0.61667,0.60556,-0.81667,-0.71944,0.69167,0.011111,0.15556,-0.13611,0.56667,0.18333,0.96944,-0.14167,-0.90278,-0.26667,0.80556,-0.95,-0.33333,-0.83611,-0.18611,0.43889,-0.96111,-0.98611,0.96944,-0.28056,0.98056,-0.38056,0.47778,-0.43611,0.55278,0.31111,-0.95,-0.41667,-0.42778,-0.98333,-0.56389,0.2,-0.57222,-0.61111,-0.59444,0.069444,-0.66667,-0.80556,0.975,0.97778,-0.75556,0.98889,-0.775,-0.81944,-0.097222,-0.81944,0.76667,-0.96389,-0.97778,-0.98611,-0.275,-0.92222,-0.97778,0.48611,-0.93611,-0.47222,0.28333,-0.95833,-0.975,-0.7,-0.047222,-0.76944,-0.78889,0.96944,-0.84722,-0.38611,-0.53056,-0.48056,0.675,-0.94167,0.99167,-0.68056,0.33056,-0.24722,-0.2,0.63889,-0.96389,-0.11111,0.038889,0.10556,0.31389,-0.97778,-0.94722,-0.51944,-0.044444,0.39444,0.40833,-0.81667,-0.84167,-0.97222,0.58056,-0.61944,0.98333,-0.61111,0.98611,0.67222,-0.56111,0.76667,0.725,0.78056,-0.36111,0.88611,-0.31944,-0.97778,0.45556,0.98333,0.97222,0.36667,-0.077778,-0.036111,0.083333,0.022222,0.23056,0.29722,-0.98889,-0.25,-0.97778,0.47222,-0.31389,0.54722,-0.74444,-0.75556,-0.54167,0.71667,-0.64444,-0.40278,-0.84167,-0.39722,0.96389,0.97778,-0.98056,-0.98611,-0.96667,-0.96111,-0.10278,-0.025,-0.76667,-0.7,0.25,0.40278,0.97222,-0.35278,0.53611,0.98611,0.79722,0.98611,0.74444,-0.086111,-0.95833,-0.97222,-0.96389,0.81667,0.62778,0.825,0.975,0.058333,-0.80556,0.95833,-0.79167,0.56944,0.59167,-0.74167,0.65,-0.60556,0.43056,-0.56944,-0.94167,0.40278,-0.54444,-0.95556,0.35278,0.45278,0.26389,-0.42222,0.56944,-0.31667,-0.71111,-0.75833,0.23333,-0.28333,0.10278,0.14722,-0.16944,0.95278,0.96667,0.066667,-0.052778,-0.47778,-0.083333,-0.047222,-0.50556,-0.058333,0.077778,-0.23056,-0.083333,0.10833,-0.27222,0.17778,-0.28611,0.23889,-0.38889,-0.34722,0.3,0.38889,-0.31944,0.45278,-0.036111,0.030556,0.98611,0.58056,-0.59167,0.52222,-0.58889,0.67222,0.78333,-0.58611,0.24167,0.8,-0.91667,-0.79444,0.33056,-0.95833,0.61389,0.8,0.43333,-0.88333,-0.78056,0.98611,0.46389,-0.80833,0.98333,-0.79444,-0.97222,0.98056,-0.975,0.66389,0.63889,0.23333,-0.63056,-0.96389,-0.53056,0.82778,0.0055556,-0.40278,0.98056,0.98333,-0.25278,-0.33056,-0.14167,0.094444,-0.96944,-0.97778,-0.63611,0.15278,-0.80278,-0.96667,-0.77778,-0.78333,-0.95833,0.40833,-0.60556,-0.74722,-0.58056,-0.75278,-0.975,0.43611,-0.38611,0.72222,-0.33333,0.66667,-0.48611,-0.44722,-0.094444,0.97778,-0.041667,0.97222,-0.15278,-0.12222,0.16111,0.20556,0.18333,0.175,0.45556,0.43611,0.475,0.50833,0.70278,0.75556,0.73889,0.75,0.96111,0.95833,0.98056,0.98889,0.96667,0.96111,0.85,0.96111,0.95556,0.96944,0.75,0.81111,0.7,0.79167,0.73611,0.52222,0.52778,0.575,0.52778,0.48611,0.35,0.29722,0.34722,0.27222,0.18333,0.16944,0.14444,0.030556,0.98056,-0.066667,-0.044444,-0.10278,0.77222,-0.24722,-0.175,-0.26667,-0.35556,0.51389,-0.3,-0.47222,-0.38333,-0.58333,-0.47778,0.24444,-0.56111,-0.63889,-0.6,-0.96944,-0.725,-0.67222,-0.072222,-0.73889,-0.74444,0.97778,-0.79167,-0.79444,-0.89444,-0.80278,0.98611,-0.90833,0.77222,-0.36944,-0.98611,-0.37778,-0.91111,-0.97222,0.74167,-0.95556,-0.96111,0.54444,-0.93056,-0.68611,-0.97222,0.525,-0.088889,-0.98333,0.31944,0.27222,-0.98333,-0.78056,0.20556,-0.73333,0.11389,0.40556,-0.0083333,-0.49444,-0.52222,-0.086111,-0.26111,0.74167,-0.27222,-0.26111,0.95556,-0.46667,-0.091667,-0.47222,0.058333,-0.67778,-0.66111,0.29722,-0.85556,-0.81389,0.15833,0.077778,0.41389,-0.99167,-0.89167,0.175,0.27222,0.64167,0.71944,0.28333,-0.95278,0.43889,0.39167,0.97778,0.96389,0.54167,0.50556,0.66389,0.64722,0.76111,0.74722,0.96111,0.96667,-0.975,-0.70556,0.98611,0.76111,-0.40556,0.525,-0.13056,0.325,-0.0083333,0.10278,0.05,0.17778,-0.93611,-0.15,-0.058333,-0.89167,0.42222,-0.43611,-0.12222,0.016667,-0.21111,0.98611,-0.027778,0.65833,-0.66667,-0.86944,0.85278,-0.30833,0.98056,0.73056,-0.14722,0.98611,0.98333,-0.83333,0.98056,-0.31667,0.81944,-0.80833,0.66944,0.82778,-0.91111,0.81389,0.83056,-0.27778,0.75833,-0.37222,0.51667,-0.975,0.97778,-0.8,0.64722,-0.44444,0.67222,-0.48333,0.51944,0.42778,0.45556,-0.98889,-0.54167,0.5,0.33611,0.48889,-0.65278,-0.73611,0.18056,0.22778,0.31389,0.31389,-0.82778,0.12222,-0.10556,-0.95556,-0.98056,0.12778,0.13889,0.0083333,-0.21389,-0.13611,-0.086111,-0.072222,-0.50556,-0.54444,-0.29444,-0.29167,-0.24167,-0.78611,-0.45,-0.98333,-0.98056,-0.51389,-0.49722,-0.58611,-0.64167,-0.725,-0.71944,-0.79167,-0.84722,-0.98056,-0.96944,-0.95556,-0.98611
5 |
--------------------------------------------------------------------------------
/data/ucy/zara/zara02/obs_map.pkl:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/BenJamesbabala/srnn-pytorch/01a48e5ff267bf7773819902fa8077ee6fa9254f/data/ucy/zara/zara02/obs_map.pkl
--------------------------------------------------------------------------------
/data/ucy/zara/zara02/pixel_pos.csv:
--------------------------------------------------------------------------------
1 | 1,2,4,5,7,7,8,9,9,10,11,11,12,13,13,14,15,15,16,17,18,18,18,19,19,20,20,20,21,21,21,22,23,23,24,24,24,24,25,26,26,26,27,27,28,29,29,29,30,30,30,31,32,32,32,33,33,34,34,34,35,35,36,36,36,37,38,38,38,38,38,39,39,39,40,41,41,41,41,42,43,43,43,44,44,45,45,45,46,46,47,47,47,47,48,48,48,48,49,49,49,49,50,50,50,50,51,51,51,51,52,52,52,52,52,52,53,53,53,53,54,54,54,54,54,55,55,55,55,55,56,56,56,56,57,57,57,57,57,58,58,58,58,58,59,59,60,60,60,60,60,60,60,61,61,61,62,62,62,62,62,63,63,63,64,64,64,64,65,65,65,65,67,67,67,67,68,68,69,70,70,70,70,72,72,73,73,75,75,75,76,77,78,78,80,81,82,83,83,85,89,94,97,98,98,100,100,100,101,101,101,102,102,102,102,103,103,103,103,103,103,104,105,105,105,105,105,105,105,106,106,106,107,107,108,108,108,108,108,108,109,109,110,110,110,110,110,111,111,111,111,112,112,112,112,112,113,113,113,113,113,114,114,114,114,114,114,115,115,116,116,116,116,116,116,117,117,117,117,117,118,118,118,118,118,119,119,119,120,120,120,120,120,121,121,121,121,121,122,122,122,123,123,123,123,123,123,124,124,124,124,124,125,125,125,125,125,126,126,126,127,127,127,127,127,128,128,129,129,129,129,129,129,129,130,130,130,131,131,131,131,131,131,131,132,132,132,132,133,133,133,133,134,134,134,134,134,135,135,135,135,135,136,136,136,136,137,137,138,138,138,138,138,138,138,139,139,140,140,140,140,140,141,142,142,142,143,143,143,143,145,145,145,146,146,146,146,148,148,148,149,149,149,149,150,151,151,152,152,154,154,155,155,155,156,157,157,157,158,160,160,160,161,161,162,163,163,163,164,164,164,164,166,166,166,166,167,167,168,168,169,169,169,170,170,171,171,173,173,173,173,174,174,174,174,175,175,175,176,176,176,177,177,177,177,178,179,179,179,179,180,180,180,181,182,182,182,182,182,182,183,183,184,184,184,184,184,185,186,186,186,186,186,187,187,188,188,188,188,189,190,190,190,190,191,192,192,192,193,193,194,194,194,195,195,195,195,196,196,197,197,197,198,198,198,199,199,199,200,200,200,200,201,201,201,201,201,203,204,204,204,205,205,205,205,207,207,207,207,207,208,208,209,209,210,210,210,211,212,212,212,213,214,214,214,214,215,217,217,218,218,218,219,219,219,221,221,221,222,222,222,223,224,224,226,226,226,228,228,229,230,231,232,235,240,243,245,245,246,247,248,248,248,249,249,251,251,252,252,253,254,254,255,255,256,256,257,257,257,259,260,260,260,260,262,262,262,263,263,265,266,266,266,267,268,268,268,268,269,269,270,270,270,270,271,271,271,272,272,273,273,273,273,274,274,275,275,275,276,276,277,277,277,277,277,278,278,279,279,280,280,280,280,280,281,281,281,282,283,283,283,283,283,284,284,284,285,286,286,286,286,287,287,288,289,289,289,290,290,291,291,292,292,292,293,293,293,294,295,296,296,297,297,297,298,299,300,301,302,304,306,308,312,314,317,320,323,326,327,327,329,329,329,331,332,332,334,334,335,335,337,337,337,338,340,340,340,341,341,342,342,342,343,344,344,344,344,346,347,348,349,351,351,354,354,355,355,357,358,358,360,361,361,363,365,365,366,367,368,368,371,371,372,372,372,373,373,374,374,375,375,375,375,376,377,377,377,377,377,377,378,378,379,379,379,379,379,379,380,380,380,380,381,381,381,381,382,382,382,382,382,382,382,382,383,383,384,384,384,384,384,384,384,385,385,385,385,385,386,386,386,386,387,387,387,387,387,387,387,388,388,388,388,388,389,389,389,389,389,389,389,390,390,390,390,390,391,391,391,392,392,392,392,392,392,393,393,394,394,394,394,394,395,395,395,395,396,396,397,397,398,398,398,399,399,399,399,399,400,401,401,401,401,402,403,403,403,403,403,403,404,404,405,405,405,406,406,406,406,406,406,407,407,407,408,408,408,408,408,408,408,409,409,409,409,409,409,410,410,410,410,410,410,410,411,411,411,411,411,411,412,412,412,412,412,412,412,412,413,413,413,413,413,413,414,414,414,414,414,414,415,415,415,415,415,416,416,416,416,416,416,416,416,417,417,418,418,418,418,418,418,418,418,419,419,419,420,420,420,420,420,421,421,421,421,421,421,422,422,422,423,423,424,424,424,425,425,425,426,426,427,427,427,428,428,428,429,429,429,429,429,430,430,431,431,431,431,432,432,432,432,432,432,433,433,433,434,434,434,434,434,434,434,435,435,435,436,436,436,436,437,437,437,437,437,437,438,439,439,439,439,440,440,441,442,442,442,442,442,442,443,444,444,444,444,445,445,446,447,447,447,447,448,448,448,449,449,449,449,450,450,451,451,451,451,452,452,452,452,452,454,454,454,454,454,455,455,455,456,456,457,457,457,457,457,457,458,458,458,459,459,460,460,461,461,461,461,461,462,462,462,462,463,463,464,464,465,465,465,465,465,465,466,466,466,467,468,468,468,468,468,468,468,469,469,469,470,470,470,470,471,471,471,471,472,472,472,472,472,472,473,473,473,474,474,474,474,474,474,474,475,475,475,475,475,476,476,476,477,477,477,477,477,477,477,478,478,479,479,479,479,479,479,479,479,480,480,481,481,481,481,481,481,481,481,482,482,482,483,483,483,484,484,484,484,484,484,484,484,485,485,485,485,486,486,486,486,487,487,487,487,487,487,487,488,488,488,488,489,489,489,489,490,490,490,490,491,491,491,491,492,492,493,493,494,494,494,494,494,495,495,496,496,497,498,499,499,500,501,501,501,502,502,503,503,503,504,504,504,505,505,506,506,507,507,508,508,508,508,509,509,510,510,511,511,512,512,512,512,513,514,514,514,515,516,516,517,517,517,518,518,519,519,520,520,521,521,521,522,522,523,523,523,523,524,525,525,526,526,526,527,527,527,527,528,528,529,529,529,529,530,530,530,531,531,531,531,532,532,533,533,533,534,534,534,535,535,535,535,535,536,536,536,536,536,537,537,537,537,538,538,538,539,539,539,539,539,540,540,541,541,541,541,541,542,542,542,543,543,544,544,544,544,545,545,545,545,546,546,547,547,547,547,547,547,548,548,550,550,550,551,551,551,553,553,553,553,555,555,556,556,557,557,559,559,559,559,560,561,562,563,563,563,564,565,565,566,567,567,567,567,568,569,570,571,572,573,575,576,577,578,578,579,580,580,580,580,582,582,582,583,583,584,584,584,585,585,586,586,586,586,587,587,588,588,588,589,589,589,589,589,590,591,591,591,591,591,592,593,593,593,593,594,595,595,595,596,596,596,596,596,597,598,598,598,599,599,599,599,599,600,600,600,600,601,601,601,602,602,602,603,603,603,604,604,604,604,605,606,606,606,607,608,608,608,609,609,609,611,611,611,611,612,613,613,613,613,614,614,614,614,616,616,616,616,617,617,617,618,618,619,619,619,620,620,621,621,621,622,622,624,624,624,624,624,625,625,625,625,626,626,626,627,627,627,627,627,627,628,628,628,629,629,629,629,629,629,630,630,630,630,630,631,631,631,631,631,631,632,632,632,632,632,633,633,633,633,633,633,634,634,634,634,634,634,634,634,635,635,635,636,636,636,636,636,636,637,637,637,638,638,638,638,638,638,638,638,639,639,639,639,640,640,640,640,640,641,641,641,641,641,642,642,642,642,642,642,643,643,643,643,643,644,644,644,644,644,644,644,644,645,645,645,646,646,646,646,646,646,646,647,647,647,647,648,648,648,648,649,649,649,649,649,649,649,649,649,649,649,650,650,650,651,651,651,651,651,651,651,651,652,652,652,652,653,653,654,654,654,654,654,654,654,654,654,655,655,655,656,656,656,656,656,656,656,657,657,657,657,657,658,658,658,658,658,659,659,659,659,659,659,660,660,660,660,661,661,661,661,661,661,662,662,662,662,662,662,662,662,663,663,663,663,664,664,665,665,665,665,666,666,666,666,666,667,667,667,667,668,668,668,669,669,670,670,671,671,672,672,672,672,673,673,674,674,675,675,675,676,676,676,677,678,679,679,679,679,680,680,681,682,683,683,684,685,685,685,685,686,687,687,688,688,688,688,689,690,690,690,690,691,692,692,692,692,692,692,693,693,693,693,693,694,694,695,695,695,695,695,695,695,696,696,696,696,697,697,697,697,698,698,698,698,698,698,699,699,699,700,700,700,700,700,701,701,701,701,701,702,703,703,703,703,703,704,704,704,704,704,705,705,706,706,706,707,707,707,707,707,707,708,708,709,709,709,709,710,710,710,710,711,711,712,712,712,712,712,712,713,713,713,714,714,714,715,715,716,716,716,716,716,717,718,718,718,718,719,719,719,720,720,720,721,721,722,722,722,723,724,724,724,724,724,725,726,726,727,727,727,728,728,729,729,730,731,731,732,733,734,735,735,735,737,737,739,740,740,741,742,742,743,744,744,745,745,746,747,748,749,749,749,750,751,751,751,752,753,754,754,754,756,756,757,757,758,758,759,759,760,760,760,762,762,762,763,764,764,764,765,765,767,768,768,768,768,769,770,771,771,772,772,774,774,774,774,775,776,776,777,777,777,778,779,779,779,779,780,781,781,782,782,783,783,784,784,784,784,785,786,786,786,787,787,787,788,788,789,789,790,790,791,791,791,791,792,792,792,793,793,793,794,794,795,795,795,796,796,797,797,798,798,799,799,799,799,799,799,799,800,800,800,800,800,801,802,802,802,802,803,803,803,803,803,803,804,804,804,805,805,805,805,806,806,806,806,806,806,806,807,807,807,807,807,808,808,808,808,809,809,809,809,809,809,810,810,810,811,811,811,812,812,812,812,812,812,812,814,814,814,815,815,816,816,818,818,819,820,821,821,821,822,823,823,824,824,825,826,826,826,827,827,827,828,828,828,828,829,829,829,830,830,831,831,831,831,832,832,832,833,833,834,834,834,834,835,835,835,835,836,837,837,837,837,838,838,838,839,839,839,840,840,840,840,840,841,841,841,842,843,843,843,843,843,843,843,844,844,845,845,845,846,846,846,846,846,847,847,847,847,847,848,848,848,848,849,849,850,850,850,850,850,850,851,851,851,852,852,852,852,853,853,853,854,854,855,855,855,856,856,856,856,856,858,858,859,859,859,859,859,859,860,860,861,862,862,862,862,862,862,863,864,865,865,865,865,865,866,867,868,868,868,868,868,868,869,869,869,871,871,871,871,872,872,873,874,874,874,875,875,875,875,876,877,877,877
2 | 1,2,2,1,2,3,3,1,2,3,1,2,3,1,2,3,3,1,2,3,5,6,1,4,3,2,5,6,3,1,4,5,2,6,4,5,3,1,6,2,5,4,1,6,7,5,1,4,2,6,7,5,6,7,8,4,9,6,5,10,8,7,9,11,4,10,7,12,13,8,10,11,9,4,10,12,8,13,7,11,7,9,10,13,12,14,7,8,11,10,13,14,9,12,7,15,16,8,11,13,10,7,12,14,15,16,7,10,9,13,12,11,17,7,14,8,15,10,16,12,11,17,18,7,13,14,9,10,16,15,17,12,11,18,8,10,7,13,14,16,9,12,17,15,18,10,13,14,11,12,16,9,8,15,17,18,11,13,14,10,12,16,13,17,11,15,18,12,10,14,17,16,14,15,18,16,10,17,14,15,18,16,17,15,18,16,17,15,16,18,17,18,15,17,18,17,18,19,17,19,19,19,20,19,22,20,23,22,24,21,19,25,26,27,20,23,24,28,22,29,21,25,19,26,23,28,27,24,20,21,29,22,25,26,24,23,20,27,29,21,28,22,20,26,25,21,22,23,24,29,27,20,28,22,29,21,30,31,23,26,24,20,25,29,22,27,21,28,30,23,20,31,24,21,22,26,25,29,27,20,28,23,30,21,32,29,31,24,20,25,26,27,23,21,29,28,30,32,20,31,25,24,23,29,21,28,27,26,30,20,33,32,29,31,25,23,28,21,24,33,34,20,23,30,35,32,25,33,31,34,21,24,23,20,30,35,23,25,31,36,38,34,32,33,24,35,20,30,34,21,31,32,25,36,38,34,33,35,24,30,31,39,32,34,25,36,33,35,32,24,25,40,37,38,39,34,36,35,33,39,37,40,32,34,36,38,35,37,33,32,37,34,35,36,38,32,33,40,35,37,34,36,33,35,41,36,34,33,37,41,36,34,33,37,36,41,39,37,34,37,41,38,42,41,37,38,40,42,39,43,44,38,39,42,40,38,44,43,40,39,42,44,38,40,43,42,44,42,44,45,43,41,44,46,47,45,43,42,44,46,41,47,45,43,44,46,42,41,47,44,45,46,43,44,41,47,48,43,46,49,44,45,41,48,47,49,44,46,41,43,48,45,49,44,47,41,48,46,49,49,45,48,41,47,46,49,48,41,45,47,50,51,46,49,48,41,52,45,50,51,47,48,52,49,41,50,48,53,51,46,45,52,47,48,49,50,53,51,48,49,52,47,43,53,50,48,51,49,52,54,53,50,47,48,54,51,49,53,50,52,48,49,54,51,53,52,50,54,44,51,53,52,50,44,54,51,50,53,52,43,44,54,44,53,44,54,43,54,44,43,54,44,54,54,54,55,44,43,55,43,55,56,57,43,55,56,57,55,58,59,56,57,55,58,59,56,57,58,55,59,56,57,58,55,55,59,57,56,58,57,56,59,44,58,60,61,56,57,44,59,60,62,57,63,58,56,61,59,57,56,60,62,63,61,58,59,56,64,60,63,61,65,62,66,59,58,64,66,63,60,65,61,58,59,66,62,64,44,63,58,65,59,60,66,61,62,64,65,60,63,66,61,62,64,66,60,65,63,61,62,64,66,44,65,60,63,61,64,66,62,63,64,67,65,66,67,66,67,66,67,66,67,67,68,68,68,68,68,69,70,68,69,70,68,69,70,68,71,69,70,68,71,70,69,71,70,69,68,71,72,70,69,71,70,69,72,71,44,72,70,71,72,71,72,44,73,74,72,73,74,71,73,74,72,73,44,75,74,73,75,75,76,74,73,77,76,78,75,77,73,76,78,75,79,80,81,77,75,76,78,79,74,73,80,69,81,82,75,77,79,76,78,83,84,80,82,79,75,85,76,77,81,83,84,78,85,82,80,79,75,84,83,76,77,78,85,81,84,79,76,82,83,80,78,85,75,84,77,44,79,76,85,83,82,80,84,78,81,79,76,85,83,77,84,78,82,79,80,76,83,85,81,84,77,82,78,79,76,80,83,77,82,78,79,84,81,76,77,78,80,44,77,79,76,78,86,87,88,81,80,78,86,87,88,76,77,89,44,90,87,86,88,77,89,90,87,93,86,76,88,94,78,77,93,89,90,94,87,91,86,88,92,77,78,93,91,94,76,90,89,87,86,92,77,78,88,94,89,91,90,87,93,86,92,89,76,94,77,88,93,90,87,86,91,94,89,78,92,88,93,87,94,95,86,89,90,91,93,87,88,92,95,94,86,76,89,93,94,91,87,88,95,90,92,86,95,89,93,94,91,88,71,92,86,95,90,87,94,91,93,89,92,90,94,91,93,92,89,88,94,90,91,93,96,92,89,97,96,91,93,94,92,90,97,89,96,94,93,44,89,91,98,96,97,92,90,89,99,44,98,77,96,97,98,44,99,94,96,93,97,98,88,44,96,97,99,98,88,96,44,97,99,98,88,99,100,101,98,88,102,98,101,99,100,88,102,98,88,101,99,98,100,88,102,103,101,88,98,99,104,101,77,100,105,102,101,103,98,99,104,101,98,103,105,99,100,102,76,104,101,103,77,76,98,99,105,104,102,100,106,101,103,105,98,106,101,104,99,100,76,102,103,105,106,107,104,105,100,109,110,102,106,108,103,107,110,105,109,106,104,108,111,102,100,107,112,110,103,109,106,113,111,108,102,104,105,110,112,107,113,109,106,100,103,111,107,102,112,110,108,113,105,104,106,111,109,112,107,108,110,113,106,111,103,112,70,109,106,107,105,108,113,111,112,70,110,104,113,106,107,112,109,108,111,70,103,110,105,112,113,106,107,111,71,108,70,104,109,110,105,113,111,112,107,106,114,115,108,103,113,104,109,70,110,105,106,114,107,115,113,104,112,108,110,105,114,109,107,113,115,116,114,108,115,113,110,116,114,109,111,113,115,116,114,116,115,114,106,116,115,118,117,106,116,114,118,115,117,106,116,118,117,119,115,106,118,116,117,119,106,115,117,118,116,119,117,106,115,120,118,119,117,121,120,115,118,117,119,121,69,120,117,118,121,119,117,120,122,118,121,117,123,119,120,122,118,121,71,123,122,119,120,118,121,71,69,123,122,120,121,118,71,119,123,122,120,121,69,71,118,122,124,123,121,120,71,119,124,125,118,121,122,71,123,124,69,120,125,124,121,126,71,122,123,125,120,124,126,69,71,124,125,120,121,122,123,126,71,69,124,125,70,122,127,126,128,123,71,124,69,122,125,126,127,128,123,129,126,122,127,129,128,123,129,122,127,128,123,129,122,127,128,129,129,128,123,127,122,129,128,127,130,129,131,128,130,131,129,127,128,111,130,131,130,131,130,131,130,131,130,132,131,132,133,130,131,134,131,130,133,134,132,135,70,131,130,133,134,135,132,136,131,130,133,132,135,134,131,133,130,136,132,131,133,134,135,130,136,132,133,135,134,131,136,137,132,135,133,131,134,136,138,132,139,137,135,136,133,138,134,69,137,139,140,136,131,135,138,137,133,134,140,139,69,137,138,136,135,139,140,137,138,136,139,135,137,138,140,139,137,136,138,140,131,141,137,139,140,136,138,142,141,139,137,140,142,138,131,139,140,141,137,142,138,143,140,139,141,142,143,140,139,144,146,131,145,141,143,142,147,140,131,146,144,145,139,148,143,141,142,147,146,148,145,149,130,144,143,139,140,141,131,146,147,145,142,148,130,139,143,140,144,146,149,131,141,148,147,145,130,142,146,139,149,148,140,143,144,147,145,131,151,148,150,130,139,146,147,143,144,145,131,147,140,146,130,151,148,150,139,147,145,144,131,130,146,143,152,151,153,147,144,150,148,145,130,131,146,143,152,144,147,153,154,155,130,151,148,131,145,150,146,143,147,144,145,152,153,154,155,130,151,147,148,143,145,146,150,131,144,153,147,154,158,156,152,130,157,155,144,147,151,148,158,153,154,145,146,150,157,152,156,155,144,154,151,153,157,145,158,148,152,150,155,156,146,151,153,145,150,154,144,157,155,152,158,151,156,148,145,154,150,157,144,153,146,155,152,159,158,151,145,156,154,157,155,152,153,157,150,154,159,148,144,151,145,156,158,155,146,157,153,154,152,156,157,155,158,159,151,150,152,160,153,155,112,156,157,158,152,159,160,153,156,157,152,158,159,152,156,153,160,157,158,152,160,159,156,157,153,158,160,159,157,156,153,161,158,160,159,162,160,163,161,159,163,160,162,161,161,163,162,164,160,161,161,163,165,162,160,164,161,165,166,162,163,169,160,168,167,164,170,165,161,163,169,166,162,160,164,167,171,168,170,165,161,163,169,166,171,164,160,167,162,165,168,161,170,163,164,169,171,166,162,165,167,168,161,170,171,169,164,166,168,162,165,167,163,170,171,166,169,164,168,172,167,173,165,166,170,167,171,164,169,168,173,172,165,170,166,167,169,171,164,168,165,173,166,169,170,172,167,165,166,168,171,167,169,173,164,170,172,169,167,168,170,166,173,171,167,169,166,172,168,173,167,171,168,167,171,172,166,173,169,168,174,172,169,173,172,169,173,174,172,173,169,172,173,175,176,173,172,176,175,176,175,177,174,178,176,175,177,178,175,176,177,178,176,175,177,178,177,178,175,176,177,178,176,175,177,178,175,176,177,175,178,177,176,175,179,178,174,177,178,179,174,177,178,179,180,169,179,181,177,180,182,179,181,182,180,179,177,179,182,180,181,183,184,180,182,181,183,184,181,180,182,183,184,181,182,180,183,179,184,181,182,177,180,179,183,184,181,182,178,180,179,183,184,181,182,180,178,179,185,183,184,112,185,169,181,182,180,185,183,184,186,112,181,185,70,111,112,182,184,69,186,111,183,177,185,69,112,70,111,112,187,69,186,70,184,188,112,178,185,183,111,69,177,186,70,112,188,111,187,69,178,184,183,186,185,112,111,188,187,70,177,183,188,112,69,111,186,178,184,188,187,112,185,186,111,70,112,178,69,187,188,186,177,178,185,188,187,186,188,186,187,188,186,189,190,188,187,186,189,190,188,191,192,187,193,189,190,186,188,194,193,191,192,195,190,189,193,187,194,188,191,195,192,190,193,189,194,188,191,195,192,190,193,189,191,194,195,190,193,196,192,195,190,191,189,193,194,195,196,186,190,192,193,186,189,196,191,195,194,190,192,197,193,186,196,191,193,195,198,194,199,190,189,197,192,186,196,200,195,198,199,193,191,192,197,194,186,200,198,196,201,195,199,186,197,200,198,202,201,196,186,199,197,198,202,200,201,196,169,202,200,197,199,198,201,169,196,201,202,198,199,200,197,169,201,196,201,198,200,202,197,199,203,198,200,204,199,197,202,203,200,204,203,198,197,200,202,204,203,204,202,200,197,203,198,204,202,202,203,204
3 | 0.22917,0.15278,0.15625,0.21528,0.12153,0.99306,0.81597,0.21528,0.10764,0.625,0.20139,0.11111,0.46875,0.19792,0.10764,0.37153,0.32292,0.20833,0.10417,0.23611,0.98264,0.98611,0.23264,0.0069444,0.17361,0.125,0.73264,0.71181,0.083333,0.21528,0.010417,0.48264,0.13194,0.43056,0.034722,0.28819,-0.090278,0.23264,0.21875,0.10764,0.10417,0.076389,0.22222,0.048611,-0.11111,-0.125,0.17361,0.11458,0.076389,-0.14931,-0.024306,-0.25694,-0.29167,0.055556,0.25347,0.14583,0.40625,-0.42014,-0.44792,-0.26389,0.23958,0.10069,0.35764,0.18056,0.12153,-0.15278,0.10417,0.37153,0.24306,0.23611,-0.13194,0.16667,0.33681,0.13889,-0.10069,0.33333,0.23958,0.22222,0.11111,0.14236,0.12153,0.33333,-0.0625,0.21528,0.3125,0.21875,0.125,0.23611,0.10764,-0.024306,0.23958,0.15972,0.34028,0.31597,0.17014,0.22917,0.31597,0.22917,0.083333,0.24306,-0.024306,0.28125,0.30208,0.090278,0.15625,0.21528,0.39931,-0.013889,0.36111,0.23264,0.29861,0.13194,-0.46528,0.54514,0.010417,0.23611,0.090278,0.03125,0.13194,0.32986,0.13194,-0.35764,-0.29514,0.73611,0.26389,-0.048611,0.34722,0.079861,0.052083,-0.0034722,-0.27778,0.37153,0.14583,-0.19792,0.25,0.10764,0.98958,0.31944,-0.11111,-0.0034722,0.34722,0.42708,-0.20139,-0.069444,-0.11806,0.11806,0.32292,-0.18403,0.19444,0.44444,-0.048611,0.34375,0.24653,-0.10764,-0.14583,-0.076389,0.21528,0.33681,-0.21181,0.16667,0.4375,-0.083333,0.37847,-0.09375,0.28472,-0.17361,-0.024306,0.47569,0.23958,-0.20486,-0.055556,-0.12847,-0.24653,-0.19792,0.03125,-0.13542,0.38194,-0.041667,-0.35417,-0.19444,0.045139,-0.13542,-0.041667,-0.23264,0.041667,-0.17708,-0.027778,-0.31597,-0.25347,0.020833,-0.045139,-0.052083,-0.43403,-0.15278,-0.125,-0.20486,-0.16319,0.25,-0.23611,0.23264,0.22569,0.21875,-0.96875,0.27778,0.39931,-0.82292,0.24653,0.39583,-0.32639,-0.77083,0.29514,-0.19444,0.41667,0.39931,-0.66319,0.19792,-0.23264,0.28125,0.38542,0.36806,-0.625,-0.12847,0.30903,0.46528,0.15625,0.29514,0.39931,-0.13889,-0.48611,-0.44792,0.35417,0.30208,-0.072917,0.45486,-0.11458,0.11111,-0.32292,0.38542,0.34722,-0.27778,0.26736,0.22917,-0.19792,0.47569,-0.048611,-0.125,0.24653,0.11458,-0.097222,0.35764,0.34028,-0.13194,0.23611,0.30903,0.38889,-0.017361,0.25,0.34375,0.12153,0.46181,-0.076389,-0.079861,-0.027778,0.39236,0.38542,0.28819,0.041667,0.17708,0.21528,0.14236,-0.069444,0.3125,-0.045139,0.034722,0.40972,0.46875,0.027778,0.36458,0.28819,-0.125,0.17708,0.15625,0.19097,-0.024306,0.19444,0.37847,0.27778,0.017361,-0.16667,0.090278,0.45833,0.3125,0.14583,-0.090278,0.36806,0.21528,0.17708,0.13542,-0.19097,0.29861,0.13542,0.065972,0.15278,0.34722,-0.10417,0.30556,0.38542,0.45486,0.21181,-0.19444,0.29167,0.13194,0.32292,0.32292,0.14931,0.19792,0.38542,-0.083333,0.090278,0.23264,0.26389,-0.17708,0.26389,0.27431,0.097222,0.13194,0.22917,0.17361,0.38542,0.14236,-0.038194,0.15625,0.33333,-0.14236,0.35764,0.034722,0.35069,0.27431,0.42361,-0.0034722,-0.43056,0.072917,0.15625,0.097222,0.21875,-0.017361,-0.09375,0.39583,0.041667,0.013889,0.44097,0.18056,0.29167,0.020833,-0.37153,0.017361,0.083333,-0.086806,0.23264,0.35417,0.41319,-0.14931,0.21181,-0.010417,0.28472,0.038194,0.045139,-0.125,0.22917,0.1875,0.28125,-0.19444,-0.20486,-0.23611,-0.097222,-0.052083,0.045139,-0.18056,-0.0034722,-0.079861,-0.15625,-0.12847,0.22222,-0.086806,0.020833,-0.18056,-0.22222,-0.076389,-0.020833,0.25694,-0.0625,-0.11458,-0.27431,-0.010417,-0.14236,0.26736,-0.076389,-0.13889,-0.31597,-0.069444,-0.17708,-0.083333,-0.125,-0.39583,0.32986,-0.052083,-0.23264,-0.21181,-0.097222,0.3125,-0.0034722,-0.28819,-0.34722,-0.14583,0.020833,0.30556,-0.041667,-0.076389,-0.40278,0.017361,0.22569,-0.11458,0.29861,0.23611,0.065972,-0.03125,-0.11806,0.30208,0.0034722,0.36806,0.28125,0.072917,0.10764,0.31944,-0.048611,0.19792,0.28472,0.375,0.034722,0.25694,0.39583,0.30556,0.41319,0.13194,0.375,0.45139,0.31944,0.46528,0.35417,0.20833,0.39236,0.27431,0.34028,0.17361,-0.125,0.20833,0.40972,0.44792,0.29514,0.14583,0.29167,-0.055556,0.17708,0.39931,0.28125,0.11806,0.41667,0.28472,0.0034722,0.27778,0.13889,0.065972,0.41667,0.28472,0.28819,0.041667,0.083333,0.47222,0.034722,0.17014,0.32986,0.14583,0.23264,0.076389,0.065972,0.16667,0.39931,0.020833,0.14236,0.47569,0.12847,0.12153,0.22569,0.48958,0.069444,0,0.15278,0.027778,0.25,0.23611,0.11806,0.17708,-0.16319,0.038194,0.052083,0.24306,0.13889,-0.37847,0.14236,0.041667,0.17708,0.23611,0.069444,0.24306,0.11458,-0.59028,0.40972,0.18056,0.15625,0.21181,0.055556,0.090278,0.40972,0.22222,-0.80903,0.12847,0.097222,0.26736,0.1875,0.065972,0.19792,0.41319,0.072917,0.10764,0.20486,0.11111,0.25694,0.16319,0.048611,0.13889,0.38542,0.09375,0.48264,0.24306,0.076389,-0.013889,0.125,0.10069,0.375,0.25347,0.30208,0.052083,0.10764,-0.034722,0.26389,0.079861,0.069444,0.31597,0.017361,0.37847,-0.010417,0.09375,0.26042,0.083333,0.30903,0.36806,0.024306,0.21875,0.47222,0.09375,0.3125,0.3125,0.034722,0.42708,0.16667,0.11806,0.055556,0.30556,0.35417,0.48958,0.40278,0.13889,0.29861,0.29167,0.14236,0.065972,0.49306,-0.034722,-0.048611,0.57639,-0.15278,-0.20139,-0.30208,-0.45486,-0.80556,-0.20486,-0.23958,0.63194,-0.13889,0.80208,-0.069444,0.16667,0.23264,0.99306,-0.045139,0.12847,0.20833,-0.013889,-0.010417,0.12847,0.072917,0.16319,-0.0034722,0.0069444,0.13542,0.010417,0.10069,0.024306,-0.059028,0.17361,-0.045139,0.034722,0.0625,-0.072917,-0.097222,0.21181,-0.027778,-0.125,0.076389,-0.083333,-0.17014,0.20139,-0.25347,0.097222,-0.0034722,-0.11806,-0.22917,-0.13194,-0.27083,0.19792,0.038194,-0.045139,-0.19792,0.03125,0.086806,-0.29167,-0.045139,0.19444,-0.29861,-0.38542,0.052083,-0.027778,0.055556,-0.017361,0.11111,0.25694,-0.48611,0.21875,0.055556,0.072917,-0.013889,0.38194,-0.024306,0.13542,0.30208,0.15278,0.20486,0.10764,0.079861,0.069444,0.39236,0.034722,0.22222,0.30903,0.072917,-0.0069444,0.1875,-0.28472,0.11458,0.20139,0.36806,0.29861,0.12847,0.0069444,0.0625,0.03125,0.20833,0.35764,0.20833,0.14583,-0.052083,0.097222,0.0625,0.23958,-0.069444,0.28472,0.34375,0.18403,0.16667,0.10069,0.25694,-0.086806,-0.27083,0.35069,0.3125,0.1875,0.22569,0.26736,-0.10069,0.11111,0.21181,0.26736,0.43403,0.35417,-0.16667,0.42708,-0.23264,0.37847,-0.32986,0.37847,-0.47222,0.36458,0.38542,0.23264,0.15972,0.10417,0.10417,0.13889,0.15972,0.26389,0.20139,0.12847,0.24306,0.27083,0.079861,0.17361,0.29861,0.086806,0.076389,0.15278,0.3125,0.14236,0.17361,0.079861,0.20139,0.17361,0.079861,0.25694,0.22917,0.31597,0.14236,0.069444,0.21181,0.13542,0.065972,0.32986,0.17014,-0.27431,0.34375,0.097222,0.1875,0.34722,0.15278,0.32292,-0.26736,0.29861,0.36111,0.31944,0.29861,0.37153,0.17708,0.30556,0.39931,0.35069,0.30556,-0.28125,0.17361,0.38889,0.30556,0.16667,0.15972,0.25,0.39583,0.29514,0.13889,0.23264,0.20833,0.23611,0.11111,0.31597,0.22222,0.18403,0.27431,0.20833,0.31944,0.43056,0.076389,0.30903,0.22569,0.15278,0.21181,0.40625,0.32292,0.34722,0.090278,0.43403,0.97222,0.29514,0.052083,0.24653,0.25347,0.13542,0.98611,0.98264,0.36111,0.77083,0.26736,0.22569,0.98611,0.29167,0.024306,0.44792,0.80208,0.73958,0.11111,0.73264,0.58681,0.36111,0.26736,0.1875,0.54167,0.64931,0.27083,0.013889,0.090278,0.5,0.4375,0.40972,0.29861,0.26736,0.43403,0.48611,0.35764,0.048611,0.33681,0.17014,0.33333,0.013889,-0.26042,0.30556,0.28472,0.28125,0.40972,0.37153,0.38542,0.3125,-0.0069444,0.44792,0.29167,0.33333,0.21528,0.38889,0.059028,0.29167,-0.041667,0.32292,0.27431,0.37847,0.38889,0.38889,0.15972,0.45486,0.25694,0.10417,0.31944,-0.027778,0.26389,0.38889,0.375,0.375,0.11806,0.27778,-0.010417,0.24653,0.20139,0.44792,0.40278,0.16667,0.0034722,0.34375,-0.27431,0.18403,0.20833,0.38889,0.013889,0.21528,0.11111,-0.23611,0.40972,0.33681,0.024306,0.21875,0.090278,-0.20139,0.43056,0.1875,0.28472,-0.29167,0.39583,0.11458,0.25694,-0.14931,0.1875,0.28125,0.375,0.15278,0.26389,0.28819,0.44792,-0.14236,0.35069,0.052083,0.19792,0.23611,0.25347,0.36458,0.34375,0.19792,0.51736,0.32639,-0.11806,0.39236,0.26389,0.11806,0.25,0.42014,0.33681,0.4375,0.38889,0.23611,0.24306,0.32639,0.29514,0.33681,0.20486,-0.038194,0.34375,0.29167,0.31944,0.40278,0.26042,0.26389,0.29861,0.22569,0.34375,0.45139,0.39931,0.42361,0.038194,0.27431,0.42014,0.23958,0.25,0.23611,0.39583,0.35764,0.29514,0.15278,0.086806,0.31944,0.15972,0.41319,-0.25347,0.21528,0.35069,0.43403,0.12847,0.32639,0.125,0.14583,0.0625,-0.23611,0.39931,0.20833,0.48264,0.32292,0.30556,0.42708,0.027778,0.11458,0.21875,-0.27083,0.39236,-0.024306,0.19444,-0.29514,0.28472,0.28472,0.39583,-0.038194,0.26042,0.19444,-0.069444,0.22569,-0.30903,0.36111,0.13889,0.34722,-0.072917,0.23611,0.25,-0.09375,0.36111,0.32292,-0.072917,0.20833,-0.11458,0.23611,0.24306,0.29861,0.35069,-0.086806,0.18403,0.91319,-0.16667,0.22569,0.97569,0.69792,-0.125,0.17014,0.25694,-0.24653,0.30556,0.70139,0.19097,0.44792,0.27778,0.16319,-0.23264,0.18403,-0.22569,-0.48958,0.27778,0.44792,-0.34028,0.25,0.18403,-0.32986,-0.13194,-0.44792,0.45139,0.090278,0.26389,-0.39236,-0.076389,-0.25347,0.26389,-0.059028,0.15625,0.19097,-0.30903,0.22917,-0.045139,-0.24306,0.13542,-0.16319,-0.21181,0.21528,-0.40278,-0.027778,0.079861,-0.034722,-0.11458,0.25,0.052083,0.33333,0.14931,-0.083333,0.27083,0.375,-0.069444,0.072917,0.079861,0.34375,0.21528,0.37153,-0.086806,0.18403,-0.020833,0.020833,-0.12847,0.32986,0.17014,0.375,-0.22917,-0.027778,0.17708,-0.18056,-0.065972,-0.14931,-0.0069444,0.48264,0.35069,-0.21528,0.38889,0.024306,-0.19097,-0.16319,-0.083333,-0.09375,0.045139,-0.11111,-0.125,-0.16319,-0.017361,0.35069,0.37847,0.53125,-0.052083,0.076389,-0.065972,0.51389,0.64931,-0.048611,0.048611,-0.13194,-0.020833,0.35764,0.30903,0.37153,0.017361,-0.034722,-0.097222,0.0069444,0.37153,-0.11806,0.03125,0.10764,0.27431,0.99653,0.32986,-0.038194,-0.079861,0.40972,0.1875,0.03125,-0.076389,0.23611,0.33333,0.38194,0.27778,0.40625,0.28819,-0.065972,0.16319,0.39931,-0.10069,0.36458,0.40972,0,0.25347,0.35069,0.25,0.20139,0.14236,0.23958,0.39236,-0.10764,0.35417,0.37847,-0.39236,0.35417,0.18403,0.28125,-0.052083,-0.125,0.35069,0.26042,0.059028,-0.30208,0.29514,0.34375,0.24653,-0.12847,0.39236,0.020833,0.3125,0.32292,0.31597,0.086806,-0.19792,-0.14236,-0.055556,0.36458,0.43056,0.27431,0.36458,0.0069444,0.03125,0.33681,-0.11111,0.32292,0.41667,-0.12847,0.38194,0.11806,0.30903,0.22917,0.059028,-0.13889,0.0069444,-0.079861,0.43056,0.35764,0.18056,0.36111,-0.048611,-0.12153,0.125,0.11111,0.37153,0.32639,0.072917,0.4375,0.23611,-0.083333,0.34722,-0.11806,0.38194,-0.15625,-0.0034722,0.18056,0.42014,0.17361,0.086806,0.3125,-0.038194,0.29167,0.32639,-0.125,-0.15972,0.39583,0.38194,0.27431,-0.11111,0.31597,-0.47917,0.11458,-0.0625,-0.15278,-0.041667,0.27778,0.33681,0.29861,-0.17014,-0.16319,0.36458,0.33681,-0.375,-0.11806,-0.090278,0.37847,0.16667,0.29514,-0.21528,0.40625,0.25347,0.37153,-0.065972,-0.25694,0.34375,0.46528,0.29514,-0.17361,-0.038194,0.29167,0.37153,0.52083,0.26042,0.39583,-0.010417,-0.076389,0.34722,0.49306,0.27431,-0.0625,0.42014,-0.10764,0.1875,-0.03125,-0.46181,-0.15972,-0.065972,0.16319,0.42361,-0.35417,0.013889,-0.10069,-0.0625,0.14236,-0.27083,-0.0625,-0.21875,0.076389,-0.083333,-0.25,0.14931,-0.090278,-0.1875,-0.055556,0.11111,-0.10417,-0.25694,0.20833,-0.1875,-0.09375,0.024306,0.12847,0.36458,-0.26042,-0.18056,-0.072917,0.31597,0.34722,0.21875,-0.22569,-0.13194,-0.15625,0.25,0.09375,0.32639,-0.19792,-0.1875,0.20833,-0.11806,-0.30903,0.27778,-0.28125,-0.12847,0.17708,-0.47569,-0.12847,-0.055556,0.21181,-0.17708,-0.072917,0.125,0.17014,-0.013889,-0.12847,0.020833,0.083333,-0.027778,0.059028,0.18403,0.10417,0.048611,-0.09375,0.013889,-0.0034722,0.027778,0.17361,0.12847,0.072917,-0.013889,-0.010417,-0.0625,0.12153,0.14236,0.12153,0.038194,-0.42708,0.10069,-0.097222,-0.041667,0.12153,0.26736,-0.42708,-0.23264,0.19792,-0.090278,0.041667,0.14583,0.069444,-0.42361,0.16319,-0.0069444,-0.30556,-0.47222,-0.041667,-0.26042,0.17014,0.013889,0.048611,-0.44792,0.041667,-0.54861,-0.36806,0.23958,0.19792,-0.66667,-0.61806,0.086806,0.020833,-0.038194,0.076389,-0.52083,0.26042,0.28125,-0.82292,-0.77778,0.29167,-0.013889,0.27431,-0.68403,0.39236,0.10764,0.36806,-0.98958,0.26736,0.03125,-1,-0.85069,0.20139,0.31944,0.18403,0.19792,-1,0.076389,0.13194,0.052083,0.22569,0.25,-0.069444,0.14583,0.083333,0.18056,0.31944,-0.10764,0.21528,0.097222,0.20486,-0.097222,-0.076389,0.28125,0.42014,0.15972,0.29861,-0.059028,0.35069,0.24306,-0.37847,-0.09375,-0.19444,0.40972,-0.26736,-0.10764,-0.19444,0.29514,0.44097,0.38542,-0.20486,-0.055556,-0.14931,-0.045139,-0.12847,-0.013889,-0.10417,0.017361,-0.069444,0.16319,0.027778,0.15972,0.27083,-0.055556,-0.0034722,0.13194,-0.010417,-0.059028,0.21875,0.097222,0.15278,0.16319,0.27083,0.010417,-0.069444,0.18056,0.079861,0.18403,0.11458,-0.14931,0.013889,-0.069444,0.14931,0.14583,0.15625,0.055556,0.017361,0.10764,-0.052083,-0.069444,0.17014,0.020833,0.11111,0.052083,0.076389,-0.09375,-0.024306,0.17708,0.14583,0.076389,0.079861,-0.017361,-0.027778,0.36806,0.20833,0.14236,0.20139,-0.017361,0.125,-0.086806,0.18056,0.26389,-0.78125,0.32292,0.1875,-0.14931,0.22222,0.16667,0.15972,0.28819,0.25347,-0.61111,-0.78472,-0.14583,-0.0069444,0.1875,0.13889,0.21528,0.22917,0.16667,-0.60764,-0.41319,0.25,0.17708,0.11806,-0.10417,0.18056,-0.24653,-0.39931,0.13542,0.079861,-0.010417,-0.12153,0.23264,0.16667,0.11111,-0.1875,-0.0034722,0.25347,0.034722,0.17361,-0.065972,0.0034722,0.29167,0.34375,0.072917,-0.038194,0.055556,0.22569,0.19444,0.25,0.10764,0.38194,-0.10764,0.14931,0.28472,-0.034722,0.055556,-0.17708,0.20486,0.38889,0.097222,0.3125,-0.14236,-0.15972,-0.03125,0.16319,0.09375,-0.059028,-0.12847,-0.079861,0.33333,-0.15972,-0.0034722,0.19097,0.23264,-0.013889,0.11806,0.39236,-0.14583,0.027778,-0.13194,0.30208,0.17014,-0.097222,-0.13542,-0.010417,0.29167,0.15278,0.37847,-0.090278,-0.072917,0.125,-0.39931,-0.079861,0.26389,-0.10764,-0.11111,-0.1875,0.3125,0.048611,-0.069444,0.34028,0.10417,0.15972,-0.013889,-0.034722,-0.12153,-0.15625,-0.1875,0.16319,-0.045139,-0.31597,0.0625,0.31597,0.010417,0.26042,0.045139,-0.020833,0.18056,-0.052083,-0.17014,-0.21875,0.027778,-0.25347,-0.18403,0.11806,0.20139,0.027778,0.090278,0.30903,0.0069444,0.20833,0.0069444,-0.23611,-0.055556,0.16319,-0.17361,0.11806,0.027778,0.17361,0.20139,-0.43403,-0.052083,0.079861,0.24653,-0.024306,0.17014,-0.32986,0.21528,0.03125,0.10764,0.20833,0.10764,-0.079861,-0.21181,0.39931,0.21181,0.29167,0.24306,0.11806,0.13542,-0.065972,0.041667,0.086806,0.16319,-0.11806,-0.25347,0.38889,0.14583,0.21528,0.27431,0.28472,0.32639,0.048611,0.15972,-0.052083,0.13194,0.086806,0.097222,-0.14931,-0.30208,0.16319,0.16319,0.083333,0.35417,0.27083,0.25,0.35069,0.045139,0.14583,0.18403,-0.0625,-0.44097,0.083333,-0.16667,0.059028,0.20139,0.15972,0.25347,0.23611,0.18056,0.30208,0.17014,0.34028,0.083333,0.32292,0.40625,0.17014,0.32292,0.125,-0.03125,0.24653,0.21875,0.11806,0.055556,-0.16667,0.034722,0.28819,0.3125,0.13194,0.44097,0.16667,0.10417,0.090278,0.22569,0.24306,0.055556,0.22917,-0.034722,0.29861,-0.038194,0.50347,0.10417,-0.14931,0.038194,0.19097,0.09375,-0.041667,0.12153,0.17708,0.22222,0.54514,0.22917,0.17014,0.017361,0.045139,-0.017361,0.076389,0.21528,-0.03125,0.15972,0.18056,0.125,-0.13542,0.54167,0.15972,0.27431,0.076389,0.041667,0.097222,-0.052083,0.31597,0.076389,0.47917,0.14236,0.086806,0.041667,0.020833,0.37153,0.28819,0.0069444,0.21528,0.10764,0.12847,-0.083333,0.010417,0.45139,-0.090278,0.045139,0.079861,0.38194,0.17014,-0.090278,0.069444,0.42014,0.024306,0.21875,0.20139,0.079861,0.18403,0.29514,0.038194,0.39931,0.36806,-0.0625,0.11111,0.065972,0.12153,0.16667,0.26042,-0.024306,-0.020833,0.14236,0.083333,0.10417,0.10417,0.027778,0.027778,-0.11111,0.21528,0.17014,0.14931,-0.045139,0.13542,0.12153,0.097222,0.23958,-0.25694,0.22569,0.059028,0.16319,0.30903,0.1875,-0.38542,-0.80903,0.28472,-0.038194,0.20486,0.38194,-0.090278,0.30903,-0.47569,0.23264,0.27778,-0.14931,0.34028,-0.3125,-0.23611,0.20833,0.26042,0.13889,-0.18056,-0.18056,-0.14931,0.12847,0.64236,0.19444,-0.19097,0.11458,-0.13542,0.38194,-0.97222,0.1875,0.097222,0.4375,-0.26389,-0.15972,-0.79167,0.013889,0.20139,0.25694,-0.10764,0.12847,0.3125,-0.79514,0.27431,-0.26736,-0.045139,-0.60417,-0.097222,-0.097222,0.11458,0.11806,-0.17708,0.20833,0.17014,-0.61111,-0.013889,-0.10764,-0.25694,-0.40278,0.33333,0.076389,-0.055556,-0.24653,0.069444,0.23264,-0.11458,0.03125,0.065972,-0.41667,0.31944,-0.0069444,-0.14583,-0.027778,-0.41319,0.076389,0.10069,-0.013889,-0.11806,-0.26389,-0.038194,0.32986,-0.083333,0.12153,0.21181,0.10764,0.052083,-0.083333,-0.097222,-0.12153,-0.11111,-0.0625,0.36458,0.048611,-0.14236,0.041667,0.16667,0.45833,0.0034722,-0.11806,-0.1875,-0.17014,0.12847,-0.0034722,-0.20833,0.19097,0.26042,0.49653,-0.21875,-0.065972,-0.10417,-0.18056,-0.27778,0.1875,0.43403,-0.24653,0.14931,0.027778,0.49653,-0.34722,0.47222,-0.17361,-0.083333,0.52778,-0.28472,0.16319,0.017361,0.16319,-0.010417,-0.26389,0.52431,-0.1875,0.18403,0.52431,0.12153,-0.079861,0.61458,-0.28472,0.66667,-0.0034722,-0.1875,0.12153,0.75,-0.11806,-0.25,0.97569,-0.22569,0.03125,0.98264,0.17014,-0.29167,-0.39583,-0.069444,0.038194,-0.24306,0.19444,0.020833,-0.25,0.19097,-0.013889,0.059028,0.18056,-0.24306,0.10417,0.23611,0.28472,0.20486,0.22569,0.13542,0.23958,0.32986,0.21181,0.29167,0.40278,-0.013889,0.36111,0.125,0.20486,0.36111,0.28819,0.16319,0.076389,0.31597,0.25,0.059028,0.125,0.28472,0.24653,0.29167,0.23611,0.13194,0.024306,0.23611,0.16667,-0.065972,0.090278,0.16667,0.097222,-0.0069444,-0.14931,0.14931,-0.0034722,0.038194,0.14236,-0.072917,0.048611,-0.46528,0.0625,0.017361,0.14931,0.072917,-0.375,0.079861,0.15972,0.09375,-0.29167,0.40278,-0.22917,-0.25347,0.40625,0.17014,0.39583,0.38889,-0.19444,0.39931,0.38194,0.33681,-0.12847,0.16667,-0.076389,0.39583,0.22222,0.37153,0.045139,-0.048611,0.14236,0.33681,0.29167,0.010417,-0.065972,0.19792,0.14583,0.19792,0.041667,-0.055556,0.15972,0.15625,0.18403,0.065972,-0.048611,-0.020833,0.17361,0.16319,0.19792,0.15278,0.038194,0.052083,-0.0625,0.19097,0.18056,0.11458,0.10764,0.15625,-0.0069444,-0.0625,0.16667,0.14931,0.076389,0.11458,0.24306,0.017361,-0.024306,-0.10417,0.36806,-0.020833,-0.25694,0.14236,0.13889,0.065972,-0.052083,-0.034722,-0.097222,-0.31597,0.36806,0.11806,-0.048611,0.27083,0.38542,0.36806,0.13889,-0.10764,0.24653,-0.25347,0.38542,-0.034722,0.21181,-0.034722,0.23264,0.28472,0.32986,0.33333,0.20486,0.28125,0.25694,-0.17708,0.38194,-0.10417,0.28472,0.18403,0.12847,-0.03125,-0.038194,0.32292,0.27431,0.20486,-0.10069,0.40278,0.18403,0.26042,0.27083,0.25694,0.28125,0.13194,-0.14931,-0.11458,-0.0625,-0.0069444,0.14236,0.1875,0.26736,0.30556,0.42361,0.20486,-0.21875,0.28472,0.083333,0.29167,0.17708,-0.09375,0.14931,-0.27778,0.29514,0.34722,0.076389,0.038194,-0.14931,0.18403,0.44097,0.10764,0.19097,0.3125,0.36806,0.30903,-0.23264,0.25347,0.18403,0.0625,0.31944,0.36458,-0.25347,0.3125,-0.25,0.32639,0.27778,-0.26389,0.14236,0.048611,0.27778,0.31944,-0.27778,0.12847,0.055556,0.27778,0.25694,0.18056,0.35417,-0.27778,0.13889,0.055556,-0.30556,0.29861,0.33333,-0.14931,0.23958,0.18056,0.20486,0.090278,0.15278,-0.072917,0.38194,0.36806,0.32639,0.20486,0.22222,0.12153,0.12153,0.034722,0.22222,0.40278,0.32292,0.15625,0.27083,0.048611,0.17708,0.14236,0.25347,0.097222,0.40625,0.35764,0.19792,0.22569,0.24306,-0.024306,0.36806,0.1875,0.076389,0.25694,0.26736,0.36806,0.33681,0.17014,-0.29167,0.17708,-0.034722,0.37847,-0.27083,0.32639,0.12153,0.072917,0.31944,0.34722,0.23264,0.027778,-0.21528,0.45486,-0.21181,0.12153,0.11111,0.47222,0.29514,-0.23264,0.37847,0.24653,0.22222,0.33681,-0.14931,0.052083,-0.10417,0.14931,-0.20486,0.30556,-0.1875,0.24306,0.47569,0.14236,0.079861,-0.17708,0.42014,0.03125,-0.12847,-0.21528,0.16667,0.80208,0.31944,0.23611,0.14583,-0.18403,-0.10417,-0.23611,-0.24306,0.46181,0.16319,0.25,0.26389,-0.15625,-0.22917,-0.13889,-0.10764,0.21181,0.19792,-0.28125,-0.083333,-0.052083,-0.12153,0.21875,-0.1875,-0.059028,-0.27778,0.26389,-0.20139,-0.034722,-0.12153,0.22222,-0.0069444,-0.059028,-0.32639,-0.28819,0.31944,-0.32639,-0.065972,0.069444,0.045139,0.010417,0.25694,0.20833,-0.010417,0.13542,0.43056,0.30556,0.059028,0.065972,0.21528,0.14583,0.45486,0.24653,0.024306,0.097222,0.15972,0.076389,0.46181,0.23611,0.44792,0.10764,0.21528,0.15972,0.20833,0.083333,0.42361,0.14931,0.17361,0.18403,0.38542
4 | 0.99444,0.99444,0.86389,0.74444,0.675,-0.47222,-0.39444,0.45,0.50556,-0.26389,0.35833,0.35556,-0.11667,0.25278,0.19167,0.022222,0.14722,0.036111,-0.063889,0.34722,-0.033333,-0.11667,-0.19444,-0.98889,0.59444,-0.33333,-0.13333,-0.24167,0.76111,-0.425,-0.79722,-0.25556,-0.53333,-0.4,-0.56111,-0.36389,0.99722,-0.63611,-0.54722,-0.73611,-0.53333,-0.325,-0.84722,-0.70833,-0.99167,-0.71667,-1,-0.011111,-1.0139,-0.86389,-0.90278,-0.85556,-0.97222,-0.76389,-0.98889,0.32222,-0.98889,-1.0028,-0.96667,-0.99167,-0.80278,-0.58611,-0.75833,-0.99444,0.68889,-0.90278,-0.41389,-0.98056,-0.99444,-0.58333,-0.78889,-0.82222,-0.52778,0.99444,-0.66944,-0.77778,-0.40556,-0.76389,-0.16944,-0.60833,-0.058333,-0.26944,-0.45833,-0.53611,-0.54444,0.99444,0.016667,-0.11389,-0.33889,-0.23056,-0.31667,0.78333,0.022222,-0.3,-0.0055556,0.98889,0.98333,0.11111,-0.15556,-0.19722,-0.080556,0.022222,-0.16389,0.52778,0.81667,0.78056,0.088889,0.025,0.30833,0,-0.044444,0.077778,-0.97778,0.18611,0.31111,0.44722,0.59167,0.10556,0.56389,0.091667,0.24722,-0.94444,-0.99444,0.30556,0.21944,0.10833,0.60833,0.20278,0.4,0.39167,-0.88889,0.27222,0.46944,-0.89444,0.74167,0.325,0.48611,0.45556,-0.091667,0.225,0.83611,0.46667,-0.775,0.19722,-0.73889,0.51111,0.69722,-0.30278,0.76111,0.65278,0.055556,1.0028,0.99167,0.0027778,-0.60278,-0.56944,0.89722,0.9,-0.51667,0.69722,0.87222,-0.13333,1,-0.43056,0.99722,-0.21944,-0.40278,0.99722,0.85,-0.74167,-0.31667,-0.32778,-0.89444,-0.43889,-0.15278,-0.46389,0.99722,-0.097222,-1.0056,-0.64444,0.061111,-0.66111,0.10556,-0.825,0.26111,-0.86389,0.3,-0.95278,-0.99167,0.48889,0.51111,0.67222,-0.98056,0.69722,0.875,0.85,1,-0.98333,0.99444,-0.74444,-0.37778,0.075,0.74167,0.45833,-0.98889,0.74722,-0.99167,-0.75,-0.99167,0.80556,0.78056,-0.98611,0.99167,-0.98056,0.725,-0.78056,-0.91389,-0.98611,-0.43056,1,0.79722,-0.88333,1,0.78056,-0.60833,-0.78056,-0.74444,-0.78889,0.68333,0.76667,0.74722,-0.10556,-0.74722,0.51667,-0.66667,-0.40278,0.65833,-0.48056,0.51389,0.73333,-0.46389,0.17778,0.625,0.23889,-0.53889,0.66667,0.35556,-0.22778,-0.49444,0.25556,-0.16944,0.57222,-0.12778,0.54167,0.10833,0.58889,-0.98889,-0.99167,-0.061111,-0.027778,-0.32222,0.50278,-0.33333,0.0055556,0.75278,0.125,0.46111,0.18611,-0.75833,0.075,0.36111,-0.73333,-0.17778,0.31111,1.0028,-0.35,-0.15556,-0.24722,0.38333,0.20278,0.41667,0.2,-0.53333,0.14167,-0.98889,-0.43611,-0.45556,0.013889,0.033333,0.0055556,-0.66944,0.7,0.37222,-0.038889,-0.65556,0.70556,-0.24167,-0.80278,-0.14722,-0.17778,0.175,0.19167,0.56667,-0.86389,-0.23333,0.85833,1,-1.0028,0.016667,-0.36389,0.99167,-0.58333,-1.0028,0.11389,0.31667,0.68889,0.99722,-0.45,0.38611,0.85833,0.97778,-0.58333,0.79167,0.26944,0.98889,-0.29444,0.47222,0.7,0.39167,0.82778,-0.69722,0.525,0.92222,-0.75,0.46389,0.79167,1.0056,0.58611,0.58056,0.98333,-0.97222,0.63611,-0.086111,0.54167,0.65,0.6,-1,0.72778,0.53333,-1.0111,0.8,-0.030556,0.76111,0.78333,-0.95,0.43333,0.36944,0.31389,0.85,0.99167,0.98889,-0.99167,0.061111,0.29444,0.91667,0.57222,0.16667,0.058333,0.23611,1,1.0056,-0.99167,0.99444,-0.93056,-0.94722,0.091667,0.33889,-0.19722,-0.055556,-0.91944,0.81389,-0.95833,0.50278,-0.1,0.11944,-0.90833,-0.48611,0.62778,-0.26667,0.75556,0.41389,-0.33889,-0.75278,-0.17778,-0.86389,1.0028,-0.48889,-0.92778,-0.93611,0.15278,-0.57778,-0.4,-0.71389,-1.0083,-0.98056,-0.58889,-0.75556,-0.87222,-0.16667,-0.65833,-0.78889,-0.91111,-1,-0.46667,-0.99167,-0.30278,-0.89444,-0.65,-0.96667,-0.83056,-0.10833,-0.83056,-1,-0.0055556,-1,-0.79722,-0.86667,-0.76667,-0.88889,0.97778,0.98889,-0.81667,-0.90556,-0.5,-0.83889,-0.88056,0.73333,0.675,-0.91111,-1.0028,-0.21389,0.54444,-1.0083,-1,0.375,0.0083333,0.30278,0.26389,0.088889,0.99167,0.14722,0.05,-0.036111,0.99167,0.98889,0.85556,-0.030556,0.56111,-0.19444,0.82778,0.18611,0.9,0.65,-0.26667,-0.3,0.65556,1,0.35278,0.8,-0.43889,0.46667,0.5,-0.50278,-0.56389,0.54722,0.66111,0.99167,-0.58056,0.32222,0.98611,-0.60556,0.23889,0.67778,0.96111,0.51667,0.9,-0.68333,0.12778,0.75,-0.63056,0.925,0.0083333,0.83333,-0.73889,0.33889,0.79722,0.84444,-0.094444,0.71389,0.60278,-0.24167,0.68333,0.81944,0.13889,-0.35278,0.44444,0.48056,0.83611,-0.47222,-0.044444,0.98889,0.99167,-0.60556,0.23889,0.27778,0.82222,0.99167,-0.67778,0.81389,0.76111,-0.24722,0.14722,0.80278,-0.013889,0.825,0.6,0,0.98333,0.50556,-0.99444,-1.0028,0.58889,-0.475,-0.21389,-0.21389,0.39167,0.78333,0.23889,-0.43611,-0.43056,0.26667,-0.73889,-0.67778,0.50556,0.10556,-0.60833,-0.011111,-0.63333,0.013889,-0.98056,0.27778,-0.10833,-1.0056,-0.76389,-0.79167,-0.26389,-0.83056,0.030556,-0.35278,-0.30278,-0.99722,-1.0056,-0.52778,-0.51389,-0.21667,-0.50278,-0.60278,-0.30278,-0.775,-0.77222,-0.50278,-0.75833,-0.84167,-0.75833,-0.1,-0.99444,-1.0083,-0.75278,-1,-0.72778,-0.67222,0.18611,-0.64444,-0.99444,-0.61389,0.40556,-0.79444,0.53056,-0.60556,-0.875,0.60833,-0.6,0.68333,0.75556,0.80833,0.98889,-0.58889,-0.89722,0.74167,-0.91111,0.49722,0.98611,0.98333,-0.91944,0.29444,0.75556,0.76389,-0.0027778,-0.99444,-0.98611,0.50556,0.48611,-0.30833,-0.83333,-0.8,0.25278,0.22222,-0.70278,-0.61944,-0.61111,-0.0027778,-0.047222,-0.55556,-0.83333,-1,-0.41667,-0.26111,-0.26667,-0.34722,-0.48056,-0.49167,-0.19167,-0.55833,-0.1,-0.99722,-0.99167,-0.68333,-0.72222,-0.52778,0.027778,-0.85278,-0.98889,-0.90833,-0.99167,0.15833,-0.88056,-0.77222,0.23056,-1.0083,-0.95278,-0.60833,-0.78889,-0.81389,-0.55556,0.39722,0.40833,-0.975,-0.99444,-0.38611,-0.64167,-0.32778,-0.98333,-0.50278,0.99444,0.59167,0.63333,-0.74722,0.84167,-0.38611,-0.072222,-0.70833,-0.063889,0.84444,0.81944,0.68611,-0.21667,-0.41667,-0.56667,-0.10278,1.0028,-0.43056,1.0056,0.21944,0.45556,0.23889,0.022222,-0.11944,-0.11111,0.44167,0.2,0.23333,0.50556,0.29722,0.18333,0.041667,0.69444,0.26944,0.48056,0.76944,0.60556,0.53056,-0.22222,-0.61111,0.55278,1.0056,0.75833,1.0056,0.79722,-0.45278,1.0028,0.98611,1.0167,0.98889,1,-0.69167,0.75278,-0.84444,0.33611,-0.95556,-0.038889,-0.98889,-0.54722,-1.0028,-0.99722,-0.8,-0.58333,-0.36389,-0.13889,0.99167,0.99722,0.041667,0.75556,0.76667,0.23333,0.52222,0.52222,0.42222,-0.99167,0.26944,0.26667,0.65833,-0.74722,0.058333,0,-0.48889,-0.125,-0.17222,0.99722,-0.34722,-0.99722,-0.21944,-0.23056,-0.27778,-0.18056,-0.25833,-0.77778,-0.24722,-0.63889,-0.50556,-0.17778,-0.225,-0.18889,-0.25278,0.10556,-0.58333,-0.99444,-0.98889,0.40556,-0.78611,-0.76667,-0.21389,-0.50556,-0.49722,0.99444,-0.21389,-0.63889,0.99167,-0.025,0.072222,0.75,0.45556,0.99444,0.40278,0.41111,0.98611,0.85278,0.98889,0.22222,0.79722,0.64167,0.68611,0.82222,0.10556,0.99444,-0.99444,-0.98889,0.58333,-0.088889,0.48889,0.58611,0.82778,1.0028,1.0056,-0.81111,-0.23056,-0.78611,0.011111,-0.31944,0.38056,0.63889,0.27778,0.36944,0.16944,0.080556,-0.61111,0.11111,0.43889,-0.54722,0.46667,0.13889,0.18056,-0.56944,0.28333,0.094444,0.18056,0.53611,0.23889,-0.425,0.25278,-0.73056,0.175,0.36667,0.0083333,-0.011111,0.033333,0.58056,-0.31944,0.26944,0.066667,-0.083333,0.39167,0.47778,-0.20556,-0.072222,0.65833,-0.99722,0.37222,-0.21389,-0.69444,-0.05,-0.20833,0.73611,0.58611,0.53333,-0.05,0.49722,-0.17778,-0.069444,-0.17222,-0.30556,0.84167,0.70833,-0.40556,0.63611,-0.31944,0.68611,-0.30278,0.18611,-0.41944,0.83889,1.0028,0.24722,0.76944,-0.55833,0.82778,-0.48611,-0.51389,-0.55278,0.44722,1.0111,-0.66389,0.99722,-0.59167,-0.70278,1.0028,0.58056,-0.65,-0.70278,-0.675,0.71111,-0.66389,-0.73056,-1,-0.71667,-0.75278,0.99167,0.98889,-0.99722,1.0028,1.0083,-0.8,0.79444,0.77778,-0.88056,-0.75,-0.77778,-0.99444,-0.61944,-0.99444,0.54722,0.54444,-0.75556,-0.84167,-0.83611,-0.88889,0.32778,-0.98333,0.30556,-0.79722,-0.63056,-0.99167,-0.85278,-0.90556,-0.86389,-0.66111,-0.74722,-0.89722,0.12778,1,0.075,-0.53333,0.98611,-0.95,-0.91667,-0.74167,0.86111,-0.78056,-0.85278,-0.58611,-0.52222,-0.072222,-0.13056,0.83611,-0.96389,-0.96667,-0.46111,-0.66944,-0.48056,0.66944,-0.46944,-0.20833,-0.59444,-0.28056,0.65278,-0.43611,-0.91111,-0.61944,-0.97778,-0.43333,-0.53889,-0.39167,-0.30556,-0.38889,0.45278,-0.55278,-0.37222,-1.0056,0.46389,-0.39444,-0.50556,-0.41111,-0.48889,0.99167,-0.5,-0.25556,-0.24722,0.24167,-0.44167,-0.55278,-0.375,0.24167,0.93333,-0.43333,-0.66389,-0.92778,-0.097222,-0.30278,-0.325,0.052778,-0.70833,-0.34167,0.9,-0.063889,0.063889,-0.83333,0.94167,0.041667,-0.15556,-0.17222,-0.125,-0.31667,-0.23889,-0.12222,-0.99722,1.0028,0.11111,-1.0111,-0.041667,-0.30556,-0.011111,0.23889,-0.325,0.325,0.125,-0.49444,0.15833,-0.51667,0.43889,-0.33889,0.29722,0.55833,-0.71944,0.36389,-0.98333,-0.725,0.675,-0.85278,-0.91389,-0.875,0.52778,0.525,-0.87778,0.78056,-0.75833,0.83889,-0.85,0.69444,0.71667,-0.60833,0.90556,-1.0028,0.98889,-0.85,-0.71944,-0.99167,1.0083,1.0139,0.99167,-0.62778,0.89722,-0.99167,-0.89722,-0.73889,0.80833,-0.71667,0.88333,0.99722,-0.94722,1,-0.80278,0.72778,-0.35556,-0.83333,-0.98056,-0.86389,0.78611,0.68889,-0.38611,-0.99722,-0.99444,-1.0028,0.65556,0.575,-0.44167,0.51111,-0.99167,-0.99444,0.45556,-0.525,-0.98611,0.31389,-0.78611,0.29167,-0.82778,-0.65278,-0.80833,0.16944,-0.75556,-0.48889,0.066667,0.030556,-0.61944,-0.92222,-0.625,0.98056,-0.30556,-1.0028,-0.11667,-0.13056,0.99167,-0.16389,-0.99722,-0.37222,0.98889,-0.38333,0.0055556,0.82778,-0.34444,-0.3,0.89444,0.26111,-0.53056,0.67222,0.86944,-0.525,-0.13889,-0.15833,-0.92778,0.74722,0.50556,0.58333,-1.0028,-0.96389,-0.74722,-0.73056,0.68889,0.625,0.061111,0.10278,0.98056,0.8,0.44167,0.55556,-0.99722,0.76389,1.0056,0.46389,-0.99444,0.29444,-1,0.28611,0.31667,0.425,0.54444,-0.99167,0.32778,0.30278,0.50278,0.98889,0.98056,0.5,0.375,-0.98056,0.13333,-0.79167,0.82222,0.16667,0.77778,0.18889,0.097222,-0.77778,-0.99167,0.69444,0.74444,-0.58056,-0.98611,0.64167,-0.075,0.58056,0.05,0.98889,-0.76667,-0.57222,0.85278,-0.11944,-0.05,0.48611,-0.78611,-0.41389,0.86389,0.44167,-0.097222,0.99722,-0.3,-0.56389,-0.28333,0.99722,-0.63333,0.31389,-0.40556,0.725,-0.23611,-0.31944,-0.22778,-0.45556,0.25278,-0.52222,-0.11944,-0.29167,0.14167,0.59444,-0.31944,-0.39444,-0.53056,-0.44722,-0.15,0.05,-0.38611,0.036111,-0.48889,-0.14722,0.41389,-0.27778,-0.35833,-0.14722,-0.052778,-0.55556,0.26389,-0.44444,0.19167,-0.28056,-0.125,0.038889,-0.19444,-0.13611,-0.76111,-0.17778,-0.67778,-0.23611,0.10556,-0.48056,0.38611,-0.15833,-0.26111,0.26111,-0.12222,-0.79167,-0.30278,-0.325,-0.8,-0.077778,-0.12222,-0.20278,0.56389,-0.5,-0.99722,-0.99444,0.49444,-0.98611,-0.23056,-0.93611,-0.5,-0.12778,-0.55,-0.93333,-0.52778,-0.78611,0.75278,-0.93889,-0.4,-1,-0.18333,0.73611,-0.74722,-1.0056,-0.53889,-0.75278,0.99722,-0.6,-0.85556,-0.98611,-0.30556,0.98889,-0.75556,-0.81944,-1.0083,-0.78611,-0.061111,-1.0028,-0.10556,-0.99167,-0.59722,-0.54444,0.26111,-0.30833,-0.4,0.57222,-0.53333,-0.091667,-0.23056,-0.95556,0.98333,-0.50833,0.097222,1.0083,-0.94722,-0.025,0.78333,-0.53889,0.36389,-0.90278,0.58333,-0.98889,0.2,-0.62222,-0.80833,0.64444,0.34444,-0.83889,-0.75,0.425,0.14722,-0.65278,0.99444,-0.65,-0.086111,-0.98889,0.70556,0.99167,-0.45833,-0.475,-0.32222,0.98889,0.78889,1.0028,-0.28611,-0.58333,-0.25278,0.76944,-0.26667,0.59444,-0.78056,-0.088889,0.55278,-0.030556,-0.89722,0.37778,-0.98889,0.094444,0.33611,-0.96111,-0.98056,0.175,0.175,-0.92222,0.27778,0.16667,-0.29167,-0.85,-0.83333,0.38333,-0.033333,0.43889,0.027778,-0.24444,-0.22222,-0.71389,-0.72222,-0.16944,-0.12222,0.62222,-0.16667,0.63611,-0.59167,-0.64167,-0.29444,-0.27222,-0.18889,-0.091667,0.82778,-0.56667,0.99167,-0.46389,-0.43611,-0.48056,-0.041667,0.99167,0.88056,0.99167,1,-0.57778,-0.45833,0.030556,-0.33889,0.81389,-0.20833,-0.66667,0.88611,0.77222,-0.73333,0.98889,0.18889,-0.31944,-0.21389,0.80278,-0.825,0.72778,0.86667,-0.21667,0.43056,0.72222,0.75278,-0.99444,-0.99444,-0.15,-0.072222,0.78056,0.66667,-0.21944,0.71389,0.76111,-0.11667,0.013889,-0.98333,0.73333,-0.99444,0.091667,1.025,0.68611,-0.18056,0.16944,0.76944,0.725,-0.73889,-0.72778,0.275,-0.98611,0.70556,0.36667,-0.49722,-0.73333,-0.44722,0.43889,-0.48333,0.55833,-0.225,-0.21389,0.67778,-0.25278,0.775,0.047222,0.055556,-0.036111,0.16944,0.26389,1.0028,0.32222,1.0028,0.425,0.51944,0.58889,-0.99722,0.66667,-0.99167,0.76389,-0.94167,-0.90278,1,1.0056,0.99167,-0.088889,-0.84444,-0.77778,-0.73889,-0.675,-0.63889,-0.58333,-0.53056,-0.46111,-0.42222,0.99167,-0.35,0.825,-0.98889,-0.31944,-0.28056,-0.98889,-0.21111,-0.225,-0.75833,-0.74722,0.45556,-0.97778,-0.13611,-0.13056,-0.125,-0.50278,-0.46389,-0.83889,0.11667,0.99444,-0.019444,-0.027778,-0.28611,-0.069444,-0.64444,-0.22222,0.055556,-0.16111,0.047222,0.8,-0.24722,0.10556,-0.041667,-0.047222,-0.43889,0.083333,0.58889,-0.50556,0.17778,-0.28611,0.175,0.12222,0.41111,-0.99444,-0.76111,-0.075,0.45556,0.14722,0.45,0.27778,-0.98611,-1.0028,0.85278,-0.79167,0.16944,0.13333,0.71389,-0.77778,0.71667,-0.21111,-0.69167,0.83056,0.86111,-0.061111,0.125,0.43611,-0.59444,-0.55833,0.99167,1.0083,0.8,0.78333,-0.19444,-0.36667,-0.38333,-0.26944,0.76389,0.70833,0.725,-0.14167,-0.16389,-0.55,0.60833,0.99722,0.025,0.019444,0.675,0.47778,0.21111,-0.79722,0.23611,0.56111,0.15833,-0.99167,0.44444,0.31389,0.41667,-1.0083,0.49167,-0.98333,-0.71389,0.14444,0.68333,0.275,-0.71111,0.76667,0.16667,-0.011111,0.13056,-0.42222,0.99444,-0.44722,0.99722,0.99722,-0.038889,-0.19444,-0.13611,-0.17778,0.81944,-0.26944,-0.40556,-0.98889,0.98889,0.13611,-0.97778,0.16667,0.65278,0.13056,-0.99167,-0.48333,0.18056,0.85833,-0.8,-0.83611,-0.625,0.98889,0.43333,0.44722,0.45,-0.79167,0.74167,0.88889,-0.67778,-0.97778,0.094444,-0.61111,0.28333,-0.76389,-0.725,0.73889,0.20833,0.63611,-0.58611,-0.52222,0.75,0.72778,0.13611,-0.83056,0.10278,-0.84722,-0.4,0.575,-0.96389,0.25,0.99722,0.59722,-0.39722,-0.37222,0.21667,0.99722,0.53333,-0.86944,-0.99722,0.54444,-0.90556,-0.091667,-0.25,-0.26389,-0.22778,0.28056,-0.98056,0.475,-0.99444,0.26944,-0.93611,0.51111,-0.15556,-0.35,-0.091667,-0.14167,0.35833,-0.086111,-0.99444,0.44722,0.3,-0.81111,0.37222,-0.82222,-0.99444,0.036111,-0.066667,0.0083333,0.49444,0.38056,0.35556,-0.59444,0.99444,-0.63333,0.99167,0.20556,0.047222,-0.63333,0.2,0.036111,0.49444,0.59167,0.225,-0.8,0.77778,0.14167,0.41111,0.775,-0.975,0.99167,0.65278,-0.42222,0.013889,0.71111,0.16111,-0.41111,0.080556,-0.91944,0.60556,0.28611,0.25833,0.58056,0.60278,-0.73611,0.80556,0.84722,-0.23333,0.73611,-0.2,-0.99167,0.29167,-0.11944,-0.16111,0.99444,0.45556,0.43889,0.89167,-0.47778,-0.98611,-0.98611,0.37778,1,-0.99167,0.54722,0.48056,0.99722,-0.011111,-0.39167,-0.80833,0.29167,-0.20556,0.31944,-0.31389,0.052778,-0.82778,0.25278,-0.80278,0.31111,0.45278,-0.030556,0.18889,0.18056,-0.66944,0.36944,-0.63333,-0.58056,0.14167,0.24722,0.13889,-0.61667,-0.54167,0.3,0.10833,0.49722,0.32778,0.17222,0.50556,-0.52222,-0.055556,0.038889,-0.425,0.39722,-0.41944,-0.78333,0.63333,0.38333,0.50278,-0.35833,0.67222,0.013889,-0.78611,-0.26944,-0.058333,0.99167,-0.24722,0.57222,0.81111,-0.24444,0.60833,-0.21667,-0.43611,-0.15278,-0.11111,-0.125,0.73889,0.81111,0.74722,-1,0.98889,0.73889,0.99722,-0.097222,-0.066667,-0.58333,-1,-0.033333,-0.225,1,-0.33333,0.052778,0.072222,-0.78889,0.088889,0.44444,0.98889,0.99444,-0.48611,0.99167,-0.43611,-1.0056,-0.20278,0.21111,0.23333,0.24722,-0.675,0.21111,0.70556,-0.61944,0.36944,0.41667,-0.82222,0.43611,-0.052778,-0.93056,0.55556,-0.775,0.36944,0.60556,0.625,-1,0.18333,-0.275,0.75833,0.78333,-0.90278,0.79444,0.0027778,-0.52222,0.98889,0.99444,-0.95833,0.28056,0.99444,-0.19444,-0.76111,-0.98889,-0.36389,-0.99167,0.28889,-1.0056,-0.79722,-0.51667,-0.71111,0.28056,0.21111,-0.575,-0.46944,0.98611,-0.71667,0.10278,-0.033333,-0.34444,0.98333,-0.26111,-0.81944,0.73333,-0.27222,0.81944,0.71667,-0.047222,-0.10833,0.98056,-0.91944,0.99167,0.87222,0.54444,-0.99167,0.62778,-0.51667,0.1,0.79722,0.74722,0.18611,-0.95556,0.39444,0.86111,0.98333,0.80833,-0.76944,0.41944,-0.76111,0.31389,0.58611,0.76111,0.82778,0.25,-1,0.83889,0.45278,0.22222,0.63889,-0.87778,-0.575,0.56111,0.080556,0.34167,0.62778,0.75556,0.68889,-0.030556,0.80278,0.48056,-0.95833,-0.35833,0.41667,0.14444,-0.11389,0.70833,0.32778,0.98889,-0.28056,0.74167,0.99722,-0.175,0.21389,0.62778,-0.052778,-0.29444,0.13889,-0.98611,0.65833,-0.98889,-0.53333,0.55,0.066667,0.60833,-0.016667,-0.46944,-0.24722,-0.030556,-0.83889,-0.79167,-0.76667,0.25833,0.45556,0.58056,-0.40556,-0.23611,-0.69444,-0.2,-0.91389,-0.64167,0.36389,-0.50278,0.525,-0.54722,0.5,-1.0056,0.31944,-0.37222,-0.45556,0.43056,-0.63889,-0.39722,-0.99722,0.8,-0.26389,-0.725,0.40278,-0.56667,1.0083,0.31389,-0.15833,-0.66111,0.38333,-0.75833,0.28056,0.044444,-0.775,0.080556,0.37778,-0.875,-0.89722,0.36667,-1,0.34444,0.24167,0.31944,-0.78056,-0.97222,-0.99167,0.48889,-0.83889,0.49167,0.54722,-0.88611,0.54722,-0.92778,0.58056,0.61667,-0.85833,0.68333,0.81111,-0.99167,-0.97778,1,1.0111,-0.77222,-0.75833,-0.55,-0.525,0.98611,-0.90278,0.98333,-0.28333,-0.29722,0.66944,0.71111,-0.088889,-0.044444,0.48056,0.46389,0.18889,0.18056,0.3,0.33333,0.20278,0.22222,0.36111,0.4,0.086111,0.086111,0.58611,0.56667,-0.0083333,-0.025,0.71944,0.80278,-0.12222,0.86111,-0.15556,-0.22778,0.99167,1.0028,-0.97778,-0.26944,-0.90278,-0.34722,-0.35,-0.91111,-0.99167,-0.40556,-0.41389,-0.87778,-0.99167,-0.88333,-0.76389,-0.98056,-0.44167,-0.75833,-0.98056,-0.66389,-0.76389,-0.81389,-0.49167,-0.58611,-0.46944,-0.53056,-0.63056,-0.34167,-0.48333,0.98333,0.98611,-0.17222,-0.40556,-0.33611,0.825,0.78889,-0.22778,-0.019444,-0.29722,0.67778,0.61944,-0.11389,-0.16389,0.18056,0.49444,-0.55278,0.44444,0.066667,0,-0.42778,0.4,-0.60833,0.325,0.275,0.26667,0.21944,-0.40556,0.58333,-0.76111,0.125,0.097222,0.50278,0.46389,0.79167,-0.43333,-1,0.99167,-0.083333,-0.1,-0.21111,0.81667,-0.86667,0.74444,0.66667,1.0028,0.63611,-0.325,-0.30556,0.99444,-0.18333,1.0028,0.45833,-0.15556,-0.1,-0.15556,0.99722,-0.50833,-0.21111,0.85833,-0.027778,-0.56389,-0.46667,0.22778,-0.27222,-0.10556,-0.23889,0.10278,0.013889,0.98611,-0.35833,0.69167,-0.39167,-0.72778,0.98333,0.15556,-0.45556,-0.052778,-0.81111,0.31667,-0.47222,-0.50833,0.56944,-0.58056,0.36944,0.81389,0.50278,0.72778,-0.61389,-0.52778,-0.88889,-0.94722,0.46389,-0.37222,0.54167,0.60833,0.71389,0.63333,-0.76389,-0.67222,-1.0056,0.66667,0.7,-0.81389,0.75278,0.35278,-0.73889,-0.99167,0.59444,0.49444,0.87778,-0.63611,0.25278,1.0139,-1.0028,1,-0.89167,-0.99167,0.30278,0.40556,0.13611,-0.99444,-0.99722,-0.99722,0.16667,0.063889,0.047222,-0.0027778,-0.033333,-0.17778,-0.13056,-0.11389,-0.99444,-0.98056,-0.26944,-0.4,-0.19722,-0.76389,-0.75556,-0.42778,0.99167,0.98889,-0.65278,-0.98889,-0.50556,-0.48611,-0.17778,-0.65833,0.98333,-0.88056,0.71944,0.70556,0.98889,-0.28056,-0.24722,-0.65,-1,0.71667,-0.84444,0.44444,0.75278,0.425,-0.069444,-0.45278,-0.027778,0.48333,-1.0028,0.23611,0.47222,0.20556,0.058333,-0.26667,0.14444,0.063889,0.18611,0.28333,0.17778,-0.063889,0.99444,-0.055556,0.13889,0.31944,-0.14444,0.425,0.12222,-0.12778,-0.027778,0.79444,-0.18056,0.46667,-0.27778,0.375,-0.23889,0.65,0.6,-0.425,-0.26111,-0.43056,0.70278,-0.51111,-0.99444,0.58333,-0.36111,0.39722,-0.69167,0.70278,-0.525,-0.98611,-0.69722,-0.98333,0.99167,1,-0.85278,-0.76111,-0.48333,0.16944,-0.975,-0.73611,-0.85,-0.8,0.99444,-1.0028,-0.98611,-0.69444,-1,-0.64722,-0.81389,-0.68611,-0.063889,-1,-1.0222,-0.53611,-0.78333,-0.54722,-0.61667,-0.51944,-0.97778,-0.96111,-0.30833,-1.0028,-0.22778,-0.36111,-0.35278,-0.85833,-0.37778,-0.925,-0.51111,-0.88611,-0.675,-0.16944,-0.15556,0.14444,-0.14167,-0.87222,-0.95,-0.71111,-0.88611,-0.42778,0.069444,0.43889,0.061111,0.075,-1.0028,-0.95556,-0.99444,-1.0056,0.28056,0.27222,-0.14167,0.30278,0.80556,0.98056,0.47222,0.47222,0.98889,0.99722,0.53056,0.13889,0.85,0.60833,0.83611,0.68889,0.70556,0.74167,0.76944,0.41944,0.63611,0.49167,0.45278,0.625,0.98889,0.99167,0.37222,1,0.35556,0.72778,0.85556,0.21111,0.22222
5 |
--------------------------------------------------------------------------------
/make_directories.sh:
--------------------------------------------------------------------------------
1 | #!/bin/bash
2 |
3 | mkdir save
4 | mkdir log
5 |
6 | cd save
7 | mkdir 0 1 2 3 4
8 | cd 0
9 | mkdir save_attention
10 | cd ../1
11 | mkdir save_attention
12 | cd ../2
13 | mkdir save_attention
14 | cd ../3
15 | mkdir save_attention
16 | cd ../4
17 | mkdir save_attention
18 |
19 | cd ../../log
20 | mkdir 0 1 2 3 4
21 | cd 0
22 | mkdir log_attention
23 | cd ../1
24 | mkdir log_attention
25 | cd ../2
26 | mkdir log_attention
27 | cd ../3
28 | mkdir log_attention
29 | cd ../4
30 | mkdir log_attention
31 |
32 | cd ../../
33 |
--------------------------------------------------------------------------------
/scripts/plot_training_curve.py:
--------------------------------------------------------------------------------
1 | import matplotlib.pyplot as plt
2 | import seaborn
3 | import numpy as np
4 | import os
5 | import argparse
6 |
7 | parser = argparse.ArgumentParser()
8 | parser.add_argument('--dataset', type=int, default=5)
9 | args = parser.parse_args()
10 |
11 | log_directory = 'log'
12 | plot_directory= 'plot'
13 |
14 | if args.dataset == 5:
15 |
16 | for i in range(5):
17 | training_val_loss = np.genfromtxt(os.path.join(log_directory, str(i), 'log_attention', 'log_curve.txt'), delimiter=',')
18 |
19 | xaxis = training_val_loss[:, 0]
20 | training_loss = training_val_loss[:, 1]
21 | val_loss = training_val_loss[:, 2]
22 |
23 | output_plot = os.path.join(plot_directory, 'plot_'+str(i)+'.png')
24 |
25 | plt.figure()
26 | plt.plot(xaxis, training_loss, 'r-', label='Training loss')
27 | plt.plot(xaxis, val_loss, 'b-', label='Validation loss')
28 | plt.title('Training and Validation loss vs Epoch for dataset ' + str(i))
29 | plt.legend()
30 | plot_file = os.path.join(plot_directory, 'training_curve_'+str(i)+'.png')
31 | plt.savefig(plot_file, bbox_inches='tight')
32 |
33 | else:
34 | i = args.dataset
35 | training_val_loss = np.genfromtxt(os.path.join(log_directory, str(i), 'log_attention', 'log_curve.txt'), delimiter=',')
36 |
37 | xaxis = training_val_loss[:, 0]
38 | training_loss = training_val_loss[:, 1]
39 | val_loss = training_val_loss[:, 2]
40 |
41 | output_plot = os.path.join(plot_directory, 'plot_'+str(i)+'.png')
42 |
43 | plt.figure()
44 | plt.plot(xaxis, training_loss, 'r-', label='Training loss')
45 | plt.plot(xaxis, val_loss, 'b-', label='Validation loss')
46 | plt.title('Training and Validation loss vs Epoch for dataset ' + str(i))
47 | plt.legend()
48 | plot_file = os.path.join(plot_directory, 'training_curve_'+str(i)+'.png')
49 | plt.savefig(plot_file, bbox_inches='tight')
50 |
51 |
52 |
53 |
--------------------------------------------------------------------------------
/srnn/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/BenJamesbabala/srnn-pytorch/01a48e5ff267bf7773819902fa8077ee6fa9254f/srnn/__init__.py
--------------------------------------------------------------------------------
/srnn/attn_visualize.py:
--------------------------------------------------------------------------------
1 | '''
2 | Visualization script for the attention structural RNN model
3 |
4 | Author: Anirudh Vemula
5 | Date: 8th May 2017
6 | '''
7 | import numpy as np
8 | import matplotlib.pyplot as plt
9 | import pickle
10 | import argparse
11 | import seaborn
12 |
13 |
14 | def plot_attention(true_pos_nodes, pred_pos_nodes, nodes_present, observed_length, attn_weights, name, plot_directory):
15 | traj_length, numNodes, _ = true_pos_nodes.shape
16 |
17 | traj_data = {}
18 | for tstep in range(traj_length):
19 | pred_pos = pred_pos_nodes[tstep, :]
20 | true_pos = true_pos_nodes[tstep, :]
21 |
22 | for ped in range(numNodes):
23 | if ped not in traj_data and tstep < observed_length:
24 | traj_data[ped] = [[], []]
25 |
26 | if ped in nodes_present[tstep]:
27 | traj_data[ped][0].append(true_pos[ped, :])
28 | traj_data[ped][1].append(pred_pos[ped, :])
29 |
30 | fig = plt.figure()
31 | for tstep in range(traj_length):
32 | if tstep < observed_length:
33 | # Observed part
34 | continue
35 | # Predicted part
36 | # Create a plot for current prediction tstep
37 | # TODO for now, just plot the attention for the first step
38 | if len(nodes_present[observed_length-1]) == 1:
39 | # Only one pedestrian, so no spatial edges
40 | continue
41 | for ped in nodes_present[observed_length-1]:
42 | true_traj_ped = (np.array(traj_data[ped][0]) + 1) / 2
43 | pred_traj_ped = (np.array(traj_data[ped][1]) + 1) / 2
44 |
45 | peds_other = attn_weights[tstep-observed_length][ped][1]
46 | attn_w = attn_weights[tstep-observed_length][ped][0]
47 |
48 | # fig = plt.figure()
49 | ax = fig.gca()
50 | c = 'r'
51 | # ipdb.set_trace()
52 | list_of_points = range(true_traj_ped[:, 0].size-1)
53 | plt.plot(true_traj_ped[:, 0], true_traj_ped[:, 1], color=c, linestyle='solid', linewidth=1, marker='o', markevery=list_of_points)
54 | plt.scatter(true_traj_ped[-1, 0], true_traj_ped[-1, 1], color='b', marker='D')
55 | # plt.plot(pred_traj_ped[:, 0], pred_traj_ped[:, 1], color=c, linestyle='dashed', marker='x', linewidth=1)
56 |
57 | for ind_ped, ped_o in enumerate(peds_other):
58 | true_traj_ped_o = (np.array(traj_data[ped_o][0]) + 1) / 2
59 | pred_traj_ped_o = (np.array(traj_data[ped_o][1]) + 1) / 2
60 |
61 | weight = attn_w[ind_ped]
62 |
63 | c = np.random.rand(3)
64 | list_of_points = range(true_traj_ped_o[:, 0].size-1)
65 | plt.plot(true_traj_ped_o[:, 0], true_traj_ped_o[:, 1], color=c, linestyle='solid', linewidth=1, marker='o', markevery=list_of_points)
66 | plt.scatter(true_traj_ped_o[-1, 0], true_traj_ped_o[-1, 1], color='b', marker='D')
67 | circle = plt.Circle((true_traj_ped_o[-1, 0], true_traj_ped_o[-1, 1]), weight*0.1, fill=False, color='b', linewidth=2)
68 | ax.add_artist(circle)
69 | # plt.plot(pred_traj_ped_o[:, 0], pred_traj_ped_o[:, 1], color=c, linestyle='dashed', marker='x', linewidth=2*weight)
70 |
71 | # plt.ylim((1, 0))
72 | # plt.xlim((0, 1))
73 | ax.set_aspect('equal')
74 | # plt.show()
75 | plt.savefig(plot_directory+'/'+name+'_'+str(ped)+'.png')
76 | # plt.close('all')
77 | plt.clf()
78 |
79 | break
80 | plt.close('all')
81 |
82 |
83 | def main():
84 |
85 | parser = argparse.ArgumentParser()
86 |
87 | parser.add_argument('--test_dataset', type=int, default=0,
88 | help='test dataset index')
89 |
90 | # Parse the parameters
91 | args = parser.parse_args()
92 |
93 | save_directory = 'save/'
94 | save_directory += str(args.test_dataset) + '/save_attention/'
95 | plot_directory = 'plot/plot_attention_viz/'+str(args.test_dataset)
96 |
97 | f = open(save_directory+'results.pkl', 'rb')
98 | results = pickle.load(f)
99 |
100 | for i in range(len(results)):
101 | # For each sequence
102 | print(i)
103 | true_pos_nodes = results[i][0]
104 | pred_pos_nodes = results[i][1]
105 | nodes_present = results[i][2]
106 | observed_length = results[i][3]
107 | attn_weights = results[i][4]
108 |
109 | name = 'sequence' + str(i)
110 |
111 | plot_attention(true_pos_nodes, pred_pos_nodes, nodes_present, observed_length, attn_weights, name, plot_directory)
112 |
113 |
114 | if __name__ == '__main__':
115 | main()
116 |
--------------------------------------------------------------------------------
/srnn/criterion.py:
--------------------------------------------------------------------------------
1 | '''
2 | Criterion for the structural RNN model
3 | introduced in https://arxiv.org/abs/1511.05298
4 |
5 | Author : Anirudh Vemula
6 | Date : 30th March 2017
7 | '''
8 |
9 |
10 | import torch
11 | import numpy as np
12 | from helper import getCoef
13 | from torch.autograd import Variable
14 |
15 |
16 | def Gaussian2DLikelihood(outputs, targets, nodesPresent, pred_length):
17 | '''
18 | Computes the likelihood of predicted locations under a bivariate Gaussian distribution
19 | params:
20 | outputs: Torch variable containing tensor of shape seq_length x numNodes x output_size
21 | targets: Torch variable containing tensor of shape seq_length x numNodes x input_size
22 | nodesPresent : A list of lists, of size seq_length. Each list contains the nodeIDs that are present in the frame
23 | '''
24 |
25 | # Get the sequence length
26 | seq_length = outputs.size()[0]
27 | # Get the observed length
28 | obs_length = seq_length - pred_length
29 |
30 | # Extract mean, std devs and correlation
31 | mux, muy, sx, sy, corr = getCoef(outputs)
32 |
33 | # Compute factors
34 | normx = targets[:, :, 0] - mux
35 | normy = targets[:, :, 1] - muy
36 | sxsy = sx * sy
37 | z = torch.pow((normx/sx), 2) + torch.pow((normy/sy), 2) - 2*((corr*normx*normy)/sxsy)
38 | negRho = 1 - torch.pow(corr, 2)
39 |
40 | # Numerator
41 | result = torch.exp(-z/(2*negRho))
42 | # Normalization factor
43 | denom = 2 * np.pi * (sxsy * torch.sqrt(negRho))
44 |
45 | # Final PDF calculation
46 | result = result / denom
47 |
48 | # Numerical stability
49 | epsilon = 1e-20
50 | result = -torch.log(torch.clamp(result, min=epsilon))
51 |
52 | # Compute the loss across all frames and all nodes
53 | loss = 0
54 | counter = 0
55 |
56 | for framenum in range(obs_length, seq_length):
57 | nodeIDs = nodesPresent[framenum]
58 |
59 | for nodeID in nodeIDs:
60 |
61 | loss = loss + result[framenum, nodeID]
62 | counter = counter + 1
63 |
64 | if counter != 0:
65 | return loss / counter
66 | else:
67 | return loss
68 |
69 |
70 | def Gaussian2DLikelihoodInference(outputs, targets, assumedNodesPresent, nodesPresent):
71 | '''
72 | Computes the likelihood of predicted locations under a bivariate Gaussian distribution at test time
73 | params:
74 | outputs : predicted locations
75 | targets : true locations
76 | assumedNodesPresent : Nodes assumed to be present in each frame in the sequence
77 | nodesPresent : True nodes present in each frame in the sequence
78 | '''
79 | # Extract mean, std devs and correlation
80 | mux, muy, sx, sy, corr = getCoef(outputs)
81 |
82 | # Compute factors
83 | normx = targets[:, :, 0] - mux
84 | normy = targets[:, :, 1] - muy
85 | sxsy = sx * sy
86 | z = (normx/sx)**2 + (normy/sy)**2 - 2*((corr*normx*normy)/sxsy)
87 | negRho = 1 - corr**2
88 |
89 | # Numerator
90 | result = torch.exp(-z/(2*negRho))
91 | # Normalization factor
92 | denom = 2 * np.pi * (sxsy * torch.sqrt(negRho))
93 |
94 | # Final PDF calculation
95 | result = result / denom
96 |
97 | # Numerical stability
98 | epsilon = 1e-20
99 |
100 | result = -torch.log(torch.clamp(result, min=epsilon))
101 |
102 | # Compute the loss
103 | loss = Variable(torch.zeros(1).cuda())
104 | counter = 0
105 |
106 | for framenum in range(outputs.size()[0]):
107 | nodeIDs = nodesPresent[framenum]
108 |
109 | for nodeID in nodeIDs:
110 | if nodeID not in assumedNodesPresent:
111 | # If the node wasn't assumed to be present, don't compute loss for it
112 | continue
113 | loss = loss + result[framenum, nodeID]
114 | counter = counter + 1
115 |
116 | if counter != 0:
117 | return loss / counter
118 | else:
119 | return loss
120 |
--------------------------------------------------------------------------------
/srnn/helper.py:
--------------------------------------------------------------------------------
1 | '''
2 | Helper functions for the structural RNN model
3 | introduced in https://arxiv.org/abs/1511.05298
4 |
5 | Author : Anirudh Vemula
6 | Date : 3rd April 2017
7 | '''
8 | import numpy as np
9 | import torch
10 | from torch.autograd import Variable
11 |
12 |
13 | def getVector(pos_list):
14 | '''
15 | Gets the vector pointing from second element to first element
16 | params:
17 | pos_list : A list of size two containing two (x, y) positions
18 | '''
19 | pos_i = pos_list[0]
20 | pos_j = pos_list[1]
21 |
22 | return np.array(pos_i) - np.array(pos_j)
23 |
24 |
25 | def getMagnitudeAndDirection(*args):
26 | '''
27 | Gets the magnitude and direction of the vector corresponding to positions
28 | params:
29 | args: Can be a list of two positions or the two positions themselves (variable-length argument)
30 | '''
31 | if len(args) == 1:
32 | pos_list = args[0]
33 | pos_i = pos_list[0]
34 | pos_j = pos_list[1]
35 |
36 | vector = np.array(pos_i) - np.array(pos_j)
37 | magnitude = np.linalg.norm(vector)
38 | if abs(magnitude) > 1e-4:
39 | direction = vector / magnitude
40 | else:
41 | direction = vector
42 | return [magnitude] + direction.tolist()
43 |
44 | elif len(args) == 2:
45 | pos_i = args[0]
46 | pos_j = args[1]
47 |
48 | ret = torch.zeros(3)
49 | vector = pos_i - pos_j
50 | magnitude = torch.norm(vector)
51 | if abs(magnitude) > 1e-4:
52 | direction = vector / magnitude
53 | else:
54 | direction = vector
55 |
56 | ret[0] = magnitude
57 | ret[1:3] = direction
58 | return ret
59 |
60 | else:
61 | raise NotImplementedError('getMagnitudeAndDirection: Function signature incorrect')
62 |
63 |
64 | def getCoef(outputs):
65 | '''
66 | Extracts the mean, standard deviation and correlation
67 | params:
68 | outputs : Output of the SRNN model
69 | '''
70 | mux, muy, sx, sy, corr = outputs[:, :, 0], outputs[:, :, 1], outputs[:, :, 2], outputs[:, :, 3], outputs[:, :, 4]
71 |
72 | # Exponential to get a positive value for std dev
73 | sx = torch.exp(sx)
74 | sy = torch.exp(sy)
75 | # tanh to get a value between [-1, 1] for correlation
76 | corr = torch.tanh(corr)
77 |
78 | return mux, muy, sx, sy, corr
79 |
80 |
81 | def sample_gaussian_2d(mux, muy, sx, sy, corr, nodesPresent):
82 | '''
83 | Returns samples from 2D Gaussian defined by the parameters
84 | params:
85 | mux, muy, sx, sy, corr : a tensor of shape 1 x numNodes
86 | Contains x-means, y-means, x-stds, y-stds and correlation
87 | nodesPresent : a list of nodeIDs present in the frame
88 |
89 | returns:
90 | next_x, next_y : a tensor of shape numNodes
91 | Contains sampled values from the 2D gaussian
92 | '''
93 | o_mux, o_muy, o_sx, o_sy, o_corr = mux[0, :], muy[0, :], sx[0, :], sy[0, :], corr[0, :]
94 |
95 | numNodes = mux.size()[1]
96 |
97 | next_x = torch.zeros(numNodes)
98 | next_y = torch.zeros(numNodes)
99 | for node in range(numNodes):
100 | if node not in nodesPresent:
101 | continue
102 | mean = [o_mux[node], o_muy[node]]
103 | cov = [[o_sx[node]*o_sx[node], o_corr[node]*o_sx[node]*o_sy[node]], [o_corr[node]*o_sx[node]*o_sy[node], o_sy[node]*o_sy[node]]]
104 |
105 | next_values = np.random.multivariate_normal(mean, cov, 1)
106 | next_x[node] = next_values[0][0]
107 | next_y[node] = next_values[0][1]
108 |
109 | return next_x, next_y
110 |
111 |
112 | def compute_edges(nodes, tstep, edgesPresent):
113 | '''
114 | Computes new edgeFeatures at test time
115 | params:
116 | nodes : A tensor of shape seq_length x numNodes x 2
117 | Contains the x, y positions of the nodes (might be incomplete for later time steps)
118 | tstep : The time-step at which we need to compute edges
119 | edgesPresent : A list of tuples
120 | Each tuple has the (nodeID_a, nodeID_b) pair that represents the edge
121 | (Will have both temporal and spatial edges)
122 |
123 | returns:
124 | edges : A tensor of shape numNodes x numNodes x 2
125 | Contains vectors representing the edges
126 | '''
127 | numNodes = nodes.size()[1]
128 | edges = (torch.zeros(numNodes * numNodes, 2)).cuda()
129 | for edgeID in edgesPresent:
130 | nodeID_a = edgeID[0]
131 | nodeID_b = edgeID[1]
132 |
133 | if nodeID_a == nodeID_b:
134 | # Temporal edge
135 | pos_a = nodes[tstep - 1, nodeID_a, :]
136 | pos_b = nodes[tstep, nodeID_b, :]
137 |
138 | edges[nodeID_a * numNodes + nodeID_b, :] = pos_a - pos_b
139 | # edges[nodeID_a * numNodes + nodeID_b, :] = getMagnitudeAndDirection(pos_a, pos_b)
140 | else:
141 | # Spatial edge
142 | pos_a = nodes[tstep, nodeID_a, :]
143 | pos_b = nodes[tstep, nodeID_b, :]
144 |
145 | edges[nodeID_a * numNodes + nodeID_b, :] = pos_a - pos_b
146 | # edges[nodeID_a * numNodes + nodeID_b, :] = getMagnitudeAndDirection(pos_a, pos_b)
147 |
148 | return edges
149 |
150 |
151 | def get_mean_error(ret_nodes, nodes, assumedNodesPresent, trueNodesPresent):
152 | '''
153 | Computes average displacement error
154 | Parameters
155 | ==========
156 |
157 | ret_nodes : A tensor of shape pred_length x numNodes x 2
158 | Contains the predicted positions for the nodes
159 |
160 | nodes : A tensor of shape pred_length x numNodes x 2
161 | Contains the true positions for the nodes
162 |
163 | nodesPresent : A list of lists, of size pred_length
164 | Each list contains the nodeIDs of the nodes present at that time-step
165 |
166 | Returns
167 | =======
168 |
169 | Error : Mean euclidean distance between predicted trajectory and the true trajectory
170 | '''
171 | pred_length = ret_nodes.size()[0]
172 | error = torch.zeros(pred_length).cuda()
173 | counter = 0
174 |
175 | for tstep in range(pred_length):
176 |
177 | for nodeID in assumedNodesPresent:
178 |
179 | if nodeID not in trueNodesPresent[tstep]:
180 | continue
181 |
182 | pred_pos = ret_nodes[tstep, nodeID, :]
183 | true_pos = nodes[tstep, nodeID, :]
184 |
185 | error[tstep] += torch.norm(pred_pos - true_pos, p=2)
186 | counter += 1
187 |
188 | if counter != 0:
189 | error[tstep] = error[tstep] / counter
190 |
191 | return torch.mean(error)
192 |
193 |
194 | def get_final_error(ret_nodes, nodes, assumedNodesPresent, trueNodesPresent):
195 | '''
196 | Computes final displacement error
197 | Parameters
198 | ==========
199 |
200 | ret_nodes : A tensor of shape pred_length x numNodes x 2
201 | Contains the predicted positions for the nodes
202 |
203 | nodes : A tensor of shape pred_length x numNodes x 2
204 | Contains the true positions for the nodes
205 |
206 | nodesPresent : A list of lists, of size pred_length
207 | Each list contains the nodeIDs of the nodes present at that time-step
208 |
209 | Returns
210 | =======
211 |
212 | Error : Mean final euclidean distance between predicted trajectory and the true trajectory
213 | '''
214 | pred_length = ret_nodes.size()[0]
215 | error = 0
216 | counter = 0
217 |
218 | # Last time-step
219 | tstep = pred_length - 1
220 | for nodeID in assumedNodesPresent:
221 |
222 | if nodeID not in trueNodesPresent[tstep]:
223 | continue
224 |
225 | pred_pos = ret_nodes[tstep, nodeID, :]
226 | true_pos = nodes[tstep, nodeID, :]
227 |
228 | error += torch.norm(pred_pos - true_pos, p=2)
229 | counter += 1
230 |
231 | if counter != 0:
232 | error = error / counter
233 |
234 | return error
235 |
236 |
237 | def sample_gaussian_2d_batch(outputs, nodesPresent, edgesPresent, nodes_prev_tstep):
238 | mux, muy, sx, sy, corr = getCoef_train(outputs)
239 |
240 | next_x, next_y = sample_gaussian_2d_train(mux.data, muy.data, sx.data, sy.data, corr.data, nodesPresent)
241 |
242 | nodes = torch.zeros(outputs.size()[0], 2)
243 | nodes[:, 0] = next_x
244 | nodes[:, 1] = next_y
245 |
246 | nodes = Variable(nodes.cuda())
247 |
248 | edges = compute_edges_train(nodes, edgesPresent, nodes_prev_tstep)
249 |
250 | return nodes, edges
251 |
252 |
253 | def compute_edges_train(nodes, edgesPresent, nodes_prev_tstep):
254 | numNodes = nodes.size()[0]
255 | edges = Variable((torch.zeros(numNodes * numNodes, 2)).cuda())
256 | for edgeID in edgesPresent:
257 | nodeID_a = edgeID[0]
258 | nodeID_b = edgeID[1]
259 |
260 | if nodeID_a == nodeID_b:
261 | # Temporal edge
262 | pos_a = nodes_prev_tstep[nodeID_a, :]
263 | pos_b = nodes[nodeID_b, :]
264 |
265 | edges[nodeID_a * numNodes + nodeID_b, :] = pos_a - pos_b
266 | # edges[nodeID_a * numNodes + nodeID_b, :] = getMagnitudeAndDirection(pos_a, pos_b)
267 | else:
268 | # Spatial edge
269 | pos_a = nodes[nodeID_a, :]
270 | pos_b = nodes[nodeID_b, :]
271 |
272 | edges[nodeID_a * numNodes + nodeID_b, :] = pos_a - pos_b
273 | # edges[nodeID_a * numNodes + nodeID_b, :] = getMagnitudeAndDirection(pos_a, pos_b)
274 |
275 | return edges
276 |
277 |
278 | def getCoef_train(outputs):
279 | mux, muy, sx, sy, corr = outputs[:, 0], outputs[:, 1], outputs[:, 2], outputs[:, 3], outputs[:, 4]
280 |
281 | sx = torch.exp(sx)
282 | sy = torch.exp(sy)
283 | corr = torch.tanh(corr)
284 | return mux, muy, sx, sy, corr
285 |
286 |
287 | def sample_gaussian_2d_train(mux, muy, sx, sy, corr, nodesPresent):
288 | o_mux, o_muy, o_sx, o_sy, o_corr = mux, muy, sx, sy, corr
289 |
290 | numNodes = mux.size()[0]
291 |
292 | next_x = torch.zeros(numNodes)
293 | next_y = torch.zeros(numNodes)
294 | for node in range(numNodes):
295 | if node not in nodesPresent:
296 | continue
297 | mean = [o_mux[node], o_muy[node]]
298 |
299 | cov = [[o_sx[node]*o_sx[node], o_corr[node]*o_sx[node]*o_sy[node]], [o_corr[node]*o_sx[node]*o_sy[node], o_sy[node]*o_sy[node]]]
300 |
301 | next_values = np.random.multivariate_normal(mean, cov, 1)
302 | next_x[node] = next_values[0][0]
303 | next_y[node] = next_values[0][1]
304 |
305 | return next_x, next_y
306 |
--------------------------------------------------------------------------------
/srnn/model.py:
--------------------------------------------------------------------------------
1 | '''
2 | The structural RNN model
3 | introduced in https://arxiv.org/abs/1511.05298
4 |
5 | Author : Anirudh Vemula
6 | Date : 16th March 2017
7 | '''
8 |
9 | import torch.nn as nn
10 | from torch.autograd import Variable
11 | import torch
12 | import numpy as np
13 |
14 |
15 | class HumanNodeRNN(nn.Module):
16 | '''
17 | Class representing human Node RNNs in the st-graph
18 | '''
19 | def __init__(self, args, infer=False):
20 | '''
21 | Initializer function
22 | params:
23 | args : Training arguments
24 | infer : Training or test time (True at test time)
25 | '''
26 | super(HumanNodeRNN, self).__init__()
27 |
28 | self.args = args
29 | self.infer = infer
30 |
31 | # Store required sizes
32 | self.rnn_size = args.human_node_rnn_size
33 | self.output_size = args.human_node_output_size
34 | self.embedding_size = args.human_node_embedding_size
35 | self.input_size = args.human_node_input_size
36 | self.edge_rnn_size = args.human_human_edge_rnn_size
37 |
38 | # Linear layer to embed input
39 | self.encoder_linear = nn.Linear(self.input_size, self.embedding_size)
40 |
41 | # ReLU and Dropout layers
42 | self.relu = nn.ReLU()
43 | self.dropout = nn.Dropout(args.dropout)
44 |
45 | # Linear layer to embed edgeRNN hidden states
46 | self.edge_embed = nn.Linear(self.edge_rnn_size, self.embedding_size)
47 |
48 | # Linear layer to embed attention module output
49 | self.edge_attention_embed = nn.Linear(self.edge_rnn_size*2, self.embedding_size)
50 |
51 | # The LSTM cell
52 | self.cell = nn.LSTMCell(2*self.embedding_size, self.rnn_size)
53 |
54 | # Output linear layer
55 | self.output_linear = nn.Linear(self.rnn_size, self.output_size)
56 |
57 | def forward(self, pos, h_temporal, h_spatial_other, h, c):
58 | '''
59 | Forward pass for the model
60 | params:
61 | pos : input position
62 | h_temporal : hidden state of the temporal edgeRNN corresponding to this node
63 | h_spatial_other : output of the attention module
64 | h : hidden state of the current nodeRNN
65 | c : cell state of the current nodeRNN
66 | '''
67 | # Encode the input position
68 | encoded_input = self.encoder_linear(pos)
69 | encoded_input = self.relu(encoded_input)
70 | encoded_input = self.dropout(encoded_input)
71 |
72 | # Concat both the embeddings
73 | h_edges = torch.cat((h_temporal, h_spatial_other), 1)
74 | h_edges_embedded = self.relu(self.edge_attention_embed(h_edges))
75 | h_edges_embedded = self.dropout(h_edges_embedded)
76 |
77 | concat_encoded = torch.cat((encoded_input, h_edges_embedded), 1)
78 |
79 | # One-step of LSTM
80 | h_new, c_new = self.cell(concat_encoded, (h, c))
81 |
82 | # Get output
83 | out = self.output_linear(h_new)
84 |
85 | return out, h_new, c_new
86 |
87 |
88 | class HumanHumanEdgeRNN(nn.Module):
89 | '''
90 | Class representing the Human-Human Edge RNN in the s-t graph
91 | '''
92 | def __init__(self, args, infer=False):
93 | '''
94 | Initializer function
95 | params:
96 | args : Training arguments
97 | infer : Training or test time (True at test time)
98 | '''
99 | super(HumanHumanEdgeRNN, self).__init__()
100 |
101 | self.args = args
102 | self.infer = infer
103 |
104 | # Store required sizes
105 | self.rnn_size = args.human_human_edge_rnn_size
106 | self.embedding_size = args.human_human_edge_embedding_size
107 | self.input_size = args.human_human_edge_input_size
108 |
109 | # Linear layer to embed input
110 | self.encoder_linear = nn.Linear(self.input_size, self.embedding_size)
111 |
112 | # ReLU and Dropout layers
113 | self.relu = nn.ReLU()
114 | self.dropout = nn.Dropout(args.dropout)
115 |
116 | # The LSTM cell
117 | self.cell = nn.LSTMCell(self.embedding_size, self.rnn_size)
118 |
119 | def forward(self, inp, h, c):
120 | '''
121 | Forward pass for the model
122 | params:
123 | inp : input edge features
124 | h : hidden state of the current edgeRNN
125 | c : cell state of the current edgeRNN
126 | '''
127 | # Encode the input position
128 | encoded_input = self.encoder_linear(inp)
129 | encoded_input = self.relu(encoded_input)
130 | encoded_input = self.dropout(encoded_input)
131 |
132 | # One-step of LSTM
133 | h_new, c_new = self.cell(encoded_input, (h, c))
134 |
135 | return h_new, c_new
136 |
137 |
138 | class EdgeAttention(nn.Module):
139 | '''
140 | Class representing the attention module
141 | '''
142 | def __init__(self, args, infer=False):
143 | '''
144 | Initializer function
145 | params:
146 | args : Training arguments
147 | infer : Training or test time (True at test time)
148 | '''
149 | super(EdgeAttention, self).__init__()
150 |
151 | self.args = args
152 | self.infer = infer
153 |
154 | # Store required sizes
155 | self.human_human_edge_rnn_size = args.human_human_edge_rnn_size
156 | self.human_node_rnn_size = args.human_node_rnn_size
157 | self.attention_size = args.attention_size
158 |
159 | # Linear layer to embed temporal edgeRNN hidden state
160 | self.temporal_edge_layer = nn.Linear(self.human_human_edge_rnn_size, self.attention_size)
161 |
162 | # Linear layer to embed spatial edgeRNN hidden states
163 | self.spatial_edge_layer = nn.Linear(self.human_human_edge_rnn_size, self.attention_size)
164 |
165 | def forward(self, h_temporal, h_spatials):
166 | '''
167 | Forward pass for the model
168 | params:
169 | h_temporal : Hidden state of the temporal edgeRNN
170 | h_spatials : Hidden states of all spatial edgeRNNs connected to the node.
171 | '''
172 | # Number of spatial edges
173 | num_edges = h_spatials.size()[0]
174 |
175 | # Embed the temporal edgeRNN hidden state
176 | temporal_embed = self.temporal_edge_layer(h_temporal)
177 | temporal_embed = temporal_embed.squeeze(0)
178 |
179 | # Embed the spatial edgeRNN hidden states
180 | spatial_embed = self.spatial_edge_layer(h_spatials)
181 |
182 | # Dot based attention
183 | attn = torch.mv(spatial_embed, temporal_embed)
184 |
185 | # Variable length
186 | temperature = num_edges / np.sqrt(self.attention_size)
187 | attn = torch.mul(attn, temperature)
188 |
189 | # Softmax
190 | attn = torch.nn.functional.softmax(attn)
191 |
192 | # Compute weighted value
193 | weighted_value = torch.mv(torch.t(h_spatials), attn)
194 |
195 | return weighted_value, attn
196 |
197 |
198 | class SRNN(nn.Module):
199 | '''
200 | Class representing the SRNN model
201 | '''
202 | def __init__(self, args, infer=False):
203 | '''
204 | Initializer function
205 | params:
206 | args : Training arguments
207 | infer : Training or test time (True at test time)
208 | '''
209 | super(SRNN, self).__init__()
210 |
211 | self.args = args
212 | self.infer = infer
213 |
214 | if self.infer:
215 | # Test time
216 | self.seq_length = 1
217 | self.obs_length = 1
218 | else:
219 | # Training time
220 | self.seq_length = args.seq_length
221 | self.obs_length = args.seq_length - args.pred_length
222 |
223 | # Store required sizes
224 | self.human_node_rnn_size = args.human_node_rnn_size
225 | self.human_human_edge_rnn_size = args.human_human_edge_rnn_size
226 | self.output_size = args.human_node_output_size
227 |
228 | # Initialize the Node and Edge RNNs
229 | self.humanNodeRNN = HumanNodeRNN(args, infer)
230 | self.humanhumanEdgeRNN_spatial = HumanHumanEdgeRNN(args, infer)
231 | self.humanhumanEdgeRNN_temporal = HumanHumanEdgeRNN(args, infer)
232 |
233 | # Initialize attention module
234 | self.attn = EdgeAttention(args, infer)
235 |
236 | def forward(self, nodes, edges, nodesPresent, edgesPresent, hidden_states_node_RNNs, hidden_states_edge_RNNs, cell_states_node_RNNs, cell_states_edge_RNNs):
237 | '''
238 | Forward pass for the model
239 | params:
240 | nodes : input node features
241 | edges : input edge features
242 | nodesPresent : A list of lists, of size seq_length
243 | Each list contains the nodeIDs that are present in the frame
244 | edgesPresent : A list of lists, of size seq_length
245 | Each list contains tuples of nodeIDs that have edges in the frame
246 | hidden_states_node_RNNs : A tensor of size numNodes x node_rnn_size
247 | Contains hidden states of the node RNNs
248 | hidden_states_edge_RNNs : A tensor of size numNodes x numNodes x edge_rnn_size
249 | Contains hidden states of the edge RNNs
250 |
251 | returns:
252 | outputs : A tensor of shape seq_length x numNodes x 5
253 | Contains the predictions for next time-step
254 | hidden_states_node_RNNs
255 | hidden_states_edge_RNNs
256 | '''
257 | # Get number of nodes
258 | numNodes = nodes.size()[1]
259 |
260 | # Initialize output array
261 | outputs = Variable(torch.zeros(self.seq_length*numNodes, self.output_size)).cuda()
262 |
263 | # Data structure to store attention weights
264 | attn_weights = [{} for _ in range(self.seq_length)]
265 |
266 | # For each frame
267 | for framenum in range(self.seq_length):
268 | # Find the edges present in the current frame
269 | edgeIDs = edgesPresent[framenum]
270 |
271 | # Separate temporal and spatial edges
272 | temporal_edges = [x for x in edgeIDs if x[0] == x[1]]
273 | spatial_edges = [x for x in edgeIDs if x[0] != x[1]]
274 |
275 | # Find the nodes present in the current frame
276 | nodeIDs = nodesPresent[framenum]
277 |
278 | # Get features of the nodes and edges present
279 | nodes_current = nodes[framenum]
280 | edges_current = edges[framenum]
281 |
282 | # Initialize temporary tensors
283 | hidden_states_nodes_from_edges_temporal = Variable(torch.zeros(numNodes, self.human_human_edge_rnn_size).cuda())
284 | hidden_states_nodes_from_edges_spatial = Variable(torch.zeros(numNodes, self.human_human_edge_rnn_size).cuda())
285 |
286 | # If there are any edges
287 | if len(edgeIDs) != 0:
288 |
289 | # Temporal Edges
290 | if len(temporal_edges) != 0:
291 | # Get the temporal edges
292 | list_of_temporal_edges = Variable(torch.LongTensor([x[0]*numNodes + x[0] for x in edgeIDs if x[0] == x[1]]).cuda())
293 | # Get nodes associated with the temporal edges
294 | list_of_temporal_nodes = torch.LongTensor([x[0] for x in edgeIDs if x[0] == x[1]]).cuda()
295 |
296 | # Get the corresponding edge features
297 | edges_temporal_start_end = torch.index_select(edges_current, 0, list_of_temporal_edges)
298 | # Get the corresponding hidden states
299 | hidden_temporal_start_end = torch.index_select(hidden_states_edge_RNNs, 0, list_of_temporal_edges)
300 | # Get the corresponding cell states
301 | cell_temporal_start_end = torch.index_select(cell_states_edge_RNNs, 0, list_of_temporal_edges)
302 |
303 | # Do forward pass through temporaledgeRNN
304 | h_temporal, c_temporal = self.humanhumanEdgeRNN_temporal(edges_temporal_start_end, hidden_temporal_start_end, cell_temporal_start_end)
305 |
306 | # Update the hidden state and cell state
307 | hidden_states_edge_RNNs[list_of_temporal_edges.data] = h_temporal
308 | cell_states_edge_RNNs[list_of_temporal_edges.data] = c_temporal
309 |
310 | # Store the temporal hidden states obtained in the temporary tensor
311 | hidden_states_nodes_from_edges_temporal[list_of_temporal_nodes] = h_temporal
312 |
313 | # Spatial Edges
314 | if len(spatial_edges) != 0:
315 | # Get the spatial edges
316 | list_of_spatial_edges = Variable(torch.LongTensor([x[0]*numNodes + x[1] for x in edgeIDs if x[0] != x[1]]).cuda())
317 | # Get nodes associated with the spatial edges
318 | list_of_spatial_nodes = np.array([x[0] for x in edgeIDs if x[0] != x[1]])
319 |
320 | # Get the corresponding edge features
321 | edges_spatial_start_end = torch.index_select(edges_current, 0, list_of_spatial_edges)
322 | # Get the corresponding hidden states
323 | hidden_spatial_start_end = torch.index_select(hidden_states_edge_RNNs, 0, list_of_spatial_edges)
324 | # Get the corresponding cell states
325 | cell_spatial_start_end = torch.index_select(cell_states_edge_RNNs, 0, list_of_spatial_edges)
326 |
327 | # Do forward pass through spatialedgeRNN
328 | h_spatial, c_spatial = self.humanhumanEdgeRNN_spatial(edges_spatial_start_end, hidden_spatial_start_end, cell_spatial_start_end)
329 |
330 | # Update the hidden state and cell state
331 | hidden_states_edge_RNNs[list_of_spatial_edges.data] = h_spatial
332 | cell_states_edge_RNNs[list_of_spatial_edges.data] = c_spatial
333 |
334 | # pass it to attention module
335 | # For each node
336 | for node in range(numNodes):
337 | # Get the indices of spatial edges associated with this node
338 | l = np.where(list_of_spatial_nodes == node)[0]
339 | if len(l) == 0:
340 | # If the node has no spatial edges, nothing to do
341 | continue
342 | l = torch.LongTensor(l).cuda()
343 | # What are the other nodes with these edges?
344 | node_others = [x[1] for x in edgeIDs if x[0] == node and x[0] != x[1]]
345 | # If it has spatial edges
346 | # Get its corresponding temporal edgeRNN hidden state
347 | h_node = hidden_states_nodes_from_edges_temporal[node]
348 |
349 | # Do forward pass through attention module
350 | hidden_attn_weighted, attn_w = self.attn(h_node.view(1, -1), h_spatial[l])
351 |
352 | # Store the attention weights
353 | attn_weights[framenum][node] = (attn_w.data.cpu().numpy(), node_others)
354 |
355 | # Store the output of attention module in temporary tensor
356 | hidden_states_nodes_from_edges_spatial[node] = hidden_attn_weighted
357 |
358 | # If there are nodes in this frame
359 | if len(nodeIDs) != 0:
360 |
361 | # Get list of nodes
362 | list_of_nodes = Variable(torch.LongTensor(nodeIDs).cuda())
363 |
364 | # Get their node features
365 | nodes_current_selected = torch.index_select(nodes_current, 0, list_of_nodes)
366 |
367 | # Get the hidden and cell states of the corresponding nodes
368 | hidden_nodes_current = torch.index_select(hidden_states_node_RNNs, 0, list_of_nodes)
369 | cell_nodes_current = torch.index_select(cell_states_node_RNNs, 0, list_of_nodes)
370 |
371 | # Get the temporal edgeRNN hidden states corresponding to these nodes
372 | h_temporal_other = hidden_states_nodes_from_edges_temporal[list_of_nodes.data]
373 | h_spatial_other = hidden_states_nodes_from_edges_spatial[list_of_nodes.data]
374 |
375 | # Do a forward pass through nodeRNN
376 | outputs[framenum * numNodes + list_of_nodes.data], h_nodes, c_nodes = self.humanNodeRNN(nodes_current_selected, h_temporal_other, h_spatial_other, hidden_nodes_current, cell_nodes_current)
377 |
378 | # Update the hidden and cell states
379 | hidden_states_node_RNNs[list_of_nodes.data] = h_nodes
380 | cell_states_node_RNNs[list_of_nodes.data] = c_nodes
381 |
382 | # Reshape the outputs carefully
383 | outputs_return = Variable(torch.zeros(self.seq_length, numNodes, self.output_size).cuda())
384 | for framenum in range(self.seq_length):
385 | for node in range(numNodes):
386 | outputs_return[framenum, node, :] = outputs[framenum*numNodes + node, :]
387 |
388 | return outputs_return, hidden_states_node_RNNs, hidden_states_edge_RNNs, cell_states_node_RNNs, cell_states_edge_RNNs, attn_weights
389 |
--------------------------------------------------------------------------------
/srnn/sample.py:
--------------------------------------------------------------------------------
1 | '''
2 | Test script for the structural RNN model
3 | introduced in https://arxiv.org/abs/1511.05298
4 |
5 | Author : Anirudh Vemula
6 | Date : 2nd April 2017
7 | '''
8 |
9 |
10 | import os
11 | import pickle
12 | import argparse
13 | import time
14 |
15 | import torch
16 | from torch.autograd import Variable
17 |
18 | import numpy as np
19 | from utils import DataLoader
20 | from st_graph import ST_GRAPH
21 | from model import SRNN
22 | from helper import getCoef, sample_gaussian_2d, compute_edges, get_mean_error, get_final_error
23 | from criterion import Gaussian2DLikelihood, Gaussian2DLikelihoodInference
24 |
25 |
26 | def main():
27 |
28 | parser = argparse.ArgumentParser()
29 | # Observed length of the trajectory parameter
30 | parser.add_argument('--obs_length', type=int, default=8,
31 | help='Observed length of the trajectory')
32 | # Predicted length of the trajectory parameter
33 | parser.add_argument('--pred_length', type=int, default=12,
34 | help='Predicted length of the trajectory')
35 | # Test dataset
36 | parser.add_argument('--test_dataset', type=int, default=3,
37 | help='Dataset to be tested on')
38 |
39 | # Model to be loaded
40 | parser.add_argument('--epoch', type=int, default=49,
41 | help='Epoch of model to be loaded')
42 |
43 | # Parse the parameters
44 | sample_args = parser.parse_args()
45 |
46 | # Save directory
47 | save_directory = 'save/'
48 | save_directory += str(sample_args.test_dataset)+'/'
49 | save_directory += 'save_attention'
50 |
51 | # Define the path for the config file for saved args
52 | with open(os.path.join(save_directory, 'config.pkl'), 'rb') as f:
53 | saved_args = pickle.load(f)
54 |
55 | # Initialize net
56 | net = SRNN(saved_args, True)
57 | net.cuda()
58 |
59 | checkpoint_path = os.path.join(save_directory, 'srnn_model_'+str(sample_args.epoch)+'.tar')
60 |
61 | if os.path.isfile(checkpoint_path):
62 | print('Loading checkpoint')
63 | checkpoint = torch.load(checkpoint_path)
64 | # model_iteration = checkpoint['iteration']
65 | model_epoch = checkpoint['epoch']
66 | net.load_state_dict(checkpoint['state_dict'])
67 | print('Loaded checkpoint at {}'.format(model_epoch))
68 |
69 | # Dataset to get data from
70 | dataset = [sample_args.test_dataset]
71 |
72 | dataloader = DataLoader(1, sample_args.pred_length + sample_args.obs_length, dataset, True, infer=True)
73 |
74 | dataloader.reset_batch_pointer()
75 |
76 | # Construct the ST-graph object
77 | stgraph = ST_GRAPH(1, sample_args.pred_length + sample_args.obs_length)
78 |
79 | results = []
80 |
81 | # Variable to maintain total error
82 | total_error = 0
83 | final_error = 0
84 |
85 | for batch in range(dataloader.num_batches):
86 | start = time.time()
87 |
88 | # Get the next batch
89 | x, _, frameIDs, d = dataloader.next_batch(randomUpdate=False)
90 |
91 | # Construct ST graph
92 | stgraph.readGraph(x)
93 |
94 | nodes, edges, nodesPresent, edgesPresent = stgraph.getSequence()
95 |
96 | # Convert to cuda variables
97 | nodes = Variable(torch.from_numpy(nodes).float(), volatile=True).cuda()
98 | edges = Variable(torch.from_numpy(edges).float(), volatile=True).cuda()
99 |
100 | # Separate out the observed part of the trajectory
101 | obs_nodes, obs_edges, obs_nodesPresent, obs_edgesPresent = nodes[:sample_args.obs_length], edges[:sample_args.obs_length], nodesPresent[:sample_args.obs_length], edgesPresent[:sample_args.obs_length]
102 |
103 | # Sample function
104 | ret_nodes, ret_attn = sample(obs_nodes, obs_edges, obs_nodesPresent, obs_edgesPresent, sample_args, net, nodes, edges, nodesPresent)
105 |
106 | # Compute mean and final displacement error
107 | total_error += get_mean_error(ret_nodes[sample_args.obs_length:].data, nodes[sample_args.obs_length:].data, nodesPresent[sample_args.obs_length-1], nodesPresent[sample_args.obs_length:])
108 | final_error += get_final_error(ret_nodes[sample_args.obs_length:].data, nodes[sample_args.obs_length:].data, nodesPresent[sample_args.obs_length-1], nodesPresent[sample_args.obs_length:])
109 |
110 | end = time.time()
111 |
112 | print('Processed trajectory number : ', batch, 'out of', dataloader.num_batches, 'trajectories in time', end - start)
113 |
114 | # Store results
115 | results.append((nodes.data.cpu().numpy(), ret_nodes.data.cpu().numpy(), nodesPresent, sample_args.obs_length, ret_attn, frameIDs))
116 |
117 | # Reset the ST graph
118 | stgraph.reset()
119 |
120 | print('Total mean error of the model is ', total_error / dataloader.num_batches)
121 | print('Total final error of the model is ', final_error / dataloader.num_batches)
122 |
123 | print('Saving results')
124 | with open(os.path.join(save_directory, 'results.pkl'), 'wb') as f:
125 | pickle.dump(results, f)
126 |
127 |
128 | def sample(nodes, edges, nodesPresent, edgesPresent, args, net, true_nodes, true_edges, true_nodesPresent):
129 | '''
130 | Sample function
131 | Parameters
132 | ==========
133 |
134 | nodes : A tensor of shape obs_length x numNodes x 2
135 | Each row contains (x, y)
136 |
137 | edges : A tensor of shape obs_length x numNodes x numNodes x 2
138 | Each row contains the vector representing the edge
139 | If edge doesn't exist, then the row contains zeros
140 |
141 | nodesPresent : A list of lists, of size obs_length
142 | Each list contains the nodeIDs that are present in the frame
143 |
144 | edgesPresent : A list of lists, of size obs_length
145 | Each list contains tuples of nodeIDs that have edges in the frame
146 |
147 | args : Sampling Arguments
148 |
149 | net : The network
150 |
151 | Returns
152 | =======
153 |
154 | ret_nodes : A tensor of shape (obs_length + pred_length) x numNodes x 2
155 | Contains the true and predicted positions of all the nodes
156 | '''
157 | # Number of nodes
158 | numNodes = nodes.size()[1]
159 |
160 | # Initialize hidden states for the nodes
161 | h_nodes = Variable(torch.zeros(numNodes, net.args.human_node_rnn_size), volatile=True).cuda()
162 | h_edges = Variable(torch.zeros(numNodes * numNodes, net.args.human_human_edge_rnn_size), volatile=True).cuda()
163 | c_nodes = Variable(torch.zeros(numNodes, net.args.human_node_rnn_size), volatile=True).cuda()
164 | c_edges = Variable(torch.zeros(numNodes * numNodes, net.args.human_human_edge_rnn_size), volatile=True).cuda()
165 |
166 | # Propagate the observed length of the trajectory
167 | for tstep in range(args.obs_length-1):
168 | # Forward prop
169 | out_obs, h_nodes, h_edges, c_nodes, c_edges, _ = net(nodes[tstep].view(1, numNodes, 2), edges[tstep].view(1, numNodes*numNodes, 2), [nodesPresent[tstep]], [edgesPresent[tstep]], h_nodes, h_edges, c_nodes, c_edges)
170 | # loss_obs = Gaussian2DLikelihood(out_obs, nodes[tstep+1].view(1, numNodes, 2), [nodesPresent[tstep+1]])
171 |
172 | # Initialize the return data structures
173 | ret_nodes = Variable(torch.zeros(args.obs_length + args.pred_length, numNodes, 2), volatile=True).cuda()
174 | ret_nodes[:args.obs_length, :, :] = nodes.clone()
175 |
176 | ret_edges = Variable(torch.zeros((args.obs_length + args.pred_length), numNodes * numNodes, 2), volatile=True).cuda()
177 | ret_edges[:args.obs_length, :, :] = edges.clone()
178 |
179 | ret_attn = []
180 |
181 | # Propagate the predicted length of trajectory (sampling from previous prediction)
182 | for tstep in range(args.obs_length-1, args.pred_length + args.obs_length-1):
183 | # TODO Not keeping track of nodes leaving the frame (or new nodes entering the frame, which I don't think we can do anyway)
184 | # Forward prop
185 | outputs, h_nodes, h_edges, c_nodes, c_edges, attn_w = net(ret_nodes[tstep].view(1, numNodes, 2), ret_edges[tstep].view(1, numNodes*numNodes, 2),
186 | [nodesPresent[args.obs_length-1]], [edgesPresent[args.obs_length-1]], h_nodes, h_edges, c_nodes, c_edges)
187 | loss_pred = Gaussian2DLikelihoodInference(outputs, true_nodes[tstep + 1].view(1, numNodes, 2), nodesPresent[args.obs_length-1], [true_nodesPresent[tstep + 1]])
188 |
189 | # Sample from o
190 | # mux, ... are tensors of shape 1 x numNodes
191 | mux, muy, sx, sy, corr = getCoef(outputs)
192 | next_x, next_y = sample_gaussian_2d(mux.data, muy.data, sx.data, sy.data, corr.data, nodesPresent[args.obs_length-1])
193 |
194 | ret_nodes[tstep + 1, :, 0] = next_x
195 | ret_nodes[tstep + 1, :, 1] = next_y
196 |
197 | # Compute edges
198 | # TODO Currently, assuming edges from the last observed time-step will stay for the entire prediction length
199 | ret_edges[tstep + 1, :, :] = compute_edges(ret_nodes.data, tstep + 1, edgesPresent[args.obs_length-1])
200 |
201 | # Store computed attention weights
202 | ret_attn.append(attn_w[0])
203 |
204 | return ret_nodes, ret_attn
205 |
206 |
207 | if __name__ == '__main__':
208 | main()
209 |
--------------------------------------------------------------------------------
/srnn/st_graph.py:
--------------------------------------------------------------------------------
1 | '''
2 | ST-graph data structure script for the structural RNN implementation
3 | Takes a batch of sequences and generates corresponding ST-graphs
4 |
5 | Author : Anirudh Vemula
6 | Date : 15th March 2017
7 | '''
8 | import numpy as np
9 | from helper import getVector, getMagnitudeAndDirection
10 |
11 |
12 | class ST_GRAPH():
13 |
14 | def __init__(self, batch_size=50, seq_length=5):
15 | '''
16 | Initializer function for the ST graph class
17 | params:
18 | batch_size : Size of the mini-batch
19 | seq_length : Sequence length to be considered
20 | '''
21 | self.batch_size = batch_size
22 | self.seq_length = seq_length
23 |
24 | self.nodes = [{} for i in range(batch_size)]
25 | self.edges = [{} for i in range(batch_size)]
26 |
27 | def reset(self):
28 | self.nodes = [{} for i in range(self.batch_size)]
29 | self.edges = [{} for i in range(self.batch_size)]
30 |
31 | def readGraph(self, source_batch):
32 | '''
33 | Main function that constructs the ST graph from the batch data
34 | params:
35 | source_batch : List of lists of numpy arrays. Each numpy array corresponds to a frame in the sequence.
36 | '''
37 | for sequence in range(self.batch_size):
38 | # source_seq is a list of numpy arrays
39 | # where each numpy array corresponds to a single frame
40 | source_seq = source_batch[sequence]
41 | for framenum in range(self.seq_length):
42 | # Each frame is a numpy array
43 | # each row in the array is of the form
44 | # pedID, x, y
45 | frame = source_seq[framenum]
46 |
47 | # Add nodes
48 | for ped in range(frame.shape[0]):
49 | pedID = frame[ped, 0]
50 | x = frame[ped, 1]
51 | y = frame[ped, 2]
52 | pos = (x, y)
53 |
54 | if pedID not in self.nodes[sequence]:
55 | node_type = 'H'
56 | node_id = pedID
57 | node_pos_list = {}
58 | node_pos_list[framenum] = pos
59 | self.nodes[sequence][pedID] = ST_NODE(node_type, node_id, node_pos_list)
60 | else:
61 | self.nodes[sequence][pedID].addPosition(pos, framenum)
62 |
63 | # Add Temporal edge between the node at current time-step
64 | # and the node at previous time-step
65 | edge_id = (pedID, pedID)
66 | pos_edge = (self.nodes[sequence][pedID].getPosition(framenum-1), pos)
67 | if edge_id not in self.edges[sequence]:
68 | edge_type = 'H-H/T'
69 | edge_pos_list = {}
70 | # ASSUMPTION: Adding temporal edge at the later time-step
71 | edge_pos_list[framenum] = pos_edge
72 | self.edges[sequence][edge_id] = ST_EDGE(edge_type, edge_id, edge_pos_list)
73 | else:
74 | self.edges[sequence][edge_id].addPosition(pos_edge, framenum)
75 |
76 | # ASSUMPTION:
77 | # Adding spatial edges between all pairs of pedestrians.
78 | # TODO:
79 | # Can be pruned by considering pedestrians who are close to each other
80 | # Add spatial edges
81 | for ped_in in range(frame.shape[0]):
82 | for ped_out in range(ped_in+1, frame.shape[0]):
83 | pedID_in = frame[ped_in, 0]
84 | pedID_out = frame[ped_out, 0]
85 | pos_in = (frame[ped_in, 1], frame[ped_in, 2])
86 | pos_out = (frame[ped_out, 1], frame[ped_out, 2])
87 | pos = (pos_in, pos_out)
88 | edge_id = (pedID_in, pedID_out)
89 | # ASSUMPTION:
90 | # Assuming that pedIDs always are in increasing order in the input batch data
91 | if edge_id not in self.edges[sequence]:
92 | edge_type = 'H-H/S'
93 | edge_pos_list = {}
94 | edge_pos_list[framenum] = pos
95 | self.edges[sequence][edge_id] = ST_EDGE(edge_type, edge_id, edge_pos_list)
96 | else:
97 | self.edges[sequence][edge_id].addPosition(pos, framenum)
98 |
99 | def printGraph(self):
100 | '''
101 | Print function for the graph
102 | For debugging purposes
103 | '''
104 | for sequence in range(self.batch_size):
105 | nodes = self.nodes[sequence]
106 | edges = self.edges[sequence]
107 |
108 | print('Printing Nodes')
109 | print('===============================')
110 | for node in nodes.values():
111 | node.printNode()
112 | print('--------------')
113 |
114 | print
115 | print('Printing Edges')
116 | print('===============================')
117 | for edge in edges.values():
118 | edge.printEdge()
119 | print('--------------')
120 |
121 | def getSequence(self):
122 | '''
123 | Gets the sequence
124 | '''
125 | nodes = self.nodes[0]
126 | edges = self.edges[0]
127 |
128 | numNodes = len(nodes.keys())
129 | list_of_nodes = {}
130 |
131 | retNodes = np.zeros((self.seq_length, numNodes, 2))
132 | retEdges = np.zeros((self.seq_length, numNodes*numNodes, 2)) # Diagonal contains temporal edges
133 | retNodePresent = [[] for c in range(self.seq_length)]
134 | retEdgePresent = [[] for c in range(self.seq_length)]
135 |
136 | for i, ped in enumerate(nodes.keys()):
137 | list_of_nodes[ped] = i
138 | pos_list = nodes[ped].node_pos_list
139 | for framenum in range(self.seq_length):
140 | if framenum in pos_list:
141 | retNodePresent[framenum].append(i)
142 | retNodes[framenum, i, :] = list(pos_list[framenum])
143 |
144 | for ped, ped_other in edges.keys():
145 | i, j = list_of_nodes[ped], list_of_nodes[ped_other]
146 | edge = edges[(ped, ped_other)]
147 |
148 | if ped == ped_other:
149 | # Temporal edge
150 | for framenum in range(self.seq_length):
151 | if framenum in edge.edge_pos_list:
152 | retEdgePresent[framenum].append((i, j))
153 | retEdges[framenum, i*(numNodes) + j, :] = getVector(edge.edge_pos_list[framenum])
154 | else:
155 | # Spatial edge
156 | for framenum in range(self.seq_length):
157 | if framenum in edge.edge_pos_list:
158 | retEdgePresent[framenum].append((i, j))
159 | retEdgePresent[framenum].append((j, i))
160 | # the position returned is a tuple of tuples
161 |
162 | retEdges[framenum, i*numNodes + j, :] = getVector(edge.edge_pos_list[framenum])
163 | retEdges[framenum, j*numNodes + i, :] = -np.copy(retEdges[framenum, i*(numNodes) + j, :])
164 |
165 | return retNodes, retEdges, retNodePresent, retEdgePresent
166 |
167 |
168 | class ST_NODE():
169 |
170 | def __init__(self, node_type, node_id, node_pos_list):
171 | '''
172 | Initializer function for the ST node class
173 | params:
174 | node_type : Type of the node (Human or Obstacle)
175 | node_id : Pedestrian ID or the obstacle ID
176 | node_pos_list : Positions of the entity associated with the node in the sequence
177 | '''
178 | self.node_type = node_type
179 | self.node_id = node_id
180 | self.node_pos_list = node_pos_list
181 |
182 | def getPosition(self, index):
183 | '''
184 | Get the position of the node at time-step index in the sequence
185 | params:
186 | index : time-step
187 | '''
188 | assert(index in self.node_pos_list)
189 | return self.node_pos_list[index]
190 |
191 | def getType(self):
192 | '''
193 | Get node type
194 | '''
195 | return self.node_type
196 |
197 | def getID(self):
198 | '''
199 | Get node ID
200 | '''
201 | return self.node_id
202 |
203 | def addPosition(self, pos, index):
204 | '''
205 | Add position to the pos_list at a specific time-step
206 | params:
207 | pos : A tuple (x, y)
208 | index : time-step
209 | '''
210 | assert(index not in self.node_pos_list)
211 | self.node_pos_list[index] = pos
212 |
213 | def printNode(self):
214 | '''
215 | Print function for the node
216 | For debugging purposes
217 | '''
218 | print('Node type:', self.node_type, 'with ID:', self.node_id, 'with positions:', self.node_pos_list.values(), 'at time-steps:', self.node_pos_list.keys())
219 |
220 |
221 | class ST_EDGE():
222 |
223 | def __init__(self, edge_type, edge_id, edge_pos_list):
224 | '''
225 | Inititalizer function for the ST edge class
226 | params:
227 | edge_type : Type of the edge (Human-Human or Human-Obstacle)
228 | edge_id : Tuple (or set) of node IDs involved with the edge
229 | edge_pos_list : Positions of the nodes involved with the edge
230 | '''
231 | self.edge_type = edge_type
232 | self.edge_id = edge_id
233 | self.edge_pos_list = edge_pos_list
234 |
235 | def getPositions(self, index):
236 | '''
237 | Get Positions of the nodes at time-step index in the sequence
238 | params:
239 | index : time-step
240 | '''
241 | assert(index in self.edge_pos_list)
242 | return self.edge_pos_list[index]
243 |
244 | def getType(self):
245 | '''
246 | Get edge type
247 | '''
248 | return self.edge_type
249 |
250 | def getID(self):
251 | '''
252 | Get edge ID
253 | '''
254 | return self.edge_id
255 |
256 | def addPosition(self, pos, index):
257 | '''
258 | Add a position to the pos_list at a specific time-step
259 | params:
260 | pos : A tuple (x, y)
261 | index : time-step
262 | '''
263 | assert(index not in self.edge_pos_list)
264 | self.edge_pos_list[index] = pos
265 |
266 | def printEdge(self):
267 | '''
268 | Print function for the edge
269 | For debugging purposes
270 | '''
271 | print('Edge type:', self.edge_type, 'between nodes:', self.edge_id, 'at time-steps:', self.edge_pos_list.keys())
272 |
--------------------------------------------------------------------------------
/srnn/test/__init__.py:
--------------------------------------------------------------------------------
https://raw.githubusercontent.com/BenJamesbabala/srnn-pytorch/01a48e5ff267bf7773819902fa8077ee6fa9254f/srnn/test/__init__.py
--------------------------------------------------------------------------------
/srnn/test/criterion_test.py:
--------------------------------------------------------------------------------
1 | from ..criterion import Gaussian2DLikelihood
2 | import torch
3 | from torch.autograd import Variable
4 |
5 |
6 | outputs = torch.ones(5, 5, 5)
7 | targets = torch.zeros(5, 5, 2)
8 | nodesPresent = [[0, 1], [1, 2], [1, 2, 3], [3, 4], [3, 4]]
9 |
10 | outputs = Variable(outputs)
11 | targets = Variable(targets)
12 |
13 | loss = Gaussian2DLikelihood(outputs, targets, nodesPresent)
14 | print loss
15 |
--------------------------------------------------------------------------------
/srnn/test/graph_visualization_test.py:
--------------------------------------------------------------------------------
1 | from ..visualize import make_dot
2 | from ..model import SRNN
3 | from ..utils import DataLoader
4 | from ..st_graph import ST_GRAPH
5 | import pickle
6 | import os
7 | import torch
8 | from torch.autograd import Variable
9 |
10 | with open(os.path.join('save', 'config.pkl'), 'rb') as f:
11 | args = pickle.load(f)
12 |
13 | net = SRNN(args)
14 |
15 | dataset = [0]
16 |
17 | dataloader = DataLoader(1, 8, dataset, True)
18 |
19 | dataloader.reset_batch_pointer()
20 |
21 | stgraph = ST_GRAPH(1, 8)
22 |
23 | x, _, d = dataloader.next_batch()
24 | stgraph.readGraph(x)
25 |
26 | nodes, edges, nodesPresent, edgesPresent = stgraph.getSequence(0)
27 |
28 | # Convert to cuda variables
29 | nodes = Variable(torch.from_numpy(nodes).float()).cuda()
30 | edges = Variable(torch.from_numpy(edges).float()).cuda()
31 |
32 | numNodes = nodes.size()[1]
33 | hidden_states_node_RNNs = Variable(torch.zeros(numNodes, args.human_node_rnn_size)).cuda()
34 | hidden_states_edge_RNNs = Variable(torch.zeros(numNodes*numNodes, args.human_human_edge_rnn_size)).cuda()
35 | cell_states_node_RNNs = Variable(torch.zeros(numNodes, args.human_node_rnn_size)).cuda()
36 | cell_states_edge_RNNs = Variable(torch.zeros(numNodes*numNodes, args.human_human_edge_rnn_size)).cuda()
37 |
38 | net.zero_grad()
39 |
40 | outputs, _, _, _, _ = net(nodes[:args.seq_length], edges[:args.seq_length], nodesPresent[:-1], edgesPresent[:-1], hidden_states_node_RNNs, hidden_states_edge_RNNs, cell_states_node_RNNs, cell_states_edge_RNNs)
41 |
42 | print outputs
43 |
--------------------------------------------------------------------------------
/srnn/test/st_graph_test.py:
--------------------------------------------------------------------------------
1 | from ..utils import DataLoader
2 | from ..st_graph import ST_GRAPH
3 |
4 | bsize = 100
5 | slength = 10
6 |
7 | dataloader = DataLoader(batch_size=bsize, seq_length=slength, datasets=[0], forcePreProcess=True)
8 | x, y, d = dataloader.next_batch()
9 |
10 | graph = ST_GRAPH(batch_size=bsize, seq_length=slength)
11 | graph.readGraph(x)
12 |
13 | # graph.printGraph()
14 |
15 | nodes, edges, nodesPresent, edgesPresent = graph.getSequence(0)
16 |
--------------------------------------------------------------------------------
/srnn/train.py:
--------------------------------------------------------------------------------
1 | '''
2 | Train script for the structural RNN model
3 | introduced in https://arxiv.org/abs/1511.05298
4 |
5 | Author : Anirudh Vemula
6 | Date : 29th March 2017
7 | '''
8 |
9 | import argparse
10 | import os
11 | import pickle
12 | import time
13 |
14 | import torch
15 | from torch.autograd import Variable
16 |
17 | from utils import DataLoader
18 | from st_graph import ST_GRAPH
19 | from model import SRNN
20 | from criterion import Gaussian2DLikelihood
21 |
22 |
23 | def main():
24 | parser = argparse.ArgumentParser()
25 |
26 | # RNN size
27 | parser.add_argument('--human_node_rnn_size', type=int, default=128,
28 | help='Size of Human Node RNN hidden state')
29 | parser.add_argument('--human_human_edge_rnn_size', type=int, default=256,
30 | help='Size of Human Human Edge RNN hidden state')
31 |
32 | # Input and output size
33 | parser.add_argument('--human_node_input_size', type=int, default=2,
34 | help='Dimension of the node features')
35 | parser.add_argument('--human_human_edge_input_size', type=int, default=2,
36 | help='Dimension of the edge features')
37 | parser.add_argument('--human_node_output_size', type=int, default=5,
38 | help='Dimension of the node output')
39 |
40 | # Embedding size
41 | parser.add_argument('--human_node_embedding_size', type=int, default=64,
42 | help='Embedding size of node features')
43 | parser.add_argument('--human_human_edge_embedding_size', type=int, default=64,
44 | help='Embedding size of edge features')
45 |
46 | # Attention vector dimension
47 | parser.add_argument('--attention_size', type=int, default=64,
48 | help='Attention size')
49 |
50 | # Sequence length
51 | parser.add_argument('--seq_length', type=int, default=20,
52 | help='Sequence length')
53 | parser.add_argument('--pred_length', type=int, default=12,
54 | help='Predicted sequence length')
55 |
56 | # Batch size
57 | parser.add_argument('--batch_size', type=int, default=8,
58 | help='Batch size')
59 |
60 | # Number of epochs
61 | parser.add_argument('--num_epochs', type=int, default=200,
62 | help='number of epochs')
63 |
64 | # Gradient value at which it should be clipped
65 | parser.add_argument('--grad_clip', type=float, default=10.,
66 | help='clip gradients at this value')
67 | # Lambda regularization parameter (L2)
68 | parser.add_argument('--lambda_param', type=float, default=0.00005,
69 | help='L2 regularization parameter')
70 |
71 | # Learning rate parameter
72 | parser.add_argument('--learning_rate', type=float, default=0.001,
73 | help='learning rate')
74 | # Decay rate for the learning rate parameter
75 | parser.add_argument('--decay_rate', type=float, default=0.99,
76 | help='decay rate for the optimizer')
77 |
78 | # Dropout rate
79 | parser.add_argument('--dropout', type=float, default=0,
80 | help='Dropout probability')
81 |
82 | # The leave out dataset
83 | parser.add_argument('--leaveDataset', type=int, default=3,
84 | help='The dataset index to be left out in training')
85 |
86 | args = parser.parse_args()
87 |
88 | train(args)
89 |
90 |
91 | def train(args):
92 | datasets = [i for i in range(5)]
93 | # Remove the leave out dataset from the datasets
94 | datasets.remove(args.leaveDataset)
95 | # datasets = [0]
96 | # args.leaveDataset = 0
97 |
98 | # Construct the DataLoader object
99 | dataloader = DataLoader(args.batch_size, args.seq_length + 1, datasets, forcePreProcess=True)
100 |
101 | # Construct the ST-graph object
102 | stgraph = ST_GRAPH(1, args.seq_length + 1)
103 |
104 | # Log directory
105 | log_directory = 'log/'
106 | log_directory += str(args.leaveDataset)+'/'
107 | log_directory += 'log_attention'
108 |
109 | # Logging file
110 | log_file_curve = open(os.path.join(log_directory, 'log_curve.txt'), 'w')
111 | log_file = open(os.path.join(log_directory, 'val.txt'), 'w')
112 |
113 | # Save directory
114 | save_directory = 'save/'
115 | save_directory += str(args.leaveDataset)+'/'
116 | save_directory += 'save_attention'
117 |
118 | # Open the configuration file
119 | with open(os.path.join(save_directory, 'config.pkl'), 'wb') as f:
120 | pickle.dump(args, f)
121 |
122 | # Path to store the checkpoint file
123 | def checkpoint_path(x):
124 | return os.path.join(save_directory, 'srnn_model_'+str(x)+'.tar')
125 |
126 | # Initialize net
127 | net = SRNN(args)
128 | net.cuda()
129 |
130 | optimizer = torch.optim.Adam(net.parameters(), lr=args.learning_rate)
131 | # optimizer = torch.optim.RMSprop(net.parameters(), lr=args.learning_rate, momentum=0.0001, centered=True)
132 |
133 | learning_rate = args.learning_rate
134 | print('Training begin')
135 | best_val_loss = 100
136 | best_epoch = 0
137 |
138 | # Training
139 | for epoch in range(args.num_epochs):
140 | dataloader.reset_batch_pointer(valid=False)
141 | loss_epoch = 0
142 |
143 | # For each batch
144 | for batch in range(dataloader.num_batches):
145 | start = time.time()
146 |
147 | # Get batch data
148 | x, _, _, d = dataloader.next_batch(randomUpdate=True)
149 |
150 | # Loss for this batch
151 | loss_batch = 0
152 |
153 | # For each sequence in the batch
154 | for sequence in range(dataloader.batch_size):
155 | # Construct the graph for the current sequence
156 | stgraph.readGraph([x[sequence]])
157 |
158 | nodes, edges, nodesPresent, edgesPresent = stgraph.getSequence()
159 |
160 | # Convert to cuda variables
161 | nodes = Variable(torch.from_numpy(nodes).float()).cuda()
162 | edges = Variable(torch.from_numpy(edges).float()).cuda()
163 |
164 | # Define hidden states
165 | numNodes = nodes.size()[1]
166 | hidden_states_node_RNNs = Variable(torch.zeros(numNodes, args.human_node_rnn_size)).cuda()
167 | hidden_states_edge_RNNs = Variable(torch.zeros(numNodes*numNodes, args.human_human_edge_rnn_size)).cuda()
168 |
169 | cell_states_node_RNNs = Variable(torch.zeros(numNodes, args.human_node_rnn_size)).cuda()
170 | cell_states_edge_RNNs = Variable(torch.zeros(numNodes*numNodes, args.human_human_edge_rnn_size)).cuda()
171 |
172 | # Zero out the gradients
173 | net.zero_grad()
174 | optimizer.zero_grad()
175 |
176 | # Forward prop
177 | outputs, _, _, _, _, _ = net(nodes[:args.seq_length], edges[:args.seq_length], nodesPresent[:-1], edgesPresent[:-1], hidden_states_node_RNNs, hidden_states_edge_RNNs, cell_states_node_RNNs, cell_states_edge_RNNs)
178 |
179 | # Compute loss
180 | loss = Gaussian2DLikelihood(outputs, nodes[1:], nodesPresent[1:], args.pred_length)
181 | loss_batch += loss.data[0]
182 |
183 | # Compute gradients
184 | loss.backward()
185 |
186 | # Clip gradients
187 | torch.nn.utils.clip_grad_norm(net.parameters(), args.grad_clip)
188 |
189 | # Update parameters
190 | optimizer.step()
191 |
192 | # Reset the stgraph
193 | stgraph.reset()
194 |
195 | end = time.time()
196 | loss_batch = loss_batch / dataloader.batch_size
197 | loss_epoch += loss_batch
198 |
199 | print(
200 | '{}/{} (epoch {}), train_loss = {:.3f}, time/batch = {:.3f}'.format(epoch * dataloader.num_batches + batch,
201 | args.num_epochs * dataloader.num_batches,
202 | epoch,
203 | loss_batch, end - start))
204 |
205 | # Compute loss for the entire epoch
206 | loss_epoch /= dataloader.num_batches
207 | # Log it
208 | log_file_curve.write(str(epoch)+','+str(loss_epoch)+',')
209 |
210 | # Validation
211 | dataloader.reset_batch_pointer(valid=True)
212 | loss_epoch = 0
213 |
214 | for batch in range(dataloader.valid_num_batches):
215 | # Get batch data
216 | x, _, d = dataloader.next_valid_batch(randomUpdate=False)
217 |
218 | # Loss for this batch
219 | loss_batch = 0
220 |
221 | for sequence in range(dataloader.batch_size):
222 | stgraph.readGraph([x[sequence]])
223 |
224 | nodes, edges, nodesPresent, edgesPresent = stgraph.getSequence()
225 |
226 | # Convert to cuda variables
227 | nodes = Variable(torch.from_numpy(nodes).float()).cuda()
228 | edges = Variable(torch.from_numpy(edges).float()).cuda()
229 |
230 | # Define hidden states
231 | numNodes = nodes.size()[1]
232 | hidden_states_node_RNNs = Variable(torch.zeros(numNodes, args.human_node_rnn_size)).cuda()
233 | hidden_states_edge_RNNs = Variable(torch.zeros(numNodes*numNodes, args.human_human_edge_rnn_size)).cuda()
234 | cell_states_node_RNNs = Variable(torch.zeros(numNodes, args.human_node_rnn_size)).cuda()
235 | cell_states_edge_RNNs = Variable(torch.zeros(numNodes*numNodes, args.human_human_edge_rnn_size)).cuda()
236 |
237 | outputs, _, _, _, _, _ = net(nodes[:args.seq_length], edges[:args.seq_length], nodesPresent[:-1], edgesPresent[:-1],
238 | hidden_states_node_RNNs, hidden_states_edge_RNNs,
239 | cell_states_node_RNNs, cell_states_edge_RNNs)
240 |
241 | # Compute loss
242 | loss = Gaussian2DLikelihood(outputs, nodes[1:], nodesPresent[1:], args.pred_length)
243 |
244 | loss_batch += loss.data[0]
245 |
246 | # Reset the stgraph
247 | stgraph.reset()
248 |
249 | loss_batch = loss_batch / dataloader.batch_size
250 | loss_epoch += loss_batch
251 |
252 | loss_epoch = loss_epoch / dataloader.valid_num_batches
253 |
254 | # Update best validation loss until now
255 | if loss_epoch < best_val_loss:
256 | best_val_loss = loss_epoch
257 | best_epoch = epoch
258 |
259 | # Record best epoch and best validation loss
260 | print('(epoch {}), valid_loss = {:.3f}'.format(epoch, loss_epoch))
261 | print('Best epoch {}, Best validation loss {}'.format(best_epoch, best_val_loss))
262 | # Log it
263 | log_file_curve.write(str(loss_epoch)+'\n')
264 |
265 | # Save the model after each epoch
266 | print('Saving model')
267 | torch.save({
268 | 'epoch': epoch,
269 | 'state_dict': net.state_dict(),
270 | 'optimizer_state_dict': optimizer.state_dict()
271 | }, checkpoint_path(epoch))
272 |
273 | # Record the best epoch and best validation loss overall
274 | print('Best epoch {}, Best validation loss {}'.format(best_epoch, best_val_loss))
275 | # Log it
276 | log_file.write(str(best_epoch)+','+str(best_val_loss))
277 |
278 | # Close logging files
279 | log_file.close()
280 | log_file_curve.close()
281 |
282 |
283 | if __name__ == '__main__':
284 | main()
285 |
--------------------------------------------------------------------------------
/srnn/utils.py:
--------------------------------------------------------------------------------
1 | '''
2 | Utils script for the structural RNN implementation
3 | Handles processing the input and target data in batches and sequences
4 |
5 | Author : Anirudh Vemula
6 | Date : 15th March 2017
7 | '''
8 |
9 | import os
10 | import pickle
11 | import numpy as np
12 | import random
13 |
14 |
15 | class DataLoader():
16 |
17 | def __init__(self, batch_size=50, seq_length=5, datasets=[0, 1, 2, 3, 4], forcePreProcess=False, infer=False):
18 | '''
19 | Initialiser function for the DataLoader class
20 | params:
21 | batch_size : Size of the mini-batch
22 | seq_length : Sequence length to be considered
23 | datasets : The indices of the datasets to use
24 | forcePreProcess : Flag to forcefully preprocess the data again from csv files
25 | '''
26 | # List of data directories where raw data resides
27 | self.data_dirs = ['./data/eth/univ', './data/eth/hotel',
28 | './data/ucy/zara/zara01', './data/ucy/zara/zara02',
29 | './data/ucy/univ']
30 | self.used_data_dirs = [self.data_dirs[x] for x in datasets]
31 | self.test_data_dirs = [self.data_dirs[x] for x in range(5) if x not in datasets]
32 | self.infer = infer
33 |
34 | # Number of datasets
35 | self.numDatasets = len(self.data_dirs)
36 |
37 | # Data directory where the pre-processed pickle file resides
38 | self.data_dir = './data'
39 |
40 | # Store the arguments
41 | self.batch_size = batch_size
42 | self.seq_length = seq_length
43 |
44 | # Validation arguments
45 | self.val_fraction = 0.2
46 |
47 | # Define the path in which the process data would be stored
48 | data_file = os.path.join(self.data_dir, "trajectories.cpkl")
49 |
50 | # If the file doesn't exist or forcePreProcess is true
51 | if not(os.path.exists(data_file)) or forcePreProcess:
52 | print("Creating pre-processed data from raw data")
53 | # Preprocess the data from the csv files of the datasets
54 | # Note that this data is processed in frames
55 | self.frame_preprocess(self.used_data_dirs, data_file)
56 |
57 | # Load the processed data from the pickle file
58 | self.load_preprocessed(data_file)
59 | # Reset all the data pointers of the dataloader object
60 | self.reset_batch_pointer(valid=False)
61 | self.reset_batch_pointer(valid=True)
62 |
63 | def frame_preprocess(self, data_dirs, data_file):
64 | '''
65 | Function that will pre-process the pixel_pos.csv files of each dataset
66 | into data with occupancy grid that can be used
67 | params:
68 | data_dirs : List of directories where raw data resides
69 | data_file : The file into which all the pre-processed data needs to be stored
70 | '''
71 | # all_frame_data would be a list of list of numpy arrays corresponding to each dataset
72 | # Each numpy array will correspond to a frame and would be of size (numPeds, 3) each row
73 | # containing pedID, x, y
74 | all_frame_data = []
75 | # Validation frame data
76 | valid_frame_data = []
77 | # frameList_data would be a list of lists corresponding to each dataset
78 | # Each list would contain the frameIds of all the frames in the dataset
79 | frameList_data = []
80 | # numPeds_data would be a list of lists corresponding to each dataset
81 | # Ech list would contain the number of pedestrians in each frame in the dataset
82 | numPeds_data = []
83 | # Index of the current dataset
84 | dataset_index = 0
85 |
86 | # For each dataset
87 | for ind_directory, directory in enumerate(data_dirs):
88 | # define path of the csv file of the current dataset
89 | # file_path = os.path.join(directory, 'pixel_pos.csv')
90 | file_path = os.path.join(directory, 'pixel_pos_interpolate.csv')
91 |
92 | # Load the data from the csv file
93 | data = np.genfromtxt(file_path, delimiter=',')
94 |
95 | # Frame IDs of the frames in the current dataset
96 | frameList = np.unique(data[0, :]).tolist()
97 | numFrames = len(frameList)
98 |
99 | # Add the list of frameIDs to the frameList_data
100 | frameList_data.append(frameList)
101 | # Initialize the list of numPeds for the current dataset
102 | numPeds_data.append([])
103 | # Initialize the list of numpy arrays for the current dataset
104 | all_frame_data.append([])
105 | # Initialize the list of numpy arrays for the current dataset
106 | valid_frame_data.append([])
107 |
108 | # if directory == './data/eth/univ':
109 | # skip = 6
110 | #else:
111 | # skip = 10
112 | # skip = 3
113 |
114 | skip = 10
115 |
116 | for ind, frame in enumerate(frameList):
117 |
118 | ## NOTE CHANGE
119 | if ind % skip != 0:
120 | # Skip every n frames
121 | continue
122 |
123 |
124 | # Extract all pedestrians in current frame
125 | pedsInFrame = data[:, data[0, :] == frame]
126 |
127 | # Extract peds list
128 | pedsList = pedsInFrame[1, :].tolist()
129 |
130 | # Add number of peds in the current frame to the stored data
131 | numPeds_data[dataset_index].append(len(pedsList))
132 |
133 | # Initialize the row of the numpy array
134 | pedsWithPos = []
135 |
136 | # For each ped in the current frame
137 | for ped in pedsList:
138 | # Extract their x and y positions
139 | current_x = pedsInFrame[3, pedsInFrame[1, :] == ped][0]
140 | current_y = pedsInFrame[2, pedsInFrame[1, :] == ped][0]
141 |
142 | # Add their pedID, x, y to the row of the numpy array
143 | pedsWithPos.append([ped, current_x, current_y])
144 |
145 | if (ind > numFrames * self.val_fraction) or (self.infer):
146 | # At inference time, no validation data
147 | # Add the details of all the peds in the current frame to all_frame_data
148 | all_frame_data[dataset_index].append(np.array(pedsWithPos))
149 | else:
150 | valid_frame_data[dataset_index].append(np.array(pedsWithPos))
151 |
152 | dataset_index += 1
153 |
154 | # Save the tuple (all_frame_data, frameList_data, numPeds_data) in the pickle file
155 | f = open(data_file, "wb")
156 | pickle.dump((all_frame_data, frameList_data, numPeds_data, valid_frame_data), f, protocol=2)
157 | f.close()
158 |
159 | def load_preprocessed(self, data_file):
160 | '''
161 | Function to load the pre-processed data into the DataLoader object
162 | params:
163 | data_file : the path to the pickled data file
164 | '''
165 | # Load data from the pickled file
166 | f = open(data_file, 'rb')
167 | self.raw_data = pickle.load(f)
168 | f.close()
169 | # Get all the data from the pickle file
170 | self.data = self.raw_data[0]
171 | self.frameList = self.raw_data[1]
172 | self.numPedsList = self.raw_data[2]
173 | self.valid_data = self.raw_data[3]
174 | counter = 0
175 | valid_counter = 0
176 |
177 | # For each dataset
178 | for dataset in range(len(self.data)):
179 | # get the frame data for the current dataset
180 | all_frame_data = self.data[dataset]
181 | valid_frame_data = self.valid_data[dataset]
182 | print('Training data from dataset {} : {}'.format(dataset, len(all_frame_data)))
183 | print('Validation data from dataset {} : {}'.format(dataset, len(valid_frame_data)))
184 | # Increment the counter with the number of sequences in the current dataset
185 | counter += int(len(all_frame_data) / (self.seq_length))
186 | valid_counter += int(len(valid_frame_data) / (self.seq_length))
187 |
188 | # Calculate the number of batches
189 | self.num_batches = int(counter/self.batch_size)
190 | self.valid_num_batches = int(valid_counter/self.batch_size)
191 | print('Total number of training batches: {}'.format(self.num_batches * 2))
192 | print('Total number of validation batches: {}'.format(self.valid_num_batches))
193 | # On an average, we need twice the number of batches to cover the data
194 | # due to randomization introduced
195 | self.num_batches = self.num_batches * 2
196 | # self.valid_num_batches = self.valid_num_batches * 2
197 |
198 | def next_batch(self, randomUpdate=True):
199 | '''
200 | Function to get the next batch of points
201 | '''
202 | # Source data
203 | x_batch = []
204 | # Target data
205 | y_batch = []
206 | # Frame data
207 | frame_batch = []
208 | # Dataset data
209 | d = []
210 | # Iteration index
211 | i = 0
212 | while i < self.batch_size:
213 | # Extract the frame data of the current dataset
214 | frame_data = self.data[self.dataset_pointer]
215 | frame_ids = self.frameList[self.dataset_pointer]
216 | # Get the frame pointer for the current dataset
217 | idx = self.frame_pointer
218 | # While there is still seq_length number of frames left in the current dataset
219 | if idx + self.seq_length < len(frame_data):
220 | # All the data in this sequence
221 | # seq_frame_data = frame_data[idx:idx+self.seq_length+1]
222 | seq_source_frame_data = frame_data[idx:idx+self.seq_length]
223 | seq_target_frame_data = frame_data[idx+1:idx+self.seq_length+1]
224 | seq_frame_ids = frame_ids[idx:idx+self.seq_length]
225 |
226 | # Number of unique peds in this sequence of frames
227 | x_batch.append(seq_source_frame_data)
228 | y_batch.append(seq_target_frame_data)
229 | frame_batch.append(seq_frame_ids)
230 |
231 | # advance the frame pointer to a random point
232 | if randomUpdate:
233 | self.frame_pointer += random.randint(1, self.seq_length)
234 | else:
235 | self.frame_pointer += self.seq_length
236 |
237 | d.append(self.dataset_pointer)
238 | i += 1
239 |
240 | else:
241 | # Not enough frames left
242 | # Increment the dataset pointer and set the frame_pointer to zero
243 | self.tick_batch_pointer(valid=False)
244 |
245 | return x_batch, y_batch, frame_batch, d
246 |
247 | def next_valid_batch(self, randomUpdate=True):
248 | '''
249 | Function to get the next Validation batch of points
250 | '''
251 | # Source data
252 | x_batch = []
253 | # Target data
254 | y_batch = []
255 | # Dataset data
256 | d = []
257 | # Iteration index
258 | i = 0
259 | while i < self.batch_size:
260 | # Extract the frame data of the current dataset
261 | frame_data = self.valid_data[self.valid_dataset_pointer]
262 | # Get the frame pointer for the current dataset
263 | idx = self.valid_frame_pointer
264 | # While there is still seq_length number of frames left in the current dataset
265 | if idx + self.seq_length < len(frame_data):
266 | # All the data in this sequence
267 | # seq_frame_data = frame_data[idx:idx+self.seq_length+1]
268 | seq_source_frame_data = frame_data[idx:idx+self.seq_length]
269 | seq_target_frame_data = frame_data[idx+1:idx+self.seq_length+1]
270 |
271 | # Number of unique peds in this sequence of frames
272 | x_batch.append(seq_source_frame_data)
273 | y_batch.append(seq_target_frame_data)
274 |
275 | # advance the frame pointer to a random point
276 | if randomUpdate:
277 | self.valid_frame_pointer += random.randint(1, self.seq_length)
278 | else:
279 | self.valid_frame_pointer += self.seq_length
280 |
281 | d.append(self.valid_dataset_pointer)
282 | i += 1
283 |
284 | else:
285 | # Not enough frames left
286 | # Increment the dataset pointer and set the frame_pointer to zero
287 | self.tick_batch_pointer(valid=True)
288 |
289 | return x_batch, y_batch, d
290 |
291 | def tick_batch_pointer(self, valid=False):
292 | '''
293 | Advance the dataset pointer
294 | '''
295 | if not valid:
296 | # Go to the next dataset
297 | self.dataset_pointer += 1
298 | # Set the frame pointer to zero for the current dataset
299 | self.frame_pointer = 0
300 | # If all datasets are done, then go to the first one again
301 | if self.dataset_pointer >= len(self.data):
302 | self.dataset_pointer = 0
303 | else:
304 | # Go to the next dataset
305 | self.valid_dataset_pointer += 1
306 | # Set the frame pointer to zero for the current dataset
307 | self.valid_frame_pointer = 0
308 | # If all datasets are done, then go to the first one again
309 | if self.valid_dataset_pointer >= len(self.valid_data):
310 | self.valid_dataset_pointer = 0
311 |
312 | def reset_batch_pointer(self, valid=False):
313 | '''
314 | Reset all pointers
315 | '''
316 | if not valid:
317 | # Go to the first frame of the first dataset
318 | self.dataset_pointer = 0
319 | self.frame_pointer = 0
320 | else:
321 | self.valid_dataset_pointer = 0
322 | self.valid_frame_pointer = 0
323 |
--------------------------------------------------------------------------------
/srnn/visualize.py:
--------------------------------------------------------------------------------
1 | '''
2 | Visualization script for the structural RNN model
3 | introduced in https://arxiv.org/abs/1511.05298
4 |
5 | Author : Anirudh Vemula
6 | Date : 3rd April 2017
7 | '''
8 | import numpy as np
9 | import matplotlib.pyplot as plt
10 | import pickle
11 | from torch.autograd import Variable
12 | import argparse
13 | import seaborn
14 |
15 |
16 | def plot_trajectories(true_trajs, pred_trajs, nodesPresent, obs_length, name, plot_directory, withBackground=False):
17 | '''
18 | Parameters
19 | ==========
20 |
21 | true_trajs : Numpy matrix of shape seq_length x numNodes x 2
22 | Contains the true trajectories of the nodes
23 |
24 | pred_trajs : Numpy matrix of shape seq_length x numNodes x 2
25 | Contains the predicted trajectories of the nodes
26 |
27 | nodesPresent : A list of lists, of size seq_length
28 | Each list contains the nodeIDs present at that time-step
29 |
30 | obs_length : Length of observed trajectory
31 |
32 | name : Name of the plot
33 |
34 | withBackground : Include background or not
35 | '''
36 |
37 | traj_length, numNodes, _ = true_trajs.shape
38 | # Initialize figure
39 | # Load the background
40 | # im = plt.imread('plot/background.png')
41 | # if withBackground:
42 | # implot = plt.imshow(im)
43 |
44 | # width_true = im.shape[0]
45 | # height_true = im.shape[1]
46 |
47 | # if withBackground:
48 | # width = width_true
49 | # height = height_true
50 | # else:
51 | width = 1
52 | height = 1
53 |
54 | traj_data = {}
55 | for tstep in range(traj_length):
56 | pred_pos = pred_trajs[tstep, :]
57 | true_pos = true_trajs[tstep, :]
58 |
59 | for ped in range(numNodes):
60 | if ped not in traj_data and tstep < obs_length:
61 | traj_data[ped] = [[], []]
62 |
63 | if ped in nodesPresent[tstep]:
64 | traj_data[ped][0].append(true_pos[ped, :])
65 | traj_data[ped][1].append(pred_pos[ped, :])
66 |
67 | for j in traj_data:
68 | c = np.random.rand(3)
69 | true_traj_ped = traj_data[j][0] # List of [x,y] elements
70 | pred_traj_ped = traj_data[j][1]
71 |
72 | true_x = [(p[0]+1)/2*height for p in true_traj_ped]
73 | true_y = [(p[1]+1)/2*width for p in true_traj_ped]
74 | pred_x = [(p[0]+1)/2*height for p in pred_traj_ped]
75 | pred_y = [(p[1]+1)/2*width for p in pred_traj_ped]
76 |
77 | plt.plot(true_x, true_y, color=c, linestyle='solid', linewidth=2, marker='o')
78 | plt.plot(pred_x, pred_y, color=c, linestyle='dashed', linewidth=2, marker='+')
79 |
80 | if not withBackground:
81 | plt.ylim((1, 0))
82 | plt.xlim((0, 1))
83 |
84 | plt.show()
85 | if withBackground:
86 | plt.savefig('plot_with_background/'+name+'.png')
87 | else:
88 | plt.savefig(plot_directory+'/'+name+'.png')
89 |
90 | plt.gcf().clear()
91 | # plt.close('all')
92 | plt.clf()
93 |
94 |
95 | def main():
96 | parser = argparse.ArgumentParser()
97 |
98 | # Experiments
99 | parser.add_argument('--noedges', action='store_true')
100 | parser.add_argument('--temporal', action='store_true')
101 | parser.add_argument('--temporal_spatial', action='store_true')
102 | parser.add_argument('--attention', action='store_true')
103 |
104 | parser.add_argument('--test_dataset', type=int, default=0,
105 | help='test dataset index')
106 |
107 | # Parse the parameters
108 | args = parser.parse_args()
109 |
110 | # Check experiment tags
111 | if not (args.noedges or args.temporal or args.temporal_spatial or args.attention):
112 | print 'Use one of the experiment tags to enforce model'
113 | return
114 |
115 | # Save directory
116 | save_directory = 'save/'
117 | save_directory += str(args.test_dataset) + '/'
118 | plot_directory = 'plot/'
119 | if args.noedges:
120 | print 'No edge RNNs used'
121 | save_directory += 'save_noedges'
122 | plot_directory += 'plot_noedges'
123 | elif args.temporal:
124 | print 'Only temporal edge RNNs used'
125 | save_directory += 'save_temporal'
126 | plot_directory += 'plot_temporal'
127 | elif args.temporal_spatial:
128 | print 'Both temporal and spatial edge RNNs used'
129 | save_directory += 'save_temporal_spatial'
130 | plot_directory += 'plot_temporal_spatial'
131 | else:
132 | print 'Both temporal and spatial edge RNNs used with attention'
133 | save_directory += 'save_attention'
134 | plot_directory += 'plot_attention'
135 |
136 | f = open(save_directory+'/results.pkl', 'rb')
137 | results = pickle.load(f)
138 |
139 | # print "Enter 0 (or) 1 for without/with background"
140 | # withBackground = int(input())
141 | withBackground = 0
142 |
143 | for i in range(len(results)):
144 | print i
145 | name = 'sequence' + str(i)
146 | plot_trajectories(results[i][0], results[i][1], results[i][2], results[i][3], name, plot_directory, withBackground)
147 |
148 |
149 | if __name__ == '__main__':
150 | main()
151 |
--------------------------------------------------------------------------------