46 |
--------------------------------------------------------------------------------
/discussion.md:
--------------------------------------------------------------------------------
1 | Please add topics/questions you'd like to have discussed during the workshop.
2 |
3 | - Challenges in end-to-end compilation (from "PyTorch to FPGA")
4 | - Meaningful error messages (What does "failed to route" mean for a C++ program)
5 | - Compilation times (What does `-O0` compilation look like?)
6 |
7 | - Interface design and integration into larger Systems-on-Chip.
8 |
9 | - Status and future of C based HLS tools
10 | - What is preventing wider adoption?
11 | - Are there any success areas?
12 | - Is C the right abstraction for HLS tools?
13 |
14 | - Discussion on the open-source LLVM front-end of Vitis HLS tool.
15 |
16 | - Future of FPGAs as accelerators vs. other technologies (e.g. GPUs).
17 | - How is HLS helping to popularise FPGAs? What still needs work? What comes after HLS?
18 | - How will technologies such as HBM2 affect FPGA performance/adoption?
19 | - Will "harder" concepts such as multi-FPGA, FPGA networking, and near-memory gain traction?
20 |
21 | - FPGA overlays and encapsulating hardware programmability for software developers.
22 |
23 | - For languages that generate verilog, what are the challenges involved in taking output from
24 | EDA tools and putting it back into the context of the original language.
25 | - How should failing timing paths be presented?
26 | - What should waveforms looks like?
27 |
--------------------------------------------------------------------------------
/blog-proposal.md:
--------------------------------------------------------------------------------
1 | Outlining the state of programming languages for hardware design and to
2 | announce the first workshop on languages, tools, and techniques for accelerator
3 | design (co-located with ASPLOS): https://capra.cs.cornell.edu/latte21/
4 |
5 | There has been a renewed interest in applying programming languages ideas to
6 | enable productive hardware design: Spatial [PLDI 18], Aetherling [PLDI 20],
7 | Dahlia [PLDI 20], Koika [PLDI 20], Calyx [ASPLOS 21], etc.
8 | Building such domain specific languages is an exciting area with the potential
9 | for a lot of innovation—from the design of type systems down
10 | to the interfaces for integrating hardware.
11 | This area also has a lot of unsolved challenges that the PL community is
12 | well positioned to solve: giving semantics to programs with physical
13 | constraints, building verified compilation tools, and building new abstractions
14 | that capture the complexity of design spaces for hardware accelerators.
15 |
16 | I will provide a broad overview of space by drawing on published research as
17 | well as ongoing industry efforts.
18 |
19 | Previous writing for broad audiences:
20 | - Compiling for the Reconfigurable Future: https://rachitnigam.com/post/reconf-future/
21 | - I'm the first author for two of the aforementioned papers in the area (Dahlia and Calyx).
22 |
23 | If possible, I would like the blog post to be cross-published with the SIGARCH
24 | blog: https://www.sigarch.org/blog/
25 |
--------------------------------------------------------------------------------
/.github/workflows/deploy.yml:
--------------------------------------------------------------------------------
1 | name: Zola deploy
2 |
3 | on: [push]
4 |
5 | jobs:
6 | docs:
7 | runs-on: ubuntu-latest
8 | steps:
9 | - uses: actions/checkout@v2
10 | with:
11 | submodules: true # Fetch Hugo themes (true OR recursive)
12 | fetch-depth: 0 # Fetch all history for .GitInfo and .Lastmod
13 |
14 | - name: Install and Run Zola
15 | run: |
16 | sudo snap install --edge zola
17 | zola --root web/ build
18 |
19 | - name: rsync
20 | env:
21 | DEPLOY_HOST: ${{ secrets.DEPLOY_HOST }}
22 | DEPLOY_PORT: ${{ secrets.DEPLOY_PORT }}
23 | DEPLOY_USER: ${{ secrets.DEPLOY_USER }}
24 | DEPLOY_KEY: ${{ secrets.DEPLOY_KEY }}
25 | DEPLOY_KNOWN_HOSTS: ${{ secrets.DEPLOY_KNOWN_HOSTS }}
26 | DEPLOY_SRC: ./web/public/
27 | DEPLOY_DEST: sync/latte21-www
28 | DEPLOY_2_HOST: courses.cit.cornell.edu
29 | DEPLOY_2_DEST: coursewww/capra.cs.cornell.edu/htdocs/latte21
30 | run: |
31 | echo "$DEPLOY_KEY" > pk
32 | echo "$DEPLOY_KNOWN_HOSTS" > kh
33 | chmod 600 pk
34 | echo "========STAGE 1 SYNC: CAPRA==========="
35 | rsync --compress --recursive --checksum --itemize-changes --delete \
36 | -e "ssh -p \"$DEPLOY_PORT\" -i pk -o 'UserKnownHostsFile kh'" \
37 | $DEPLOY_SRC $DEPLOY_USER@$DEPLOY_HOST:$DEPLOY_DEST
38 | echo "========STAGE 2 SYNC: COURSES==========="
39 | ssh -p $DEPLOY_PORT -i pk -o 'UserKnownHostsFile kh' \
40 | $DEPLOY_USER@$DEPLOY_HOST \
41 | rsync --compress --recursive --checksum --itemize-changes \
42 | --delete -e ssh --no-perms \
43 | $DEPLOY_DEST/ $DEPLOY_2_HOST:$DEPLOY_2_DEST
44 |
--------------------------------------------------------------------------------
/web/static/img/latte.svg:
--------------------------------------------------------------------------------
1 |
--------------------------------------------------------------------------------
/web/content/cfp.md:
--------------------------------------------------------------------------------
1 | +++
2 | template = "page.html"
3 | +++
4 |
5 | ## Important Stuff
6 |
7 | Submit your **2-page position paper** via [HotCRP][]. Important dates:
8 |
9 | - [HotCRP][] opens: **February 17, 2021**
10 | - Paper submission: **February 25, 2021 (11:59pm AOE)**
11 | - Author Notification: **March 18, 2021**
12 | - Workshop: **April 15, 2021**
13 |
14 | ## Call for Participation
15 |
16 | **Motivation.**
17 | Hardware acceleration is a key part of combating the stagnation of hardware performance scaling. Implementing accelerators with state-of-the-art hardware design flows, such as traditional HDLs and current HLS tools, remains a specialized task requiring EE training, proprietary toolchains, and extremely slow compile-edit-run cycles. While traditional approaches *might* be appropriate for developing general-purpose CPUs that will ship millions of units, they are an impediment to popularizing acceleration for the “long tail” of applications that could benefit from special-purpose hardware. With new language designs and new techniques inspired by traditional compilers research, there is an opportunity to turn accelerator construction from a years-long enterprise into a weekend project.
18 |
19 | **Scope.**
20 | LATTE is a venue for discussion, debate, and brainstorming at the intersection of hardware acceleration and programming languages research. The focus is on new languages and tools that aim to let domain specialists, not just hardware experts, produce efficient accelerators. A full range of targets are in scope: ASICs (silicon), FPGAs, CGRAs, or future reconfigurable hardware. A wide variety of research topics are in scope including, but not limited to:
21 |
22 | - Domain-specific languages for accelerator design
23 | - Compilers for optimizing hardware designs
24 | - Verification, testing, and debugging techniques
25 | - Virtualization schemes for specialized & reconfigurable hardware
26 |
27 | LATTE solicits short position papers that need not fit the mold of a traditional publication:
28 |
29 | - Early, in-progress research snapshots
30 | - Experience reports on building or deploying accelerators and the tools involved
31 | - Essays advocating for or against a general approach
32 | - Retrospectives on past efforts on tools, languages, and techniques for accelerator design
33 | - Calls for solutions to open challenges in the area (questions without answers)
34 | - Demonstrations of real systems (to be shown off in a live demo at the workshop)
35 |
36 | ### How to Participate
37 |
38 | The primary goal of the workshop is to enable discussion. It will accept **2-page position papers**.
39 | The workshop will allocate short time slots to the papers, each paired with a discussion following [SNAPL][]'s discussion format:
40 | “table discussion” where small breakout groups will discuss the paper, followed by plenary Q&A.
41 |
42 | Position paper submissions will undergo peer review by a program committee of interdisciplinary experts working on both high-level (languages, compilers, drivers) and low-level (circuit optimization, interconnect design) problems in the area.
43 |
44 | Papers should use [the formatting guidelines for SIGPLAN conferences][sigplanconf] (the `acmart` format with the `sigplan` two-column option) and not exceed 2 pages, excluding references. Review is single-blind, so please include authors' names on the submitted PDF.
45 |
46 | Paper submission will is via [HotCRP][].
47 | The accepted papers will not be published in a proceeding—PDFs will instead appear on the workshop's website.
48 |
49 | [snapl]: http://cs.brown.edu/~sk/Memos/Conference-Discussion-Format/
50 | [hotcrp]: https://latte.cs.cornell.edu/
51 |
--------------------------------------------------------------------------------
/web/content/accepted.csv:
--------------------------------------------------------------------------------
1 | ID,Title,Authors,Thread,Talk,Session
2 | 1,Phism: Polyhedral High-Level Synthesis in MLIR,Ruizhe Zhao and Jianyi Cheng (Imperial College London),https://github.com/cucapra/latte21/discussions/21,https://youtu.be/50UjVlDF1Us,Language Design
3 | 3,High-Level Synthesis Tools should be Proven Correct,Yann Herklotz and John Wickerson (Imperial College London),https://github.com/cucapra/latte21/discussions/12,https://youtu.be/Olhbhq46Amc,Formal
4 | 4,Registerless Hardware Description,Oron Port and Yoav Etsion (Technion - Israel Institute of Technology),https://github.com/cucapra/latte21/discussions/22,https://youtu.be/_kpwAtg7TgI,HDL
5 | 5,ScaleHLS: Achieving Scalable High-Level Synthesis through MLIR,"Hanchen Ye (University of Illinois at Urbana-Champaign); Cong Hao (Georgia Institute of Technology); Hyunmin Jeong, Jack Huang, and Deming Chen (University of Illinois at Urbana-Champaign)",https://github.com/cucapra/latte21/discussions/37,https://youtu.be/BkE-3upeHUE,HLS
6 | 7,Improving HLS with Shared Accelerators: A Retrospective,Parnian Mokri and Mark Hempstead (Tufts University),https://github.com/cucapra/latte21/discussions/31,https://youtu.be/uoRFMrrOqU4,HLS
7 | 8,Elastic Silicon Interconnects: Abstracting Communication in Accelerator Design,John Demme (Microsoft),https://github.com/cucapra/latte21/discussions/29,https://www.youtube.com/watch?v=gjOkGX2E7EY,Interconnect & Memory
8 | 9,Compiler Infrastructure for Specializing Domain-Specific Memory Templates,Stephanie Soldavini and Christian Pilato (Politecnico di Milano),https://github.com/cucapra/latte21/discussions/34,https://youtu.be/PugsMNbdRQE,Interconnect & Memory
9 | 11,"Meta-level issues in Offloading: Scoping, Composition, Development, and their Automation","Andre DeHon, Hans Giesen, Nik Sultana, and Yuanlong Xiao (University of Pennsylvania)",https://github.com/cucapra/latte21/discussions/19,https://www.youtube.com/watch?v=PO6cqg46D4M,Integration
10 | 12,Enabling Cross-Domain Communication: How to Bridge the Gap between AI and HW Engineers,"Michael J. Klaiber, Axel J. Acosta, Ingo Feldner, and Falk Rehm (Robert Bosch GmbH)",https://github.com/cucapra/latte21/discussions/39,https://www.youtube.com/watch?v=IgUMyaA99kg,Industrial & Applications
11 | 15,Building Beyond HLS: Graph Analysis and Others,"Pedro Filipe Silva (Faculty of Engineering, University of Porto); João Bispo and Nuno Paulino (INESC-TEC and Faculty of Engineering, University of Porto)",https://github.com/cucapra/latte21/discussions/32,https://www.youtube.com/watch?v=CHpg09hwt3w,Industrial & Applications
12 | 17,Compile-Time RTL Interpreters,Sahand Kashani and James R. Larus (EPFL),https://github.com/cucapra/latte21/discussions/24,https://youtu.be/OAKiMHiPu1g,HDL
13 | 18,The Enzian Coherent Interconnect (ECI): Opening a coherence protocol to research and applications,"Abishek Ramdas, David Cock, Timothy Roscoe, and Gustavo Alonso (ETH Zurich)",https://github.com/cucapra/latte21/discussions/16,https://youtu.be/D9Sqg-Xzi2k,Interconnect & Memory
14 | 19,What are the Semantics of Hardware?,Gilbert Bernstein (Berkeley); Ross Daly (Stanford); Jonathan Ragan-Kelley (MIT); Pat Hanrahan (Stanford),https://github.com/cucapra/latte21/discussions/28,https://www.youtube.com/watch?v=xiAW7ULIulM,Formal
15 | 20,Towards Higher-Level Synthesis and Co-design with Python,Alexandre Quenon and Vitor Ramos Gomes da Silva (University of Mons),https://github.com/cucapra/latte21/discussions/13,https://www.youtube.com/watch?v=YB0WdSEOXT8,Language Design
16 | 21,Application specific dataflow machine construction for programming FPGAs via Lucent,Nick Brown (EPCC at the University of Edinburgh),https://github.com/cucapra/latte21/discussions/17,https://youtu.be/JePzhA51kWA,Language Design
17 | 23,A Position on Transparent Reconfigurable Systems,"Luís Sousa (Faculty of Engineering, University of Porto); Nuno Paulino and João Canas Ferreira (INESC-TEC and Faculty of Engineering, University of Porto)",https://github.com/cucapra/latte21/discussions/25,https://youtu.be/WNjIbt7Dyi4,Abstractions
18 | 24,Single-Source Hardware-Software Codesign,"Blaise Tine, Hyesoon Kim, and Sudhakar Yalamanchili (Georgia Institute of Technology)",https://github.com/cucapra/latte21/discussions/44,https://youtu.be/WxpZwn6PafY,Integration
19 | 28,Design Decisions in LiveHD for HDLs Compilation,Sheng-Hong Wang and Jose Renau (University of California - Santa Cruz),https://github.com/cucapra/latte21/discussions/15,https://youtu.be/oKjYDUqeSoU,HDL
20 | 30,(Redacted),(Redacted),,,Integration
21 | 31,Faster Coverage Convergence with Automatic Test ParameterTuning in Constrained Random Verification,"Qijing Huang (Google; UC Berkeley); Hamid Shojaei, Fred Zyda, and Azade Nazi (Google); Shobha Vasudevan (Google; UIUC); Sat Chatterjee and Richard Ho (Google)",https://github.com/cucapra/latte21/discussions/41,https://youtu.be/HbGRaXNwBm8,Industrial & Applications
22 | 32,Generality is the Key Dimension in Accelerator Design,"Jian Weng, Vidushi Dadu, Sihao Liu, and Tony Nowatzki (UCLA)",https://github.com/cucapra/latte21/discussions/11,https://www.youtube.com/watch?v=4RhaY4ekBVE,Abstractions
23 | 33,High-Level Synthesis of Security Properties via Software-Level Abstractions,Christian Pilato (Politecnico di Milano); Francesco Regazzoni (Università della Svizzera italiana),https://github.com/cucapra/latte21/discussions/35,https://youtu.be/Ubs5ag-6jaM,HLS
24 |
--------------------------------------------------------------------------------
/web/content/index.md:
--------------------------------------------------------------------------------
1 | +++
2 | template = "index.html"
3 | +++
4 |
5 | **LATTE '21 has concluded.**
6 | Please feel free to participate in the [discussion threads][github-thread] for
7 | the various papers and other topics.
8 | Hope to see you next year!
9 |
10 | LATTE is an [ASPLOS][] workshop on applying programming languages and compilers
11 | techniques to generate hardware accelerators.
12 | The workshop will take place on **April 15, 2021** and will feature 23 papers
13 | along with 2 keynote presentations.
14 | The [call for participation](./cfp) has been archived.
15 | Read [Rachit's blog post][pl-blog] summarizing the work in this area and advertising the workshop.
16 | Please register for it through the [ASPLOS registration website][asplos-registration].
17 |
18 | ## Program
19 |
20 | The LATTE program will consist of sessions of 2-3 papers each grouped according
21 | to their theme.
22 | Each paper talk will be 6 minutes long and prerecorded.
23 | After the talk, we will spend 6 minutes in small breakout rooms discussion the
24 | session papers.
25 | After that, we will regroup in the main session and dedicate 6 minutes for
26 | plenary questions to the authors.
27 | All talks will be released a few days before the workshop.
28 |
29 | To enable extensive discussions, each paper will have [Github discussion
30 | thread][github-thread].
31 | Authors will be subscribed to the thread for their paper and will answer questions
32 | during and (hopefully) after the workshop!
33 |
34 | | Time (EST) | Event |
35 | |-------------|-------|
36 | | 10am - 10.15am | Opening & Introductions |
37 | | 10.15am - 11am | [Keynote - Sharad Malik][sharad-position] |
38 | | 11am - 11.30am | [Session 1 - Formal](#formal) |
39 | | 11.30am - 12pm| [Session 2 - Language Design](#language-design) |
40 | | 12pm - 12.15pm|Break |
41 | | 12.15pm - 12.45pm| [Session 3 - HDL](#hdl) |
42 | | 12.45pm - 1.15pm| [Session 4 - Interconnect & Memory](#interconnect-memory) |
43 | | 1.15pm - 2.00pm| Lunch Break |
44 | | 2.00pm - 2.30pm|Topic Discussion |
45 | | 2.30pm - 3.00pm| [Session 5 - Integration](#integration) |
46 | | 3.00pm - 3.45pm|Keynote - Sophia Yakun Shao |
47 | | 3.45pm - 4.15pm| [Session 6 - HLS](#hls) |
48 | | 4.15pm - 4.30pm|Break |
49 | | 4.30pm - 5pm| [Session 7 - Industrial & Applications](#industrial-applications) |
50 | | 5pm - 5.30| [Session 8 - Abstractions](#abstractions) |
51 | | 5.30pm - 6pm|Closing Discussion |
52 |
53 | ## Discussion Topics
54 |
55 | We will have 1-2 short discussion sessions that will feature a debate
56 | among the attendees.
57 | We need your help building a list of controversial topics to serve as grist for
58 | the discussion mill.
59 |
60 | Please submit a sentence or two about an open problem, philosophical question,
61 | or other thought you'd like to see discussed at the workshop.
62 | You can submit as many of these as you like.
63 | We'll use these suggestions to set up a debate during the workshop.
64 |
65 | Add your topic suggestions by editing [this wiki page][topics] on GitHub.
66 |
67 | ## Keynotes
68 |
69 | - **[Sharad Malik][sharad]**: [Hardware-Software Interface Specification for Verification in Accelerator-Rich Platforms][sharad-position]
70 | - **[Yakun Sophia Shao][sophia]**: Efficient and Productive: Holistic Approach to Accelerator Design, Integration, and Scheduling
71 |
72 | ## Accepted Papers
73 |
74 | The LATTE 21 program will feature the following 23 papers and will be discussed
75 | using [SNAPL][]'s round-table discussion format.
76 | Papers are grouped based on a theme.
77 | Each session will start with all papers of a theme and follow with the
78 | discussions.
79 | The camera-ready version for each paper is linked here.
80 | A few days before the workshop, we will link the prerecorded talks for each
81 | paper as well as the [Github discussion][github-thread] for each paper.
82 |
83 | {{ program() }}
84 |
85 |
86 |
87 |
88 | ### Program Committee
89 |
90 | - Thomas Bourgeat, MIT
91 | - Ross Daly, Stanford
92 | - [David Durst](https://davidbdurst.com/), Stanford
93 | - Tobias Grosser, ETH Zürich
94 | - Shunning Jiang, Cornell
95 | - Lana Josipović, EPFL
96 | - Vinod Kathail, Xilinx
97 | - Chris Leary, Google
98 | - Thierry Moreau, OctoML
99 | - [Clément Pit-Claudel](https://pit-claudel.fr/clement/), MIT
100 | - Jose Renau, UCSC
101 | - Hongbo Rong, Intel
102 | - John Wickerson, ICL
103 |
104 |
105 |
106 |
107 |
108 | ### Organizing Committee
109 |
110 | - [Rachit Nigam](https://rachitnigam.com), Cornell University
111 | - [Adrian Sampson](http://adriansampson.net), Cornell University
112 | - Stephen Neuendorffer, Xilinx
113 | - [Zhiru Zhang](https://www.csl.cornell.edu/~zhiruz/), Cornell University
114 |
115 |
116 |
117 |
118 |
119 |
120 | [topics]: https://github.com/cucapra/latte20/edit/main/discussion.md
121 | [snapl]: http://cs.brown.edu/~sk/Memos/Conference-Discussion-Format/
122 | [sigplanconf]: http://www.sigplan.org/Resources/Author/
123 | [hotcrp]: https://latte.cs.cornell.edu/
124 | [asplos]: https://asplos-conference.org
125 | [pl-blog]: https://blog.sigplan.org/2021/02/17/languages-tools-and-techniques-for-accelerator-design/
126 | [sophia]: https://people.eecs.berkeley.edu/~ysshao/index.html
127 | [sharad]: https://www.princeton.edu/~sharad/
128 | [github-thread]: https://github.com/cucapra/latte21/discussions/categories/papers
129 | [sharad-position]: //paper/6.pdf
130 | [asplos-registration]: https://web.cvent.com/event/6259afee-6594-4456-86a0-2a22fbfc47b8/summary
131 |
--------------------------------------------------------------------------------
/web/static/css/main.css:
--------------------------------------------------------------------------------
1 | @charset "UTF-8";
2 | @font-face {
3 | font-family:et-book;
4 | src:url(et-book/et-book-roman-line-figures/et-book-roman-line-figures.eot);
5 | src:url(et-book/et-book-roman-line-figures/et-book-roman-line-figures.eot?#iefix) format("embedded-opentype"),
6 | url(et-book/et-book-roman-line-figures/et-book-roman-line-figures.woff) format("woff"),
7 | url(et-book/et-book-roman-line-figures/et-book-roman-line-figures.ttf) format("truetype"),
8 | url(et-book/et-book-roman-line-figures/et-book-roman-line-figures.svg#etbookromanosf) format("svg");
9 | font-weight:400;
10 | font-style:normal;
11 | font-display:swap
12 | }
13 | @font-face {
14 | font-family:et-book;
15 | src:url(et-book/et-book-display-italic-old-style-figures/et-book-display-italic-old-style-figures.eot);
16 | src:url(et-book/et-book-display-italic-old-style-figures/et-book-display-italic-old-style-figures.eot?#iefix) format("embedded-opentype"),
17 | url(et-book/et-book-display-italic-old-style-figures/et-book-display-italic-old-style-figures.woff) format("woff"),
18 | url(et-book/et-book-display-italic-old-style-figures/et-book-display-italic-old-style-figures.ttf) format("truetype"),
19 | url(et-book/et-book-display-italic-old-style-figures/et-book-display-italic-old-style-figures.svg#etbookromanosf) format("svg");
20 | font-weight:400;
21 | font-style:italic;
22 | font-display:swap
23 | }
24 | @font-face {
25 | font-family:et-book;
26 | src:url(et-book/et-book-bold-line-figures/et-book-bold-line-figures.eot);
27 | src:url(et-book/et-book-bold-line-figures/et-book-bold-line-figures.eot?#iefix) format("embedded-opentype"),
28 | url(et-book/et-book-bold-line-figures/et-book-bold-line-figures.woff) format("woff"),
29 | url(et-book/et-book-bold-line-figures/et-book-bold-line-figures.ttf) format("truetype"),
30 | url(et-book/et-book-bold-line-figures/et-book-bold-line-figures.svg#etbookromanosf) format("svg");
31 | font-weight:700;
32 | font-style:normal;
33 | font-display:swap
34 | }
35 | @font-face {
36 | font-family:et-book-roman-old-style;
37 | src:url(et-book/et-book-roman-old-style-figures/et-book-roman-old-style-figures.eot);
38 | src:url(et-book/et-book-roman-old-style-figures/et-book-roman-old-style-figures.eot?#iefix) format("embedded-opentype"),
39 | url(et-book/et-book-roman-old-style-figures/et-book-roman-old-style-figures.woff) format("woff"),
40 | url(et-book/et-book-roman-old-style-figures/et-book-roman-old-style-figures.ttf) format("truetype"),
41 | url(et-book/et-book-roman-old-style-figures/et-book-roman-old-style-figures.svg#etbookromanosf) format("svg");
42 | font-weight:400;
43 | font-style:normal;
44 | font-display:swap
45 | }
46 |
47 | * {
48 | -webkit-box-sizing: border-box;
49 | -moz-box-sizing: border-box;
50 | box-sizing: border-box;
51 | }
52 |
53 | :before,
54 | :after {
55 | -webkit-box-sizing: border-box;
56 | -moz-box-sizing: border-box;
57 | box-sizing: border-box;
58 | }
59 |
60 | html {
61 | background-color: #F0F1F3;
62 | padding: 2%;
63 | }
64 |
65 | body {
66 | font-weight: medium;
67 | font-size: 16px;
68 | line-height: 1.4;
69 | font-family:et-book,Palatino,"Palatino Linotype","Palatino LT STD","Book Antiqua",Georgia,serif;
70 | color: #242424;
71 | max-width: 800px;
72 | margin: 5% ;
73 | margin-top: 0;
74 | }
75 |
76 | section {
77 | background: #fff;
78 | margin-bottom: 1%;
79 | position: relative;
80 | padding: 6% 8%;
81 | }
82 |
83 | hr {
84 | color: #ddd;
85 | height: 1px;
86 | margin: 1em 0;
87 | border-top: solid 1px #ddd;
88 | border-bottom: none;
89 | border-left: 0;
90 | border-right: 0;
91 | }
92 |
93 | h1, h2, h3,
94 | p > strong, .paper strong, td a, td a:hover, td a:focus {
95 | color: #98694F
96 | }
97 |
98 | ul.session {
99 | margin-top: 0.3em;
100 | }
101 |
102 | .paper-links {
103 | margin-top: .3em;
104 | margin-bottom: .8em;
105 | }
106 |
107 | .button a {
108 | text-decoration: none;
109 | border: 1px #98694F solid;
110 | margin-right: 0.5em;
111 | padding-top: 0.2em;
112 | padding-bottom: 0.2em;
113 | padding-left: 0.4em;
114 | padding-right: 0.4em;
115 | border-radius: 0.2rem;
116 | font-size: 0.92em;
117 | color: #98694F;
118 | }
119 |
120 | .button:hover a, .button:focus a {
121 | border: 1px #98694F solid;
122 | background: #98694F;
123 | color: #fff;
124 | transition-property: background;
125 | transition-duration: .2s;
126 | text-decoration: none;
127 | }
128 |
129 | .subtitle {
130 | margin-top: 0;
131 | color: #777;
132 | }
133 |
134 | h1, h2, h3 {
135 | margin-bottom: 0;
136 | }
137 |
138 | a, a:focus, a:hover {
139 | color: rgb(0,0,0,.8);
140 | text-decoration: none;
141 | border-bottom: #bbb .08em dotted;
142 | }
143 |
144 | a:focus, a:hover {
145 | background: #fbf3f3;
146 | transition-property: background;
147 | transition-duration: .2s;
148 | }
149 |
150 | table {
151 | margin: auto;
152 | }
153 |
154 | footer {
155 | color: #aaa;
156 | }
157 | footer a, footer a:hover, footer a:focus {
158 | color: #aaa;
159 | }
160 |
161 | header img.logo {
162 | float: right;
163 | height: 5em;
164 | margin-right: 1em;
165 | margin-left: 1em;
166 | }
167 |
168 | .authors {
169 | font-weight: 400;
170 | font-size: 0.9em;
171 | color: #777;
172 | }
173 |
174 | .program li {
175 | margin-bottom: 0.5em;
176 | }
177 |
178 | @media screen and (min-width: 800px) {
179 | .committee {
180 | display: flex;
181 | flex: auto;
182 | }
183 | .committee .organization {
184 | margin-left: 5em;
185 | }
186 | }
187 |
188 | /*************************************************
189 | * Tables
190 | **************************************************/
191 |
192 | table {
193 | width: 100%;
194 | max-width: 100%;
195 | margin-bottom: 1rem;
196 | font-size: 0.93rem;
197 | }
198 |
199 | td {
200 | border: 1px solid #ddd;
201 | padding: 3px;
202 | }
203 |
204 | thead {
205 | border-top: 1px solid #ddd;
206 | }
207 |
208 | /* Table Striped */
209 | /*table > tbody > tr:nth-child(odd) > td,*/
210 | /*table > tbody > tr:nth-child(odd) > th {*/
211 | /*background-color: #e0d2ca;*/
212 | /*}*/
213 |
--------------------------------------------------------------------------------
/camera-ready/Makefile.include:
--------------------------------------------------------------------------------
1 | ###
2 | ### generic GNU make Makefile for .tex -> .pdf.
3 | ### ransford at cs.washington.edu
4 | ### http://github.com/ransford/pdflatex-makefile
5 | ###
6 | ### Recommended usage:
7 | ### 1. echo 'include Makefile.include' > Makefile
8 | ### 2. Optional: Edit the Makefile to override $(TARGET)
9 | ### and anything else (e.g., PDFVIEWER, AFTERALL)
10 | ### 3. $ make snapshot
11 | ### # pass around a draft...
12 | ### 4. $ make distill
13 | ### # submit the camera-ready version with embedded fonts
14 | ###
15 | ### Final result:
16 | ### % cat Makefile
17 | ### TARGET=mypaper
18 | ### PDFVIEWER=open -a 'Adobe Acrobat Professional'
19 | ### AFTERALL=mypostprocessingstep
20 | ### include Makefile.include
21 | ###
22 | ### mypostprocessingstep:
23 | ### # do something...
24 | ###
25 |
26 | PDFLATEX ?= pdflatex -halt-on-error -file-line-error -shell-escape
27 | BIBTEX ?= bibtex
28 | MAKEGLOSSARIES ?= makeglossaries
29 | MAKENOMENCL ?= makeindex
30 |
31 | ## String to find in log to check whether rerun is necessary
32 | RERUN_PATTERN = Rerun to
33 |
34 | ifneq ($(QUIET),)
35 | PDFLATEX += -interaction=batchmode
36 | ERRFILTER := > /dev/null || (egrep ':[[:digit:]]+:' *.log && false)
37 | BIBTEX += -terse
38 | else
39 | PDFLATEX += -interaction=nonstopmode
40 | ERRFILTER=
41 | endif
42 |
43 | ## Action for 'make view'
44 | OS=$(shell uname -s)
45 | ifeq ($(OS),Darwin)
46 | PDFVIEWER ?= open
47 | else
48 | PDFVIEWER ?= xdg-open
49 | endif
50 |
51 | ## Name of the target file, minus .pdf: e.g., TARGET=mypaper causes this
52 | ## Makefile to turn mypaper.tex into mypaper.pdf.
53 | TARGETS += $(TARGET)
54 | TEXTARGETS = $(TARGETS:=.tex)
55 | PDFTARGETS = $(TARGETS:=.pdf)
56 | AUXFILES = $(TARGETS:=.aux)
57 | LOGFILES = $(TARGETS:=.log)
58 |
59 | ## Inkscape SVG file processing:
60 | ifeq ($(shell which inkscape >/dev/null 2>&1 && echo USING_INKSCAPE),USING_INKSCAPE)
61 | FIG_SVG=$(wildcard $(FIGS)/*.svg)
62 | FIG_PDF=$(FIG_SVG:.svg=.pdf)
63 | else
64 | FIG_PDF=
65 | endif
66 |
67 | ## If $(TARGET).tex refers to .bib files like \bibliography{foo,bar}, then
68 | ## $(BIBFILES) will contain foo.bib and bar.bib, and both files will be added as
69 | ## dependencies to $(PDFTARGETS).
70 | ## Effect: updating a .bib file will trigger re-typesetting.
71 | BIBFILES += $(patsubst %,%.bib,\
72 | $(shell grep '^[^%]*\\bibliography{' $(TEXTARGETS) | \
73 | grep -o '\\bibliography{[^}]\+}' | \
74 | sed -e 's/^[^%]*\\bibliography{\([^}]*\)}.*/\1/' \
75 | -e 's/, */ /g'))
76 |
77 | ## Add \input'ed or \include'd files to $(PDFTARGETS) dependencies; ignore
78 | ## .tex extensions.
79 | INCLUDEDTEX = $(patsubst %,%.tex,\
80 | $(shell grep '^[^%]*\\\(input\|include\){' $(TEXTARGETS) | \
81 | grep -o '\\\(input\|include\){[^}]\+}' | \
82 | sed -e 's/^.*{\([^}]*\)}.*/\1/' \
83 | -e 's/\.tex$$//'))
84 |
85 | AUXFILES += $(INCLUDEDTEX:.tex=.aux)
86 |
87 | ## grab a version number from the repository (if any) that stores this.
88 | ## * REVISION is the current revision number (short form, for inclusion in text)
89 | ## * VCSTURD is a file that gets touched after a repo update
90 | SPACE = $(empty) $(empty)
91 | ifeq ($(shell git status >/dev/null 2>&1 && echo USING_GIT),USING_GIT)
92 | ifeq ($(shell git svn info >/dev/null 2>&1 && echo USING_GIT_SVN),USING_GIT_SVN)
93 | # git-svn
94 | ifeq ($(REVISION),)
95 | REVISION := $(shell git svn find-rev git-svn)
96 | endif
97 | VCSTURD := $(subst $(SPACE),\ ,$(shell git rev-parse --git-dir)/refs/remotes/git-svn)
98 | else
99 | # plain git
100 | ifeq ($(REVISION),)
101 | REVISION := $(shell git describe --always HEAD)
102 | TIME := $(shell git show -s --date=format:'%m/%d %H:%M' --pretty='format: %cd')
103 | endif
104 | GIT_BRANCH := $(shell git symbolic-ref HEAD 2>/dev/null)
105 | VCSTURD := $(subst $(SPACE),\ ,$(shell git rev-parse --git-dir)/$(GIT_BRANCH))
106 | endif
107 | else ifeq ($(shell hg root >/dev/null 2>&1 && echo USING_HG),USING_HG)
108 | # mercurial
109 | ifeq ($(REVISION),)
110 | REVISION := $(shell hg id -i)
111 | endif
112 | VCSTURD := $(subst $(SPACE),\ ,$(shell hg root)/.hg/dirstate)
113 | else ifeq ($(shell svn info >/dev/null && echo USING_SVN),USING_SVN)
114 | # subversion
115 | ifeq ($(REVISION),)
116 | REVISION := $(subst :,-,$(shell svnversion -n))
117 | endif
118 | VCSTURD := $(addsuffix /.svn/entries, $(shell svn info | grep 'Root Path' | sed -e 's/\(.*\:\)\(.*\) /\2/'))
119 | endif
120 |
121 | # .PHONY names all targets that aren't filenames
122 | .PHONY: all clean pdf view snapshot distill distclean
123 |
124 | all: pdf $(AFTERALL)
125 |
126 | ifeq ($(shell which inkscape >/dev/null 2>&1 && echo USING_INKSCAPE),USING_INKSCAPE)
127 | $(FIGS)/%.pdf: $(FIGS)/%.svg ## Figures for the manuscript
128 | inkscape -C -z --file=$< --export-pdf=$@ 2> /dev/null
129 | endif
130 |
131 | pdf: $(FIG_PDF) $(PDFTARGETS)
132 |
133 | view: $(PDFTARGETS)
134 | $(PDFVIEWER) $(PDFTARGETS)
135 |
136 | # define a \Revision{} command you can include in your document's preamble.
137 | # especially useful with e.g. draftfooter.sty or fancyhdr.
138 | # usage: \input{revision}
139 | # ... \Revision{}
140 | ifneq ($(REVISION),)
141 | REVDEPS += revision.tex
142 | revision.tex: $(VCSTURD)
143 | printf '\\newcommand{\Revision}'"{$(subst _,\_,$(REVISION)) ($(strip $(TIME)))}" > $@
144 | AUXFILES += revision.aux
145 | endif
146 |
147 | # to generate aux but not pdf from pdflatex, use -draftmode
148 | %.aux: %.tex $(REVDEPS)
149 | $(PDFLATEX) -draftmode $* $(ERRFILTER)
150 |
151 | # specify KEEPAUX=1 if you need to keep auxiliary (.aux) files for some other
152 | # tool (e.g., an autocompleting text editor)
153 | ifneq ($(KEEPAUX),1)
154 | .INTERMEDIATE: $(AUXFILES)
155 | endif
156 |
157 | # introduce BibTeX dependency if we found a \bibliography
158 | ifneq ($(strip $(BIBFILES)),)
159 | BIBDEPS = %.bbl
160 | %.bbl: %.aux $(BIBFILES)
161 | $(BIBTEX) $*
162 | endif
163 |
164 | # introduce makeglossaries dependency if we found \printglossary/ies
165 | HAS_GLOSSARIES = $(shell \
166 | grep '^[^%]*\\printglossar\(ies\|y\)' $(TEXTARGETS) $(INCLUDEDTEX) && \
167 | echo HAS_GLOSSARIES)
168 | ifneq ($(HAS_GLOSSARIES),)
169 | GLSDEPS = %.gls
170 | %.gls: %.aux
171 | $(MAKEGLOSSARIES) $(TARGETS)
172 | endif
173 |
174 | # introduce makenomenclature dependency if we found \printnomenclature
175 | HAS_NOMENCL = $(shell \
176 | grep '^[^%]*\\printnomenclature' $(TEXTARGETS) $(INCLUDEDTEX) && \
177 | echo HAS_NOMENCL)
178 | ifneq ($(HAS_NOMENCL),)
179 | NLSDEPS = %.nls
180 | %.nls: %.nlo
181 | $(MAKENOMENCL) $(TARGETS).nlo -s nomencl.ist -o $(TARGETS).nls
182 | endif
183 |
184 | $(PDFTARGETS): %.pdf: %.tex %.aux $(GLSDEPS) $(BIBDEPS) $(INCLUDEDTEX) $(REVDEPS) $(NLSDEPS) $(EXTRADEPS)
185 | $(PDFLATEX) $* $(ERRFILTER)
186 | ifneq ($(strip $(BIBFILES)),)
187 | @if egrep -q "undefined (references|citations)" $*.log; then \
188 | $(BIBTEX) $* && $(PDFLATEX) $* $(ERRFILTER); fi
189 | endif
190 | @while grep -q "$(RERUN_PATTERN)" $*.log; do \
191 | $(PDFLATEX) $* $(ERRFILTER); done
192 |
193 | DRAFTS := $(PDFTARGETS:.pdf=-$(REVISION).pdf)
194 | $(DRAFTS): %-$(REVISION).pdf: %.pdf
195 | cp $< $@
196 | snapshot: $(DRAFTS)
197 |
198 | %.distilled.pdf: %.pdf
199 | gs -q -dSAFER -dNOPAUSE -dBATCH -sDEVICE=pdfwrite -sOutputFile=$@ \
200 | -dCompatibilityLevel=1.5 -dPDFSETTINGS=/prepress -c .setpdfwrite -f $<
201 | exiftool -overwrite_original -Title="" -Creator="" -CreatorTool="" $@
202 |
203 | distill: $(PDFTARGETS:.pdf=.distilled.pdf)
204 |
205 | distclean: clean
206 | $(RM) $(PDFTARGETS) $(PDFTARGETS:.pdf=.distilled.pdf) $(EXTRADISTCLEAN)
207 |
208 | clean:
209 | $(RM) $(foreach T,$(TARGETS), \
210 | $(T).bbl $(T).bcf $(T).bit $(T).blg \
211 | $(T)-blx.bib $(T).brf $(T).fdb_latexmk \
212 | $(T).fls $(T).glg $(T).glo $(T).gls \
213 | $(T).glsdefs $(T).glx \ $(T).gxg \
214 | $(T).gxs $(T).idx $(T).ilg $(T).ind \
215 | $(T).ist $(T).loa $(T).lof $(T).lol \
216 | $(T).lot $(T).maf $(T).mtc $(T).nav \
217 | $(T).out $(T).pag $(T).run.xml $(T).snm \
218 | $(T).svn $(T).tdo $(T).tns $(T).toc \
219 | $(T).vtc $(T).url) \
220 | $(REVDEPS) $(AUXFILES) $(LOGFILES) \
221 | $(EXTRACLEAN) $(FIG_PDF)
222 |
--------------------------------------------------------------------------------
/camera-ready/ACM-Reference-Format.bst:
--------------------------------------------------------------------------------
1 | %%% -*-BibTeX-*-
2 | %%% ====================================================================
3 | %%% @BibTeX-style-file{
4 | %%% author = "Nelson H. F. Beebe, Boris Veytsman and Gerald Murray",
5 | %%% version = "2.1",
6 | %%% date = "14 June 2017",
7 | %%% filename = "ACM-Reference-Format.bst",
8 | %%% email = "borisv@lk.net, boris@varphi.com",
9 | %%% codetable = "ISO/ASCII",
10 | %%% keywords = "ACM Transactions bibliography style; BibTeX",
11 | %%% license = "public domain",
12 | %%% supported = "yes",
13 | %%% abstract = "",
14 | %%% }
15 | %%% ====================================================================
16 |
17 | %%% Revision history: see source in git
18 |
19 | ENTRY
20 | { address
21 | advisor
22 | archiveprefix
23 | author
24 | booktitle
25 | chapter
26 | city
27 | date
28 | edition
29 | editor
30 | eprint
31 | eprinttype
32 | eprintclass
33 | howpublished
34 | institution
35 | journal
36 | key
37 | location
38 | month
39 | note
40 | number
41 | organization
42 | pages
43 | primaryclass
44 | publisher
45 | school
46 | series
47 | title
48 | type
49 | volume
50 | year
51 | % New keys recognized
52 | issue % UTAH: used in, e.g., ACM SIGSAM Bulletin and ACM Communications in Computer Algebra
53 | articleno
54 | eid
55 | day % UTAH: needed for newspapers, weeklies, bi-weeklies
56 | doi % UTAH
57 | url % UTAH
58 | bookpages % UTAH
59 | numpages
60 | lastaccessed % UTAH: used only for @Misc{...}
61 | coden % UTAH
62 | isbn % UTAH
63 | isbn-13 % UTAH
64 | issn % UTAH
65 | lccn % UTAH
66 | }
67 | {}
68 | { label.year extra.label sort.year sort.label basic.label.year}
69 |
70 | INTEGERS { output.state before.all mid.sentence after.sentence after.block }
71 |
72 | INTEGERS { show-isbn-10-and-13 } % initialized below in begin.bib
73 |
74 | INTEGERS { nameptr namesleft numnames }
75 |
76 | INTEGERS { multiresult }
77 |
78 | INTEGERS { len }
79 |
80 | INTEGERS { last.extra.num }
81 |
82 | STRINGS { s t t.org u }
83 |
84 | STRINGS { last.label next.extra }
85 |
86 | STRINGS { p1 p2 p3 page.count }
87 |
88 |
89 | FUNCTION { not }
90 | {
91 | { #0 }
92 | { #1 }
93 | if$
94 | }
95 |
96 | FUNCTION { and }
97 | {
98 | 'skip$
99 | { pop$ #0 }
100 | if$
101 | }
102 |
103 | FUNCTION { or }
104 | {
105 | { pop$ #1 }
106 | 'skip$
107 | if$
108 | }
109 |
110 |
111 | FUNCTION { dump.stack.1 }
112 | {
113 | duplicate$ "STACK[top] = [" swap$ * "]" * warning$
114 | }
115 |
116 | FUNCTION { dump.stack.2 }
117 | {
118 | duplicate$ "STACK[top ] = [" swap$ * "]" * warning$
119 | swap$
120 | duplicate$ "STACK[top-1] = [" swap$ * "]" * warning$
121 | swap$
122 | }
123 |
124 | FUNCTION { empty.or.unknown }
125 | {
126 | %% Examine the top stack entry, and push 1 if it is empty, or
127 | %% consists only of whitespace, or is a string beginning with two
128 | %% queries (??), and otherwise, push 0.
129 | %%
130 | %% This function provides a replacement for empty$, with the
131 | %% convenient feature that unknown values marked by two leading
132 | %% queries are treated the same as missing values, and thus, do not
133 | %% appear in the output .bbl file, and yet, their presence in .bib
134 | %% file(s) serves to mark values which are temporarily missing, but
135 | %% are expected to be filled in eventually once more data is
136 | %% obtained. The TeX User Group and BibNet bibliography archives
137 | %% make extensive use of this practice.
138 | %%
139 | %% An empty string cannot serve the same purpose, because just as in
140 | %% statistics data processing, an unknown value is not the same as an
141 | %% empty value.
142 | %%
143 | %% At entry: stack = ... top:[string]
144 | %% At exit: stack = ... top:[0 or 1]
145 |
146 | duplicate$ empty$
147 | { pop$ #1 }
148 | { #1 #2 substring$ "??" = }
149 | if$
150 | }
151 |
152 | FUNCTION { writeln }
153 | {
154 | %% In BibTeX style files, the sequences
155 | %%
156 | %% ... "one" "two" output
157 | %% ... "one" "two" output.xxx
158 | %%
159 | %% ship "one" to the output file, possibly following by punctuation,
160 | %% leaving the stack with
161 | %%
162 | %% ... "two"
163 | %%
164 | %% There is thus a one-string lag in output processing that must be
165 | %% carefully handled to avoid duplicating a string in the output
166 | %% file. Unless otherwise noted, all output.xxx functions leave
167 | %% just one new string on the stack, and that model should be born
168 | %% in mind when reading or writing function code.
169 | %%
170 | %% BibTeX's asynchronous buffering of output from strings from the
171 | %% stack is confusing because newline$ bypasses the buffer. It
172 | %% would have been so much easier for newline to be a character
173 | %% rather than a state of the output-in-progress.
174 | %%
175 | %% The documentation in btxhak.dvi is WRONG: it says
176 | %%
177 | %% newline$ Writes onto the bbl file what's accumulated in the
178 | %% output buffer. It writes a blank line if and only
179 | %% if the output buffer is empty. Since write$ does
180 | %% reasonable line breaking, you should use this
181 | %% function only when you want a blank line or an
182 | %% explicit line break.
183 | %%
184 | %% write$ Pops the top (string) literal and writes it on the
185 | %% output buffer (which will result in stuff being
186 | %% written onto the bbl file when the buffer fills
187 | %% up).
188 | %%
189 | %% Examination of the BibTeX source code shows that write$ does
190 | %% indeed behave as claimed, but newline$ sends a newline character
191 | %% directly to the output file, leaving the stack unchanged. The
192 | %% first line "Writes onto ... buffer." is therefore wrong.
193 | %%
194 | %% The original BibTeX style files almost always use "write$ newline$"
195 | %% in that order, so it makes sense to hide that pair in a private
196 | %% function like this one, named after a statement in Pascal,
197 | %% the programming language embedded in the BibTeX Web program.
198 |
199 | write$ % output top-of-stack string
200 | newline$ % immediate write of newline (not via stack)
201 | }
202 |
203 | FUNCTION { init.state.consts }
204 | {
205 | #0 'before.all :=
206 | #1 'mid.sentence :=
207 | #2 'after.sentence :=
208 | #3 'after.block :=
209 | }
210 |
211 | FUNCTION { output.nonnull }
212 | { % Stack in: ... R S T Stack out: ... R T File out: S
213 | 's :=
214 | output.state mid.sentence =
215 | {
216 | ", " * write$
217 | }
218 | {
219 | output.state after.block =
220 | {
221 | add.period$ writeln
222 | "\newblock " write$
223 | }
224 | {
225 | output.state before.all =
226 | {
227 | write$
228 | }
229 | {
230 | add.period$ " " * write$
231 | }
232 | if$
233 | }
234 | if$
235 | mid.sentence 'output.state :=
236 | }
237 | if$
238 | s
239 | }
240 |
241 | FUNCTION { output.nonnull.dot.space }
242 | { % Stack in: ... R S T Stack out: ... R T File out: S
243 | 's :=
244 | output.state mid.sentence = % { ". " * write$ }
245 | {
246 | ". " * write$
247 | }
248 | {
249 | output.state after.block =
250 | {
251 | add.period$ writeln "\newblock " write$
252 | }
253 | {
254 | output.state before.all =
255 | {
256 | write$
257 | }
258 | {
259 | add.period$ " " * write$
260 | }
261 | if$
262 | }
263 | if$
264 | mid.sentence 'output.state :=
265 | }
266 | if$
267 | s
268 | }
269 |
270 | FUNCTION { output.nonnull.remove }
271 | { % Stack in: ... R S T Stack out: ... R T File out: S
272 | 's :=
273 | output.state mid.sentence =
274 | {
275 | " " * write$
276 | }
277 | {
278 | output.state after.block =
279 | {
280 | add.period$ writeln "\newblock " write$
281 | }
282 | {
283 | output.state before.all =
284 | {
285 | write$
286 | }
287 | {
288 | add.period$ " " * write$
289 | }
290 | if$
291 | }
292 | if$
293 | mid.sentence 'output.state :=
294 | }
295 | if$
296 | s
297 | }
298 |
299 | FUNCTION { output.nonnull.removenospace }
300 | { % Stack in: ... R S T Stack out: ... R T File out: S
301 | 's :=
302 | output.state mid.sentence =
303 | {
304 | "" * write$
305 | }
306 | {
307 | output.state after.block =
308 | {
309 | add.period$ writeln "\newblock " write$
310 | }
311 | {
312 | output.state before.all =
313 | {
314 | write$
315 | }
316 | {
317 | add.period$ " " * write$
318 | }
319 | if$
320 | }
321 | if$
322 | mid.sentence 'output.state :=
323 | }
324 | if$
325 | s
326 | }
327 |
328 | FUNCTION { output }
329 | { % discard top token if empty, else like output.nonnull
330 | duplicate$ empty.or.unknown
331 | 'pop$
332 | 'output.nonnull
333 | if$
334 | }
335 |
336 | FUNCTION { output.dot.space }
337 | { % discard top token if empty, else like output.nonnull.dot.space
338 | duplicate$ empty.or.unknown
339 | 'pop$
340 | 'output.nonnull.dot.space
341 | if$
342 | }
343 |
344 | FUNCTION { output.removenospace }
345 | { % discard top token if empty, else like output.nonnull.removenospace
346 | duplicate$ empty.or.unknown
347 | 'pop$
348 | 'output.nonnull.removenospace
349 | if$
350 | }
351 |
352 | FUNCTION { output.check }
353 | { % like output, but warn if key name on top-of-stack is not set
354 | 't :=
355 | duplicate$ empty.or.unknown
356 | { pop$ "empty " t * " in " * cite$ * warning$ }
357 | 'output.nonnull
358 | if$
359 | }
360 |
361 | FUNCTION { bibinfo.output.check }
362 | { % like output.check, adding bibinfo field
363 | 't :=
364 | duplicate$ empty.or.unknown
365 | { pop$ "empty " t * " in " * cite$ * warning$ }
366 | { "\bibinfo{" t "}{" * * swap$ * "}" *
367 | output.nonnull }
368 | if$
369 | }
370 |
371 | FUNCTION { output.check.dot.space }
372 | { % like output.dot.space, but warn if key name on top-of-stack is not set
373 | 't :=
374 | duplicate$ empty.or.unknown
375 | { pop$ "empty " t * " in " * cite$ * warning$ }
376 | 'output.nonnull.dot.space
377 | if$
378 | }
379 |
380 | FUNCTION { fin.block }
381 | { % functionally, but not logically, identical to fin.entry
382 | add.period$
383 | writeln
384 | }
385 |
386 | FUNCTION { fin.entry }
387 | {
388 | add.period$
389 | writeln
390 | }
391 |
392 | FUNCTION { new.sentence }
393 | { % update sentence state, with neither output nor stack change
394 | output.state after.block =
395 | 'skip$
396 | {
397 | output.state before.all =
398 | 'skip$
399 | { after.sentence 'output.state := }
400 | if$
401 | }
402 | if$
403 | }
404 |
405 | FUNCTION { fin.sentence }
406 | {
407 | add.period$
408 | write$
409 | new.sentence
410 | ""
411 | }
412 |
413 | FUNCTION { new.block }
414 | {
415 | output.state before.all =
416 | 'skip$
417 | { after.block 'output.state := }
418 | if$
419 | }
420 |
421 | FUNCTION { output.coden } % UTAH
422 | { % output non-empty CODEN as one-line sentence (stack untouched)
423 | coden empty.or.unknown
424 | { }
425 | { "\showCODEN{" coden * "}" * writeln }
426 | if$
427 | }
428 |
429 | %
430 | % Sometimes articleno starts with the word 'Article' or 'Paper.
431 | % (this is a bug of acmdl, sigh)
432 | % We strip them. We assume eid or articleno is already on stack
433 | %
434 |
435 | FUNCTION { strip.articleno.or.eid }
436 | {
437 | 't :=
438 | t #1 #7 substring$ "Article" =
439 | {t #8 t text.length$ substring$ 't :=}
440 | { }
441 | if$
442 | t #1 #7 substring$ "article" =
443 | {t #8 t text.length$ substring$ 't :=}
444 | { }
445 | if$
446 | t #1 #5 substring$ "Paper" =
447 | {t #6 t text.length$ substring$ 't :=}
448 | { }
449 | if$
450 | t #1 #5 substring$ "paper" =
451 | {t #6 t text.length$ substring$ 't :=}
452 | { }
453 | if$
454 | % Strip any left trailing space or ~
455 | t #1 #1 substring$ " " =
456 | {t #2 t text.length$ substring$ 't :=}
457 | { }
458 | if$
459 | t #1 #1 substring$ "~" =
460 | {t #2 t text.length$ substring$ 't :=}
461 | { }
462 | if$
463 | t
464 | }
465 |
466 |
467 | FUNCTION { format.articleno }
468 | {
469 | articleno empty.or.unknown not eid empty.or.unknown not and
470 | { "Both articleno and eid are defined for " cite$ * warning$ }
471 | 'skip$
472 | if$
473 | articleno empty.or.unknown eid empty.or.unknown and
474 | { "" }
475 | {
476 | numpages empty.or.unknown
477 | { "articleno or eid field, but no numpages field, in "
478 | cite$ * warning$ }
479 | { }
480 | if$
481 | eid empty.or.unknown
482 | { "Article \bibinfo{articleno}{" articleno strip.articleno.or.eid * "}" * }
483 | { "Article \bibinfo{articleno}{" eid strip.articleno.or.eid * "}" * }
484 | if$
485 | }
486 | if$
487 | }
488 |
489 | FUNCTION { format.year }
490 | { % push year string or "[n.\,d.]" onto output stack
491 | %% Because year is a mandatory field, we always force SOMETHING
492 | %% to be output
493 | "\bibinfo{year}{"
494 | year empty.or.unknown
495 | { "[n.\,d.]" }
496 | { year }
497 | if$
498 | * "}" *
499 | }
500 |
501 | FUNCTION { format.day.month }
502 | { % push "day month " or "month " or "" onto output stack
503 | day empty.or.unknown
504 | {
505 | month empty.or.unknown
506 | { "" }
507 | { "\bibinfo{date}{" month * "} " *}
508 | if$
509 | }
510 | {
511 | month empty.or.unknown
512 | { "" }
513 | { "\bibinfo{date}{" day * " " * month * "} " *}
514 | if$
515 | }
516 | if$
517 | }
518 |
519 | FUNCTION { format.day.month.year } % UTAH
520 | { % if month is empty, push "" else push "(MON.)" or "(DD MON.)"
521 | % Needed for frequent periodicals: 2008. ... New York Times C-1, C-2, C-17 (23 Oct.)
522 | % acm-*.bst addition: prefix parenthesized date string with
523 | % ", Article nnn "
524 | articleno empty.or.unknown eid empty.or.unknown and
525 | { "" }
526 | { output.state after.block =
527 | {", " format.articleno * }
528 | { format.articleno }
529 | if$
530 | }
531 | if$
532 | " (" * format.day.month * format.year * ")" *
533 | }
534 |
535 | FUNCTION { output.day.month.year } % UTAH
536 | { % if month is empty value, do nothing; else output stack top and
537 | % leave with new top string "(MON.)" or "(DD MON.)"
538 | % Needed for frequent periodicals: 2008. ... New York Times C-1, C-2, C-17 (23 Oct.)
539 | format.day.month.year
540 | output.nonnull.remove
541 | }
542 |
543 | FUNCTION { strip.doi } % UTAH
544 | { % Strip any Web address prefix to recover the bare DOI, leaving the
545 | % result on the output stack, as recommended by CrossRef DOI
546 | % documentation.
547 | % For example, reduce "http://doi.acm.org/10.1145/1534530.1534545" to
548 | % "10.1145/1534530.1534545". A suitable URL is later typeset and
549 | % displayed as the LAST item in the reference list entry. Publisher Web
550 | % sites wrap this with a suitable link to a real URL to resolve the DOI,
551 | % and the master https://doi.org/ address is preferred, since publisher-
552 | % specific URLs can disappear in response to economic events. All
553 | % journals are encouraged by the DOI authorities to use that typeset
554 | % format and link procedures for uniformity across all publications that
555 | % include DOIs in reference lists.
556 | % The numeric prefix is guaranteed to start with "10.", so we use
557 | % that as a test.
558 | % 2017-02-04 Added stripping of https:// (Boris)
559 | doi #1 #3 substring$ "10." =
560 | { doi }
561 | {
562 | doi 't := % get modifiable copy of DOI
563 |
564 | % Change https:// to http:// to strip both prefixes (BV)
565 |
566 | t #1 #8 substring$ "https://" =
567 | { "http://" t #9 t text.length$ #8 - substring$ * 't := }
568 | { }
569 | if$
570 |
571 | t #1 #7 substring$ "http://" =
572 | {
573 | t #8 t text.length$ #7 - substring$ 't :=
574 |
575 | "INTERNAL STYLE-FILE ERROR" 's :=
576 |
577 | % search for next "/" and assign its suffix to s
578 |
579 | { t text.length$ }
580 | {
581 | t #1 #1 substring$ "/" =
582 | {
583 | % save rest of string as true DOI (should be 10.xxxx/yyyy)
584 | t #2 t text.length$ #1 - substring$ 's :=
585 | "" 't := % empty string t terminates the loop
586 | }
587 | {
588 | % discard first character and continue loop: t <= substring(t,2,last)
589 | t #2 t text.length$ #1 - substring$ 't :=
590 | }
591 | if$
592 | }
593 | while$
594 |
595 | % check for valid DOI (should be 10.xxxx/yyyy)
596 | s #1 #3 substring$ "10." =
597 | { }
598 | { "unrecognized DOI substring " s * " in DOI value [" * doi * "]" * warning$ }
599 | if$
600 |
601 | s % push the stripped DOI on the output stack
602 |
603 | }
604 | {
605 | "unrecognized DOI value [" doi * "]" * warning$
606 | doi % push the unrecognized original DOI on the output stack
607 | }
608 | if$
609 | }
610 | if$
611 | }
612 |
613 | %
614 | % Change by BV: added standard prefix to URL
615 | %
616 | FUNCTION { output.doi } % UTAH
617 | { % output non-empty DOI as one-line sentence (stack untouched)
618 | doi empty.or.unknown
619 | { }
620 | {
621 | %% Use \urldef here for the same reason it is used in output.url,
622 | %% see output.url for further discussion.
623 | "\urldef\tempurl%" writeln
624 | "\url{https://doi.org/" strip.doi * "}" * writeln
625 | "\showDOI{\tempurl}" writeln
626 | }
627 | if$
628 | }
629 |
630 | FUNCTION { output.isbn } % UTAH
631 | { % output non-empty ISBN-10 and/or ISBN-13 as one-line sentences (stack untouched)
632 | show-isbn-10-and-13
633 | {
634 | %% show both 10- and 13-digit ISBNs
635 | isbn empty.or.unknown
636 | { }
637 | {
638 | "\showISBNx{" isbn * "}" * writeln
639 | }
640 | if$
641 | isbn-13 empty.or.unknown
642 | { }
643 | {
644 | "\showISBNxiii{" isbn-13 * "}" * writeln
645 | }
646 | if$
647 | }
648 | {
649 | %% show 10-digit ISBNs only if 13-digit ISBNs not available
650 | isbn-13 empty.or.unknown
651 | {
652 | isbn empty.or.unknown
653 | { }
654 | {
655 | "\showISBNx{" isbn * "}" * writeln
656 | }
657 | if$
658 | }
659 | {
660 | "\showISBNxiii{" isbn-13 * "}" * writeln
661 | }
662 | if$
663 | }
664 | if$
665 | }
666 |
667 | FUNCTION { output.issn } % UTAH
668 | { % output non-empty ISSN as one-line sentence (stack untouched)
669 | issn empty.or.unknown
670 | { }
671 | { "\showISSN{" issn * "}" * writeln }
672 | if$
673 | }
674 |
675 | FUNCTION { output.issue }
676 | { % output non-empty issue number as a one-line sentence (stack untouched)
677 | issue empty.or.unknown
678 | { }
679 | { "Issue " issue * "." * writeln }
680 | if$
681 | }
682 |
683 | FUNCTION { output.lccn } % UTAH
684 | { % return with stack untouched
685 | lccn empty.or.unknown
686 | { }
687 | { "\showLCCN{" lccn * "}" * writeln }
688 | if$
689 | }
690 |
691 | FUNCTION { output.note } % UTAH
692 | { % return with stack empty
693 | note empty.or.unknown
694 | { }
695 | { "\shownote{" note add.period$ * "}" * writeln }
696 | if$
697 | }
698 |
699 | FUNCTION { output.note.check } % UTAH
700 | { % return with stack empty
701 | note empty.or.unknown
702 | { "empty note in " cite$ * warning$ }
703 | { "\shownote{" note add.period$ * "}" * writeln }
704 | if$
705 | }
706 |
707 | FUNCTION { output.eprint } %
708 | { % return with stack empty
709 | eprint empty.or.unknown
710 | { }
711 | { "\showeprint"
712 | archiveprefix empty.or.unknown
713 | { eprinttype empty.or.unknown
714 | { }
715 | { "[" eprinttype "]" * * * }
716 | if$
717 | }
718 | { "[" archiveprefix "l" change.case$ "]" * * * }
719 | if$
720 | "{" eprint "}" * * *
721 | primaryclass empty.or.unknown
722 | { eprintclass empty.or.unknown
723 | { }
724 | { "~[" eprintclass "]" * * * }
725 | if$
726 | }
727 | { "~[" primaryclass "]" * * * }
728 | if$
729 | writeln
730 | }
731 | if$
732 | }
733 |
734 |
735 | %
736 | % Changes by BV 2011/04/15. Do not output
737 | % url if doi is defined
738 | %
739 | FUNCTION { output.url } % UTAH
740 | { % return with stack untouched
741 | % output URL and associated lastaccessed fields
742 | doi empty.or.unknown
743 | {
744 | url empty.or.unknown
745 | { }
746 | {
747 | %% Use \urldef, outside \showURL, so that %nn, #, etc in URLs work
748 | %% correctly. Put the actual URL on its own line to reduce the
749 | %% likelihood of BibTeX's nasty line wrapping after column 79.
750 | %% \url{} can undo this, but if that doesn't work for some reason
751 | %% the .bbl file would have to be repaired manually.
752 | "\urldef\tempurl%" writeln
753 | "\url{" url * "}" * writeln
754 |
755 | "\showURL{%" writeln
756 | lastaccessed empty.or.unknown
757 | { "" }
758 | { "Retrieved " lastaccessed * " from " * }
759 | if$
760 | "\tempurl}" * writeln
761 | }
762 | if$
763 | }
764 | { }
765 | if$
766 | }
767 |
768 | FUNCTION { output.year.check }
769 | { % warn if year empty, output top string and leave " YEAR