├── .gitignore
├── LICENSE
├── README.md
└── stack_usage.py
/.gitignore:
--------------------------------------------------------------------------------
1 | # ply, python artefacts
2 | parser.out
3 | parsetab.*
4 |
--------------------------------------------------------------------------------
/LICENSE:
--------------------------------------------------------------------------------
1 | BSD 3-Clause License
2 |
3 | Copyright (c) 2020, Simon Wright
4 | All rights reserved.
5 |
6 | Redistribution and use in source and binary forms, with or without
7 | modification, are permitted provided that the following conditions are met:
8 |
9 | 1. Redistributions of source code must retain the above copyright notice, this
10 | list of conditions and the following disclaimer.
11 |
12 | 2. Redistributions in binary form must reproduce the above copyright notice,
13 | this list of conditions and the following disclaimer in the documentation
14 | and/or other materials provided with the distribution.
15 |
16 | 3. Neither the name of the copyright holder nor the names of its
17 | contributors may be used to endorse or promote products derived from
18 | this software without specific prior written permission.
19 |
20 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
21 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
22 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
23 | DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
24 | FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
25 | DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
26 | SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
27 | CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
28 | OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
29 | OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
30 |
--------------------------------------------------------------------------------
/README.md:
--------------------------------------------------------------------------------
1 | # Stack Usage #
2 |
3 | One of the more tricky problems when coding for an embedded system is ensuring that each task has enough stack space for its needs.
4 |
5 | Most microprocessor control units (MCUs) have limited amounts of RAM; on a general-purpose machine, for example a Linux desktop, it would in theory be possible for a potential stack overflow to be recognised and for additional stack space to be allocated, and even if that's not possible, a large stack can be pre-allocated using virtual memory without any immediate cost.
6 |
7 | This can't be done if the MCU only has real memory. Stack overflow may be caught or not. If you're lucky there'll be enough information left behind for you to use the debugger to find the reason for the resulting crash straight away, if not it could take a while. Under [FreeRTOS](https://www.freertos.org) you're very likely to get a hard fault to add to the joy (stacks and task control blocks live close together, so running off the end of your stack is likely to trample on the stored state of another task, resulting in it trying to access invalid memory).
8 |
9 | The Python program `stack_usage.py` is intended to help with this (it's not a panacea, though! if you have [AdaCore](https://www.adacore.com) support, you'll be better off using [GNATstack](https://www.adacore.com/gnatpro/toolsuite/gnatstack)).
10 |
11 | The initial motivation for this work was a hard fault encountered while writing a test program to check that Ada [timing events](http://www.ada-auth.org/standards/rm12_w_tc1/html/RM-D-15.html) work properly (well, usably) with the [FreeRTOS](https://www.freertos.org)-based [Cortex GNAT RTS](https://github.com/simonjwright/cortex-gnat-rts).
12 |
13 | ## Requirements ##
14 |
15 | `stack_usage` has been developed on macOS Mojave using Python 2.7 and 3.7 and [PLY](https://www.dabeaz.com/ply/ply.html) (**P**ython **L**ex and **Y**acc). To install PLY,
16 |
17 | ``` sh
18 | pip install --user ply
19 | ```
20 |
21 | It relies on the information generated by the GCC compiler (FSF GCC 10 or later, GNAT GPL 2015 or later) using the switch `-fcallgraph-info=su,da`.
22 |
23 | ## Use ##
24 |
25 | ```
26 | usage: stack_usage.py [flags]
27 | flags:
28 | -h, --help: output this message
29 | -s, --save=file: save data in file
30 | -l, --load=file: restore previously saved data
31 | -o, --output=file: file for CSV output (D=stack_usage.csv)
32 | -d, --diagnostics: output diagnostic info on parsing
33 | ```
34 |
35 | Notes:
36 |
37 | * When data is saved, it's in Python's [pickle](https://docs.python.org/2/library/pickle.html) form.
38 | * The diagnostics output is only useful while developing the parsing information.
39 |
40 | ## Case study ##
41 |
42 | As noted above, the initial motivation for this work was a hard fault encountered while writing a test program to check that Ada [timing events](http://www.ada-auth.org/standards/rm12_w_tc1/html/RM-D-15.html) work properly (well, usably) with the [FreeRTOS](https://www.freertos.org)-based [Cortex GNAT RTS](https://github.com/simonjwright/cortex-gnat-rts).
43 |
44 | Timing events should, per [ARM D.15(25)](http://www.ada-auth.org/standards/rm12_w_tc1/html/RM-D-15.html#p25), "be executed directly by the real-time clock interrupt mechanism". This wasn't done in Cortex GNAT RTS, because of ensuring proper handling of inter-task and inter-interrupt protection (especially tricky with the micro:bit, because it uses a cortex-M0 part); instead, a highest-priority task checks timings and runs handlers when required.
45 |
46 | Handlers are protected procedures, and code for them is generated as for any protected procedure, with the proper locking mechanisms.
47 |
48 | The `Timer` task [was specified as](https://github.com/simonjwright/cortex-gnat-rts/blob/13393a1198f1ff8e50bba54640b98463e88e9181/common/a-rttiev.adb#L112)
49 | ``` ada
50 | task Timer is
51 | pragma Priority (System.Priority'Last);
52 | end Timer;
53 | ```
54 | implemented, comments elided, as
55 | ``` ada
56 | task body Timer is
57 | Next : Time := Time_First;
58 | Period : constant Time_Span := Milliseconds (5);
59 | begin
60 | loop
61 | Process_Queued_Events (Next_Event_Time => Next);
62 | delay until (if Next = Time_First then Clock + Period else Next);
63 | end loop;
64 | end Timer;
65 | ```
66 | In Cortex GNAT RTS, an Ada task is created as a FreeRTOS task whose initial procedure is a [wrapper](https://github.com/simonjwright/cortex-gnat-rts/blob/master/common/gcc8/s-tarest.adb#L65) which allocates any secondary stack required out of the task's stack (in this case, 10% of the task's default 768 bytes) and invokes a compiler-generated wrapper round the user-written task body.
67 |
68 | The event handlers for this software are specified as
69 | ``` ada
70 | protected LED_Event_Handling is
71 | pragma Interrupt_Priority;
72 | procedure Handle
73 | (Event : in out Ada.Real_Time.Timing_Events.Timing_Event);
74 | private
75 | ...
76 | end LED_Event_Handling;
77 | ```
78 | and are implemented by the compiler as a wrapper, which performs any locking and calls the user-written handler body. This wrapper is called directly by the timer task, so any stack used by the handler body comes out of the timer task's stack.
79 |
80 | It turned out that a lot of the problem was that the standard compilation options used included `-O0` (no optimisation), and this led to the wrapper procedure (see above) using 300 bytes more stack than the 40 bytes it used with `-Og` (which "offer[s] a reasonable level of optimization while maintaining fast compilation and a good debugging experience").
81 |
82 | A further improvement was to eliminate the `Timer` task's secondary stack and bump its main stack:
83 | ``` ada
84 | task Timer
85 | with
86 | Priority => System.Priority'Last,
87 | Storage_Size => 1024,
88 | Secondary_Stack_Size => 0
89 | is
90 | pragma Task_Name ("events_timer");
91 | end Timer;
92 | ```
93 |
94 | At this point, it seemed like a good idea to implement `stack_usage.py`. I also added callgraph code generation as an option to the Cortex GNAT RTS build process.
95 |
96 | To generate raw callgraph info for the micro:bit RTS,
97 | ```
98 | cd ~/cortex-gnat-rts/microbit
99 | make CALLGRAPH=yes clean all install
100 | ```
101 |
102 | To generate raw callgraph info for the `events` test program,
103 | ```
104 | cd ~/cortex-gnat-rts/test_microbit
105 | make clean
106 | gprbuild -p -P testbed events -cargs -Og -fcallgraph-info=su,da
107 | ```
108 |
109 | To process and save the RTS's information,
110 | ```
111 | cd ~/stack_usage
112 | ./stack_usage.py \
113 | --save=microbit.p \
114 | ~/cortex-gnat-rts/microbit/.build/*.ci
115 | ```
116 |
117 | To combine this saved RTS information with the `events` program's data,
118 | ```
119 | ./stack_usage.py \
120 | --load=microbit.p \
121 | --output=events.csv \
122 | ~/cortex-gnat-rts/test-microbit/.build/*.ci
123 | ```
124 |
125 | Looking at `events.csv`, we find that `ada.real_time.timing_events.timerTKVIP` (called during elaboration to create the `Timer` task) uses 264 bytes (from the environment task's stack). The `system.tasking.restricted.stages.wrapper` procedure uses 40 bytes of the `Timer` task's stack, and calls (via a pointer, so invisibly to `stack_usage.py`) `ada.real_time.timing_events.timerTKB`, the task's body, which uses 296 bytes (including 64 bytes used by `ada.real_time.timing_events.process_queued_events`).
126 |
127 | Again, `stack_usage.py` can't trace into the handler procedure, because it's called via a pointer:
128 | ``` ada
129 | if Handler /= null then
130 | Handler.all (Timing_Event (Next_Event.all));
131 | end if;
132 | ```
133 | so we have to use our knowledge of the actual handlers to find that it's `event_support.led_event_handling.handleN` (312 bytes), wrapped by `event_support.led_event_handling.handleP` (which organises locking) for a total of 328 bytes.
134 |
135 | The total stack usage for the timer task is then predicted to be 40 + 296 + 328 = 664 bytes.
136 |
137 | This is a worst-case value: on [measuring the actual free space](https://github.com/simonjwright/cortex-gnat-rts/wiki/MeasuringStackUsage), the actual peak usage was 424 bytes.
138 |
139 | Why?!
140 |
141 | The worst-case usage for a particular subprogram is the amount it uses for its own purposes (register storage, local variables) plus the maximum used by any of the subprograms it calls. If a particular execution pattern doesn't call that maximal subprogram, then the actual usage will be lower. In this case, one of the most expensive called subprograms was 64-bit scaled arithmetic division, at 192 bytes, called by `Ada.Real_Time.Time_Of` at 248 bytes. I'm assuming that this was never actually invoked.
142 |
143 | ## Restrictions/To Do ##
144 |
145 | ### Ignored subprograms ###
146 |
147 | The tool currently ignores
148 |
149 | * calls to subprograms not compiled with the required options (e.g. `memcmp`)
150 | * dynamic objects (for example, `declare` blocks with variables sized at runtime)
151 | * dispatching calls
152 |
153 | At the very least it should mark the subprograms where there's a potential problem.
154 |
155 | ### Detailed reports ###
156 |
157 | It might be helpful to see details of which subprograms are called by a given subprogram.
158 |
--------------------------------------------------------------------------------
/stack_usage.py:
--------------------------------------------------------------------------------
1 | #!/usr/bin/python
2 | # -*- coding: utf-8 -*-
3 |
4 | # Copyright (C) Simon Wright
5 |
6 | # This package is free software; you can redistribute it and/or
7 | # modify it under terms of the BSD 3-Clause License.
8 |
9 | # Reads .ci files generated by GCC in response to the switch
10 | # -fcallgraph-info=su,da.
11 |
12 | # Reports on the total stack usage by each subprogram and the
13 | # subprograms it calls, recursively.
14 |
15 | # Uses PLY (http://www.dabeaz.com/ply/).
16 |
17 | import csv
18 | import getopt
19 | import os
20 | import pickle
21 | import ply.lex as lex
22 | import ply.yacc as yacc
23 | import re
24 | import sys
25 | import time
26 |
27 | # ----------------------------------------------------------------------
28 | # Object model
29 | # ----------------------------------------------------------------------
30 |
31 | class Graph:
32 | """Contains the interesting content of one .ci file."""
33 | def __init__(self):
34 | # title is the unit filename
35 | self.title = ''
36 | # internal_nodes contains information on the subprograms in
37 | # this unit
38 | self.internal_nodes = []
39 | # external_nodes contains information on subprograms called
40 | # from this unit but external to it
41 | self.external_nodes = []
42 | # edges contains information on the calls made from this unit
43 | self.edges = []
44 | def __str__(self):
45 | result = "graph title: %s" % self.title
46 | for int in self.internal_nodes:
47 | result += "\n" + str(int)
48 | for ext in self.external_nodes:
49 | result += "\n" + str(ext)
50 | for edge in self.edges:
51 | result += "\n" + str(edge)
52 | return result
53 |
54 | class Node(object):
55 | """Effectively abstract: either an InternalNode or an ExternalNode."""
56 | def __init__(self):
57 | # title is the subprograms's symbol, possibly annotated with
58 | # the file name
59 | self.title = ''
60 | # symbol is the subprogram's symbol
61 | self.symbol = ''
62 | # XXX I don't remember what this is
63 | self.source = ''
64 |
65 | class InternalNode(Node):
66 | """A subprogram declared in this unit."""
67 | def __init__(self):
68 | super(InternalNode, self).__init__()
69 | # the amount of stack used in this subprogram
70 | self.static_stack = 0
71 | # the number of dynamic objects (?)
72 | self.dynamic_objects = 0
73 | def __str__(self):
74 | result = "intl: title: %s symbol: %s src: %s stack: %d dobjs: %d" \
75 | % (self.title, self.symbol, self.source,
76 | self.static_stack, self.dynamic_objects)
77 | return result
78 |
79 | class ExternalNode(Node):
80 | """A subprogram called from this unit but not defined here."""
81 | def __init__(self):
82 | super(ExternalNode, self).__init__()
83 | def __str__(self):
84 | result = "extl: title: %s symbol: %s source: %s" \
85 | % (self.title, self.symbol, self.source)
86 | return result
87 |
88 | class Edge:
89 | """Represents a call from a subprogram in this compilation unit. The
90 | target may be local (will be an InternalNode) or not (an ExternalNode).
91 | @source is the calling subprogram.
92 | @target is the called subprogram.
93 | """
94 | def __init__(self):
95 | # source is the calling subprogram
96 | self.source = ''
97 | # target is the called subprogram
98 | self.target = ''
99 | # XXX not sure what this is
100 | self.label = None
101 | def __str__(self):
102 | result = "edge: from: %s to: %s" % (self.source, self.target)
103 | if self.label:
104 | result += ", label: %s" % self.label
105 | else:
106 | result += ", NO LABEL"
107 | return result
108 |
109 | class Graphs:
110 | def __init__(self):
111 | self.graphs = []
112 | # sources is a map {sourcename:(internal_node, stack_used)}
113 | self.sources = {}
114 | # edges is a map {sourcename:[targetname, ...]}
115 | self.edges = {}
116 | # missing contains the symbols that were called but not found
117 | # (probably would be in the external_nodes); used so missing
118 | # symbols only reported once
119 | self.missing = []
120 | def add_ci_file(self, file):
121 | try:
122 | input = open(file, 'r')
123 | except:
124 | error("couldn't open %s for input.\n" % file)
125 | lexer.input(input.read())
126 | input.close()
127 | g = parser.parse(lexer=lexer, debug=verbosity)
128 | # # in case we need to know whether this is an Ada unit
129 | # ada = g.title
130 | # ada = ada.rsplit('.')[-1]
131 | # ada = ada == 'ads' or ada == 'adb'
132 | self.graphs += [g, ]
133 | for i in g.internal_nodes:
134 | src = i.symbol
135 | if src in self.sources:
136 | warning("duplicate subprogram '%s'" % src)
137 | else:
138 | self.sources[src] = [i, None]
139 | for e in g.edges:
140 | src = e.source
141 | tgt = e.target
142 | if src in self.edges:
143 | if tgt in self.edges[src]:
144 | pass
145 | else:
146 | self.edges[src] += (tgt, )
147 | else:
148 | self.edges[src] = (tgt, )
149 | def _resolve(self, name):
150 | """Calculate the total stack depth required for @name.
151 | If sources[name][1] is None, this means we have to
152 | calculate it as sources[name][0].static_stack + the max of
153 | all the nodes called (recursively).
154 | If not None, we've already done this calculation (it contains
155 | the cached result).
156 | """
157 | if not name in self.sources:
158 | if not name in self.missing:
159 | self.missing += [name, ]
160 | warning("callee '%s' not found" % name)
161 | return 0
162 | if not self.sources[name][1]:
163 | if name in self.edges:
164 | called = self.edges[name]
165 | stacks = [self._resolve(c) for c in called if c != name]
166 | max_called_stack = max(stacks)
167 | else:
168 | max_called_stack = 0
169 | self.sources[name][1] = self.sources[name][0].static_stack \
170 | + max_called_stack
171 | return self.sources[name][1]
172 | def usage(self):
173 | """Returns the results, sorted by caller, as a list of 2-element
174 | lists: [(, )]
175 | """
176 | names = self.sources.keys()
177 | return sorted([(n.replace('__', '.'), self._resolve(n)) for n in names],
178 | key=lambda el: el[0])
179 |
180 | # ----------------------------------------------------------------------
181 | # Parser
182 | # ----------------------------------------------------------------------
183 |
184 | def p_start(p):
185 | '''
186 | start \
187 | : GRAPH COLON OPEN_BRACE title graph_contents CLOSE_BRACE
188 | | GRAPH COLON OPEN_BRACE title CLOSE_BRACE
189 | '''
190 | p[0] = Graph()
191 | p[0].title = p[4]
192 | if len(p) > 5:
193 | for c in p[5]:
194 | if isinstance(c, InternalNode):
195 | p[0].internal_nodes += (c, )
196 | elif isinstance(c, ExternalNode):
197 | p[0].external_nodes += (c, )
198 | elif isinstance(c, Edge):
199 | p[0].edges += (c, )
200 |
201 | def p_title(p):
202 | '''
203 | title : TITLE COLON STRING
204 | '''
205 | p[0] = p[3]
206 |
207 | def p_graph_contents(p):
208 | '''
209 | graph_contents \
210 | : graph_item graph_contents
211 | | graph_item
212 | '''
213 | p[0] = (p[1],)
214 | if len(p) == 3:
215 | p[0] += p[2]
216 |
217 | def p_graph_item(p):
218 | '''
219 | graph_item \
220 | : class
221 | | node
222 | | edge
223 | '''
224 | p[0] = p[1]
225 |
226 | def p_class(p):
227 | '''
228 | class \
229 | : CLASS OPEN_BRACE \
230 | CLASSNAME COLON STRING \
231 | LABEL COLON STRING \
232 | PARENT COLON STRING \
233 | VIRTUALS COLON STRING \
234 | CLOSE_BRACE
235 | '''
236 | #p[0] = "class, ignored"
237 |
238 | def p_node(p):
239 | '''
240 | node : NODE COLON OPEN_BRACE title node_content CLOSE_BRACE
241 | '''
242 | p[0] = p[5]
243 | p[0].title = p[4]
244 | p[0].symbol = p[4].rsplit(':')[-1]
245 |
246 | def p_node_content(p):
247 | '''
248 | node_content \
249 | : internal_node
250 | | external_node
251 | '''
252 | p[0] = p[1]
253 |
254 | # Matcher for the 'label' of an internal node (a subprogram in this CI
255 | # file)
256 | internal_matcher = re.compile(r'^(\S+)\\n(.*)\\n(\d+).*\\n(\d+).*$')
257 |
258 | def p_internal_node(p):
259 | '''
260 | internal_node : LABEL COLON STRING
261 | '''
262 | p[0] = InternalNode()
263 | p[0].content = p[3]
264 | match = internal_matcher.match(p[3])
265 | if match:
266 | p[0].source = match.group(2)
267 | p[0].static_stack = int(match.group(3))
268 | p[0].dynamic_objects = int(match.group(4))
269 | else:
270 | warning("failed to match internal '%s'" % p[3])
271 |
272 | # Matcher for the 'label' of an external node (a subprogram not in
273 | # this CI file)
274 | external_matcher = re.compile(r'^(\S+)\\n(.*)$')
275 |
276 | def p_external_node(p):
277 | '''
278 | external_node : LABEL COLON STRING SHAPE COLON ELLIPSE
279 | '''
280 | p[0] = ExternalNode()
281 | p[0].content = p[3]
282 | match = external_matcher.match(p[3])
283 | if match:
284 | p[0].source = match.group(2)
285 | else:
286 | warning("failed to match external '%s'" % p[3])
287 |
288 | def p_edge(p):
289 | '''
290 | edge \
291 | : EDGE COLON OPEN_BRACE \
292 | SOURCENAME COLON STRING \
293 | TARGETNAME COLON STRING \
294 | LABEL COLON STRING \
295 | CLOSE_BRACE
296 | | EDGE COLON OPEN_BRACE \
297 | SOURCENAME COLON STRING \
298 | TARGETNAME COLON STRING \
299 | CLOSE_BRACE
300 | '''
301 | p[0] = Edge()
302 | # for Ada sources, 'sourcename' is sometimes filename:source
303 | p[0].source = p[6].rsplit(':')[-1]
304 | # for targets in this C source file, 'targetname' is
305 | # filename:target
306 | p[0].target = p[9].rsplit(':')[-1]
307 | if len(p) > 11:
308 | p[0].label = p[12]
309 |
310 | def p_error(p):
311 | '''Panic mode recovery.'''
312 | if not p:
313 | warning("That seems to be it.")
314 | return None
315 | text = lexer.lexdata
316 | last_cr = text.rfind('\n', 0, p.lexpos)
317 | if last_cr < 0:
318 | last_cr = 0
319 | column = (p.lexpos - last_cr) - 1
320 | error("Syntax error at %s on line %d:%d" % (p.type, p.lineno, column))
321 |
322 | # ----------------------------------------------------------------------
323 | # Lexer
324 | # ----------------------------------------------------------------------
325 |
326 | # See https://www.dabeaz.com/ply/ply.html#ply_nn21 for management of
327 | # colons within strings using states.
328 |
329 | tokens = (
330 | 'CLASS',
331 | 'CLASSNAME',
332 | 'CLOSE_BRACE',
333 | 'COLON',
334 | 'EDGE',
335 | 'ELLIPSE',
336 | 'GRAPH',
337 | 'LABEL',
338 | 'NODE',
339 | 'OPEN_BRACE',
340 | 'PARENT',
341 | 'SHAPE',
342 | 'SOURCENAME',
343 | 'STRING',
344 | 'TARGETNAME',
345 | 'TITLE',
346 | 'VIRTUALS',
347 | )
348 |
349 | states = (
350 | ('inString', 'exclusive'),
351 | )
352 |
353 | t_CLASS = r'class'
354 | t_CLASSNAME = r'classname'
355 | t_CLOSE_BRACE = r'}'
356 | t_EDGE = r'edge'
357 | t_ELLIPSE = r'ellipse'
358 | t_GRAPH = r'graph'
359 | t_LABEL = r'label'
360 | t_NODE = r'node'
361 | t_OPEN_BRACE = r'{'
362 | t_PARENT = r'parent'
363 | t_SHAPE = r'shape'
364 | t_SOURCENAME = r'sourcename'
365 | t_TARGETNAME = r'targetname'
366 | t_TITLE = r'title'
367 | t_VIRTUALS = r'virtuals'
368 |
369 | # colons outside strings
370 | t_INITIAL_COLON = r':'
371 |
372 | def t_inString(t):
373 | r'"'
374 | t.lexer.string_start = t.lexer.lexpos
375 | t.lexer.begin('inString')
376 |
377 | def t_inString_content(t):
378 | r'[^"]'
379 |
380 | def t_inString_endquote(t):
381 | r'"'
382 | # leave off the end quote
383 | t.value = t.lexer.lexdata[t.lexer.string_start:t.lexer.lexpos - 1]
384 | t.type = 'STRING'
385 | t.lexer.lineno += t.value.count('\n')
386 | t.lexer.begin('INITIAL')
387 | return t
388 |
389 | def t_INITIAL_newline(t):
390 | # Define a rule so we can track skipped line numbers as well as
391 | # significant ones.
392 | r'\n+'
393 | t.lexer.lineno += len(t.value)
394 |
395 | # A string containing ignored characters (space, tab and CR)
396 | t_ANY_ignore = ' \t\r'
397 |
398 | def t_ANY_error(t):
399 | '''Error handling.'''
400 | warning("\nIllegal character '%s', line %d"
401 | % (t.value, t.lexer.lineno))
402 | t.lexer.skip(1)
403 |
404 | # ----------------------------------------------------------------------
405 | # Main
406 | # ----------------------------------------------------------------------
407 |
408 | def warning(msg):
409 | sys.stderr.write("%s\n" % msg)
410 |
411 | def error(msg):
412 | sys.stderr.write("%s\n" % msg)
413 | sys.exit(1)
414 |
415 | def main():
416 |
417 | def usage():
418 | sys.stderr.write('usage: stack_usage.py [flags] \n')
419 | sys.stderr.write('flags:\n')
420 | sys.stderr.write('-h, --help: '
421 | + 'output this message\n')
422 | sys.stderr.write('-s, --save=file: '
423 | + 'save data in file\n')
424 | sys.stderr.write('-l, --load=file: '
425 | + 'restore previously saved data\n')
426 | sys.stderr.write('-o, --output=file: '
427 | + 'file for CSV output (D=stack_usage.csv)\n')
428 | sys.stderr.write('-d, --diagnostics: '
429 | + 'output diagnostic info on parsing\n')
430 |
431 | try:
432 | opts, args = getopt.getopt(
433 | sys.argv[1:],
434 | 'hs:l:o:d',
435 | ('help', 'save=', 'load=', 'output=', 'diagnostics', ))
436 | except getopt.GetoptError:
437 | usage()
438 | sys.exit(1)
439 |
440 | input = sys.stdin
441 | output_file = 'stack_usage.csv'
442 | do_save = False; save_file = ''
443 | do_load = False; load_file = ''
444 | global verbosity; verbosity = False
445 |
446 | for o, v in opts:
447 | if o in ('-h', '--help'):
448 | usage()
449 | sys.exit()
450 | if o in ('-v', '--verbose'):
451 | verbosity = True
452 | if o in ('-l', '--load'):
453 | do_load = True
454 | load_file = v
455 | if o in ('-s', '--save'):
456 | do_save = True
457 | save_file = v
458 | if o in ('-o', '--output'):
459 | output_file = v
460 |
461 | if len(args) == 0:
462 | usage()
463 | sys.exit(1)
464 |
465 | # create the lexer
466 | global lexer; lexer = lex.lex()
467 | # create the parser (global, for p_error())
468 | global parser; parser = yacc.yacc()
469 |
470 | # check for load of previous run
471 | if do_load:
472 | graphs = pickle.load(open(load_file, "rb"))
473 | else:
474 | graphs = Graphs()
475 |
476 | # parse the input files, collect the data
477 | for f in args:
478 | graphs.add_ci_file(f)
479 |
480 | # save if requested
481 | if do_save:
482 | pickle.dump(graphs, open(save_file, "wb"))
483 |
484 | csv_file = open(output_file, mode='w')
485 | csv_writer = csv.DictWriter(csv_file, ('Caller', 'Depth'))
486 | csv_writer.writeheader()
487 |
488 | for row in graphs.usage():
489 | csv_writer.writerow({'Caller':row[0], 'Depth':row[1]})
490 |
491 | csv_file.close()
492 |
493 | if __name__ == '__main__':
494 | main()
495 |
--------------------------------------------------------------------------------