├── COPYING3 ├── README.md ├── papers ├── hilt2013 │ ├── HILT2013.dvi │ ├── HILT2013.pdf │ ├── HILT2013.ps │ ├── HILT2013.tex │ ├── HILT2013_Bib.tex │ ├── HILT2013_Presentation.pdf │ ├── HILT2013_Ref.bib │ ├── abstract.tex │ ├── acknowledgements.tex │ ├── conclusion.tex │ ├── def.tex │ ├── formalizationwork.tex │ ├── mathpartir.sty │ ├── motivation.tex │ ├── related.tex │ ├── sig-alternate-2013.cls │ ├── splncs03.bst │ ├── sprmindx.sty │ └── submission_confirmation.pdf └── paper2014 │ ├── .texlipse │ ├── 2014paper_main.bbl │ ├── 2014paper_main.bib │ ├── 2014paper_main.tex │ ├── abstract.tex │ ├── conclusion.tex │ ├── definitions.tex │ ├── evaluation.tex │ ├── introduction.tex │ ├── listings.sty │ ├── llncs.cls │ ├── llncsdoc.sty │ ├── lstlangcoq.sty │ ├── lstmisc.sty │ ├── lstpatch.sty │ ├── overview.tex │ ├── related-work.tex │ ├── run-time-checks.tex │ ├── santos.bib │ ├── spark-semantics.tex │ └── splncs.bst ├── spark2014_semantics ├── .gitignore └── src │ ├── CpdtTactics.v │ ├── Makefile │ ├── Makefile.conf │ ├── README.txt │ ├── _CoqProject │ ├── ast.v │ ├── ast_basics.v │ ├── ast_rt.v │ ├── ast_template.v │ ├── ast_util.v │ ├── environment.v │ ├── eval.v │ ├── eval_rt.v │ ├── htmlgen │ ├── languagegen │ ├── list_util.v │ ├── makegen │ ├── rt.v │ ├── rt_counter.v │ ├── rt_gen.v │ ├── rt_gen_consistent.v │ ├── rt_gen_impl.v │ ├── rt_gen_impl_consistent.v │ ├── rt_gen_util.v │ ├── rt_opt.v │ ├── rt_opt_ZArith.v │ ├── rt_opt_compare.v │ ├── rt_opt_consistent.v │ ├── rt_opt_consistent_util.v │ ├── rt_opt_impl.v │ ├── rt_opt_impl_consistent.v │ ├── rt_opt_util.v │ ├── rt_validator.v │ ├── run │ ├── run_check_count │ ├── run_tests │ ├── store_util.v │ ├── symboltable.v │ ├── symboltable_module.v │ ├── values.v │ ├── well_typed.v │ └── well_typed_util.v └── spark83_semantics ├── dynamic.pdf ├── dynamic.ps ├── static.pdf └── static.ps /README.md: -------------------------------------------------------------------------------- 1 | # 1. Introduction 2 | 3 | This project aims at developing a semantics of the SPARK language in Coq. It is 4 | based on the Language Reference Manuals of 5 | [Ada](http://www.ada-auth.org/standards/rm12_w_tc1/html/RM-TOC.html) and 6 | [SPARK](http://docs.adacore.com/spark2014-docs/html/lrm/). 7 | 8 | Initial work on that formalization was [published at HILT 9 | 2013](http://www.spark-2014.org/uploads/HILT2013_Presentation.pdf). Only a 10 | small part of SPARK 2014 was formalized, but the architecture put in place 11 | already allowed reading the abstract syntax tree produced by the GNAT compiler 12 | frontend and checking that the equivalent generated abstract syntax tree in Coq 13 | was well-formed and had the desired run-time checks. 14 | 15 | The latest results were [published at SEFM 16 | 2017](http://www.spark-2014.org/uploads/sefm17-spark-formalization.pdf). 17 | 18 | # 2. Goals 19 | 20 | The SPARK toolset aims at giving guarantees to its users about the properties 21 | of the software analyzed, be it absence of runtime errors or more complex 22 | properties. But the SPARK toolset being itself a complex tool, it is not free 23 | of errors. So how do we get confidence in the results of the analysis? The 24 | established means for getting confidence in tools in industry in through a 25 | process called sometimes tool certification, sometimes tool qualification. It 26 | requires to describe at various levels of details (depending on the criticality 27 | of the tool usage) the intended functionality of the tool, and to demonstrate 28 | (usually through testing) that the tool correctly implements these 29 | functionalities. 30 | 31 | The academic way of obtaining confidence is also called "certification" but it 32 | uncovers a completely different reality. It requires to provide mathematical 33 | evidence, through mechanized proof, that the tool indeed performs a formally 34 | specified functionality. Examples of that level of certification are the 35 | CompCert compiler and the SEL4 operating system. This level of assurance is 36 | very costly to achieve, and as a result not suitable for the majority of tools. 37 | 38 | For SPARK, we have worked with our academic partners from Kansas State 39 | University and Conservatoire National des Arts et Métiers to achieve a middle 40 | ground, establishing mathematical evidence of the correctness of a critical 41 | part of the SPARK toolset. The part on which we focused is the tagging of nodes 42 | requiring run-time checks by the frontend of the SPARK technology. This 43 | frontend is a critical and complex part, which is shared between the formal 44 | verification tool GNATprove and the compiler GNAT. It is responsible for 45 | generating semantically annotated Abstract Syntax Trees of the source code, 46 | with special tags for nodes that require run-time checks. Then, GNATprove 47 | relies on these tags to generate formulas to prove, so a missing tag means a 48 | missing verification. Our interest in getting better assurance on this part of 49 | the technology is not theoretical: that's a part where we repeatedly had errors 50 | in the past, leading to missing verifications. 51 | 52 | # 3. Contents 53 | 54 | spark83_semantics - papers on earlier work on formalization of the previous 55 | SPARK version, prior to SPARK 2014 56 | 57 | spark2014_semantics - Coq formalization of a core of SPARK 58 | 59 | papers - LaTeX sources for articles 60 | -------------------------------------------------------------------------------- /papers/hilt2013/HILT2013.dvi: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AdaCore/sparkformal/51ed67be1b1d80f7f2681237dfbf4ee7add395d6/papers/hilt2013/HILT2013.dvi -------------------------------------------------------------------------------- /papers/hilt2013/HILT2013.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AdaCore/sparkformal/51ed67be1b1d80f7f2681237dfbf4ee7add395d6/papers/hilt2013/HILT2013.pdf -------------------------------------------------------------------------------- /papers/hilt2013/HILT2013.tex: -------------------------------------------------------------------------------- 1 | \documentclass{sig-alternate-2013} 2 | \usepackage{amsmath,amssymb,txfonts} 3 | \usepackage{mathpartir} 4 | \usepackage{paralist} 5 | \usepackage{colortbl} 6 | \usepackage{color} 7 | \usepackage{multicol} 8 | \usepackage{multirow} 9 | \usepackage{hyperref} 10 | \usepackage{graphicx} 11 | \usepackage[boxed,noend,linesnumbered,figure]{algorithm2e} 12 | % \usepackage{subfigure} 13 | \usepackage{listings} 14 | % \usepackage[all]{xy} 15 | 16 | \lstdefinelanguage{Spark}% 17 | {morekeywords={abort,abs,accept,access,all,and,array,at,begin,body,% 18 | case,constant,declare,delay,delta,digits,do,else,elsif,end,entry,% 19 | exception,exit,for,function,generic,goto,if,in,is,limited,loop,% 20 | mod,new,not,null,of,or,others,out,package,pragma,private,% 21 | procedure,raise,range,record,rem,renames,return,reverse,select,% 22 | separate,subtype,task,terminate,then,type,use,when,while,with,% 23 | xor},% 24 | sensitive=f,% 25 | morecomment=[l]--,% 26 | morecomment=[l][\itshape \color{blue}]{--\#},% 27 | morestring=[m]",% percent not defined as stringizer so far 28 | morestring=[m]'% 29 | }[keywords,comments,strings]% 30 | 31 | \newif\iftechreport 32 | \techreportfalse 33 | %\techreporttrue 34 | \newcommand{\iftr}[1]{\iftechreport#1\fi} 35 | \newcommand{\ifntr}[1]{\iftechreport\else#1\fi} 36 | 37 | \renewcommand{\topfraction}{.99} 38 | \renewcommand{\textfraction}{.01} 39 | \renewcommand{\floatpagefraction}{.99} 40 | \renewcommand{\floatsep}{.1cm} 41 | \renewcommand{\textfloatsep}{.25cm} 42 | 43 | %\textheight=9in % could be \textheight=9.2in 44 | %\topmargin=-0.5in 45 | %\oddsidemargin=0in 46 | %\evensidemargin=0in 47 | %\textwidth=6.5in 48 | %\textheight=8in 49 | %\topmargin=-.2in 50 | %\addtolength{\textheight}{.52in} 51 | 52 | \newcommand{\mytitle}{Towards The Formalization of SPARK 2014 Semantics With 53 | Explicit Run-time Checks Using Coq} 54 | 55 | \input{def.tex} 56 | % 57 | \begin{document} 58 | \conferenceinfo{HILT'13,} {November 10--14, 2013, Pittsburgh, PA, USA.} 59 | \CopyrightYear{2013} 60 | \crdata{978-1-4503-2466-3/13/11} 61 | \clubpenalty=10000 62 | \widowpenalty = 10000 63 | 64 | \title{\mytitle\thanks{}} 65 | \renewcommand\footnotemark{} 66 | \renewcommand\footnoterule{} 67 | 68 | \numberofauthors{2} 69 | 70 | \author{ 71 | \alignauthor 72 | Pierre Courtieu,Maria Virginia Aponte, Tristan Crolard \\ 73 | \affaddr{Conservatoire National des Arts et Metiers} \\ 74 | \email{Pierre.Courtieu@cnam.fr, maria-virginia.aponte\_garcia@cnam.fr, tristan.crolard@cnam.fr} 75 | % 76 | \alignauthor 77 | Zhi Zhang, Robby, Jason Belt, John~Hatcliff \\ 78 | \affaddr{Kansas State University} \\ 79 | \email{zhangzhi@ksu.edu, robby@ksu.edu, belt@ksu.edu, hatcliff@ksu.edu} 80 | % 81 | \and 82 | \alignauthor 83 | Jerome Guitton \\ 84 | \affaddr{AdaCore}\\ 85 | \email{guitton@adacore.com} 86 | % 87 | \alignauthor 88 | Trevor Jennings \\ 89 | \affaddr{Altran}\\ 90 | \email{trevor.jennings@altran.com} 91 | } 92 | \date{\today} 93 | 94 | \maketitle 95 | \input{abstract.tex} 96 | \category{D.2.4}{Software Engineering}{Software/Program Verification}, 97 | Formal Methods, Correctness Proofs 98 | \category{F.4.1}{Mathematical Logic and Formal Language}{Mathematical Logic} 99 | [mechanical theorem proving] 100 | 101 | \terms{Reliability, Security, Verification} 102 | 103 | \keywords{SPARK, Coq Proof Assistant, Formal Semantics, Machine-Verified Proof} 104 | 105 | \input{motivation} 106 | %\vspace{2cm} 107 | \input{formalizationwork} 108 | \input{related} 109 | \input{conclusion} 110 | \input{acknowledgements} 111 | 112 | \bibliographystyle{abbrv} 113 | %\bibliography{HILT2013_Ref} 114 | \input{HILT2013_Bib} 115 | 116 | \end{document} 117 | -------------------------------------------------------------------------------- /papers/hilt2013/HILT2013_Bib.tex: -------------------------------------------------------------------------------- 1 | \begin{thebibliography}{10} 2 | 3 | \bibitem{Ada:URL} 4 | {Ada} reference manual. 5 | \newblock \url http://www.ada-auth.org/standards/ada12.html. 6 | 7 | \bibitem{Gnatprove:URL} 8 | Adacore {Gnatprove} tool. 9 | \newblock \url http://www.open-do.org/projects/hi-lite/gnatprove/. 10 | 11 | \bibitem{Hi-Lite:URL} 12 | Adacore {Hi-Lite} project. 13 | \newblock \url http://www.open-do.org/projects/hi-lite/. 14 | 15 | \bibitem{Jago:URL} 16 | {Jago} translation tool. 17 | \newblock \url https://github.com/sireum/bakar/tree/master/sireum-bakar-jago. 18 | 19 | \bibitem{Sireum:URL} 20 | {Sireum} software analysis platform. 21 | \newblock \url http://www.sireum.org. 22 | 23 | \bibitem{Formalization:URL} 24 | Source code for {SPARK} 2014 language subset formalization. 25 | \newblock \url 26 | https://github.com/sireum/bakar/tree/master/sireum-bakar-formalization. 27 | 28 | \bibitem{Leroy:09} 29 | X.~Leroy. 30 | \newblock Formal verification of a realistic compiler. 31 | \newblock {\em Communications of the ACM}, 52(7):107--115, 2009. 32 | 33 | \bibitem{Marsh:Book94} 34 | W.~Marsh. 35 | \newblock Formal semantics of {SPARK} - static semantics, Oct 1994. 36 | 37 | \bibitem{Neil:Book94} 38 | I.~O'Neill. 39 | \newblock Formal semantics of {SPARK} - dynamic semantics, Oct 1994. 40 | 41 | \bibitem{RTCA:DO-178} 42 | {RTCA DO-178}. 43 | \newblock Software considerations in airborne systems and equipment, 2011. 44 | 45 | \bibitem{RTCA:DO-333} 46 | {RTCA DO-333}. 47 | \newblock Formal methods supplement to do-178c and do-278a, 2011. 48 | 49 | \end{thebibliography} 50 | -------------------------------------------------------------------------------- /papers/hilt2013/HILT2013_Presentation.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AdaCore/sparkformal/51ed67be1b1d80f7f2681237dfbf4ee7add395d6/papers/hilt2013/HILT2013_Presentation.pdf -------------------------------------------------------------------------------- /papers/hilt2013/HILT2013_Ref.bib: -------------------------------------------------------------------------------- 1 | @misc{RTCA:DO-178, 2 | author = {{RTCA DO-178}}, 3 | publisher = {RTCA and EUROCAE}, 4 | title = {Software Considerations in Airborne Systems and Equipment}, 5 | year = {2011} 6 | } 7 | @misc{RTCA:DO-333, 8 | author = {{RTCA DO-333}}, 9 | publisher = {RTCA and EUROCAE}, 10 | title = {Formal Methods Supplement to DO-178C and DO-278A}, 11 | year = {2011} 12 | } 13 | @misc{Hi-Lite:URL, 14 | howpublished = {\url http://www.open-do.org/projects/hi-lite/}, 15 | title = {AdaCore {Hi-Lite} Project} 16 | } 17 | @misc{Ada:URL, 18 | howpublished = {\url http://www.ada-auth.org/standards/ada12.html}, 19 | title = {{Ada} Reference Manual} 20 | } 21 | @misc{Gnatprove:URL, 22 | howpublished = {\url http://www.open-do.org/projects/hi-lite/gnatprove/}, 23 | title = {AdaCore {Gnatprove} Tool} 24 | } 25 | @article{Leroy:09, 26 | author = {Xavier Leroy}, 27 | journal = {Communications of the ACM}, 28 | number = {7}, 29 | pages = {107--115}, 30 | title = {Formal Verification of a Realistic Compiler}, 31 | volume = {52}, 32 | year = {2009} 33 | } 34 | @misc{Sireum:URL, 35 | howpublished = {\url http://www.sireum.org}, 36 | title = {{Sireum} Software Analysis Platform} 37 | } 38 | @misc{Jago:URL, 39 | howpublished = {\url https://github.com/sireum/bakar/tree/master/sireum-bakar-jago}, 40 | title = {{Jago} Translation Tool} 41 | } 42 | @misc{Formalization:URL, 43 | howpublished = {\url https://github.com/sireum/bakar/tree/master/sireum-bakar-formalization}, 44 | title = {Source Code for {SPARK} 2014 Language Subset Formalization} 45 | } 46 | @misc{Marsh:Book94, 47 | author = {William Marsh}, 48 | title = {Formal Semantics of {SPARK} - Static Semantics}, 49 | month = {Oct}, 50 | year = {1994} 51 | } 52 | @misc{Neil:Book94, 53 | author = {Ian O'Neill}, 54 | title = {Formal Semantics of {SPARK} - Dynamic Semantics}, 55 | month = {Oct}, 56 | year = {1994} 57 | } -------------------------------------------------------------------------------- /papers/hilt2013/abstract.tex: -------------------------------------------------------------------------------- 1 | \begin{abstract} 2 | We present the first steps of a broad effort to develop a formal 3 | representation of SPARK 2014 suitable for supporting machine-verified 4 | static analyses and translations. In our initial work, we have developed 5 | technology for translating the GNAT compiler's abstract syntax trees 6 | into the Coq proof assistant, and we have formalized in Coq the dynamic 7 | semantics for a toy subset of the SPARK 2014 language. 8 | SPARK 2014 programs must ensure the absence of certain run-time errors 9 | (for example, those arising while performing division by zero, accessing 10 | non existing array cells, overflow on integer computation). The main 11 | novelty in our semantics is the encoding of (a small part of) the run-time 12 | checks performed by the compiler to ensure that any well-formed terminating 13 | SPARK programs do not lead to erroneous execution. 14 | This and other results are mechanically proved using the Coq proof assistant. 15 | The modeling of on-the-fly run-time checks within the semantics lays the 16 | foundation for future work on mechanical reasoning about SPARK 2014 17 | program correctness (in the particular area of robustness) and for 18 | studying the correctness of compiler optimizations concerning run-time 19 | checks, among others. 20 | \end{abstract} 21 | 22 | 23 | -------------------------------------------------------------------------------- /papers/hilt2013/acknowledgements.tex: -------------------------------------------------------------------------------- 1 | \section*{Acknowledgements} 2 | The authors would like to thank Emmanuel Polonowski and Yannick Moy for their valuable comments and 3 | suggestions on this work. 4 | 5 | -------------------------------------------------------------------------------- /papers/hilt2013/conclusion.tex: -------------------------------------------------------------------------------- 1 | \section{Conclusion and future work} 2 | We have implemented a prototype tool chain from SPARK 2014 language subset to 3 | Coq and its semantical formalization and proof in Coq. The experiments on running the certified 4 | interpreter shows that our formalized SPARK semantics can capture the desired run-time behavior. 5 | This is encouraging, however there is still a lot of work needed to be done. 6 | 7 | Our next step is to prove the correctness of optimizations that remove 8 | useless run-time checks. 9 | Our interpreted semantics are parameterized by the set of run-time 10 | checks to be performed. These semantics may be called with an 11 | incomplete set of run-time checks, and can evaluate in that case to an 12 | erroneous execution. A future work could be to formalize some 13 | optimizations actually performed by the GNAT compiler, and remove those useless 14 | run-time checks. The idea would be to prove 15 | these optimizations correct, namely to prove that those executions 16 | with $\mathit{less}$ run-time checks behave exactly as those following the 17 | reference semantics, which perform systematically $\mathit{all}$ the checks. 18 | 19 | In addition, we are also interested in adding more SPARK language features, 20 | such as procedure calls, pre/post aspects and loop invariants, to expand our 21 | current SPARK subset and make it more practical. This work on formalizing the SPARK semantics 22 | also paves the way for our further work on machine-verified proof of correctness of 23 | SPARK static analysis and translation tools. -------------------------------------------------------------------------------- /papers/hilt2013/def.tex: -------------------------------------------------------------------------------- 1 | \def\doframeit#1{\vbox{% 2 | \hrule height\fboxrule 3 | \hbox{% 4 | \vrule width\fboxrule \kern\fboxsep 5 | \vbox{\kern\fboxsep #1\kern\fboxsep }% 6 | \kern\fboxsep \vrule width\fboxrule }% 7 | \hrule height\fboxrule }} 8 | 9 | \def\frameit{\smallskip \advance \linewidth by -7.5pt \setbox0=\vbox \bgroup 10 | \strut \ignorespaces } 11 | \def\frameit{\smallskip \advance \linewidth by -7.5pt \setbox0=\vbox \bgroup 12 | \strut \ignorespaces } 13 | \def\endframeit{\ifhmode \par \nointerlineskip \fi \egroup 14 | \doframeit{\box0}} 15 | 16 | \newenvironment{smallitemize}{\setlength{\topsep}{0.0 truein} 17 | \begin{itemize} 18 | \setlength{\leftmargin}{.25 truein} 19 | \setlength{\parsep}{0.0 truein} 20 | \setlength{\parskip}{0.0 truein} 21 | \setlength{\itemsep}{0.0 truein}}{\end{itemize}} 22 | 23 | \newenvironment{smalldescription}{\setlength{\topsep}{0.0 truein} 24 | \begin{description} 25 | \setlength{\leftmargin}{.25 truein} 26 | \setlength{\parsep}{0.0 truein} 27 | \setlength{\parskip}{0.0 truein} 28 | \setlength{\itemsep}{0.0 truein}}{\end{description}} 29 | 30 | \newenvironment{smallenumerate}{\setlength{\topsep}{0.0 truein} 31 | \begin{enumerate} 32 | \setlength{\leftmargin}{0.25 truein} 33 | \setlength{\parsep}{0.0 truein} 34 | \setlength{\parskip}{0.0 truein} 35 | \setlength{\itemsep}{0.0 truein}}{\end{enumerate}} 36 | 37 | %% 38 | \newcommand{\spark}{{S{\sc park}}} 39 | \newcommand{\sparkada}{{S{\sc park}/\-A{\sc da}}} 40 | 41 | %% General formatting and Latin abbreviations 42 | \newcommand{\fig}{Figure~} 43 | \newcommand{\sect}{Section~} 44 | \newcommand{\comment}[1]{{\sf ({#1})}} 45 | \newcommand{\ie}{{\em i.e.}} 46 | \newcommand{\etc}{{\em etc.}} 47 | \newcommand{\wrt}{{\em wrt.}} 48 | \newcommand{\eg}{{\em e.g.}} 49 | \newcommand{\etal}{{\em et al.}} 50 | \newcommand{\ignore}[1]{} 51 | 52 | %% Inline formating for SPARK code 53 | \newcommand{\sparkcode}[1]{\textsf{#1}} 54 | \newcommand{\sparkkeyword}[1]{\textbf{#1}} 55 | \newcommand{\invars}{\mbox{IN}} 56 | \newcommand{\outvars}{\mbox{OUT}} 57 | 58 | %% list 59 | \newcommand{\cons}[2]{#1\!::\!#2} 60 | \newcommand{\append}[2]{#2 +\!\!\!+\,[#1]} 61 | \newcommand{\mklist}[1]{[#1]} 62 | \newcommand{\emptylist}{[]} 63 | \newcommand{\emptymap}{\emptyset} 64 | 65 | %% tuple 66 | \newcommand{\pair}[2]{\langle #1,#2\rangle} 67 | \newcommand{\triple}[3]{\langle#1,#2,#3\rangle} 68 | \newcommand{\quadruple}[4]{\langle#1,#2,#3,#4\rangle} 69 | \newcommand{\quintuple}[5]{\langle#1,#2,#3,#4,#5\rangle} 70 | \newcommand{\hextuple}[6]{\langle#1,#2,#3,#4,#5,#6\rangle} 71 | 72 | %% syntax 73 | \newcommand{\prog}{\mbox{\it prog}} 74 | \newcommand{\decl}{\mbox{\it decl}} 75 | \newcommand{\blk}{\mbox{\it blk}} 76 | \newcommand{\cmd}{\mbox{\it cmd}} 77 | \newcommand{\jmp}{\mbox{\it jmp}} 78 | 79 | %% semantic domains 80 | \newcommand{\typed}{\textbf{Type}} 81 | \newcommand{\constd}{\textbf{Const}} 82 | \newcommand{\bvald}{\textbf{BValue}} 83 | \newcommand{\arrd}{\textbf{Arr}} 84 | \newcommand{\fieldd}{\textbf{Field}} 85 | \newcommand{\recd}{\textbf{Rec}} 86 | % \newcommand{\tarrd}{\textbf{TArr}} 87 | \newcommand{\symd}{\textbf{SymVal}} 88 | \newcommand{\arefd}{\textbf{ARef}} 89 | \newcommand{\rrefd}{\textbf{RRef}} 90 | \newcommand{\refd}{\textbf{Ref}} 91 | \newcommand{\trefd}{\textbf{TRef}} 92 | \newcommand{\vald}{\textbf{Value}} 93 | \newcommand{\substd}{\textbf{Subst}} 94 | \newcommand{\locd}{\textbf{Label}} 95 | \newcommand{\gvard}{\textbf{Var}} 96 | \newcommand{\lvard}{\textbf{LocalVar}} 97 | \newcommand{\globald}{\textbf{Store}} 98 | \newcommand{\stated}{\textbf{State}} 99 | \newcommand{\heapd}{\textbf{Heap}} 100 | \newcommand{\theapd}{\textbf{THeap}} 101 | \newcommand{\pcd}{\textbf{PathCond}} 102 | 103 | %% state 104 | \newcommand{\statev}{s} 105 | \newcommand{\globalv}{\sigma} 106 | \newcommand{\heapv}{h} 107 | \newcommand{\theapv}{t} 108 | \newcommand{\labelv}{lbl} 109 | \newcommand{\substv}{\mu} 110 | \newcommand{\storev}{\sigma} 111 | \newcommand{\pcv}{\phi} 112 | \newcommand{\hstate}[5]{\quintuple{#1}{#2}{#3}{#4}{#5}} 113 | \newcommand{\sstate}[6]{\hextuple{#1}{#2}{#3}{#4}{#5}{#6}} 114 | \newcommand{\state}[3]{\triple{#1}{#2}{#3}} 115 | \newcommand{\heap}[1]{#1_{\heapv}} 116 | \newcommand{\theap}[1]{#1_{\theapv}} 117 | \newcommand{\pc}[1]{#1_{\pcv}} 118 | 119 | %% state lookups 120 | \newcommand{\slookup}[3]{#2_{#1}(#3)} 121 | \newcommand{\glookup}[2]{\slookup{\globalv}{#1}{#2}} 122 | \newcommand{\hlookup}[2]{\slookup{\heapv}{#1}{#2}} 123 | \newcommand{\tlookup}[2]{\slookup{\theapv}{#1}{#2}} 124 | \newcommand{\llookup}[1]{#1_{\labelv}} 125 | \newcommand{\olookup}[2]{\slookup{\storev}{#1}{#2}} 126 | \newcommand{\plookup}[2]{\slookup{\pcv}{#1}{#2}} 127 | \newcommand{\ulookup}[1]{{#1}_{\substv}} 128 | 129 | %% state updates 130 | \newcommand{\supdate}[4]{#1[#2]^{#3}_{#4}} 131 | \newcommand{\gupdate}[3]{\supdate{#1}{#2}{#3}{\globalv}} 132 | \newcommand{\hupdate}[3]{\supdate{#1}{#2}{#3}{\heapv}} 133 | \newcommand{\tupdate}[3]{\supdate{#1}{#2}{#3}{\theapv}} 134 | \newcommand{\lupdate}[3]{\supdate{#1}{#2}{#3}{\labelv}} 135 | \newcommand{\oupdate}[3]{\supdate{#1}{#2}{#3}{\storev}} 136 | \newcommand{\pupdate}[3]{\supdate{#1}{#2}{#3}{\pcv}} 137 | \newcommand{\uupdate}[3]{\supdate{#1}{#2}{#3}{\substv}} 138 | 139 | %% meta-vars 140 | \newcommand{\gvarv}{x} 141 | % \newcommand{\lvarv}{l} 142 | 143 | %% exp 144 | \newcommand{\expr}{e} 145 | \newcommand{\exprSub}[1]{\expr_{#1}} 146 | \newcommand{\expro}{\exprSub{0}} 147 | \newcommand{\expri}{\exprSub{1}} 148 | \newcommand{\exprii}{\exprSub{2}} 149 | \newcommand{\expriii}{\exprSub{3}} 150 | \newcommand{\expriv}{\exprSub{4}} 151 | \newcommand{\unexp}[2]{#1\,#2} 152 | \newcommand{\binexp}[3]{#1\,#2\,#3} 153 | \newcommand{\unop}{\ominus} 154 | \newcommand{\binop}{\oplus} 155 | \newcommand{\old}[1]{\unexp{\raise.17ex\hbox{$\scriptstyle\sim$}}{#1}} 156 | \newcommand{\aaexp}[2]{#1\sparkcode{[}#2\sparkcode{]}} 157 | \newcommand{\raexp}[2]{#1\sparkcode{.}#2} 158 | \newcommand{\aaupdt}[3]{#1\sparkcode{[}{#2}_{\!\!\!1} 159 | \Rightarrow {#3}_{\!\!\!\!\!1}~|~\ldots~|~{#2}_{\!\!\!n} \Rightarrow 160 | {#3}_{\!\!\!\!\!n}\sparkcode{]}} 161 | \newcommand{\raupdt}[3]{#1\sparkcode{[}{#2}_{1} 162 | \Rightarrow {#3}_{1}~|~\ldots~|~{#2}_{n} \Rightarrow 163 | {#3}_{n}\sparkcode{]}} 164 | 165 | %% command 166 | \newcommand{\assign}[2]{#1\,\sparkcode{:=}\,#2} 167 | \newcommand{\assert}[1]{\sparkcode{assert}\,#1} 168 | \newcommand{\assume}[1]{\sparkcode{assume}\,#1} 169 | \newcommand{\lhs}{\mbox{\it lhs}} 170 | 171 | %% jump 172 | \newcommand{\goto}[1]{\sparkcode{goto}\,#1} 173 | \newcommand{\ifelse}[3]{\sparkcode{if}\,#1\, 174 | \sparkcode{then goto}\,#2\,\sparkcode{else goto}\,#3} 175 | 176 | %% type 177 | \newcommand{\typev}{\tau} 178 | 179 | %% values 180 | \newcommand{\true}{\mbox{\sc{True}}} 181 | \newcommand{\false}{\mbox{\sc{False}}} 182 | \newcommand{\valv}{v} 183 | \newcommand{\valvv}{u} 184 | %\newcommand{\valvvv}{w} 185 | \newcommand{\indexv}{\iota} 186 | \newcommand{\indexvv}{\iota'} 187 | \newcommand{\indexvvv}{\iota''} 188 | \newcommand{\constv}{c} 189 | \newcommand{\constvv}{d} 190 | \newcommand{\constvvv}{b} 191 | \newcommand{\symv}{\alpha} 192 | \newcommand{\symvv}{\beta} 193 | \newcommand{\symvvv}{\gamma} 194 | \newcommand{\arrv}{a} 195 | \newcommand{\recv}{r} 196 | \newcommand{\refv}{p} 197 | \newcommand{\fieldv}{f} 198 | \newcommand{\tarrv}{\rho} 199 | \newcommand{\trecv}{\rho} 200 | \newcommand{\arrmv}{m} 201 | \newcommand{\recmv}{w} 202 | \newcommand{\tarrmv}{\omega} 203 | \newcommand{\trecmv}{\varpi} 204 | \newcommand{\locv}{l} 205 | \newcommand{\locvv}{l'} 206 | \newcommand{\lheapv}{\eta} 207 | \newcommand{\boundv}{\rho} 208 | \newcommand{\lboundv}{\boundv_{\textrm{lo}}} 209 | \newcommand{\hboundv}{\boundv_{\textrm{hi}}} 210 | \newcommand{\pathv}{\pi} 211 | \newcommand{\reflv}{rs} 212 | 213 | %% misc 214 | \newcommand{\fields}[1]{\mbox{\it fields}(#1)} 215 | \newcommand{\type}[1]{\mbox{\it type}(#1)} 216 | \newcommand{\timestamp}[1]{\mbox{\it timestamp}(#1)} 217 | % \newcommand{\unify}[4]{\mbox{\it unify}(#1,\,#2,\,#3,\,#4)} 218 | % \newcommand{\tunify}[3]{\mbox{\it unify}(#1,\,#2,\,#3)} 219 | \newcommand{\fst}[1]{\mbox{\it fst}(#1)} 220 | \newcommand{\snd}[1]{\mbox{\it snd}(#1)} 221 | \newcommand{\init}[1]{\mbox{\it init}(#1)} 222 | \newcommand{\block}[1]{\mbox{\it block}(#1)} 223 | \newcommand{\pset}[1]{\mathcal{P}(#1)} 224 | \newcommand{\prop}[5]{\mbox{\it propagate}(#1,\,\maps{#2}{#3},\,#4,\,#5)} 225 | \newcommand{\psubst}[4]{\mbox{\it subst}(#1,\,#2,\,#4/#3)} 226 | \newcommand{\tsubst}[2]{\mbox{\it substitute}(#1,\,#2)} 227 | \newcommand{\paupdate}[3]{\mbox{\it update}(#1,\,#2,\,#3)} 228 | \newcommand{\paassign}[3]{\mbox{\it assign}(#1,\,#2,\,#3)} 229 | % \newcommand{\insta}[3]{\delta(#1,\,#2,\,#3)} 230 | \newcommand{\pfun}[2]{#1\rightharpoonup #2} 231 | \newcommand{\mat}[2]{\mbox{\it materialize}(#1,\,#2)} 232 | \newcommand{\rep}[2]{\mbox{\it rep}(#1,\,#2)} 233 | \newcommand{\dom}[1]{\mbox{\it dom}(#1)} 234 | \newcommand{\ebounds}[1]{\mbox{\it bnds}(#1)} 235 | \newcommand{\elemtype}[1]{\mbox{\it elemtype}(#1)} 236 | \newcommand{\fieldtype}[2]{\mbox{\it fieldtype}(#1,\,#2)} 237 | \newcommand{\sadd}[2]{\mkset{#2}\cup #1} 238 | \newcommand{\sdel}[2]{#1-\mkset{#2}} 239 | \newcommand{\defeqop}{\triangleq} 240 | \newcommand{\defeq}[2]{#1\defeqop #2} 241 | \newcommand{\maps}[2]{#1\mapsto #2} 242 | \newcommand{\mlookup}[2]{{#1}(#2)} 243 | \newcommand{\mupdate}[3]{{#1}[\maps{#2}{#3}]} 244 | \newcommand{\mkset}[1]{\{\,#1\,\}} 245 | \newcommand{\mksetc}[2]{\mkset{#1\,|\,\,#2}} 246 | \newcommand{\ruleIdPrefix}{} 247 | \newcommand{\ruleWhere}[1] 248 | { \vspace{.2cm} 249 | 250 | \ensuremath{ 251 | \begin{array}{lll} 252 | \textrm{where} & #1 253 | \end{array} 254 | } 255 | } 256 | 257 | %% array 258 | \newcommand{\alup}[2]{({\mbox{\it alkup}}~{#1}~{#2})} 259 | \newcommand{\aupt}[3]{({\mbox{\it aupdt}}~{#1}~{#2}~{#3})} 260 | \newcommand{\rlup}[2]{({\mbox{\it rlkup}}~{#1}~{#2})} 261 | \newcommand{\rupt}[3]{({\mbox{\it rupdt}}~{#1}~{#2}~{#3})} 262 | \newcommand{\aibound}[2]{_{#1}^{#2}} 263 | 264 | %% evals & tran 265 | \newcommand{\eeval}[4]{#1\vdash #2 \rightarrow \langle #3,#4\rangle} 266 | \newcommand{\eeeval}[5]{#1\vdash #2 \rightarrow \langle #3,#4,#5\rangle} 267 | \newcommand{\ceval}[3]{#1\vdash #2 \Rightarrow #3} 268 | \newcommand{\tran}[2]{#1 \rightsquigarrow #2} 269 | \newcommand{\trans}[2]{#1 \rightsquigarrow^+ #2} 270 | \newcommand{\hsa}{\hspace{0.5cm}} 271 | 272 | %% compare 273 | \newcommand{\compareN}{\mbox{\it compare}} 274 | \newcommand{\compareHN}{\mbox{\it compareHelper}} 275 | \newcommand{\compare}[3]{\compareN({#1},\,{#2},\,{#3})} 276 | \newcommand{\compareH}[3]{\compareHN({#1},\,{#2},\,{#3})} 277 | \newcommand{\comp}[2]{{#1}\simeq{#2}} 278 | 279 | %% Paragraphs 280 | \newcommand{\mypara}[1]{\vspace*{2mm}\noindent{\bf {#1}:}} 281 | \newcommand{\myfpara}[1]{\noindent{\bf {#1}:}} -------------------------------------------------------------------------------- /papers/hilt2013/formalizationwork.tex: -------------------------------------------------------------------------------- 1 | \section{Formalization And Proof} 2 | 3 | \subsection{SPARK Translation Toolchain} 4 | In the long path through the definition of complete semantics for 5 | SPARK 2014, a very important step is to build a tool chain allowing 6 | the experimentation of the behavior of these semantics on real SPARK 7 | 2014 programs. In the front end of this tool chain, as part of the 8 | Sireum analysis framework \cite{Sireum:URL}, we have developed a tool called Jago 9 | \cite{Jago:URL} that translates XML representation of the GNAT compiler's 10 | ASTs into a Scala-based 11 | representation in Sireum. This open-source framework enables one to build 12 | code translators and analysis tools for SPARK 2014 in Scala. Scala's 13 | blending of functional %of 14 | and object-oriented program styles have proven 15 | quite useful in other contexts for syntax tree manipulation and analysis. 16 | Integrated into Jago is a translation of GNAT ASTs into Coq based representations. 17 | In the backend of the tool chain, a certified interpreter 18 | encoding run-time checks for the Coq AST has been developed within Coq. 19 | 20 | \subsection{Formalizing Language Semantics} 21 | A major difference between SPARK and other programming languages is that 22 | its %semantics requires its language implementation to detect all run-time errors to 23 | %ensure program correct execution. 24 | informal specification as given by the Ada reference manual requires the 25 | compiler to detect specific run-time error cases in order to enforce 26 | programs robustness. 27 | %With the Coq AST generated from SPARK by Jago, 28 | We are currently formalizing the SPARK semantics in Coq that includes run-time checks 29 | and working towards adding more language features, such as procedure calls, in our current 30 | formalization framework. 31 | At this early stage, our formal semantics consider only a small subset 32 | of SPARK 2014, and only a small subset of run-time errors. It performs 33 | appropriate run-time checks as they are specified by the SPARK and Ada 34 | reference manuals. We call these semantics, the $\mathit{reference\ 35 | semantics}$, and we implemented an interpreter certified with respect to them. 36 | Thus, our reference semantics can be both manually reviewed by SPARK 37 | experts, and also be used to experiment on real source code programs. 38 | For those who are interested, we have posted the source code of our 39 | formalization on GitHub \cite{Formalization:URL}. 40 | 41 | \subsection{Program Correctness Proof} 42 | In Coq, we have formalized a well-formed SPARK program as a well-typed, well-defined 43 | and well-checked program. 44 | A well-typed program has all its language constructs being consistent with respect to the 45 | typing rules and all its variables have correct in/out mode with respect to their reading 46 | and writing permissions. 47 | A well-defined program is a program with all its used variables initialized. 48 | A well-checked program is a program having the appropriate checks inserted at the correct places 49 | in AST trees. 50 | It is proved that for all well-formed SPARK programs in our formalized language subset, they will 51 | execute as we expect and will never exhibit undefined behavior. -------------------------------------------------------------------------------- /papers/hilt2013/mathpartir.sty: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AdaCore/sparkformal/51ed67be1b1d80f7f2681237dfbf4ee7add395d6/papers/hilt2013/mathpartir.sty -------------------------------------------------------------------------------- /papers/hilt2013/motivation.tex: -------------------------------------------------------------------------------- 1 | \section{Motivation} 2 | %\subsection{Using Formal Semantics for Certification} 3 | %First, certification process of SPARK technology can be stressed by the use of 4 | %formal semantics. 5 | We believe that the certification process of SPARK technology can be 6 | stressed by the use of formal semantics. 7 | Indeed, the software certification process as required by the DO-178-C \cite{RTCA:DO-178} 8 | standard allows formal verification to replace some forms of testing. 9 | This is one of the goals pursued by the SPARK toolchain resulting from 10 | the Hi-Lite project \cite{Hi-Lite:URL}. On the other hand, the DO-333 11 | supplement \cite{RTCA:DO-333} (formal method supplement to DO-178-C) recommends that 12 | when using formal methods "all assumptions related to each formal 13 | analysis should be described and justified". As any formal static 14 | analysis must rely on the behavior of the language being analyzed, a 15 | precise and unambiguous definition of the semantics of this language 16 | becomes clearly a requirement in the certification process. 17 | 18 | %\subsection{} 19 | %Second, enforce the theoretical foundation of the GNATprove 20 | %toolchain. 21 | We also aim to strengthen the theoretical foundation of the GNATprove 22 | toolchain. 23 | The Ada reference manual \cite{Ada:URL} introduces the notion of $\mathit{errors}$. 24 | These correspond to error situations that must be detected 25 | at run time as well as erroneous executions that need not to be 26 | detected. In Ada, the former are detected by run-time checks 27 | (RTCs) inserted by the compiler. Both must be guaranteed never to 28 | occur during the process of proving SPARK (or Ada) subprograms within 29 | the GNATprove toolchain \cite{Gnatprove:URL}. This can be ensured either by static 30 | analysis or by generating verification conditions (VCs) showing that 31 | the corresponding error situations never occur at that point in the 32 | subprogram. The generated VCs must be discharged in order to prove the 33 | subprogram. Tools within the GNATprove toolchain strongly rely on the 34 | completeness of this VCs generation process. Our semantics setting on 35 | top of a proof assistant open the possibility to formally (and 36 | mechanically) verify (to some extent) this completeness. In practice, 37 | since VCs are actually generated from the RTCs generated by the 38 | compiler, this completeness verification amounts to analyzing the RTCs 39 | inserted by the compiler in the abstract syntax tree produced by the 40 | GNAT compiler. 41 | 42 | %\subsection{} 43 | %Third, formal proofs of analyzers and certified compilers. 44 | Finally, one of our long-term goals is to provide infrastructure that can be 45 | leveraged in a variety of ways to support machine-verified proofs of 46 | correctness of SPARK 2014 static analysis and 47 | translations. To this end, we will build a translation framework 48 | from SPARK 2014 to Coq, which puts in place crucial infrastructure 49 | necessary for supporting formal proofs of SPARK analysis. Together 50 | with the formal semantics of SPARK, it provides the potential to 51 | connect to the Compcert \cite{Leroy:09} certified compiler framework. -------------------------------------------------------------------------------- /papers/hilt2013/related.tex: -------------------------------------------------------------------------------- 1 | \section{Related Work} 2 | Formal semantics were previously defined for SPARK Ada 83 in \cite{Marsh:Book94,Neil:Book94}. 3 | This definition includes both the static and the dynamic semantics of 4 | the language and rely on a precise notation inspired by the Z notation. 5 | Formalizing the full SPARK subset was clearly a challenging task and 6 | the result is indeed quite impressive: more than 500 pages were 7 | required for the complete set of rules. However, these semantics 8 | were not executable (it was only given on paper) and no tool was used 9 | to check the soundness of the definition. Moreover, no property was 10 | proved using these semantics, and more importantly, run-time checks 11 | were only considered as side conditions on semantics rules without 12 | being formally described. -------------------------------------------------------------------------------- /papers/hilt2013/sprmindx.sty: -------------------------------------------------------------------------------- 1 | delim_0 "\\idxquad " 2 | delim_1 "\\idxquad " 3 | delim_2 "\\idxquad " 4 | delim_n ",\\," 5 | -------------------------------------------------------------------------------- /papers/hilt2013/submission_confirmation.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AdaCore/sparkformal/51ed67be1b1d80f7f2681237dfbf4ee7add395d6/papers/hilt2013/submission_confirmation.pdf -------------------------------------------------------------------------------- /papers/paper2014/.texlipse: -------------------------------------------------------------------------------- 1 | #TeXlipse project settings 2 | #Mon Aug 04 10:02:20 CEST 2014 3 | markTmpDer=true 4 | builderNum=2 5 | outputDir= 6 | makeIndSty= 7 | bibrefDir= 8 | outputFormat=pdf 9 | tempDir=tmp 10 | mainTexFile=2014paper_main.tex 11 | outputFile=2014paper.pdf 12 | langSpell=en 13 | markDer=true 14 | srcDir= 15 | -------------------------------------------------------------------------------- /papers/paper2014/2014paper_main.bbl: -------------------------------------------------------------------------------- 1 | \begin{thebibliography}{1} 2 | 3 | \bibitem{sparkbook2012} 4 | Barnes, J.: 5 | \newblock {SPARK: The Proven Approach to High Integrity Software}. 6 | \newblock Altran Praxis (2012) 7 | 8 | \bibitem{oneill2012} 9 | O'Neill, I.: 10 | \newblock {SPARK} -- a language and tool-set for high-integrity software 11 | development. 12 | \newblock In: Industrial Use of Formal Methods: Formal Verification. 13 | \newblock Wiley (2012) 14 | 15 | \bibitem{ada2012rationale} 16 | Barnes, J.: 17 | \newblock {Ada 2012 Rationale}. 18 | \newblock (2012) 19 | 20 | \bibitem{ieeesoftware2013} 21 | Moy, Y., Ledinot, E., Delseny, H., Wiels, V., Monate, B.: 22 | \newblock Testing or formal verification: {DO­178C} alternatives and 23 | industrial experience. 24 | \newblock {IEEE} Software special issue on Safety­Critical Software Systems 25 | (2013) 26 | 27 | \end{thebibliography} 28 | -------------------------------------------------------------------------------- /papers/paper2014/2014paper_main.bib: -------------------------------------------------------------------------------- 1 | @Book{sparkbook2012, 2 | author = {J. Barnes}, 3 | title = {{SPARK: The Proven Approach to High Integrity Software}}, 4 | isbn = {9780957290501}, 5 | publisher = {Altran Praxis}, 6 | year = {2012} 7 | } 8 | 9 | @InCollection{oneill2012, 10 | author = {I. O'Neill}, 11 | booktitle = {Industrial Use of Formal Methods: Formal Verification}, 12 | title = {{SPARK} -- A Language and Tool-Set for High-Integrity Software Development}, 13 | isbn = {9781848213630}, 14 | publisher = {Wiley}, 15 | year = {2012} 16 | } 17 | 18 | @ARTICLE{ada2012rationale, 19 | author = {J. Barnes}, 20 | title = {{Ada 2012 Rationale}}, 21 | publisher = {AdaCore}, 22 | year = {2012} 23 | } 24 | 25 | @inproceedings{hilt2012, 26 | author = {Johannes Kanig and 27 | Edmond Schonberg and 28 | Claire Dross}, 29 | title = {{Hi-Lite}: the Convergence of Compiler Technology and Program 30 | Verification}, 31 | booktitle = {HILT}, 32 | year = {2012}, 33 | pages = {27-34}, 34 | ee = {http://doi.acm.org/10.1145/2402676.2402690}, 35 | crossref = {DBLP:conf/sigada/2012}, 36 | bibsource = {DBLP, http://dblp.uni-trier.de} 37 | } 38 | 39 | @standard{do178c, 40 | author = {RTCA}, 41 | organization = {Radio Technical Commission for Aeronautics (RTCA)}, 42 | title = {{DO-178C}: Software Considerations in Airborne Systems and Equipment 43 | Certification}, 44 | year = 2011 45 | } 46 | 47 | @article{ieeesoftware2013, 48 | author = {Yannick Moy and Emmanuel Ledinot and Hervé Delseny and Virginie Wiels and Benjamin Monate}, 49 | title = {Testing or Formal Verification: {DO­178C} Alternatives and Industrial Experience}, 50 | journal = {{IEEE} Software special issue on Safety­Critical Software Systems}, 51 | month = may, 52 | year = 2013 53 | } 54 | -------------------------------------------------------------------------------- /papers/paper2014/2014paper_main.tex: -------------------------------------------------------------------------------- 1 | \documentclass{llncs} 2 | %\usepackage{latex8} 3 | \usepackage{times} 4 | \usepackage{listings} 5 | \usepackage{amssymb,latexsym} 6 | \usepackage{times} 7 | \usepackage{graphicx} 8 | \usepackage{colortbl} 9 | \usepackage{color} 10 | \usepackage{multicol} 11 | \usepackage{multirow} 12 | %\usepackage{setspace} % can be used to controll space between lines 13 | 14 | %package for coq language in listings 15 | \usepackage{lstlangcoq} 16 | 17 | \usepackage{url} 18 | 19 | % remove to turn off page numbering 20 | \pagestyle{plain} 21 | 22 | \input{definitions} 23 | 24 | % \def\sharedaffiliation{% 25 | % \end{tabular} 26 | % \begin{tabular}{c}} 27 | 28 | %% space between bottom of "top" floats and text 29 | \addtolength{\textfloatsep}{-.5cm} 30 | 31 | %% allow more text on float pages 32 | \renewcommand{\topfraction}{.99} 33 | \renewcommand{\textfraction}{.001} 34 | % \renewcommand{\floatpagefraction}{.99} 35 | 36 | %\textheight=7.7in % could be \textheight=9.2in 37 | %\topmargin=-.2in 38 | %\textheight=8in 39 | 40 | %\topmargin=-.2in 41 | %\addtolength{\textheight}{.52in} 42 | 43 | 44 | %% define conditional for tech report 45 | 46 | \newif\iftechreport 47 | \techreportfalse 48 | 49 | %------------------------------------------------------------------------- 50 | % take the % away on next line to produce the final camera-ready version 51 | %\pagestyle{plain} 52 | 53 | %------------------------------------------------------------------------- 54 | \begin{document} 55 | 56 | \title{Who Will Check the SPARK Checkers?} 57 | 58 | % \numberofauthors{3} 59 | 60 | \author{Authors} 61 | %\author{P. Courtieu\inst{1}, V. APonte\inst{1}, T. Crolard\inst{1}, 62 | %Zhi Zhang\inst{2}, Robby\inst{2},\\ 63 | % J. Belt\inst{2}, J. Hatcliff\inst{2}, J. Guitton\inst{3}, T. Jennings\inst{4}} 64 | 65 | \institute{ 66 | Kansas State University 67 | \and 68 | CNAM 69 | \and 70 | AdaCore 71 | } 72 | 73 | \maketitle 74 | 75 | %\thispagestyle{empty} 76 | 77 | % \setlength{\parskip}{8pt plus 1pt minus 1pt} 78 | 79 | \input{abstract} 80 | \input{introduction} 81 | \input{overview} 82 | \input{spark-semantics} 83 | \input{run-time-checks} 84 | \input{evaluation} 85 | \input{related-work} 86 | \input{conclusion} 87 | 88 | \bibliographystyle{splncs} 89 | % \bibliographystyle{nature} 90 | % \bibliographystyle{abbrv} 91 | \bibliography{2014paper_main} 92 | 93 | \end{document} 94 | -------------------------------------------------------------------------------- /papers/paper2014/abstract.tex: -------------------------------------------------------------------------------- 1 | \begin{abstract} 2 | \vspace*{-.5cm} 3 | %\hspace{.5cm} 4 | SPARK 2014 is a subset of the Ada 2012 programming language amenable to formal 5 | verification, supported by the GNAT compilation toolchain. It is possible to 6 | use both industrial and academic tools that analyze SPARK 2014 programs to 7 | guarantee that a program does not raise run-time errors and respects its 8 | specification, mostly expressed as subprogram contracts. These analyses at 9 | source code level are valid for the final executable only if it can be shown 10 | that tools used for both compilation and verification comply with a common 11 | deterministic programming language semantics. 12 | 13 | In this paper, we present our work towards a formal semantics for SPARK 2014 14 | that is suitable for checking that both the compiler and analyzers correctly 15 | interpret critical parts of the program semantics. In particular, we are 16 | interested in checking that the compiler and analyzers agree on a correct 17 | placement of language defined checks, in order for a proved program to execute 18 | without run-time errors. 19 | 20 | To reach this goal, we share a common program representation between the 21 | compiler and analyzers, that we translate into the Coq proof assistant. We 22 | have formalized in Coq the dynamic semantics for a core subset of SPARK 2014, 23 | including the modeling of run-time checks, which is used to mechanically prove 24 | that checks are correctly placed in the intermediate representation. We have 25 | also proved the correctness of optimizations that eliminate some checks. 26 | \end{abstract} 27 | -------------------------------------------------------------------------------- /papers/paper2014/conclusion.tex: -------------------------------------------------------------------------------- 1 | % \vspace*{-.3cm} 2 | \section{Conclusions and Future Work} 3 | \label{sec:conclusion} 4 | % \vspace*{-.3cm} 5 | 6 | 7 | 8 | %\subsection{} 9 | %Third, formal proofs of analyzers and certified compilers. 10 | Finally, one of our long-term goals is to provide infrastructure that can be 11 | leveraged in a variety of ways to support machine-verified proofs of 12 | correctness of SPARK 2014 static analysis and 13 | translations. To this end, we will build a translation framework 14 | from SPARK 2014 to Coq, which puts in place crucial infrastructure 15 | necessary for supporting formal proofs of SPARK analysis. Together 16 | with the formal semantics of SPARK, it provides the potential to 17 | connect to the Compcert \cite{Leroy:09} certified compiler framework. 18 | -------------------------------------------------------------------------------- /papers/paper2014/evaluation.tex: -------------------------------------------------------------------------------- 1 | \section{Evaluation} 2 | \subsection{Run-Time Checks Generator Function} 3 | 4 | Check generator function for expression: 5 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 6 | Function compile2_flagged_exp_f (st: symboltable) 7 | (checkflags: check_flags) 8 | (e: expression): expression_x 9 | \end{lstlisting} 10 | 11 | \subsection{Application To SPARK 2014 Programs} 12 | -------------------------------------------------------------------------------- /papers/paper2014/introduction.tex: -------------------------------------------------------------------------------- 1 | \section{Introduction} 2 | 3 | \subsection{Background} 4 | 5 | SPARK is a subset of the Ada programming language targeted at safety- and 6 | security-critical applications. SPARK builds on the strengths of Ada for 7 | creating highly reliable and long-lived software. SPARK restrictions ensure 8 | that the behavior of a SPARK program is unambiguously defined, and simple 9 | enough that formal verification tools can perform an automatic diagnosis of 10 | conformance between a program specification and its implementation. The SPARK 11 | language and toolset for formal verification has been applied over many years 12 | to on-board aircraft systems, control systems, cryptographic systems, and rail 13 | systems~\cite{sparkbook2012,oneill2012}. The latest version, SPARK 2014, builds 14 | on the new specification features added in Ada 2012~\cite{ada2012rationale}, so 15 | formal specifications are now understood by the usual development tools and can 16 | be executed. The definition of the language subset is motivated by the 17 | simplicity and feasability of formal analysis and the need for an unambiguous 18 | semantics. 19 | 20 | Static analysis tools are available that provide flow analysis, symbolic 21 | execution and proof of SPARK programs. The academic tool Bakar 22 | Kiasan~\footnote{\url{http://www.sireum.org/index.html}} developed by Kansas 23 | State University allows executing symbolically a SPARK program with or without 24 | contracts, to detect possible run-time errors and contract violations, and in 25 | some cases also prove that no such errors can occur. The industrial tool 26 | GNATprove~\footnote{\url{http://www.adacore.com/sparkpro}} co-developed by 27 | Altran and AdaCore performs flow analysis to check correct access to data in 28 | the program (correct access to global variables as specified in data and 29 | information flow contracts and correct access to initialized data) and proof to 30 | demonstrate that the program is free from run-time errors and that the 31 | specified contracts are correctly implemented. 32 | 33 | \subsection{Motivations} 34 | 35 | A major reason for using SPARK for developing critical software is the ability 36 | to prove statically that no run-time error, such as arithmetic overflow, buffer 37 | overflow and division-by-zero, can occur. Besides the additional confidence in 38 | the software that this result brings, it can be used in some certification 39 | domains to lower the verification effort in some other areas like testing. For 40 | example, the most recent version DO-178C of the avionics certification standard 41 | allow using both tests or proofs as acceptable verification 42 | methods~\cite{ieeesoftware2013}. It is also commonly used as an argument to 43 | justify the suppression of run-time checks in the final executable, typically 44 | for getting an increase in execution speed. 45 | 46 | The reasonings above rely on the fact that a program proved free of run-time 47 | errors cannot fail a run-time check during execution. This depends on the 48 | correctness of the compiler and analyzers used. Although correctness is not 49 | typically proved for tools used in practice on industrial projects, a special 50 | certification activity aims at giving sufficient confidence that the tools 51 | behave correctly, a process known as tool qualification. 52 | 53 | A critical element for the qualification of both the GNAT compiler and the 54 | GNATprove analyzer that we develop at AdaCore is that they correctly interpret 55 | the semantics of SPARK with respect to the placement of checks. The compiler 56 | works by producing first a semantically analyzed abstract syntax tree (AST) of 57 | the program, decorated with flags that point to locations where run-time checks 58 | should be inserted. This AST is then expanded into a lower level representation 59 | with explicit run-time checking code. As the analysis of GNATprove is based off 60 | the same AST used for compilation (using the same compilation 61 | \textit{frontend}), we initially chose to depend on check flags for proof. But 62 | we quickly realized that some flags were missing in the AST, corresponding to 63 | cases where the compiler expansion phase did not depend on these flags for 64 | correct code generation. We temporarily switched to ignoring check flags 65 | completely, which proved unpractical for several reasons: it duplicates the 66 | work already done in the compiler in all analyzers (ours, and as mentioned 67 | below those of our partners), which is both costly and error-prone, and unless 68 | analyzers re-implement the optimizations of the compiler, it leads to useless 69 | checks during analysis, which increases analysis time and memory footprint. For 70 | these reasons, we have since rebased analysis on check flags, after having 71 | precisely identified which check flags were supposed to be set by the frontend, 72 | and modified the frontend to implement this requirement. 73 | 74 | Since that last change, the compiler and analyzers all share the AST produced 75 | by the frontend, decorated with check flags that point to locations where 76 | run-time checks should be inserted. But we have discovered in various occasions 77 | cases where flags were missing, which lead to a correction of the frontend. The 78 | last such occasion was the recent implementation of a Tetris in SPARK for a 79 | demo at a customer gathering: after proving that the program was free of 80 | run-time errors, the first test on the actual board stopped unexpectedly due to 81 | a range check failing during execution. There was indeed a possible check 82 | failure in the code (later corrected) on a new attribute recently introduced in 83 | SPARK, which was not detected during proof because the corresponding check flag 84 | was not set by the frontend. 85 | 86 | Thus, it is of critical importance to be able to guarantee that all check flags 87 | are set on the AST produced by the frontend, as defined in SPARK language 88 | semantics. This was the main motivation for the work described here. 89 | 90 | \subsection{Contributions} 91 | The major contributions of this paper are: 92 | \begin{itemize} 93 | \item The formalization of the dynamic semantics for a core subset of the SPARK 94 | 2014 language in the Coq proof assistant, including scalar subtypes and 95 | derived types, array types, record types, calls, and locally defined 96 | subprograms. 97 | \item The formalization of a subset of the run-time checks: overflow checks, 98 | range checks, array index checks and division by zero checks; and the proof 99 | of their completeness: if language-defined checks do not fail, a program 100 | cannot go wrong according to its formal semantics. 101 | \item The proof that existing frontend optimizations that suppress checks only 102 | do so for checks that cannot fail. 103 | \item The implementation of a tool translating SPARK programs into Coq, and 104 | automatically checking whether the check flags in the AST produced by the 105 | GNAT frontend are sufficient to guard against the program going wrong. 106 | \item Experiments where the above tool was applied to small SPARK programs, 107 | showing that the check flags in the AST were indeed as expected in the 108 | language semantics. 109 | \end{itemize} 110 | -------------------------------------------------------------------------------- /papers/paper2014/llncsdoc.sty: -------------------------------------------------------------------------------- 1 | % This is LLNCSDOC.STY the modification of the 2 | % LLNCS class file for the documentation of 3 | % the class itself. 4 | % 5 | \def\AmS{{\protect\usefont{OMS}{cmsy}{m}{n}% 6 | A\kern-.1667em\lower.5ex\hbox{M}\kern-.125emS}} 7 | \def\AmSTeX{{\protect\AmS-\protect\TeX}} 8 | % 9 | \def\ps@myheadings{\let\@mkboth\@gobbletwo 10 | \def\@oddhead{\hbox{}\hfil\small\rm\rightmark 11 | \qquad\thepage}% 12 | \def\@oddfoot{}\def\@evenhead{\small\rm\thepage\qquad 13 | \leftmark\hfil}% 14 | \def\@evenfoot{}\def\sectionmark##1{}\def\subsectionmark##1{}} 15 | \ps@myheadings 16 | % 17 | \setcounter{tocdepth}{2} 18 | % 19 | \renewcommand{\labelitemi}{--} 20 | \newenvironment{alpherate}% 21 | {\renewcommand{\labelenumi}{\alph{enumi})}\begin{enumerate}}% 22 | {\end{enumerate}\renewcommand{\labelenumi}{enumi}} 23 | % 24 | \def\bibauthoryear{\begingroup 25 | \def\thebibliography##1{\section*{References}% 26 | \small\list{}{\settowidth\labelwidth{}\leftmargin\parindent 27 | \itemindent=-\parindent 28 | \labelsep=\z@ 29 | \usecounter{enumi}}% 30 | \def\newblock{\hskip .11em plus .33em minus -.07em}% 31 | \sloppy 32 | \sfcode`\.=1000\relax}% 33 | \def\@cite##1{##1}% 34 | \def\@lbibitem[##1]##2{\item[]\if@filesw 35 | {\def\protect####1{\string ####1\space}\immediate 36 | \write\@auxout{\string\bibcite{##2}{##1}}}\fi\ignorespaces}% 37 | \begin{thebibliography}{} 38 | \bibitem[1982]{clar:eke3} Clarke, F., Ekeland, I.: Nonlinear 39 | oscillations and boundary-value problems for Hamiltonian systems. 40 | Arch. Rat. Mech. Anal. {\bf 78} (1982) 315--333 41 | \end{thebibliography} 42 | \endgroup} 43 | -------------------------------------------------------------------------------- /papers/paper2014/lstlangcoq.sty: -------------------------------------------------------------------------------- 1 | %% 2 | %% Coq definition (c) 2001 Guillaume Dufay 3 | %% 4 | %% with some modifications by J. Charles (2005) 5 | %% and hacks by Andrew Appel (2011) 6 | %% 7 | \lstdefinelanguage{Coq}% 8 | {morekeywords={Variable,Inductive,CoInductive,Fixpoint,CoFixpoint,% 9 | Definition,Lemma,Theorem,Axiom,Local,Save,Grammar,Syntax,Intro,% 10 | Trivial,Qed,Intros,Symmetry,Simpl,Rewrite,Apply,Elim,Assumption,% 11 | Left,Cut,Case,Auto,Unfold,Exact,Right,Hypothesis,Pattern,Destruct,% 12 | Constructor,Defined,Fix,Record,Proof,Induction,Hints,Exists,let,in,% 13 | Parameter,Split,Red,Reflexivity,Transitivity,if,then,else,Opaque,% 14 | Transparent,Inversion,Absurd,Generalize,Mutual,Cases,of,end,Analyze,% 15 | AutoRewrite,Functional,Scheme,params,Refine,using,Discriminate,Try,% 16 | Require,Load,Import,Scope,Set,Open,Section,End,match,with,Ltac, %, exists, forall 17 | Declare,Instance 18 | },% 19 | sensitive, % 20 | morecomment=[n]{(*}{*)},% 21 | morestring=[d]",% 22 | literate={=>}{{$\Rightarrow$\ }}1 {>->}{{$\rightarrowtail$}}2{->}{{$\to$\ }}1 23 | {<->}{$\leftrightarrow$}1 24 | {forall}{$\forall\,$}1 25 | {forallb}{forallb}7 26 | {exists}{$\exists\,$}1 27 | {existsb}{existsb}7 28 | {existsv}{existsv}7 29 | {\/\\}{{$\wedge$\ }}1 30 | {|-}{{$\vdash$\ }}1 31 | {\\\/}{{$\vee$\ }}1 32 | {~}{{$\sim$}}1 33 | %{<>}{{$\neq$}}1 indeed... no. 34 | }[keywords,comments,strings]% 35 | -------------------------------------------------------------------------------- /papers/paper2014/overview.tex: -------------------------------------------------------------------------------- 1 | \section{Overview} 2 | In the long path through the definition of complete semantics for SPARK 2014, a 3 | very important step is to build a tool chain allowing its application in formal 4 | verification and proof on real SPARK 2014 programs. It also makes it possible to 5 | integrate our SPARK formalization work into GnatProve toolchain to serve as 6 | a soundness verification tool for run-time check flags generated by Gnat front 7 | end. 8 | 9 | \paragraph{SPARK Translation Toolchain} 10 | In the front end of this tool chain, Gnat2XML, developed by AdaCore, translates 11 | SPARK programs to a fully resolved XML abstract syntax tree (AST) representation 12 | with an accompanying XML schema. As part of the Sireum analysis framework [5], 13 | we have furtherly developed a tool called Jago [4] that translates XML 14 | representation of the GNAT compiler's ASTs into a Scala-based representation in 15 | Sireum. This open-source framework enables one to build code translators and 16 | analysis tools for SPARK 2014 in Scala. Scala's blending of functional and 17 | object-oriented program styles have proven quite useful in other contexts for 18 | syntax tree manipulation and analysis. Integrated into Jago are two kinds of 19 | translations: (1) type translation to translate Gnat2XML-generated XML schema to 20 | inductive type definition in Coq; (2) program translation to translate 21 | Gnat2XML-generated XML AST representation into Coq based representations. 22 | 23 | \paragraph{Formalization and Proof in Coq} 24 | With Coq inductive type definition for SPARK AST syntax produced by Jago type 25 | translator, formal semantics encoding run-time checks for SPARK has 26 | been developed within Coq, which is refered as SPARK reference semantics. 27 | Besides, a formal semantics for SPARK AST extended with run-time check flags 28 | are defined, where run-time checks are performed only if the appropriate check 29 | flags are set for the operations. And an AST translator from a SPARK AST to a 30 | run-time check flagged AST is provided and proved correct with respect to the 31 | SPARK reference semantics. 32 | 33 | \paragraph{Run-Time Checks Verification} 34 | In GnatProve tool chain, the run-time check flags set by GNAT frontend for SPARK 35 | AST will trigger the corresponding checks verification in its back end by 36 | applying formal verification methods. It happens that some formally proved SPARK 37 | programs get into run-time errors because of missing or misplaced run-time check 38 | flags in SPARK AST. For example, at AdaCore, they developed a small Tetris 39 | program in SPARK for a demonstration of SPARK in GNAT Pro Industrial Day. But 40 | the program runs into a constraint error even though it has been proved to be 41 | free of run-time errors because of a missing range check on an expression 42 | inside a 'Update attribute reference. 43 | To verify the completeness of run-time check flags in SPARK AST as required by 44 | SPARK semantics, a formally proved run-time checks generator and verification 45 | tool are developed. For easy use of run-time checks verification, any incorrect 46 | checks on SPARK AST will be mapped back to the SPARK source code. 47 | 48 | \paragraph{Run-Time Checks Optimization} 49 | Instead of putting a full list of run-time check flags in SPARK ASTs, GNAT 50 | frontend is smart enough to remove those unnecessary run-time cheks by 51 | performing some simple optimizations. To verify the correctness of these checks 52 | optimizations, we haved formalized some simple checks optimization rules in Coq 53 | and have proved their soundess with respect to the SPARK reference semantics. 54 | 55 | -------------------------------------------------------------------------------- /papers/paper2014/related-work.tex: -------------------------------------------------------------------------------- 1 | \section{Related Work} 2 | \label{sec:related-work} 3 | % \vspace*{-.3cm} 4 | Formal semantics were previously defined for SPARK Ada 83 in \cite{Marsh:Book94,Neil:Book94}. 5 | This definition includes both the static and the dynamic semantics of 6 | the language and rely on a precise notation inspired by the Z notation. 7 | Formalizing the full SPARK subset was clearly a challenging task and 8 | the result is indeed quite impressive: more than 500 pages were 9 | required for the complete set of rules. However, these semantics 10 | were not executable (it was only given on paper) and no tool was used 11 | to check the soundness of the definition. Moreover, no property was 12 | proved using these semantics, and more importantly, run-time checks 13 | were only considered as side conditions on semantics rules without 14 | being formally described. 15 | 16 | 17 | -------------------------------------------------------------------------------- /papers/paper2014/run-time-checks.tex: -------------------------------------------------------------------------------- 1 | \section{Run-Time Checks Verification} 2 | In SPARK GNATProve tool chain, formal verification for the absence of run-time 3 | errors for SPARK program relies on the run-time check flags that are initially 4 | generated and inserted to SPARK AST by GNAT front end. GNAT front end is 5 | expected to place the correct run-time check flags during its static semantic 6 | analysis for SPARK AST. But the fact is that GNAT front end itself is not 7 | formallly verified, and that's why it's often the case that run-time check flags 8 | are missing or misplaced in SPARK AST. It leads to the problem that a SPARK 9 | program proved to be free of run-time errors can still gets into a run-time 10 | error status because of the missing or incorrect checks placed by GNAT front 11 | end. So it's meaningful to formally verify the run-time check flags with a 12 | formal verification way to make the GNATProve tool chain sound. 13 | 14 | To verify the run-time check flags, a check-flag-annotated SPARK language 15 | and its corresponding semantics are formally defined, where each expression and 16 | subexpression nodes are attached with a set of run-time check flags. Then a 17 | run-time check generator simulating GNAT front end is defined and formally 18 | proved correct with respect to the SPARK reference semantics. It formalizes the 19 | run-time check generation procedure and generates run-time check flags according 20 | to the SPARK semantics by transforming a SPARK program into a 21 | check-flag-annotated SPARK program. Finally, the run-time check flags generated 22 | by GNAT front end are verified against our formally proved run-time check flags. 23 | 24 | \subsection{Check-Flag-Annotated SPARK Language} 25 | \subsubsection{Syntax} 26 | Check-flag-annotated SPARK language is the same as the SPARK language except 27 | that each expression node is annotated with a set of run-time check flags 28 | \textit{check\_flags}, which denotes explicitly what kinds of run-time checks 29 | need to be verified during expression evaluation on the annotated expression. 30 | 31 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\scriptsize] 32 | Inductive expr_x: Type := 33 | | Name_X: astnum -> name_x -> check_flags -> expr_x 34 | | ... 35 | with name_x: Type := 36 | | Indexed_Component_X: astnum -> astnum -> idnum -> expr_x -> check_flags -> name_x 37 | | ... 38 | \end{lstlisting} 39 | 40 | \subsubsection{Semantics} 41 | SPARK reference semantics is formalized with run-time checks being always 42 | performed according to the run-time check requirements by SPARK reference 43 | manual. While in the check-flag-annotated SPARK semantics, run-time checks are 44 | triggered to be performed only if the corresponding check flags are set for the 45 | attached expression node. For example, in the following name evaluation for 46 | check-flag-annotated indexed component, range check is required only if the 47 | \textit{Do\_Range\_Check} flag is set for the index expression \textit{e}, 48 | otherwise, the index value \textit{i} is used directly as array indexing without 49 | going through range check procedure \textit{do\_range\_check}. 50 | 51 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\scriptsize] 52 | Inducitve eval_name_x: symboltable_x -> stack -> name_x -> Return value -> Prop := 53 | | Eval_Indexed_Component_X: forall e cks1 cks2 st s i x_ast_num t l u x a v ast_num, 54 | exp_check_flags e = cks1 ++ Do_Range_Check :: cks2 -> 55 | eval_expr_x st s (update_check_flags e (cks1++cks2)) (Normal (BasicV (Int i))) -> 56 | fetch_exp_type_x x_ast_num st = Some (Array_Type t) -> 57 | extract_array_index_range_x st t (Range_X l u) -> 58 | do_range_check i l u Success -> 59 | fetchG x s = Some (AggregateV (ArrayV a)) -> 60 | array_select a i = Some v -> 61 | eval_name_x st s (Indexed_Component_X ast_num x_ast_num x e nil) (Normal (BasicV v)) 62 | | ... 63 | \end{lstlisting} 64 | 65 | \subsection{Run-Time Checks Generator} 66 | \subsubsection{Check Generator} 67 | Run-time check generator is a translator from a SPARK program to a 68 | check-flag-annotated SPARK program by generating run-time check flags according 69 | to the run-time checking rules required by SPARK reference manual and inserting 70 | these check flags at the corresponding AST node. In expression check generator 71 | \textit{compile2\_flagged\_exp}, \textit{check\_flags} denote the run-time 72 | checks on the expression required by its context, such as range check for 73 | expression used in indexed component, and other expression check flags are 74 | generated according to the operation type to be performed by the expression. 75 | 76 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 77 | Inductive compile2_flagged_exp: symboltable -> check_flags -> 78 | expression -> expression_x -> Props 79 | \end{lstlisting} 80 | 81 | \subsubsection{Soundness Proof} 82 | Run-time check generator is proved sound with respect to the SPARK reference 83 | semantics and check-flag-annotated SPARK semantics. For an expression 84 | \textit{e}, if it's evaluated to some value \textit{v} in state \textit{s} by 85 | SPARK reference semantic evaluator \textit{eval\_expr}, and \textit{e'} is the 86 | check-flag-annotated expression generated from \textit{e} by expression check 87 | generator \textit{compile2\_flagged\_exp}, then \textit{e'} should be evaluated 88 | to the same value \textit{v} in check-flag-annotated SPARK semantic evaluator 89 | \textit{eval\_expr\_x}. Similar soundness proof has been done for statement 90 | check generator. 91 | 92 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 93 | Lemma expression_checks_soundness: forall e e' st st' s v, 94 | eval_expr st s e v -> 95 | compile2_flagged_exp st nil e e' -> 96 | compile2_flagged_symbol_table st st' -> 97 | eval_expr_x st' s e' v. 98 | 99 | Lemma statement_checks_soundness: forall st s stmt s' stmt' st', 100 | eval_stmt st s stmt s' -> 101 | compile2_flagged_stmt st stmt stmt' -> 102 | compile2_flagged_symbol_table st st' -> 103 | eval_stmt_x st' s stmt' s'. 104 | \end{lstlisting} 105 | 106 | \subsection{Run-Time Checks Optimization} 107 | \subsubsection{Optimization Strategy} 108 | We have formalized some simple but helpful optimizations for literal operations 109 | and remove those checks that can be obviously verified at compilation time, 110 | which is also the optimization strategy taken by the GNAT front end. 111 | 112 | \subsubsection{Soundness Proof} 113 | The idea to prove the correctness of these optimizations is to prove that SPARK 114 | program executions with optimized run-time checks behave exactly the same as 115 | those following the SPARK reference semantics, which perform systematically all 116 | the checks. 117 | 118 | \subsection{Run-Time Checks Verification} 119 | One of the major goals of our formalization work for SPARK language is 120 | to verify the completeness of run-time check flags produced by GNAT front 121 | end. It's done by comparing the GNAT generated run-time check flags with the 122 | expected ones provided by formally verified check generator with respect to the 123 | SPARK semantics. Run-time check flags are verified to be correct if they are 124 | superset of the expected ones required by the SPARK reference manual. 125 | 126 | 127 | 128 | 129 | 130 | 131 | 132 | 133 | 134 | 135 | 136 | -------------------------------------------------------------------------------- /papers/paper2014/spark-semantics.tex: -------------------------------------------------------------------------------- 1 | \section{SPARK Formalization} 2 | \subsection{Representation of SPARK 2014 Subset} 3 | Towards the formalization of SPARK 2014, we have formalized a significant subset 4 | of it, including array, record (non-nested) and procedure calls, plus some 5 | intersting SPARK (Ada) language structures, such as nested procedures and 6 | subtypes. 7 | 8 | SPARK 2014 subset language is represented as inductive type definitions in Coq, 9 | and each constructor of the type definitions is annotated with an unique AST 10 | number. The AST numbers are useful in the following semantic formalization and 11 | run-time check verification for SPARK language. They can be used to record the 12 | type for each expression and subexpression, to track back to the SPARK source 13 | program when an run-time error is detected, and locate the position in 14 | SPARK source program where the run-time check flags inserted by GNAT front end 15 | are incorrect. 16 | 17 | Here, we list some of SPARK language structures and show how we formalize them 18 | in Coq. Expression (\textit{expr}) can be literal, unary expression, binary 19 | expression or name, and each expression is annotated with an AST number 20 | (\textit{astnum}), which is represented by natural number. For type 21 | \textit{name}, it can be identifier, indexed component or selected component. 22 | Indexed component is constructed with the constructor 23 | \textit{Indexed\_Component}, whose first \textit{astnum} denotes the indexed 24 | component and the second \textit{astnum} denotes the prefix expression 25 | represented by \textit{idnum} and \textit{expr} is for index expression. 26 | 27 | %\begin{quote} 28 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 29 | Inductive expr: Type := 30 | | Name: astnum -> name -> expr 31 | | ... 32 | with name: Type := 33 | | Identifier: astnum -> idnum -> name 34 | | Indexed_Component: astnum -> astnum -> idnum -> expr -> name 35 | | Selected_Component: astnum -> astnum -> idnum -> idnum -> name. 36 | \end{lstlisting} 37 | 38 | For procedure \textit{Call} in statement \textit{stmt}, its first 39 | \textit{astnum} is the AST number for the procedure call statement, and the 40 | second \textit{astnum} is the AST number for the called procedure represented by 41 | \textit{procnum} followed by a list of arguments of type \textit{list expr}. 42 | 43 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 44 | Inductive stmt: Type := 45 | | Assignment: astnum -> name -> expr -> stmt 46 | | Call: astnum -> astnum -> procnum -> list expr -> stmt 47 | | ... 48 | \end{lstlisting} 49 | 50 | Range constrainted scalar types are useful in SPARK programs, and they can be 51 | declared with either subtype declaration, derived type definition, or integer 52 | type definition. 53 | Constructor \textit{Subtype} declares a new subtype named \textit{typenum} 54 | of some previously declared \textit{type} with some new \textit{range} 55 | constraint (e.g. subtype T is Integer range 1 .. 10). 56 | Constructor \textit{Derived\_Type} defines a new derived type named 57 | \textit{typenum} with all its characteristics derived from its parent 58 | \textit{type} plus an additional \textit{range} constraint (e.g. type U is new 59 | Integer range 1 .. 10). 60 | Constructor \textit{Integer\_Type} defines a new integer type named 61 | \textit{typenum} with an additional \textit{range} constraint (e.g. type W is 62 | range 0 .. 10). 63 | \textit{Array\_Type} and \textit{Record\_Type} are constructors for declaring 64 | new array type and new record type. 65 | 66 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 67 | Inductive type_decl: Type := 68 | | Subtype: astnum -> typenum -> type -> range -> type_decl 69 | | Derived_Type: astnum -> typenum -> type -> range -> type_decl 70 | | Integer_Type: astnum -> typenum -> range -> type_decl 71 | | Array_Type: astnum -> typenum -> type -> type -> type_decl 72 | | Record_Type: astnum -> typenum -> list (idnum*type) -> type_decl. 73 | \end{lstlisting} 74 | 75 | 76 | \subsection{Run-Time Check Flags} 77 | In SPARK, various run-time check flags are inserted at SPARK AST by the GNAT 78 | front end during its static semantic analysis. Later, these check flags will 79 | trigger the corresponding run-time checks in the GNAT back end which are 80 | discharged by formally verifying their verification conditions with the GNATProve 81 | tool chain. That's how SPARK guarantees the absence of run-time errors for 82 | developing high integrity (both safety critical and security critical) 83 | softwares. 84 | 85 | For the current SPARK 2014 subset that we are working on, all possible run-time 86 | checks as required by the language semantics are shown in the following list, 87 | which are enforced on the expression nodes. 88 | \begin{itemize} 89 | \item 90 | Do\_Overflow\_Check: This flag is set on an operator where its operation may 91 | cause overflow, such as binary operators \textit{(+, -, *, /)}, unary operator 92 | \textit{(-)} and type conversion from one base type to another when the value 93 | of source base type falls out of domain of the target base type. 94 | \item 95 | Do\_Division\_Check: This flag is set on division operators, such as 96 | \textit{(/, mod, rem)}, to indicate a zero divide check. 97 | \item 98 | Do\_Range\_Check: This flag is set on an expression which appears in a 99 | context where range check is required, such as right hand side of an 100 | assignment, subscript expression in an indexed component, argument expressions 101 | for a procedure call and initialization value expression for an object 102 | declaration. 103 | \end{itemize} 104 | 105 | \subsection{Semantic Formalization With Run-Time Checks} 106 | A major semantic difference between SPARK and other programming languages is 107 | that verification for absence of run-time errors are required by the semantics 108 | of the language itself. 109 | So in our semantic formalization for SPARK language, run-time checks is an 110 | important integrant and they are always performed at appropriate points during 111 | the language semantic evaluation. The program will be terminated with a run-time 112 | error message once any of its run-time checks fails during the program 113 | evaluation. 114 | 115 | \subsubsection{Value} 116 | In SPARK semantics, the resulting value of an expression evaluation can be 117 | either a normal value (basic or aggregate value) or a run-time error status 118 | detected during expression evaluation. 119 | Similarly, for a well-formed SPARK program, it should either terminate in a 120 | normal state or a detected run-time error, which is expected to be detected and 121 | raised during program execution. 122 | 123 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 124 | Inductive Return (A: Type): Type := 125 | | Normal: A -> Return A 126 | | Run_Time_Error: error_type -> Return A. 127 | \end{lstlisting} 128 | 129 | \subsubsection{Run-Time Check Evaluation} 130 | A significant subset of SPARK run-time checks are formalized in Coq, 131 | including overflow check, division check and range check. Overflow checks are 132 | performed to check that the result of a given orithmetic operation is within the 133 | bounds of the base type, division checks are performed to prevent divide being 134 | zero, and range checks are performed to check that the evaluation value of an 135 | expression is within bounds of its target type with respect to the context where 136 | it appears. A small fragment for overflow check formalization in Coq is: 137 | %\begin{quote} 138 | \begin{lstlisting}[escapechar = \#, language=coq, basicstyle=\small] 139 | Inductive overflow_check_bin:binop -> value -> value -> status -> Prop := 140 | | Do_Overflow_Check_On_Binops: forall op v1 v2 v, 141 | op = Plus #$\vee$# op = Minus #$\vee$# op = Multiply #$\vee$# op = Divide -> 142 | Val.binary_operation op v1 v2 = Some (BasicV (Int v)) -> 143 | (Zge_bool v min_signed) && (Zle_bool v max_signed) = true -> 144 | overflow_check_bin op v1 v2 Success 145 | | ... 146 | \end{lstlisting} 147 | %\end{quote} 148 | Now we only model the 32-bit singed integer for SPARK program, where Coq integer 149 | (Z) is used to represent this integer value with a range bound between \textit{min\_signed} 150 | and \textit{max\_signed}. This integer range constraint is enforced through 151 | the above overflow check semantics when we define the semantics for the 152 | language. As we can see, overflow checks are required only for binary operators 153 | \textit{(+, -, *, /)} among the set of binary operators in our formalized SPARK 154 | subset. And it returns either \textit{Success} or \textit{Exception} with 155 | overflow signal. 156 | 157 | \subsubsection{Expression Evaluation} 158 | In an expression evaluation, for an arithmetical operation, run-time checks are 159 | always performed according to the checking rules required for the arithmetical 160 | operators in SPARK reference manual, and a run-time error returns whenever the 161 | check fails, otherwise, a normal operation result is returned. Further checks on 162 | the normal result value maybe required depending on the context where the 163 | expression appears. One such example is that range check should be performed on 164 | the value of the index expression before it can be used as an index for an 165 | indexed component. 166 | 167 | The following is a snippet of how the expression evaluation is formalized in Coq 168 | with run-time checks enforced during its semantics evaluation. For a binary 169 | expression (Binop ast\_num op e1 op e2), if both e1 and e2 are evaluated to some 170 | normal values, then all necessary run-time checks required for the operator 171 | \textit{op} are performed, e.g. overflow check for \textit{+} and both overflow 172 | check and division check for \textit{/}, and a normal binary operation result is 173 | returned when the checks succeed. In name evaluation for indexed component, an 174 | additional range check is required to be performed according to the index type 175 | of the array, which is fetched from a preconstructed symbol table. 176 | 177 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 178 | Inductive eval_expr:symboltable -> stack -> expr -> Return value -> Prop := 179 | | Eval_Binop: forall st s e1 v1 e2 v2 ast_num op v, 180 | eval_expr st s e1 (Normal v1) -> 181 | eval_expr st s e2 (Normal v2) -> 182 | do_run_time_check_on_binop op v1 v2 Success -> 183 | Val.binary_operation op v1 v2 = Some v -> 184 | eval_expr st s (Binop ast_num op e1 e2) (Normal v) 185 | | ... 186 | with eval_name: symboltable -> stack -> name -> Return value -> Prop := 187 | | Eval_Indexed_Component_RTE: forall st s e msg ast_num x_ast_num x, 188 | eval_expr st s e (Run_Time_Error msg) -> 189 | eval_name st s (Indexed_Component ast_num x_ast_num x e) 190 | (Run_Time_Error msg) 191 | | Eval_Indexed_Component: forall st s e i x_ast_num t l u x a v ast_num, 192 | eval_expr st s e (Normal (BasicV (Int i))) -> 193 | fetch_exp_type x_ast_num st = Some (Array_Type t) -> 194 | extract_array_index_range st t (Range l u) -> 195 | do_range_check i l u Success -> 196 | fetchG x s = Some (AggregateV (ArrayV a)) -> 197 | array_select a i = Some v -> 198 | eval_name st s (Indexed_Component ast_num x_ast_num x e) 199 | (Normal (BasicV v)) 200 | | ... 201 | \end{lstlisting} 202 | 203 | \subsubsection{Statement Evaluation} 204 | In the context of statement evaluation, range checks will be enforced during 205 | statement evaluation for both assignments and procedure calls. For the case of 206 | assignment evaluation, range check for its right hand side expression is 207 | enforced if the target type of its left hand side is some range 208 | constrainted type. 209 | For the case of procedure calls, range checks may be required for both input 210 | arguments and output parameters if the types of input parameters and output 211 | argumetns are range constrainted types. 212 | 213 | For a normal assignment evaluation, first evaluate its right hand side 214 | expression \textit{e}, if it returns a normal value, then fetch the type of 215 | its left hand side name \textit{x}, perform a range check before updating 216 | its value if it's a range constrainted type. 217 | 218 | \begin{lstlisting}[escapechar=\#, language=coq, basicstyle=\small] 219 | Inductive eval_stmt:symboltable -> stack -> stmt -> Return stack -> Prop := 220 | | Eval_Assignment: forall st s e v x t l u s1 ast_num, 221 | eval_expr st s e (Normal (BasicV (Int v))) -> 222 | fetch_exp_type (name_astnum x) st = Some t -> 223 | extract_subtype_range st t (Range l u) -> 224 | do_range_check v l u Success -> 225 | storeUpdate st s x (BasicV (Int v)) s1 -> 226 | eval_stmt st s (Assignment ast_num x e) s1 227 | | ... 228 | \end{lstlisting} 229 | 230 | \subsubsection{Declaration Evaluation} 231 | For an object declaration, range check is required for the value of its 232 | initialization expression if its declared type is a range constrainted type. 233 | Type declaration and procedure declaration should have no effect on the final 234 | stack. 235 | 236 | 237 | 238 | 239 | 240 | 241 | 242 | 243 | 244 | 245 | 246 | -------------------------------------------------------------------------------- /spark2014_semantics/.gitignore: -------------------------------------------------------------------------------- 1 | *.vo 2 | *.vos 3 | *.vok 4 | *.glob 5 | *.v.d 6 | *.native 7 | *.cmx 8 | *.cmi 9 | .Makefile.d 10 | .lia.cache 11 | -------------------------------------------------------------------------------- /spark2014_semantics/src/CpdtTactics.v: -------------------------------------------------------------------------------- 1 | (** 2 | ~~~~~~~~~~~ 3 | BSD LICENSE 4 | ~~~~~~~~~~~ 5 | 6 | Copyright (c) 2006-2013, Adam Chlipala 7 | All rights reserved. 8 | 9 | Redistribution and use in source and binary forms, with or without 10 | modification, are permitted provided that the following conditions are met: 11 | 12 | - Redistributions of source code must retain the above copyright notice, 13 | this list of conditions and the following disclaimer. 14 | - Redistributions in binary form must reproduce the above copyright notice, 15 | this list of conditions and the following disclaimer in the documentation 16 | and/or other materials provided with the distribution. 17 | - The names of contributors may not be used to endorse or promote products 18 | derived from this software without specific prior written permission. 19 | 20 | THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" 21 | AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE 22 | IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE 23 | ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE 24 | LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR 25 | CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF 26 | SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS 27 | INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN 28 | CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) 29 | ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE 30 | POSSIBILITY OF SUCH DAMAGE. 31 | *) 32 | 33 | Require Import Eqdep List. 34 | 35 | Require Export Lia. 36 | 37 | Set Implicit Arguments. 38 | 39 | 40 | (** A version of [injection] that does some standard simplifications afterward: clear the hypothesis in question, bring the new facts above the double line, and attempt substitution for known variables. *) 41 | Ltac inject H := injection H; clear H; intros; try subst. 42 | 43 | (** Try calling tactic function [f] on all hypotheses, keeping the first application that doesn't fail. *) 44 | Ltac appHyps f := 45 | match goal with 46 | | [ H : _ |- _ ] => f H 47 | end. 48 | 49 | (** Succeed iff [x] is in the list [ls], represented with left-associated nested tuples. *) 50 | Ltac inList x ls := 51 | match ls with 52 | | x => idtac 53 | | (_, x) => idtac 54 | | (?LS, _) => inList x LS 55 | end. 56 | 57 | (** Try calling tactic function [f] on every element of tupled list [ls], keeping the first call not to fail. *) 58 | Ltac app f ls := 59 | match ls with 60 | | (?LS, ?X) => f X || app f LS || fail 1 61 | | _ => f ls 62 | end. 63 | 64 | (** Run [f] on every element of [ls], not just the first that doesn't fail. *) 65 | Ltac all f ls := 66 | match ls with 67 | | (?LS, ?X) => f X; all f LS 68 | | (_, _) => fail 1 69 | | _ => f ls 70 | end. 71 | 72 | (** Workhorse tactic to simplify hypotheses for a variety of proofs. 73 | Argument [invOne] is a tuple-list of predicates for which we always do inversion automatically. *) 74 | Ltac simplHyp invOne := 75 | (** Helper function to do inversion on certain hypotheses, where [H] is the hypothesis and [F] its head symbol *) 76 | let invert H F := 77 | (** We only proceed for those predicates in [invOne]. *) 78 | inList F invOne; 79 | (** This case covers an inversion that succeeds immediately, meaning no constructors of [F] applied. *) 80 | (inversion H; fail) 81 | (** Otherwise, we only proceed if inversion eliminates all but one constructor case. *) 82 | || (inversion H; [idtac]; clear H; try subst) in 83 | 84 | match goal with 85 | (** Eliminate all existential hypotheses. *) 86 | | [ H : ex _ |- _ ] => destruct H 87 | 88 | (** Find opportunities to take advantage of injectivity of data constructors, for several different arities. *) 89 | | [ H : ?F ?X = ?F ?Y |- ?G ] => 90 | (** This first branch of the [||] fails the whole attempt iff the arguments of the constructor applications are already easy to prove equal. *) 91 | (assert (X = Y); [ assumption | fail 1 ]) 92 | (** If we pass that filter, then we use injection on [H] and do some simplification as in [inject]. 93 | The odd-looking check of the goal form is to avoid cases where [injection] gives a more complex result because of dependent typing, which we aren't equipped to handle here. *) 94 | || (injection H; 95 | match goal with 96 | | [ |- X = Y -> G ] => 97 | try clear H; intros; try subst 98 | end) 99 | | [ H : ?F ?X ?U = ?F ?Y ?V |- ?G ] => 100 | (assert (X = Y); [ assumption 101 | | assert (U = V); [ assumption | fail 1 ] ]) 102 | || (injection H; 103 | match goal with 104 | | [ |- U = V -> X = Y -> G ] => 105 | try clear H; intros; try subst 106 | end) 107 | 108 | (** Consider some different arities of a predicate [F] in a hypothesis that we might want to invert. *) 109 | | [ H : ?F _ |- _ ] => invert H F 110 | | [ H : ?F _ _ |- _ ] => invert H F 111 | | [ H : ?F _ _ _ |- _ ] => invert H F 112 | | [ H : ?F _ _ _ _ |- _ ] => invert H F 113 | | [ H : ?F _ _ _ _ _ |- _ ] => invert H F 114 | 115 | (** Use an (axiom-dependent!) inversion principle for dependent pairs, from the standard library. *) 116 | | [ H : existT _ ?T _ = existT _ ?T _ |- _ ] => generalize (inj_pair2 _ _ _ _ _ H); clear H 117 | 118 | (** If we're not ready to use that principle yet, try the standard inversion, which often enables the previous rule. *) 119 | | [ H : existT _ _ _ = existT _ _ _ |- _ ] => inversion H; clear H 120 | 121 | (** Similar logic to the cases for constructor injectivity above, but specialized to [Some], since the above cases won't deal with polymorphic constructors. *) 122 | | [ H : Some _ = Some _ |- _ ] => injection H; clear H 123 | end. 124 | 125 | (** Find some hypothesis to rewrite with, ensuring that [auto] proves all of the extra subgoals added by [rewrite]. *) 126 | Ltac rewriteHyp := 127 | match goal with 128 | | [ H : _ |- _ ] => rewrite H by solve [ auto ] 129 | end. 130 | 131 | (** Combine [autorewrite] with automatic hypothesis rewrites. *) 132 | Ltac rewriterP := repeat (rewriteHyp; autorewrite with core in *). 133 | Ltac rewriter := autorewrite with core in *; rewriterP. 134 | 135 | (** This one is just so darned useful, let's add it as a hint here. *) 136 | Hint Rewrite app_ass. 137 | 138 | (** Devious marker predicate to use for encoding state within proof goals *) 139 | Definition done (T : Type) (x : T) := True. 140 | 141 | (** Try a new instantiation of a universally quantified fact, proved by [e]. 142 | [trace] is an accumulator recording which instantiations we choose. *) 143 | Ltac inster e trace := 144 | (** Does [e] have any quantifiers left? *) 145 | match type of e with 146 | | forall x : _, _ => 147 | (** Yes, so let's pick the first context variable of the right type. *) 148 | match goal with 149 | | [ H : _ |- _ ] => 150 | inster (e H) (trace, H) 151 | | _ => fail 2 152 | end 153 | | _ => 154 | (** No more quantifiers, so now we check if the trace we computed was already used. *) 155 | match trace with 156 | | (_, _) => 157 | (** We only reach this case if the trace is nonempty, ensuring that [inster] fails if no progress can be made. *) 158 | match goal with 159 | | [ H : done (trace, _) |- _ ] => 160 | (** Uh oh, found a record of this trace in the context! Abort to backtrack to try another trace. *) 161 | fail 1 162 | | _ => 163 | (** What is the type of the proof [e] now? *) 164 | let T := type of e in 165 | match type of T with 166 | | Prop => 167 | (** [e] should be thought of as a proof, so let's add it to the context, and also add a new marker hypothesis recording our choice of trace. *) 168 | generalize e; intro; 169 | assert (done (trace, tt)) by constructor 170 | | _ => 171 | (** [e] is something beside a proof. Better make sure no element of our current trace was generated by a previous call to [inster], or we might get stuck in an infinite loop! (We store previous [inster] terms in second positions of tuples used as arguments to [done] in hypotheses. Proofs instantiated by [inster] merely use [tt] in such positions.) *) 172 | all ltac:(fun X => 173 | match goal with 174 | | [ H : done (_, X) |- _ ] => fail 1 175 | | _ => idtac 176 | end) trace; 177 | (** Pick a new name for our new instantiation. *) 178 | let i := fresh "i" in (pose (i := e); 179 | assert (done (trace, i)) by constructor) 180 | end 181 | end 182 | end 183 | end. 184 | 185 | (** After a round of application with the above, we will have a lot of junk [done] markers to clean up; hence this tactic. *) 186 | Ltac un_done := 187 | repeat match goal with 188 | | [ H : done _ |- _ ] => clear H 189 | end. 190 | 191 | Require Import JMeq. 192 | 193 | (** A more parameterized version of the famous [crush]. Extra arguments are: 194 | - A tuple-list of lemmas we try [inster]-ing 195 | - A tuple-list of predicates we try inversion for *) 196 | Ltac crush' lemmas invOne := 197 | (** A useful combination of standard automation *) 198 | let sintuition := simpl in *; intuition; try subst; 199 | repeat (simplHyp invOne; intuition; try subst); try congruence in 200 | 201 | (** A fancier version of [rewriter] from above, which uses [crush'] to discharge side conditions *) 202 | let rewriter := autorewrite with core in *; 203 | repeat (match goal with 204 | | [ H : ?P |- _ ] => 205 | match P with 206 | | context[JMeq] => fail 1 (** JMeq is too fancy to deal with here. *) 207 | | _ => rewrite H by crush' lemmas invOne 208 | end 209 | end; autorewrite with core in *) in 210 | 211 | (** Now the main sequence of heuristics: *) 212 | (sintuition; rewriter; 213 | match lemmas with 214 | | false => idtac (** No lemmas? Nothing to do here *) 215 | | _ => 216 | (** Try a loop of instantiating lemmas... *) 217 | repeat ((app ltac:(fun L => inster L L) lemmas 218 | (** ...or instantiating hypotheses... *) 219 | || appHyps ltac:(fun L => inster L L)); 220 | (** ...and then simplifying hypotheses. *) 221 | repeat (simplHyp invOne; intuition)); un_done 222 | end; 223 | sintuition; rewriter; sintuition; 224 | (** End with a last attempt to prove an arithmetic fact with [omega], or prove any sort of fact in a context that is contradictory by reasoning that [omega] can do. *) 225 | try lia). (*; try (elimtype False; lia)).*) 226 | 227 | (** [crush] instantiates [crush'] with the simplest possible parameters. *) 228 | Ltac crush := crush' false fail. 229 | 230 | (** Wrap Program's [dependent destruction] in a slightly more pleasant form *) 231 | 232 | Require Import Program.Equality. 233 | 234 | (** Run [dependent destruction] on [E] and look for opportunities to simplify the result. 235 | The weird introduction of [x] helps get around limitations of [dependent destruction], in terms of which sorts of arguments it will accept (e.g., variables bound to hypotheses within Ltac [match]es). *) 236 | Ltac dep_destruct E := 237 | let x := fresh "x" in 238 | remember E as x; simpl in x; dependent destruction x; 239 | try match goal with 240 | | [ H : _ = E |- _ ] => try rewrite <- H in *; clear H 241 | end. 242 | 243 | (** Nuke all hypotheses that we can get away with, without invalidating the goal statement. *) 244 | Ltac clear_all := 245 | repeat match goal with 246 | | [ H : _ |- _ ] => clear H 247 | end. 248 | 249 | (** Instantiate a quantifier in a hypothesis [H] with value [v], or, if [v] doesn't have the right type, with a new unification variable. 250 | Also prove the lefthand sides of any implications that this exposes, simplifying [H] to leave out those implications. *) 251 | Ltac guess v H := 252 | repeat match type of H with 253 | | forall x : ?T, _ => 254 | match type of T with 255 | | Prop => 256 | (let H' := fresh "H'" in 257 | assert (H' : T); [ 258 | solve [ eauto 6 ] 259 | | specialize (H H'); clear H' ]) 260 | || fail 1 261 | | _ => 262 | specialize (H v) 263 | || let x := fresh "x" in 264 | evar (x : T); 265 | let x' := eval unfold x in x in 266 | clear x; specialize (H x') 267 | end 268 | end. 269 | 270 | (** Version of [guess] that leaves the original [H] intact *) 271 | Ltac guessKeep v H := 272 | let H' := fresh "H'" in 273 | generalize H; intro H'; guess v H'. 274 | 275 | Create HintDb X_Lib. 276 | 277 | Ltac smack := try autorewrite with X_Lib in *; crush; eauto. 278 | -------------------------------------------------------------------------------- /spark2014_semantics/src/Makefile.conf: -------------------------------------------------------------------------------- 1 | # This configuration file was generated by running: 2 | # coq_makefile -f _CoqProject -o Makefile 3 | 4 | 5 | ############################################################################### 6 | # # 7 | # Project files. # 8 | # # 9 | ############################################################################### 10 | 11 | COQMF_VFILES = CpdtTactics.v list_util.v rt.v ast_basics.v ast.v ast_rt.v ast_util.v values.v environment.v store_util.v symboltable_module.v symboltable.v eval.v eval_rt.v rt_gen.v rt_gen_util.v rt_gen_impl.v rt_gen_impl_consistent.v rt_gen_consistent.v rt_opt_ZArith.v rt_opt.v rt_opt_impl.v rt_opt_impl_consistent.v rt_opt_util.v well_typed.v well_typed_util.v rt_opt_consistent_util.v rt_opt_consistent.v rt_counter.v rt_validator.v 12 | COQMF_MLIFILES = 13 | COQMF_MLFILES = 14 | COQMF_MLGFILES = 15 | COQMF_MLPACKFILES = 16 | COQMF_MLLIBFILES = 17 | COQMF_CMDLINE_VFILES = 18 | 19 | ############################################################################### 20 | # # 21 | # Path directives (-I, -R, -Q). # 22 | # # 23 | ############################################################################### 24 | 25 | COQMF_OCAMLLIBS = 26 | COQMF_SRC_SUBDIRS = 27 | COQMF_COQLIBS = -R . spark 28 | COQMF_COQLIBS_NOML = -R . spark 29 | COQMF_CMDLINE_COQLIBS = 30 | 31 | ############################################################################### 32 | # # 33 | # Coq configuration. # 34 | # # 35 | ############################################################################### 36 | 37 | COQMF_LOCAL=1 38 | COQMF_COQLIB=/home/courtieu/coq/8.12// 39 | COQMF_DOCDIR=/home/courtieu/coq/8.12/doc/ 40 | COQMF_OCAMLFIND=/home/courtieu/.opam/4.10.0/bin/ocamlfind 41 | COQMF_CAMLFLAGS=-thread -rectypes -w +a-4-9-27-41-42-44-45-48-58-67 -safe-string -strict-sequence 42 | COQMF_WARN=-warn-error +a-3 43 | COQMF_HASNATDYNLINK=true 44 | COQMF_COQ_SRC_SUBDIRS=config lib clib kernel library engine pretyping interp gramlib gramlib/.pack parsing proofs tactics toplevel printing ide stm vernac plugins/btauto plugins/cc plugins/derive plugins/extraction plugins/firstorder plugins/funind plugins/ltac plugins/micromega plugins/nsatz plugins/omega plugins/rtauto plugins/setoid_ring plugins/ssr plugins/ssrmatching plugins/ssrsearch plugins/syntax 45 | COQMF_WINDRIVE= 46 | 47 | ############################################################################### 48 | # # 49 | # Extra variables. # 50 | # # 51 | ############################################################################### 52 | 53 | COQMF_OTHERFLAGS = 54 | COQMF_INSTALLCOQDOCROOT = spark 55 | -------------------------------------------------------------------------------- /spark2014_semantics/src/README.txt: -------------------------------------------------------------------------------- 1 | Formalization of SPARK2014 in Coq Proof Assistant 2 | 3 | =========== 4 | Notes: 5 | =========== 6 | it’s compiled with Coq8.5 version 7 | run: ./makegen will generate Makefile, 8 | within Makefile, replace the following two lines: 9 | COQLIBS?=\ 10 | -R "." Top\ 11 | -I "." 12 | COQDOCLIBS?=\ 13 | -R "." Top 14 | 15 | with: 16 | 17 | COQLIBS?=-I "." 18 | COQDOCLIBS?= 19 | 20 | ============ 21 | Overview 22 | ============ 23 | In Coq Proof Assistant, we did a semantical formalization for a 24 | subset of SPARK2014 language, including array, record and nested 25 | procedures; Besides, we also define a formal semantics for the 26 | run-time-checks-flagged SPARK2014 language; and prove that 27 | the run-time check flags generated by our formalized run-time checks 28 | generator are correct with respect to the SPARK formal semantics; 29 | The formally proved correct run-time check flags can then be used to 30 | verify the run-time check flags generated by Gnat front end; 31 | In the end, we have formalized some simple optimizations for run-time 32 | checks and prove that any SPARK program will evaluate to the same 33 | result before and after the run-time checks optimizations; 34 | 35 | 36 | ============ 37 | Coq Source Code 38 | ============ 39 | 40 | - CpdtTactics.v 41 | it provides the tactic called "smack" which is implemented by 42 | Adam Chlipala in his "Certified Programming with Dependent Types" 43 | book; the "smack" tactic is powerful and can save a lot of time and 44 | proof effort during Coq proof; 45 | 46 | - LibTactics.v 47 | it contains a set of tactics that extends the set of builtin 48 | tactics provided with the standard distribution of Coq. 49 | It intends to overcome a number of limitations of the 50 | standard set of tactics, and thereby to help user to write 51 | shorter and more robust scripts; 52 | 53 | - more_list.v 54 | it provides some list operations, e.g. list split, and 55 | corresponding lemmas; 56 | 57 | - language_basics.v 58 | - language_template.v 59 | - language.v 60 | - language_flagged.v 61 | * language_basics.v defines some basic data types and operators 62 | and literals that are used to define the SPARK 2014 AST syntax; 63 | * language_template.v defines a template for SPARK 2014 AST 64 | syntax; run the script program "languagegen" will generate 65 | language.v and language_flagged.v automatically; 66 | * language.v defines the SPARK 2014 AST syntax, which is used to 67 | to formalize SPARK 2014 reference semantics; 68 | * language_flagged.v is the same as language.v except that each 69 | AST node is extended with a set of run-time check flags; this 70 | check-flagged language is used to represent the SPARK ASTs 71 | that are generated by Gnat front end, which inserts run-time 72 | check flags to SPARK AST during its static semantical analysis; 73 | * both language_basics.v and language_template.v can be generated 74 | automatically by our Sireum Jago translation tool developed at 75 | K-State, which translates gnat2xml-generated Ada XML Schema 76 | to (inductive) type definitions in Coq; 77 | (Sireum Jago is illustrated in following sections) 78 | 79 | - checks.v 80 | - checks_generator.v 81 | - checks_comparison.v 82 | * checks.v defines a subset of run-time check flags that are 83 | required by our formalized SPARK2014 subset, including 84 | division check, overflow check and range check; it also 85 | defines some comparison functions for run-time check flags; 86 | * checks_generator.v defines run-time check flags generator 87 | according to the run-time check flags generation rules defined 88 | in Ada/SPARK reference manual; the check generator is formalized 89 | in both inducitive check generation rules and functional check 90 | generation program and prove them to be semantic equivalent; 91 | * checks_comparison.v defines comparison functions for two 92 | run-time checks-flagged SPARK ASTs, and return all AST nodes 93 | where their run-time check flags are mismatching; furthermore, 94 | they can also be mapped back to the SPARK source code to denote 95 | where the run-time checks are misssing or misplaced; 96 | this can be used to compare the Gnat-generated SPARK AST against 97 | our formally verified check-flagged SPARK AST, and report any 98 | missing or misplaced run-time check flags inserted by Gnat front 99 | end; 100 | 101 | - values.v 102 | it defines values and corresponding operations that are used to define the 103 | SPARK semantics; 104 | 105 | - environment.v 106 | it defines store and stack as memory model for evaluating SPARK 107 | programs, including some basic store and stack operations and 108 | useful lemmas; 109 | 110 | - symboltable_module.v 111 | - symboltable.v 112 | * symboltable_module.v defines a template for symbol table and 113 | its lookup and update operations; the symbol table stores 114 | procedure declarations and type declarations; 115 | * symboltable.v defines the symbol table and its corresponding 116 | operations for SPARK programs (in language.v) and the symbol 117 | table for check-flagged SPARK programs (in language_flagged.v); 118 | 119 | - semantics.v 120 | - semantics_flagged.v 121 | * semantics.v defines the formal semantics for SPARK2014 Language 122 | (defined in language.v), which is used as SPARK reference semantics; 123 | * semantics_flagged.v defines the formal semantics for checks-flagged 124 | SPARK2014 Language (defined inlanguage_flagged.v); 125 | 126 | - checks_soundness.v 127 | it proves that run-time check flags generated by checks generator 128 | are correct with respect to the SPARK reference semantics; so they 129 | can be used to verify the correctness of run-time check flags 130 | generated by Gnat front end; 131 | 132 | - checks_optimization.v 133 | it implements some simple run-time checks optimization for linear operations; 134 | 135 | - well_typed.v 136 | - well_check_flagged.v 137 | * well_typed.v defines the well-typed stack, where the value of 138 | each variable in the stack should be a value of the type of the 139 | varialbe itself, and it defines the well-typedness of the SPARK 140 | program , where the each AST node should be well-typed; 141 | * well_check_flagged.v specifies that the checks-flagged SPARK 142 | program should be annotated with correct run-time check flags; 143 | * both well_typed.v and well_check_flagged.v are used as 144 | constraint for SPARK program to prove the soundness of 145 | run-time checks optimization; 146 | 147 | - checks_optimization_soundness.v 148 | * it proves that the evaluation of SPARK programs should be the 149 | same before and after the run-time checks optimization; 150 | 151 | 152 | ==================== 153 | Compile/Document Coq Files 154 | ==================== 155 | 156 | - makegen 157 | it can be editted and executed (by running "makegen" in terminal) 158 | to generate the Makefile and then run "make" in terminal to compile 159 | the above Coq source files; 160 | 161 | - htmlgen 162 | it can be editted and executed (by running "htmlgen" in terminal) 163 | to generate html files for the above Coq source files; 164 | 165 | ================= 166 | Sireum Jago Translators 167 | ================= 168 | - Sireum Jago: which performs two kinds of translation: 169 | * type translation, which translates gnat2xml-generated XML Schema 170 | to (inductive) type definitions in Coq 171 | * program translation, which translates SPARK programs from 172 | gnat2xml-generated XML AST to Coq AST; 173 | 174 | 175 | ================= 176 | Getting Started 177 | ================= 178 | 179 | 1. Download and install Sireum (Stable version) following instructions 180 | at : http://www.sireum.org/software.html 181 | 182 | 2. Run Sireum to test SPARK Type and Program Translation 183 | 184 | In terminal: 185 | 2.1. command: "sireum bakar type -t Coq" 186 | will translate SPARK XML Schema into SPARK type definition in Coq; 187 | 188 | 2.2. command: "sireum bakar program -p Coq []" 189 | will translate the SPARK program into Coq AST; 190 | 191 | 3. command "./test_demo coq_ast_source_file_name target_file_name.v" 192 | will verify the run-time check flags generated by Gnat front 193 | end against our formally verified run-time check flags and 194 | report any mismatching run-time check flags and map them back 195 | to SPARK source programs; 196 | * test_demo is a script program (put under the same directory as 197 | the other Coq source files) to do the run-time checks; 198 | comparison and mapping back the error to SPARK source code; 199 | * coq_ast_source_file_name is a file contain the Coq AST generated 200 | in step 2.2; 201 | * target_file_name can be any target coq file name; 202 | 203 | 204 | ======================== 205 | Limitations of SPARK2014 Subset 206 | ======================== 207 | 208 | - not support for function and function call now 209 | - not support for "return" statement 210 | - no support for “or_else_short_circuit” or “and_then_short_circuit” operators 211 | - no support for string 212 | - ... 213 | 214 | 215 | 216 | 217 | 218 | 219 | 220 | 221 | 222 | 223 | 224 | 225 | 226 | 227 | 228 | 229 | 230 | 231 | 232 | 233 | 234 | 235 | 236 | 237 | 238 | 239 | 240 | 241 | 242 | 243 | 244 | -------------------------------------------------------------------------------- /spark2014_semantics/src/_CoqProject: -------------------------------------------------------------------------------- 1 | -R . spark 2 | CpdtTactics.v 3 | list_util.v 4 | rt.v 5 | ast_basics.v 6 | ast.v 7 | ast_rt.v 8 | ast_util.v 9 | values.v 10 | environment.v 11 | store_util.v 12 | symboltable_module.v 13 | symboltable.v 14 | eval.v 15 | eval_rt.v 16 | rt_gen.v 17 | rt_gen_util.v 18 | rt_gen_impl.v 19 | rt_gen_impl_consistent.v 20 | rt_gen_consistent.v 21 | rt_opt_ZArith.v 22 | rt_opt.v 23 | rt_opt_impl.v 24 | rt_opt_impl_consistent.v 25 | rt_opt_util.v 26 | well_typed.v 27 | well_typed_util.v 28 | rt_opt_consistent_util.v 29 | rt_opt_consistent.v 30 | rt_counter.v 31 | rt_validator.v 32 | 33 | -------------------------------------------------------------------------------- /spark2014_semantics/src/ast.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Export ast_basics. 13 | 14 | (** This file can be auto-generated from language_template.v by running languagegen in terminal *) 15 | 16 | (** * SPARK Subset Language *) 17 | 18 | (** We use the Ada terminology to define the terms of this subset 19 | language, which makes it easy for Ada(SPARK) users to read it; 20 | Besides, we also indicate the reference chapter (for example, ,3.5.3) 21 | in Ada 2012 RM, and formalize the language in the same (not exactly) 22 | order as they are defined in Ada 2012 RM; 23 | *) 24 | 25 | (* Ada 2012 RM, Chapter 3. Declaration and Types *) 26 | 27 | (** ** Expression *) 28 | (* Chapter 4 *) 29 | 30 | Inductive exp: Type := 31 | | Literal: astnum -> literal -> exp (* 4.2 *) 32 | | Name: astnum -> name -> exp (* 4.1 *) 33 | | BinOp: astnum -> binary_operator -> exp -> exp -> exp (* 4.5.3 and 4.5.5 *) 34 | | UnOp: astnum -> unary_operator -> exp -> exp (* 4.5.4 *) 35 | 36 | (** in IndexedComponent, the first astnum is the ast number for the indexed component, 37 | and the second astnum is the ast number for array represented by name; 38 | in E_SelectedComponent, the first astnum is the ast number for the record field, 39 | and second astnum is the ast number for record represented by name; 40 | *) 41 | with name: Type := (* 4.1 *) 42 | | Identifier: astnum -> idnum -> name (* 4.1 *) 43 | | IndexedComponent: astnum -> name -> exp -> name (* 4.1.1 *) 44 | | SelectedComponent: astnum -> name -> idnum -> name (* 4.1.3 *). 45 | 46 | (** Induction scheme for exp and name *) 47 | (** 48 | Scheme exp_ind := Induction for exp Sort Prop 49 | with name_ind := Induction for name Sort Prop. 50 | *) 51 | 52 | (** ** Statement *) 53 | (* Chapter 5 *) 54 | (* Sequence is not a statement in Ada, it's a shortcut for now; 55 | check flags can be easily added if they are needed later; 56 | *) 57 | Inductive stmt: Type := 58 | | Null: stmt (* 5.1 *) 59 | | Assign: astnum -> name -> exp -> stmt (* 5.2 *) 60 | | If: astnum -> exp -> stmt -> stmt -> stmt (* 5.3 *) 61 | | While: astnum -> exp -> stmt -> stmt (* 5.5 *) 62 | | Call: astnum -> astnum -> procnum -> list exp -> stmt (* 6.4 *) (* the second astnum for the called procedure *) 63 | | Seq: astnum -> stmt -> stmt -> stmt (* 5.1 *). 64 | 65 | (** it's used for subtype declarations: 66 | - subtype declaration, e.g. subtype MyInt is Integer range 0 .. 5; 67 | - derived type declaration, e.g. type MyInt is new Integer range 1 .. 100; 68 | - integer type declaration, e.g. type MyInt is range 1 .. 10; 69 | *) 70 | Inductive range: Type := Range (l: Z) (u: Z). (* 3.5 *) 71 | 72 | (** ** Type Declaration *) 73 | Inductive typeDecl: Type := (* 3.2.1 *) 74 | | SubtypeDecl: 75 | astnum -> typenum (*subtype name*) -> type -> range -> typeDecl (* 3.2.2 *) 76 | | DerivedTypeDecl: 77 | astnum -> typenum (*derived type name*) -> type -> range -> typeDecl (* 3.4 *) 78 | | IntegerTypeDecl: 79 | astnum -> typenum (*integer type name*) -> range -> typeDecl (* 3.5.4 *) 80 | | ArrayTypeDecl: (* Constrained_Array_Definition, non-nested one-dimentional array *) 81 | astnum -> typenum (*array type name*) -> type (*index subtype mark*) -> type (*component type*) -> typeDecl (* 3.6 *) 82 | | RecordTypeDecl: 83 | astnum -> typenum (*record type name*) -> list (idnum * type (*field type*)) -> typeDecl (* 3.8 *). 84 | 85 | (* 3.3.1 *) 86 | Record objDecl: Type := mkobjDecl{ 87 | declaration_astnum: astnum; 88 | object_name: idnum; 89 | object_nominal_subtype: type; 90 | initialization_exp: option (exp) 91 | }. 92 | 93 | (* 6.1 (15/3) *) 94 | Record paramSpec: Type := mkparamSpec{ 95 | parameter_astnum: astnum; 96 | parameter_name: idnum; 97 | parameter_subtype_mark: type; 98 | parameter_mode: mode 99 | (* parameter_default_exp: option (exp) *) 100 | }. 101 | 102 | (** ** Declaration *) 103 | (* Mutual records/inductives are not allowed in coq, so we build a record by hand. *) 104 | Inductive decl: Type := (* 3.1 *) 105 | | NullDecl: decl 106 | | TypeDecl: astnum -> typeDecl -> decl (* 3.2.1 *) 107 | | ObjDecl: astnum -> objDecl -> decl (* 3.3.1 *) 108 | | ProcBodyDecl: astnum -> procBodyDecl -> decl (* 6.1 *) 109 | | SeqDecl: astnum -> decl -> decl -> decl (* it's introduced for easy proof *) 110 | 111 | (** ** Procedure *) 112 | with procBodyDecl: Type := 113 | mkprocBodyDecl 114 | (procedure_astnum: astnum) 115 | (procedure_name: procnum) 116 | (procedure_parameter_profile: list paramSpec) 117 | (procedure_declarative_part: decl) 118 | (procedure_statements: stmt). 119 | 120 | (** * Program *) 121 | (** A program is composed of a sequence of (1) type declarations, (2) variable declarations 122 | (3) procedure declarations, where 'main' is the main procedure (with empty parameters) working 123 | as the entry point of the whole program *) 124 | 125 | Record program : Type := mkprogram{ 126 | decls: decl; 127 | main: procnum 128 | }. 129 | 130 | (** * Auxiliary Functions *) 131 | 132 | Section AuxiliaryFunctions. 133 | 134 | Definition procedure_statements pb := 135 | match pb with 136 | | mkprocBodyDecl _ _ _ _ x => x 137 | end. 138 | 139 | Definition procedure_declarative_part pb := 140 | match pb with 141 | | mkprocBodyDecl _ _ _ x _ => x 142 | end. 143 | 144 | Definition procedure_parameter_profile pb := 145 | match pb with 146 | | mkprocBodyDecl _ _ x _ _ => x 147 | end. 148 | 149 | Definition procedure_name pb := 150 | match pb with 151 | | mkprocBodyDecl _ x _ _ _ => x 152 | end. 153 | 154 | Definition type_name td := 155 | match td with 156 | | SubtypeDecl _ tn _ _ => tn 157 | | DerivedTypeDecl _ tn _ _ => tn 158 | | IntegerTypeDecl _ tn _ => tn 159 | | ArrayTypeDecl _ tn _ _ => tn 160 | | RecordTypeDecl _ tn _ => tn 161 | end. 162 | 163 | Definition subtype_range (t: typeDecl): option range := 164 | match t with 165 | | SubtypeDecl ast_num tn t r => Some r 166 | | DerivedTypeDecl ast_num tn t r => Some r 167 | | IntegerTypeDecl ast_num tn r => Some r 168 | | _ => None 169 | end. 170 | 171 | Definition expression_astnum e := 172 | match e with 173 | | Literal ast_num l => ast_num 174 | | Name ast_num n => ast_num 175 | | BinOp ast_num bop e1 e2 => ast_num 176 | | UnOp ast_num uop e => ast_num 177 | end. 178 | 179 | Definition name_astnum n := 180 | match n with 181 | | Identifier ast_num x => ast_num 182 | | IndexedComponent ast_num x e => ast_num 183 | | SelectedComponent ast_num x f => ast_num 184 | end. 185 | 186 | End AuxiliaryFunctions. 187 | 188 | 189 | 190 | 191 | 192 | 193 | -------------------------------------------------------------------------------- /spark2014_semantics/src/ast_basics.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Export ZArith. 13 | Require Export Coq.Lists.List. 14 | Require Export Coq.Bool.Bool. 15 | Require Export CpdtTactics. 16 | Require Export list_util. 17 | 18 | (** This file defines some basic data types and operations used for 19 | formalization of SPARK 2014 language; 20 | *) 21 | 22 | (** * Identifiers *) 23 | 24 | (** Distinct AST number labeled for each AST node; it's not a part 25 | of the SPARK 2014 language, but it's introduced to facilitate the 26 | formalization of the language and error debugging. For example, the 27 | AST number can be used to map the error information back to the SPARK 28 | source code, and it also can be used to map each ast node to its type 29 | information; 30 | *) 31 | 32 | Definition astnum := nat. 33 | 34 | (** In CompCert, non-negative values are used to represent identifiers, 35 | we take the same way to represent identifiers/names as natural numbers; 36 | - idnum: represent declared variables; 37 | - procnum: represent declared procedure names; 38 | - pkgnum: represent package names; 39 | - typenum: represent declared type names; 40 | *) 41 | 42 | Definition idnum := nat. 43 | 44 | Definition procnum := nat. 45 | 46 | Definition pkgnum := nat. 47 | 48 | Definition typenum := nat. 49 | 50 | Definition arrindex := Z. (* array index *) 51 | 52 | (* 53 | Record type_table: Type := mktype_table{ 54 | tt_exptype_table: list (astnum * typenum); 55 | tt_typename_table: list (typenum * (typeuri * option typedeclnum)) 56 | }. 57 | *) 58 | 59 | (** * Data Types *) 60 | (** in SPARK, data types can be boolean, integer, subtype, array/record 61 | types and others; typenum denotes the subtype names or array/record 62 | types names; 63 | 64 | In SPARK, subtype can be declared in the following ways: 65 | - subtype declaration, e.g. subtype MyInt is Integer range 0 .. 5; 66 | - derived type declaration, e.g. type MyInt is new Integer range 1 .. 100; 67 | - integer type declaration, e.g. type MyInt is range 1 .. 10; 68 | 69 | Note: now we only consider the 32-bit singed integer type for 70 | our SPARK subset language, and model it with Integer; Actually, 71 | SPARK has various integer types, we can extend our types by 72 | adding more SPARK types here and adding its corresponding value 73 | definition in values.v; 74 | *) 75 | 76 | Inductive type: Type := 77 | | Boolean (* 3.5.3 *) 78 | | Integer (* 3.5.4 *) 79 | | Subtype (t: typenum) (* 3.2.2 *) (* t: declared subtype name *) 80 | | Derived_Type (t: typenum) (* 3.4 *) 81 | | Integer_Type (t: typenum) (* 3.5.4 *) 82 | | Array_Type (t: typenum) (* 3.6 *) (* t: declared array type name *) 83 | | Record_Type (t: typenum) (* 3.8 *) (* t: declared record type name *). 84 | 85 | (** * In/Out Mode *) 86 | 87 | (* 6.2 *) 88 | Inductive mode: Type := 89 | | In 90 | | Out 91 | | In_Out. 92 | 93 | (** * Operators *) 94 | 95 | (** unary and binary operators *) 96 | Inductive unary_operator: Type := 97 | (* | Unary_Plus: unary_operator *) 98 | | Unary_Minus: unary_operator 99 | | Not: unary_operator. 100 | 101 | Inductive binary_operator: Type := 102 | | And: binary_operator 103 | | Or: binary_operator 104 | | Equal: binary_operator 105 | | Not_Equal: binary_operator 106 | | Less_Than: binary_operator 107 | | Less_Than_Or_Equal: binary_operator 108 | | Greater_Than: binary_operator 109 | | Greater_Than_Or_Equal: binary_operator 110 | | Plus: binary_operator 111 | | Minus: binary_operator 112 | | Multiply: binary_operator 113 | | Divide: binary_operator 114 | | Modulus: binary_operator. 115 | 116 | (** * Literals *) 117 | 118 | Inductive literal: Type := 119 | | Integer_Literal: Z -> literal (* 2.4 *) 120 | | Boolean_Literal: bool -> literal (* 3.5.3 *). 121 | 122 | (** * Auxiliary Functions *) 123 | 124 | Section LB_AuxiliaryFunctions. 125 | 126 | (** it wll be used to determine whether to put range check *) 127 | Definition is_range_constrainted_type (t: type): bool := 128 | match t with 129 | | Subtype _ => true 130 | | Derived_Type _ => true 131 | | Integer_Type _ => true 132 | | _ => false 133 | end. 134 | 135 | Definition subtype_num (t: type): option typenum := 136 | match t with 137 | | Subtype tn => Some tn 138 | | Derived_Type tn => Some tn 139 | | Integer_Type tn => Some tn 140 | | _ => None 141 | end. 142 | 143 | Definition beq_type (t1 t2: type): bool := 144 | match t1, t2 with 145 | | Boolean, Boolean => true 146 | | Integer, Integer => true 147 | | Subtype t1', Subtype t2' => beq_nat t1' t2' 148 | | Derived_Type t1', Derived_Type t2' => beq_nat t1' t2' 149 | | Integer_Type t1', Integer_Type t2' => beq_nat t1' t2' 150 | | Array_Type t1', Array_Type t2' => beq_nat t1' t2' 151 | | Record_Type t1', Record_Type t2' => beq_nat t1' t2' 152 | | _, _ => false 153 | end. 154 | 155 | Definition is_in_mode (x: mode): bool := 156 | match x with 157 | | In => true 158 | | _ => false 159 | end. 160 | 161 | Definition is_out_mode (x: mode): bool := 162 | match x with 163 | | Out => true 164 | | _ => false 165 | end. 166 | 167 | Definition is_in_out_mode (x: mode): bool := 168 | match x with 169 | | In_Out => true 170 | | _ => false 171 | end. 172 | 173 | Lemma beq_type_refl: forall t, 174 | true = beq_type t t. 175 | Proof. 176 | intros; destruct t; 177 | smack; apply beq_nat_refl. 178 | Qed. 179 | 180 | Lemma beq_type_eq: forall t1 t2, 181 | true = beq_type t1 t2 -> 182 | t1 = t2. 183 | Proof. 184 | intros; destruct t1, t2; smack; 185 | specialize (beq_nat_eq _ _ H); smack. 186 | Qed. 187 | 188 | End LB_AuxiliaryFunctions. 189 | -------------------------------------------------------------------------------- /spark2014_semantics/src/ast_rt.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Export ast. 13 | Require Export rt. 14 | 15 | (** This file can be auto-generated from language_template.v by running languagegen in terminal *) 16 | 17 | (** * SPARK Subset Language with Run-Time Check Decorations *) 18 | 19 | (** We use the Ada terminology to define the terms of this subset 20 | language, which makes it easy for Ada(SPARK) users to read it; 21 | Besides, we also indicate the reference chapter (for example, ,3.5.3) 22 | in Ada 2012 RM, and formalize the language in the same (not exactly) 23 | order as they are defined in Ada 2012 RM; 24 | *) 25 | 26 | (* Ada 2012 RM, Chapter 3. Declaration and Types *) 27 | 28 | (** ** Expression + RT *) 29 | (* Chapter 4 *) 30 | 31 | Inductive expRT: Type := 32 | | LiteralRT: astnum -> literal -> interior_checks -> exterior_checks -> expRT (* 4.2 *) 33 | | NameRT: astnum -> nameRT -> expRT (* 4.1 *) 34 | | BinOpRT: astnum -> binary_operator -> expRT -> expRT -> interior_checks -> exterior_checks -> expRT (* 4.5.3 and 4.5.5 *) 35 | | UnOpRT: astnum -> unary_operator -> expRT -> interior_checks -> exterior_checks -> expRT (* 4.5.4 *) 36 | 37 | (** in IndexedComponentRT, the first astnum is the ast number for the indexed component, 38 | and the second astnum is the ast number for array represented by nameRT; 39 | in SelectedComponentRT, the first astnum is the ast number for the record field, 40 | and second astnum is the ast number for record represented by nameRT; 41 | *) 42 | with nameRT: Type := (* 4.1 *) 43 | | IdentifierRT: astnum -> idnum -> exterior_checks -> nameRT (* 4.1 *) 44 | | IndexedComponentRT: astnum -> nameRT -> expRT -> exterior_checks -> nameRT (* 4.1.1 *) 45 | | SelectedComponentRT: astnum -> nameRT -> idnum -> exterior_checks -> nameRT (* 4.1.3 *). 46 | 47 | (** Induction scheme for expRT and nameRT *) 48 | (** 49 | Scheme expRT_ind := Induction for expRT Sort Prop 50 | with nameRT_ind := Induction for nameRT Sort Prop. 51 | *) 52 | 53 | (** ** Statement + RT *) 54 | (* Chapter 5 *) 55 | (* Sequence is not a statement in Ada, it's a shortcut for now; 56 | check flags can be easily added if they are needed later; 57 | *) 58 | Inductive stmtRT: Type := 59 | | NullRT: stmtRT (* 5.1 *) 60 | | AssignRT: astnum -> nameRT -> expRT -> stmtRT (* 5.2 *) 61 | | IfRT: astnum -> expRT -> stmtRT -> stmtRT -> stmtRT (* 5.3 *) 62 | | WhileRT: astnum -> expRT -> stmtRT -> stmtRT (* 5.5 *) 63 | | CallRT: astnum -> astnum -> procnum -> list expRT -> stmtRT (* 6.4 *) (* the second astnum for the called procedure *) 64 | | SeqRT: astnum -> stmtRT -> stmtRT -> stmtRT (* 5.1 *). 65 | 66 | (** it's used for subtype declarations: 67 | - subtype declaration, e.g. subtype MyInt is Integer range 0 .. 5; 68 | - derived type declaration, e.g. type MyInt is new Integer range 1 .. 100; 69 | - integer type declaration, e.g. type MyInt is range 1 .. 10; 70 | *) 71 | Inductive rangeRT: Type := RangeRT (l: Z) (u: Z). (* 3.5 *) 72 | 73 | (** ** Type Declaration + RT *) 74 | Inductive typeDeclRT: Type := (* 3.2.1 *) 75 | | SubtypeDeclRT: 76 | astnum -> typenum (*subtype name*) -> type -> rangeRT -> typeDeclRT (* 3.2.2 *) 77 | | DerivedTypeDeclRT: 78 | astnum -> typenum (*derived type name*) -> type -> rangeRT -> typeDeclRT (* 3.4 *) 79 | | IntegerTypeDeclRT: 80 | astnum -> typenum (*integer type name*) -> rangeRT -> typeDeclRT (* 3.5.4 *) 81 | | ArrayTypeDeclRT: (* Constrained_Array_Definition, non-nested one-dimentional array *) 82 | astnum -> typenum (*array type name*) -> type (*index subtype mark*) -> type (*component type*) -> typeDeclRT (* 3.6 *) 83 | | RecordTypeDeclRT: 84 | astnum -> typenum (*record type name*) -> list (idnum * type (*field type*)) -> typeDeclRT (* 3.8 *). 85 | 86 | (* 3.3.1 *) 87 | Record objDeclRT: Type := mkobjDeclRT{ 88 | declaration_astnum_rt: astnum; 89 | object_nameRT: idnum; 90 | object_nominal_subtype_rt: type; 91 | initialization_expRT: option (expRT) 92 | }. 93 | 94 | (* 6.1 (15/3) *) 95 | Record paramSpecRT: Type := mkparamSpecRT{ 96 | parameter_astnum_rt: astnum; 97 | parameter_nameRT: idnum; 98 | parameter_subtype_mark_rt: type; 99 | parameter_mode_rt: mode 100 | (* parameter_default_expRT: option (expRT) *) 101 | }. 102 | 103 | (** ** Declaration + RT *) 104 | (* Mutual records/inductives are not allowed in coq, so we build a record by hand. *) 105 | Inductive declRT: Type := (* 3.1 *) 106 | | NullDeclRT: declRT 107 | | TypeDeclRT: astnum -> typeDeclRT -> declRT (* 3.2.1 *) 108 | | ObjDeclRT: astnum -> objDeclRT -> declRT (* 3.3.1 *) 109 | | ProcBodyDeclRT: astnum -> procBodyDeclRT -> declRT (* 6.1 *) 110 | | SeqDeclRT: astnum -> declRT -> declRT -> declRT (* it's introduced for easy proof *) 111 | 112 | (** ** Procedure + RT *) 113 | with procBodyDeclRT: Type := 114 | mkprocBodyDeclRT 115 | (procedure_astnum_rt: astnum) 116 | (procedurNameRT: procnum) 117 | (procedure_parameter_profile_rt: list paramSpecRT) 118 | (procedure_declarative_part_rt: declRT) 119 | (procedure_statements_rt: stmtRT). 120 | 121 | 122 | (** ** Program + RT *) 123 | 124 | Record programRT : Type := mkprogramRT{ 125 | declsRT: declRT; 126 | mainRT: procnum 127 | }. 128 | 129 | (** * Auxiliary Functions *) 130 | 131 | Section AuxiliaryFunctions_RT. 132 | 133 | Definition procedure_statements_rt pb := 134 | match pb with 135 | | mkprocBodyDeclRT _ _ _ _ x => x 136 | end. 137 | 138 | Definition procedure_declarative_part_rt pb := 139 | match pb with 140 | | mkprocBodyDeclRT _ _ _ x _ => x 141 | end. 142 | 143 | Definition procedure_parameter_profile_rt pb := 144 | match pb with 145 | | mkprocBodyDeclRT _ _ x _ _ => x 146 | end. 147 | 148 | Definition procedur_name_rt pb := 149 | match pb with 150 | | mkprocBodyDeclRT _ x _ _ _ => x 151 | end. 152 | 153 | Definition type_name_rt td := 154 | match td with 155 | | SubtypeDeclRT _ tn _ _ => tn 156 | | DerivedTypeDeclRT _ tn _ _ => tn 157 | | IntegerTypeDeclRT _ tn _ => tn 158 | | ArrayTypeDeclRT _ tn _ _ => tn 159 | | RecordTypeDeclRT _ tn _ => tn 160 | end. 161 | 162 | Definition subtype_range_rt (t: typeDeclRT): option rangeRT := 163 | match t with 164 | | SubtypeDeclRT ast_num tn t r => Some r 165 | | DerivedTypeDeclRT ast_num tn t r => Some r 166 | | IntegerTypeDeclRT ast_num tn r => Some r 167 | | _ => None 168 | end. 169 | 170 | Definition expression_astnum_rt e := 171 | match e with 172 | | LiteralRT ast_num l in_checks ex_checks => ast_num 173 | | NameRT ast_num n => ast_num 174 | | BinOpRT ast_num bop e1 e2 in_checks ex_checks => ast_num 175 | | UnOpRT ast_num uop e in_checks ex_checks => ast_num 176 | end. 177 | 178 | Definition name_astnum_rt n := 179 | match n with 180 | | IdentifierRT ast_num x ex_checks => ast_num 181 | | IndexedComponentRT ast_num x e ex_checks => ast_num 182 | | SelectedComponentRT ast_num x f ex_checks => ast_num 183 | end. 184 | 185 | End AuxiliaryFunctions_RT. 186 | 187 | 188 | 189 | 190 | 191 | 192 | 193 | 194 | 195 | 196 | 197 | -------------------------------------------------------------------------------- /spark2014_semantics/src/ast_template.v: -------------------------------------------------------------------------------- 1 | (*_1_Require Export language_basics._1_*) 2 | (*_2_Require Export language._2_*) 3 | (*_2_Require Export checks._2_*) 4 | 5 | (** NOTICE *) 6 | 7 | (** * SPARK Subset Language *) 8 | 9 | (** We use the Ada terminology to define the terms of this subset 10 | language, which makes it easy for Ada(SPARK) users to read it; 11 | Besides, we also indicate the reference chapter (for example, ,3.5.3) 12 | in Ada 2012 RM, and formalize the language in the same (not exactly) 13 | order as they are defined in Ada 2012 RM; 14 | *) 15 | 16 | (* Ada 2012 RM, Chapter 3. Declaration and Types *) 17 | 18 | (** ** Expressions *) 19 | (* Chapter 4 *) 20 | 21 | Inductive expression_xx: Type := 22 | | E_Literal_XX: astnum -> literal (*e_checks*)-> expression_xx (* 4.2 *) 23 | | E_Name_XX: astnum -> name_xx -> expression_xx (* 4.1 *) 24 | | E_Binary_Operation_XX: astnum -> binary_operator -> expression_xx -> expression_xx (*e_checks*)-> expression_xx (* 4.5.3 and 4.5.5 *) 25 | | E_Unary_Operation_XX: astnum -> unary_operator -> expression_xx (*e_checks*)-> expression_xx (* 4.5.4 *) 26 | 27 | (** in E_Indexed_Component_XX, the first astnum is the ast number for the indexed component, 28 | and the second astnum is the ast number for array represented by name_xx; 29 | in E_Selected_Component_XX, the first astnum is the ast number for the record field, 30 | and second astnum is the ast number for record represented by name_xx; 31 | *) 32 | with name_xx: Type := (* 4.1 *) 33 | | E_Identifier_XX: astnum -> idnum (*n_checks*)-> name_xx (* 4.1 *) 34 | | E_Indexed_Component_XX: astnum -> name_xx -> expression_xx (*n_checks*)-> name_xx (* 4.1.1 *) 35 | | E_Selected_Component_XX: astnum -> name_xx -> idnum (*n_checks*)-> name_xx (* 4.1.3 *). 36 | 37 | (** Induction scheme for expression_xx and name_xx *) 38 | (** 39 | Scheme expression_xx_ind := Induction for expression_xx Sort Prop 40 | with name_xx_ind := Induction for name_xx Sort Prop. 41 | *) 42 | 43 | (** ** Statements *) 44 | (* Chapter 5 *) 45 | (* Sequence is not a statement in Ada, it's a shortcut for now; 46 | check flags can be easily added if they are needed later; 47 | *) 48 | Inductive statement_xx: Type := 49 | | S_Null_XX: statement_xx (* 5.1 *) 50 | | S_Assignment_XX: astnum -> name_xx -> expression_xx -> statement_xx (* 5.2 *) 51 | | S_If_XX: astnum -> expression_xx -> statement_xx -> statement_xx -> statement_xx (* 5.3 *) 52 | | S_While_Loop_XX: astnum -> expression_xx -> statement_xx -> statement_xx (* 5.5 *) 53 | | S_Procedure_Call_XX: astnum -> astnum -> procnum -> list expression_xx -> statement_xx (* 6.4 *) (* the second astnum for the called procedure *) 54 | | S_Sequence_XX: astnum -> statement_xx -> statement_xx -> statement_xx (* 5.1 *). 55 | 56 | (** it's used for subtype declarations: 57 | - subtype declaration, e.g. subtype MyInt is Integer range 0 .. 5; 58 | - derived type declaration, e.g. type MyInt is new Integer range 1 .. 100; 59 | - integer type declaration, e.g. type MyInt is range 1 .. 10; 60 | *) 61 | Inductive range_xx: Type := Range_XX (l: Z) (u: Z). (* 3.5 *) 62 | 63 | (** ** Type Declarations *) 64 | Inductive type_declaration_xx: Type := (* 3.2.1 *) 65 | | Subtype_Declaration_XX: 66 | astnum -> typenum (*subtype name*) -> type -> range_xx -> type_declaration_xx (* 3.2.2 *) 67 | | Derived_Type_Declaration_XX: 68 | astnum -> typenum (*derived type name*) -> type -> range_xx -> type_declaration_xx (* 3.4 *) 69 | | Integer_Type_Declaration_XX: 70 | astnum -> typenum (*integer type name*) -> range_xx -> type_declaration_xx (* 3.5.4 *) 71 | | Array_Type_Declaration_XX: (* Constrained_Array_Definition, non-nested one-dimentional array *) 72 | astnum -> typenum (*array type name*) -> type (*index subtype mark*) -> type (*component type*) -> type_declaration_xx (* 3.6 *) 73 | | Record_Type_Declaration_XX: 74 | astnum -> typenum (*record type name*) -> list (idnum * type (*field type*)) -> type_declaration_xx (* 3.8 *). 75 | 76 | (* 3.3.1 *) 77 | Record object_declaration_xx: Type := mkobject_declaration_xx{ 78 | declaration_astnum_xx: astnum; 79 | object_name_xx: idnum; 80 | object_nominal_subtype_xx: type; 81 | initialization_expression_xx: option (expression_xx) 82 | }. 83 | 84 | (* 6.1 (15/3) *) 85 | Record parameter_specification_xx: Type := mkparameter_specification_xx{ 86 | parameter_astnum_xx: astnum; 87 | parameter_name_xx: idnum; 88 | parameter_subtype_mark_xx: type; 89 | parameter_mode_xx: mode 90 | (* parameter_default_expression_xx: option (expression_xx) *) 91 | }. 92 | 93 | (** ** Declarations *) 94 | (* Mutual records/inductives are not allowed in coq, so we build a record by hand. *) 95 | Inductive declaration_xx: Type := (* 3.1 *) 96 | | D_Null_Declaration_XX: declaration_xx 97 | | D_Type_Declaration_XX: astnum -> type_declaration_xx -> declaration_xx (* 3.2.1 *) 98 | | D_Object_Declaration_XX: astnum -> object_declaration_xx -> declaration_xx (* 3.3.1 *) 99 | | D_Procedure_Body_XX: astnum -> procedure_body_xx -> declaration_xx (* 6.1 *) 100 | | D_Seq_Declaration_XX: astnum -> declaration_xx -> declaration_xx -> declaration_xx (* it's introduced for easy proof *) 101 | 102 | with procedure_body_xx: Type := 103 | mkprocedure_body_xx 104 | (procedure_astnum_xx: astnum) 105 | (procedure_name_xx: procnum) 106 | (procedure_parameter_profile_xx: list parameter_specification_xx) 107 | (procedure_declarative_part_xx: declaration_xx) 108 | (procedure_statements_xx: statement_xx). 109 | 110 | 111 | (** ** Auxiliary Functions *) 112 | 113 | Section AuxiliaryFunctions_XX. 114 | 115 | Definition procedure_statements_xx pb := 116 | match pb with 117 | | mkprocedure_body_xx _ _ _ _ x => x 118 | end. 119 | 120 | Definition procedure_declarative_part_xx pb := 121 | match pb with 122 | | mkprocedure_body_xx _ _ _ x _ => x 123 | end. 124 | 125 | Definition procedure_parameter_profile_xx pb := 126 | match pb with 127 | | mkprocedure_body_xx _ _ x _ _ => x 128 | end. 129 | 130 | Definition procedure_name_xx pb := 131 | match pb with 132 | | mkprocedure_body_xx _ x _ _ _ => x 133 | end. 134 | 135 | Definition type_name_xx td := 136 | match td with 137 | | Subtype_Declaration_XX _ tn _ _ => tn 138 | | Derived_Type_Declaration_XX _ tn _ _ => tn 139 | | Integer_Type_Declaration_XX _ tn _ => tn 140 | | Array_Type_Declaration_XX _ tn _ _ => tn 141 | | Record_Type_Declaration_XX _ tn _ => tn 142 | end. 143 | 144 | Definition subtype_range_xx (t: type_declaration_xx): option range_xx := 145 | match t with 146 | | Subtype_Declaration_XX ast_num tn t r => Some r 147 | | Derived_Type_Declaration_XX ast_num tn t r => Some r 148 | | Integer_Type_Declaration_XX ast_num tn r => Some r 149 | | _ => None 150 | end. 151 | 152 | Definition expression_astnum_xx e := 153 | match e with 154 | | E_Literal_XX ast_num l (*e_checkflags*)=> ast_num 155 | | E_Name_XX ast_num n => ast_num 156 | | E_Binary_Operation_XX ast_num bop e1 e2 (*e_checkflags*)=> ast_num 157 | | E_Unary_Operation_XX ast_num uop e (*e_checkflags*)=> ast_num 158 | end. 159 | 160 | Definition name_astnum_xx n := 161 | match n with 162 | | E_Identifier_XX ast_num x (*n_checkflags*)=> ast_num 163 | | E_Indexed_Component_XX ast_num x e (*n_checkflags*)=> ast_num 164 | | E_Selected_Component_XX ast_num x f (*n_checkflags*)=> ast_num 165 | end. 166 | 167 | End AuxiliaryFunctions_XX. 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | 177 | 178 | 179 | -------------------------------------------------------------------------------- /spark2014_semantics/src/ast_util.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Export ast_rt. 13 | 14 | (** * Auxiliary Functions for AST *) 15 | Section AuxiliaryFunctions_For_AST. 16 | 17 | End AuxiliaryFunctions_For_AST. 18 | 19 | 20 | (** * Auxiliary Functions for AST_RT *) 21 | 22 | Section AuxiliaryFunctions_For_AST_RT. 23 | 24 | (** Check Flags Extraction Functions *) 25 | 26 | Function name_exterior_checks (n: nameRT): exterior_checks := 27 | match n with 28 | | IdentifierRT ast_num x ex_cks => ex_cks 29 | | IndexedComponentRT ast_num x e ex_cks => ex_cks 30 | | SelectedComponentRT ast_num x f ex_cks => ex_cks 31 | end. 32 | 33 | Function exp_exterior_checks (e: expRT): exterior_checks := 34 | match e with 35 | | LiteralRT ast_num l in_cks ex_cks => ex_cks 36 | | NameRT ast_num n => name_exterior_checks n 37 | | BinOpRT ast_num op e1 e2 in_cks ex_cks => ex_cks 38 | | UnOpRT ast_num op e in_cks ex_cks => ex_cks 39 | end. 40 | 41 | Definition update_exterior_checks_name n checks := 42 | match n with 43 | | IdentifierRT ast_num x ex_checks => IdentifierRT ast_num x checks 44 | | IndexedComponentRT ast_num x e ex_checks => IndexedComponentRT ast_num x e checks 45 | | SelectedComponentRT ast_num x f ex_checks => SelectedComponentRT ast_num x f checks 46 | end. 47 | 48 | Definition update_exterior_checks_exp e checks := 49 | match e with 50 | | LiteralRT ast_num l in_checks ex_checks => LiteralRT ast_num l in_checks checks 51 | | NameRT ast_num n => 52 | let n' := update_exterior_checks_name n checks in 53 | NameRT ast_num n' 54 | | BinOpRT ast_num bop e1 e2 in_checks ex_checks => BinOpRT ast_num bop e1 e2 in_checks checks 55 | | UnOpRT ast_num uop e in_checks ex_checks => UnOpRT ast_num uop e in_checks checks 56 | end. 57 | 58 | Lemma exp_updated_exterior_checks: forall e cks, 59 | exp_exterior_checks (update_exterior_checks_exp e cks) = cks. 60 | Proof. 61 | intros; destruct e; smack; 62 | destruct n; smack. 63 | Qed. 64 | 65 | Lemma name_updated_exterior_checks: forall n cks, 66 | name_exterior_checks (update_exterior_checks_name n cks) = cks. 67 | Proof. 68 | intros; destruct n; smack. 69 | Qed. 70 | 71 | Lemma exp_exterior_checks_refl: forall e, 72 | update_exterior_checks_exp e (exp_exterior_checks e) = e. 73 | Proof. 74 | destruct e; smack. 75 | destruct n; simpl; smack. 76 | Qed. 77 | 78 | Lemma name_exterior_checks_refl: forall n, 79 | update_exterior_checks_name n (name_exterior_checks n) = n. 80 | Proof. 81 | destruct n; smack. 82 | Qed. 83 | 84 | Lemma update_exterior_checks_exp_astnum_eq: forall e cks, 85 | expression_astnum_rt (update_exterior_checks_exp e cks) = expression_astnum_rt e. 86 | Proof. 87 | intros; 88 | destruct e; smack. 89 | Qed. 90 | 91 | Lemma update_exterior_checks_name_astnum_eq: forall n cks, 92 | name_astnum_rt (update_exterior_checks_name n cks) = name_astnum_rt n. 93 | Proof. 94 | intros; 95 | destruct n; smack. 96 | Qed. 97 | 98 | End AuxiliaryFunctions_For_AST_RT. 99 | 100 | 101 | 102 | 103 | 104 | 105 | 106 | 107 | 108 | 109 | 110 | -------------------------------------------------------------------------------- /spark2014_semantics/src/environment.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Import FunInd. 13 | Require Export values. 14 | 15 | (** for any valid variable x, it has an in/out mode, type and value 16 | (either defined or undefined); as the in/out mode and type are 17 | invariant since the variable is declared, and they are used only 18 | at compile time, we keep these invariant information in a symbol 19 | table called _symtb_; while the value of a variable will change 20 | as the program executes, and it's used in run time evaluation, 21 | so we keep this information in a store called _store_; 22 | *) 23 | 24 | (** * Store *) 25 | (** it's a map from a variable, represented with natural number, 26 | to its value; 27 | *) 28 | 29 | Module Type ENTRY. 30 | Parameter T:Type. 31 | End ENTRY. 32 | 33 | Module STORE(V:ENTRY). 34 | 35 | Notation V:=V.T. 36 | 37 | Definition store : Type := list (idnum * V). 38 | 39 | (** ** Store Operation *) 40 | (** check whether variable x has already been declared *) 41 | Function resides (x : idnum) (s : store) := 42 | match s with 43 | | (y, v) :: s' => 44 | if beq_nat x y then true else resides x s' 45 | | nil => false 46 | end. 47 | 48 | (** fetch the value of x that has already been initialized in store *) 49 | Function fetches (x : idnum) (s : store): option V := 50 | match s with 51 | | (y, v) :: s' => 52 | if beq_nat x y then Some v 53 | else fetches x s' 54 | | nil => None 55 | end. 56 | 57 | (** [cut_to x s] return the pair (s',s'') where s = s' ++ s'' and s'' 58 | starts with the first occurrence of [x] in [s]. If no occurrence 59 | of [x] exists in [s] then (nil,nil) is returned. *) 60 | Function cuts_to (x : idnum) (s : store): store*store := 61 | match s with 62 | | (y, v) :: s' => 63 | (if beq_nat x y then (nil,s) 64 | else let (l1,l2) := cuts_to x s' in 65 | (((y, v)::l1) , l2)) 66 | | nil => (nil, nil) 67 | end. 68 | 69 | (** update the latest binding for x *) 70 | Function updates (s: store) (x : idnum) (v: V): option store := 71 | match s with 72 | | (y, v') :: s' => 73 | if beq_nat x y then 74 | Some ((y,v)::s') 75 | else 76 | match updates s' x v with 77 | | Some s'' => Some ((y,v')::s'') 78 | | None => None 79 | end 80 | | nil => None 81 | end. 82 | 83 | (** ** Lemmas About Store Operation *) 84 | 85 | Lemma updates_length: forall s x v s', 86 | updates s x v = Some s' -> 87 | List.length s = List.length s'. 88 | Proof. 89 | intros s x v. 90 | functional induction updates s x v;simpl 91 | ; intros updateds heq; inversion heq;clear heq 92 | ; subst;simpl;auto. 93 | Qed. 94 | 95 | 96 | (** * State *) 97 | 98 | (* The global state is a stack of stores. One store per procedure 99 | currently running. *) 100 | Definition scope_level := nat. (* the scope level of the declared procedure to be called *) 101 | 102 | Definition frame := prod scope_level store. 103 | 104 | Definition level_of (f: frame) := fst f. 105 | 106 | Definition store_of (f: frame) := snd f. 107 | 108 | Definition state := list frame. 109 | 110 | 111 | 112 | Definition reside (x: idnum) (f: frame) := resides x (store_of f). 113 | 114 | Definition fetch (x: idnum) (f: frame) := fetches x (store_of f). 115 | 116 | Definition cut_to (x: idnum) (f: frame) := cuts_to x (store_of f). 117 | 118 | Function update (f: frame) (x: idnum) (v: V): option frame := 119 | match updates (store_of f) x v with 120 | | Some s => Some (level_of f, s) 121 | | None => None 122 | end. 123 | 124 | Definition push (f: frame) (x: idnum) (v: V) := (level_of f, (x, v) :: (store_of f)). 125 | 126 | Definition newFrame (n: scope_level): frame := (n, nil). 127 | 128 | 129 | (** ** State Operation *) 130 | 131 | Definition pushG x v (s: state) := 132 | match s with 133 | | nil => None 134 | | f :: s' => Some ((push f x v) :: s') 135 | end. 136 | 137 | Function fetchG (x : idnum) (s : state) := 138 | match s with 139 | | f :: s' => 140 | match fetch x f with 141 | | Some v => Some v 142 | | None => fetchG x s' 143 | end 144 | | nil => None 145 | end. 146 | 147 | Function updateG (s: state) (x: idnum) (v: V): option state := 148 | match s with 149 | | f :: s' => 150 | match update f x v with 151 | | Some f' => Some (f' :: s') 152 | | None => match (updateG s' x v) with 153 | | Some s'' => Some (f :: s'') 154 | | None => None 155 | end 156 | end 157 | | nil => None 158 | end. 159 | 160 | Function resideG (x : idnum) (s : state) := 161 | match s with 162 | | f :: s' => 163 | if reside x f then 164 | true 165 | else 166 | resideG x s' 167 | | nil => false 168 | end. 169 | 170 | 171 | 172 | (* These three functions are used by Compcert compilation *) 173 | Function frameG (x : idnum) (s : state): option frame := 174 | match s with 175 | | f :: s' => if reside x f then Some f 176 | else frameG x s' 177 | | nil => None 178 | end. 179 | 180 | Definition pop_frame (s:state) : option frame := 181 | match s with 182 | | nil => None 183 | | cons f _ => Some f 184 | end. 185 | 186 | Definition level_of_top (s:state): option scope_level := 187 | match pop_frame s with 188 | | Some (lvl,_) => Some lvl 189 | | None => None 190 | end. 191 | 192 | 193 | (** [cut_until s n s' s''] means cutting the state s until to a frame 194 | whose corresponding procedure's nested declaration level is less 195 | than n, and s' is a state with all its frame's corresponding procedure's 196 | nested declaration level greater or equal n, and s'' is a state holds 197 | frames whose procedure's nested declaration levels are less than n, 198 | and s = s' ++ s''; 199 | *) 200 | Inductive cut_until: state -> scope_level -> state -> state -> Prop := 201 | | Cut_Until_Nil: forall n, 202 | cut_until nil n nil nil 203 | | Cut_Until_Head: forall f n s, 204 | (level_of f) < n -> 205 | cut_until (f :: s) n nil (f :: s) 206 | | Cut_Until_Tail: forall f n s s' s'', 207 | ~ (level_of f < n) -> 208 | cut_until s n s' s'' -> 209 | cut_until (f :: s) n (f :: s') s''. 210 | 211 | Lemma cut_until_uniqueness: forall s n intact_s' s' intact_s'' s'', 212 | cut_until s n intact_s' s' -> 213 | cut_until s n intact_s'' s'' -> 214 | intact_s' = intact_s'' /\ s' = s''. 215 | Proof. 216 | intros s n intact_s' s' intact_s'' s'' H; revert intact_s'' s''. 217 | induction H; intros; 218 | match goal with 219 | | [H: cut_until nil _ _ _ |- _] => inversion H 220 | | [H: cut_until (?f :: ?s) _ _ _ |- _] => inversion H 221 | end; smack; 222 | specialize (IHcut_until _ _ H8); smack. 223 | Qed. 224 | 225 | Lemma cut_until_spec1: 226 | forall s pb_lvl s' s'' , 227 | cut_until s pb_lvl s' s'' 228 | -> s' ++ s'' = s. 229 | Proof. 230 | intros s pb_lvl s' s'' H. 231 | induction H;auto. 232 | simpl. 233 | rewrite IHcut_until. 234 | reflexivity. 235 | Qed. 236 | 237 | Inductive stack_eq_length : state -> state -> Prop := 238 | | eqnil: stack_eq_length nil nil 239 | | eqncons: forall s s' f f', 240 | stack_eq_length s s' -> 241 | List.length (store_of f) = List.length (store_of f') -> 242 | stack_eq_length (f :: s) (f' :: s'). 243 | 244 | (** ** Lemmas About State Operation *) 245 | 246 | Lemma stack_eq_length_refl: forall s s', 247 | s = s' -> 248 | stack_eq_length s s'. 249 | Proof. 250 | intros s. 251 | induction s;intros s' heq. 252 | - subst. 253 | constructor. 254 | - subst. 255 | constructor. 256 | + apply IHs. 257 | reflexivity. 258 | + reflexivity. 259 | Qed. 260 | 261 | Require Import Setoid. 262 | Require Import Morphisms. 263 | 264 | Lemma stack_eq_length_refl2: reflexive _ stack_eq_length. 265 | Proof. 266 | hnf. 267 | intros x. 268 | apply stack_eq_length_refl. 269 | reflexivity. 270 | Qed. 271 | 272 | Lemma stack_eq_length_sym: forall s s', 273 | stack_eq_length s s' -> 274 | stack_eq_length s' s. 275 | Proof. 276 | intros s. 277 | induction s;intros s' heq. 278 | - inversion heq. 279 | constructor. 280 | - inversion heq. 281 | constructor. 282 | + apply IHs. 283 | assumption. 284 | + symmetry. 285 | assumption. 286 | Qed. 287 | 288 | Lemma stack_eq_length_trans: forall s' s s'', 289 | stack_eq_length s s' -> 290 | stack_eq_length s' s'' -> 291 | stack_eq_length s s''. 292 | Proof. 293 | intros s'. 294 | induction s';intros s s'' heq1 heq2 295 | ; try now (inversion heq1; inversion heq2;subst;constructor). 296 | inversion heq1. 297 | inversion heq2. 298 | subst. 299 | constructor. 300 | + apply IHs' ;assumption. 301 | + transitivity (List.length (store_of a));auto. 302 | Qed. 303 | 304 | Lemma stack_eq_length_trans2: transitive _ stack_eq_length. 305 | Proof. 306 | hnf. 307 | intros x y z H H0. 308 | apply stack_eq_length_trans with (s':= y);auto. 309 | Qed. 310 | 311 | Add Parametric Relation: state stack_eq_length 312 | reflexivity proved by stack_eq_length_refl2 313 | symmetry proved by stack_eq_length_sym 314 | transitivity proved by stack_eq_length_trans2 315 | as stack_eq_length_equiv_rel. 316 | 317 | Add Parametric Morphism: (@List.app frame) 318 | with signature stack_eq_length ==> stack_eq_length ==> stack_eq_length 319 | as app_morph_stack_eq_length. 320 | Proof. 321 | intros x y H. 322 | induction H;simpl;intros. 323 | - assumption. 324 | - constructor 2. 325 | + apply IHstack_eq_length. 326 | assumption. 327 | + assumption. 328 | Qed. 329 | 330 | Lemma updateG_length: forall s x v s', 331 | updateG s x v = Some s' -> 332 | List.length s = List.length s'. 333 | Proof. 334 | intros s x v. 335 | functional induction updateG s x v;simpl 336 | ; intros updateds heq; inversion heq;clear heq 337 | ; subst;simpl;auto. 338 | Qed. 339 | 340 | End STORE. 341 | 342 | -------------------------------------------------------------------------------- /spark2014_semantics/src/htmlgen: -------------------------------------------------------------------------------- 1 | coqdoc CpdtTactics.v \ 2 | list_util.v \ 3 | ast_basics.v \ 4 | ast.v \ 5 | ast_rt.v \ 6 | ast_util.v \ 7 | rt.v \ 8 | values.v \ 9 | environment.v \ 10 | symboltable_module.v \ 11 | symboltable.v \ 12 | eval.v \ 13 | eval_rt.v \ 14 | rt_gen.v \ 15 | rt_gen_impl.v \ 16 | rt_gen_impl_consistent.v \ 17 | rt_gen_consistent.v \ 18 | rt_gen_util.v \ 19 | rt_opt_ZArith.v \ 20 | rt_opt.v \ 21 | rt_opt_impl.v \ 22 | rt_opt_impl_consistent.v \ 23 | rt_opt_util.v \ 24 | well_typed.v \ 25 | well_typed_util.v \ 26 | rt_opt_consistent_util.v \ 27 | rt_opt_consistent.v \ 28 | rt_counter.v \ 29 | rt_validator.v \ 30 | -toc --no-lib-name 31 | -------------------------------------------------------------------------------- /spark2014_semantics/src/languagegen: -------------------------------------------------------------------------------- 1 | sed 's/(\*_1_//g;s/_1_\*)//g;/_2_\*)$/d;s/_xx//g;s/_XX//g;s/(\*e_checks\*)//g;s/(\*n_checks\*)//g;s/(\*e_checkflags\*)//g;s/(\*n_checkflags\*)//g;s/NOTICE/This file can be auto-generated from ast_template.v by running languagegen in terminal/g' ast_template.v > ast.v 2 | sed 's/(\*_2_//g;s/_2_\*)//g;/_1_\*)$/d;s/_xx/_x/g;s/_XX/_X/g;s/(\*e_checks\*)/-> interior_checks -> exterior_checks /g;s/(\*n_checks\*)/-> exterior_checks /g;s/(\*e_checkflags\*)/in_checks ex_checks /g;s/(\*n_checkflags\*)/ex_checks /g;s/NOTICE/This file can be auto-generated from ast_template.v by running languagegen in terminal/g' ast_template.v > ast_rt.v 3 | -------------------------------------------------------------------------------- /spark2014_semantics/src/list_util.v: -------------------------------------------------------------------------------- 1 | Require Import List FunInd. 2 | 3 | Lemma length_invnil : forall A (l:list A), length l = 0 -> l = nil. 4 | Proof. 5 | intros A l H. 6 | destruct l;auto;simpl in *. 7 | inversion H. 8 | Qed. 9 | 10 | Lemma length_invcons : forall A n (l:list A), length l = S n -> exists x l', l = x::l'. 11 | Proof. 12 | intros A n l H. 13 | destruct l;auto;simpl in *. 14 | inversion H. 15 | exists a. 16 | exists l. 17 | reflexivity. 18 | Qed. 19 | 20 | (** * List Split Operations *) 21 | 22 | Function split1 A n (l:list A) {struct n} := 23 | match l,n with 24 | | _,O => Some(nil,l) 25 | | e::l' , S n' => 26 | match split1 A n' l' with 27 | | Some(l1,l2) => Some (e::l1,l2) 28 | | None => None 29 | end 30 | | nil, S _ => None 31 | end. 32 | 33 | Function split2 {A} n m (l:list A) := 34 | match split1 A n l with 35 | | None => None 36 | | Some (l1,l2') => 37 | match split1 A m l2' with 38 | | None => None 39 | | Some (l2,l3) => Some (l1,(l2,l3)) 40 | end 41 | end. 42 | 43 | (** * List Split Lemmas *) 44 | 45 | Lemma split2_equation1 : 46 | forall A n m e (l l1 l2 l3: list A), 47 | split2 n m l = Some (l1,(l2,l3)) 48 | -> split2 (S n) m (e::l) = Some (e::l1 , (l2, l3)). 49 | Proof. 50 | intros A n m e l l1 l2 l3 H. 51 | unfold split2. 52 | simpl. 53 | functional induction (split2 n m l);try discriminate. 54 | inversion H. 55 | rewrite e0. 56 | rewrite e1. 57 | subst. 58 | reflexivity. 59 | Qed. 60 | 61 | Lemma split2_equation2 : 62 | forall A m e (l l1 l2 l3: list A), 63 | split2 0 m l = Some (l1,(l2,l3)) 64 | -> split2 0 (S m) (e::l) = Some (nil , (e::l2, l3)). 65 | Proof. 66 | intros A m e l l1 l2 l3 H. 67 | unfold split2. 68 | simpl. 69 | unfold split2 in H. 70 | simpl in *. 71 | destruct l. 72 | - destruct (split1 A m nil). 73 | + destruct p. 74 | inversion H. 75 | reflexivity. 76 | + inversion H. 77 | - destruct (split1 A m (a :: l)). 78 | + destruct p. 79 | inversion H. 80 | reflexivity. 81 | + inversion H. 82 | Qed. 83 | 84 | Lemma split2_equation3 : 85 | forall A n m e (l l1 l2 l3: list A), 86 | split2 n m l = None 87 | -> split2 (S n) m (e::l) = None. 88 | Proof. 89 | intros A n m e l l1 l2 l3 H. 90 | unfold split2. 91 | simpl. 92 | unfold split2 in H. 93 | destruct (split1 A n l). 94 | - destruct p. 95 | destruct (split1 A m l4). 96 | + destruct p. 97 | inversion H. 98 | + reflexivity. 99 | - reflexivity. 100 | Qed. 101 | 102 | 103 | Lemma split1_correct : 104 | forall A n (l l2 l1:list A), 105 | (split1 _ n l = Some (l1,l2)) 106 | -> List.length l1 = n /\ (l = l1 ++ l2). 107 | Proof. 108 | intros A n l. 109 | functional induction split1 A n l;simpl;intros. 110 | - inversion H. 111 | clear H. 112 | subst. 113 | simpl. 114 | split; reflexivity. 115 | - inversion H. 116 | clear H. 117 | subst. 118 | specialize (IHo _ _ e2). 119 | destruct IHo as [IHo1 IHo2]. 120 | subst. 121 | split;reflexivity. 122 | - inversion H. 123 | - inversion H. 124 | Qed. 125 | 126 | 127 | Lemma split1_correct_eq : 128 | forall A n (l l2 l1:list A), 129 | (split1 _ n l = Some (l1,l2)) 130 | -> (l = l1 ++ l2). 131 | Proof. 132 | intros A n l l2 l1 H. 133 | apply (split1_correct A n l l2 l1) in H. 134 | destruct H;assumption. 135 | Qed. 136 | 137 | Lemma split1_correct_length : 138 | forall A n (l l2 l1:list A), 139 | (split1 _ n l = Some (l1,l2)) 140 | -> List.length l1 = n. 141 | Proof. 142 | intros A n l l2 l1 H. 143 | apply (split1_correct A n l l2 l1) in H. 144 | destruct H;assumption. 145 | Qed. 146 | 147 | 148 | Lemma split1_complete : 149 | forall A n (l l2 l1:list A), 150 | List.length l1 = n 151 | -> (l = l1 ++ l2) 152 | -> (split1 _ n l = Some (l1,l2)). 153 | Proof. 154 | intros A n l. 155 | functional induction split1 A n l;simpl;intros. 156 | - rewrite (length_invnil _ _ H) in *. 157 | simpl in *. 158 | subst. 159 | reflexivity. 160 | - destruct (length_invcons _ _ _ H) as [e' l'']. 161 | destruct l''. 162 | subst. 163 | simpl in *. 164 | inversion H; clear H. 165 | inversion H0. clear H0. 166 | subst. 167 | specialize (IHo l0 x refl_equal refl_equal). 168 | rewrite e2 in IHo. 169 | inversion IHo. 170 | reflexivity. 171 | - destruct l1. 172 | + simpl in *. 173 | discriminate. 174 | + simpl in *. 175 | inversion H0. 176 | specialize (IHo l2 l1). 177 | rewrite IHo in e2;try discriminate. 178 | * inversion H. 179 | reflexivity. 180 | * assumption. 181 | - apply length_invcons in H. 182 | decompose [ex] H. clear H. 183 | subst. 184 | simpl in *. 185 | inversion H0. 186 | Qed. 187 | 188 | Lemma split2_correct : 189 | forall A n m (l l1 l2 l3:list A), 190 | split2 n m l = Some (l1,(l2,l3)) 191 | -> (l = l1 ++ l2 ++ l3) /\ List.length l1 = n /\ List.length l2 = m. 192 | Proof. 193 | intros A n m l l1 l2 l3 H. 194 | unfold split2 in H. 195 | destruct (split1 A n l) eqn:heq. 196 | - destruct p. 197 | apply split1_correct in heq. 198 | destruct heq as [heq1 heq2]. 199 | subst l. 200 | destruct (split1 A m l4) eqn:heq'. 201 | + destruct p. 202 | inversion H;clear H;subst. 203 | apply split1_correct in heq'. 204 | destruct heq' as [heq'1 heq'2]. 205 | subst;auto. 206 | + inversion H. 207 | - inversion H. 208 | Qed. 209 | 210 | Require Import Lia. 211 | 212 | Lemma split2_length_ok : 213 | forall A (l:list A) n m, 214 | List.length l >= n+m 215 | -> split2 n m l <> None. 216 | Proof. 217 | intros A l. 218 | induction l;simpl;intros n m h. 219 | - assert (n=0) by lia. 220 | assert (m=0) by lia. 221 | subst. 222 | unfold split2. 223 | simpl. 224 | discriminate. 225 | - destruct n. 226 | + destruct m. 227 | * unfold split2. 228 | simpl. 229 | discriminate. 230 | * intro abs. 231 | assert (h':length l >= 0 + m) by lia. 232 | generalize (IHl 0 m h'). 233 | intro IHl'. 234 | unfold split2 in abs,IHl'. 235 | simpl in *. 236 | { destruct l;simpl in *. 237 | -destruct (split1 A m nil);simpl in *. 238 | + destruct p. 239 | inversion abs. 240 | + contradiction. 241 | - destruct (split1 A m (a0 :: l));simpl in *. 242 | + destruct p. 243 | inversion abs. 244 | + contradiction. } 245 | + assert (h':length l >= n + m) by lia. 246 | generalize (IHl n m h'). 247 | intro IHl'. 248 | unfold split2 in IHl'. 249 | unfold split2. 250 | simpl in *. 251 | destruct (split1 A n l);simpl in *. 252 | * destruct p. 253 | { destruct (split1 A m l1). 254 | - destruct p. 255 | discriminate. 256 | - assumption. } 257 | * assumption. 258 | Qed. 259 | 260 | Lemma app_same_length_eq : 261 | forall A (l l' m m': list A), 262 | length l = length l' 263 | -> l++m = l'++m' -> l = l'/\ m=m'. 264 | Proof. 265 | intros A l. 266 | induction l;simpl in *;intros. 267 | - symmetry in H. 268 | apply length_invnil in H. 269 | subst. 270 | split;auto. 271 | - destruct l'. 272 | + simpl in H. 273 | inversion H. 274 | + specialize (IHl l' m m'). 275 | simpl in *. 276 | inversion H0. 277 | inversion H. 278 | specialize (IHl H4 H3). 279 | destruct IHl. 280 | subst. 281 | split;auto. 282 | Qed. 283 | 284 | Lemma app_same_length_eq2 : 285 | forall A (l l' m m': list A), 286 | length m = length m' 287 | -> l++m = l'++m' -> l = l'/\ m=m'. 288 | Proof. 289 | intros. 290 | assert (H1: length (l ++ m) = length (l' ++ m')). 291 | rewrite H0; auto. 292 | assert (H2: length l = length l'). 293 | rewrite app_length in H1. rewrite app_length in H1. 294 | rewrite H in H1; lia. 295 | specialize (app_same_length_eq _ _ _ _ _ H2 H0); auto. 296 | Qed. 297 | 298 | Lemma split2_complete : 299 | forall A n m (l l1 l2 l3:list A), 300 | l = l1 ++ l2 ++ l3 301 | -> List.length l1 = n 302 | -> List.length l2 = m 303 | -> split2 n m l = Some (l1,(l2,l3)). 304 | Proof. 305 | intros A n m l l1 l2 l3 H H0 H1. 306 | destruct (split2 n m l) eqn:heq. 307 | - destruct p. 308 | destruct p. 309 | apply split2_correct in heq. 310 | decompose [and] heq. 311 | clear heq. 312 | subst. 313 | symmetry in H4. 314 | destruct (app_same_length_eq _ _ _ _ _ H4 H2). 315 | subst. 316 | symmetry in H5. 317 | destruct (app_same_length_eq _ _ _ _ _ H5 H0). 318 | subst. 319 | reflexivity. 320 | - destruct (split2_length_ok _ l n m). 321 | + rewrite H. 322 | repeat rewrite app_length. 323 | lia. 324 | + assumption. 325 | Qed. 326 | 327 | -------------------------------------------------------------------------------- /spark2014_semantics/src/makegen: -------------------------------------------------------------------------------- 1 | coq_makefile -f _CoqProject -o Makefile 2 | 3 | -------------------------------------------------------------------------------- /spark2014_semantics/src/rt.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Import FunInd Coq.Bool.Bool. 13 | Require Import Coq.Lists.List. 14 | 15 | (** * Run-Time Checks *) 16 | (** a subset of run-time checks to be verified *) 17 | (** 18 | - Do_Division_Check 19 | 20 | Check that the second operand of the the division, mod or rem 21 | operator is not zero. 22 | 23 | - Do_Overflow_Check 24 | 25 | Check that the result of the given arithmetic operation is within 26 | the bounds of the base type. 27 | 28 | - Do_Range_Check 29 | 30 | Check that the given value is within the range of the expected scalar 31 | subtype. 32 | 33 | - Do_Range_Check_On_Return 34 | 35 | for a procedure call, if it's required a range check for its input variables 36 | during copy_in procedure, then the range check flag should be set for its input 37 | arguments; but if it's required a range check for its output variables during 38 | copy_out procedure, then the range check flag should be set for its output 39 | parameters instead of its output arguments, but it's unreasonable to set the 40 | range check on formal parameters of the called procedure as it maybe called in 41 | different context where there are no range check requirement. So here we introduce 42 | a new check flag Do_Range_Check_On_CopyOut for the output parameters but it's set 43 | on the output arguments. 44 | 45 | - Undefined_Check 46 | 47 | for any run-time checks extracted from gnat2xml other than Do_Division_Check, 48 | Do_Overflow_Check and Do_Range_Check, they are represented by Undefined_Check. 49 | *) 50 | 51 | (** 52 | Do_Range_Check (Flag9-Sem) (reference: sinfo.ads) 53 | 54 | This flag is set on an expression which appears in a context where a 55 | range check is required. The target type is clear from the context. 56 | The contexts in which this flag can appear are the following: 57 | 58 | - Right side of an assignment. In this case the target type is 59 | taken from the left side of the assignment, which is referenced 60 | by the Name of the N_Assignment_Statement node. 61 | 62 | - Subscript expressions in an indexed component. In this case the 63 | target type is determined from the type of the array, which is 64 | referenced by the Prefix of the N_Indexed_Component node. 65 | 66 | - Argument expression for a parameter, appearing either directly in 67 | the Parameter_Associations list of a call or as the Expression of an 68 | N_Parameter_Association node that appears in this list. In either 69 | case, the check is against the type of the formal. Note that the 70 | flag is relevant only in IN and IN OUT parameters, and will be 71 | ignored for OUT parameters, where no check is required in the call, 72 | and if a check is required on the return, it is generated explicitly 73 | with a type conversion. 74 | 75 | - Initialization expression for the initial value in an object 76 | declaration. In this case the Do_Range_Check flag is set on 77 | the initialization expression, and the check is against the 78 | range of the type of the object being declared. This includes the 79 | cases of expressions providing default discriminant values, and 80 | expressions used to initialize record components. 81 | 82 | - The expression of a type conversion. In this case the range check is 83 | against the target type of the conversion. See also the use of 84 | Do_Overflow_Check on a type conversion. The distinction is that the 85 | overflow check protects against a value that is outside the range of 86 | the target base type, whereas a range check checks that the 87 | resulting value (which is a value of the base type of the target 88 | type), satisfies the range constraint of the target type. 89 | *) 90 | 91 | 92 | (** checks that are needed to be verified at run time: *) 93 | 94 | Inductive check_flag: Type := 95 | | DivCheck : check_flag 96 | | OverflowCheck : check_flag 97 | | RangeCheck : check_flag 98 | | RangeCheckOnReturn : check_flag 99 | | UndefinedCheck : check_flag. 100 | 101 | 102 | (** For an expression or statement, there may exists a list of checks 103 | enforced on it, for example, for division expression, both 104 | division by zero and overflow checks are needed to be performed; 105 | *) 106 | Definition check_flags := list check_flag. 107 | 108 | (** For an expression e used as index of an array, e.g. a(e), then overflow check is called 109 | the interior run-time checks for e, and range check for the value of e is enforced by the 110 | array, so it's called exterior run-time checks; in our formalization for SPARK semantics, 111 | we distinguish these two different kind of checks, as they performed at diferrent stages, 112 | one is run-time checked when e is evaluated, and the other is run-time checked when the 113 | value of e is used as index of array a; 114 | *) 115 | Definition interior_checks := check_flags. 116 | 117 | Definition exterior_checks := check_flags. 118 | 119 | (** * Run-Time Checks Subset *) 120 | (** these functions will be used to verify the run-time check flags 121 | that are generated by GNAT front end against the expected 122 | run-time check flags as required by the semantics of SPARK 123 | programming language; 124 | *) 125 | 126 | Function beq_check_flag (ck1 ck2: check_flag): bool := 127 | match ck1, ck2 with 128 | | DivCheck, DivCheck => true 129 | | OverflowCheck, OverflowCheck => true 130 | | RangeCheck, RangeCheck => true 131 | | RangeCheckOnReturn, RangeCheckOnReturn => true 132 | | UndefinedCheck, UndefinedCheck => true 133 | | _, _ => false 134 | end. 135 | 136 | Function element_of (a: check_flag) (ls: list check_flag): bool := 137 | match ls with 138 | | nil => false 139 | | (a' :: ls') => 140 | if beq_check_flag a a' then 141 | true 142 | else 143 | element_of a ls' 144 | end. 145 | 146 | Function subset_of (cks1 cks2: check_flags): bool := 147 | match cks1 with 148 | | nil => true 149 | | ck :: cks1' => 150 | if element_of ck cks2 then 151 | subset_of cks1' cks2 152 | else 153 | false 154 | end. 155 | 156 | Function beq_check_flags (cks1 cks2: check_flags): bool := 157 | (subset_of cks1 cks2) && (subset_of cks2 cks1). 158 | -------------------------------------------------------------------------------- /spark2014_semantics/src/rt_counter.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | Require Export rt_gen. 12 | 13 | (** * Count Number of Run-Time Checks *) 14 | 15 | Section Check_Count. 16 | 17 | Record cks_infor_t: Type := cks_infor { 18 | num_of_cks : nat; 19 | division_cks: nat; 20 | overflow_cks: nat; 21 | range_cks : nat 22 | }. 23 | 24 | Function conj_cks_infor (x1 x2 : cks_infor_t): cks_infor_t := 25 | match x1, x2 with 26 | | cks_infor cksNum d o r, cks_infor cksNum1 d1 o1 r1 => 27 | cks_infor (cksNum + cksNum1) (d + d1) (o + o1) (r + r1) 28 | end. 29 | 30 | Function compute_cks_infor (cks: check_flags) : cks_infor_t := 31 | match cks with 32 | | nil => cks_infor 0 0 0 0 33 | | ck :: cks' => 34 | let r := compute_cks_infor cks' in 35 | match ck with 36 | | DivCheck => 37 | cks_infor (r.(num_of_cks) + 1) (r.(division_cks) + 1) r.(overflow_cks) r.(range_cks) 38 | | OverflowCheck => 39 | cks_infor (r.(num_of_cks) + 1) r.(division_cks) (r.(overflow_cks) + 1) r.(range_cks) 40 | | RangeCheck => 41 | cks_infor (r.(num_of_cks) + 1) r.(division_cks) r.(overflow_cks) (r.(range_cks) + 1) 42 | | RangeCheckOnReturn => 43 | cks_infor (r.(num_of_cks) + 1) r.(division_cks) r.(overflow_cks) (r.(range_cks) + 1) 44 | | _ => r 45 | end 46 | end. 47 | 48 | (** ** Check Counter for Expression *) 49 | 50 | Function count_exp_check_flags (e: expRT): cks_infor_t := 51 | match e with 52 | | LiteralRT n l in_cks ex_cks => 53 | compute_cks_infor (in_cks ++ ex_cks) 54 | | NameRT n nm => 55 | count_name_check_flags nm 56 | | BinOpRT n op e1 e2 in_cks ex_cks => 57 | conj_cks_infor (compute_cks_infor (in_cks ++ ex_cks)) 58 | (conj_cks_infor 59 | (count_exp_check_flags e1) 60 | (count_exp_check_flags e2) 61 | ) 62 | | UnOpRT n op e in_cks ex_cks => 63 | conj_cks_infor (compute_cks_infor (in_cks ++ ex_cks)) 64 | (count_exp_check_flags e) 65 | end 66 | 67 | (** ** Check Counter for Name *) 68 | 69 | with count_name_check_flags (n: nameRT): cks_infor_t := 70 | match n with 71 | | IdentifierRT n x ex_cks => 72 | compute_cks_infor ex_cks 73 | | IndexedComponentRT n x e ex_cks => 74 | conj_cks_infor (compute_cks_infor ex_cks) 75 | (conj_cks_infor 76 | (count_name_check_flags x) 77 | (count_exp_check_flags e) 78 | ) 79 | | SelectedComponentRT n x f ex_cks => 80 | conj_cks_infor (compute_cks_infor ex_cks) 81 | (count_name_check_flags x) 82 | end. 83 | 84 | Function count_args_check_flags (le: list expRT): cks_infor_t := 85 | match le with 86 | | nil => cks_infor 0 0 0 0 87 | | (e1 :: le1') => 88 | conj_cks_infor (count_exp_check_flags e1) 89 | (count_args_check_flags le1') 90 | end. 91 | 92 | 93 | (** ** Check Counter for Statement *) 94 | 95 | Function count_stmt_check_flags (c: stmtRT): cks_infor_t := 96 | match c with 97 | | NullRT => cks_infor 0 0 0 0 98 | | AssignRT n x e => 99 | conj_cks_infor (count_name_check_flags x) 100 | (count_exp_check_flags e) 101 | | IfRT n e c1 c2 => 102 | conj_cks_infor (count_exp_check_flags e) 103 | (conj_cks_infor 104 | (count_stmt_check_flags c1) 105 | (count_stmt_check_flags c2) 106 | ) 107 | | WhileRT n e c => 108 | conj_cks_infor (count_exp_check_flags e) 109 | (count_stmt_check_flags c) 110 | | CallRT n pn p args => 111 | (count_args_check_flags args) 112 | | SeqRT n c1 c2 => 113 | conj_cks_infor (count_stmt_check_flags c1) 114 | (count_stmt_check_flags c2) 115 | end. 116 | 117 | Function count_type_decl_check_flags (t: typeDeclRT): cks_infor_t := 118 | match t with 119 | | SubtypeDeclRT n tn t (RangeRT l u) => 120 | cks_infor 0 0 0 0 121 | | DerivedTypeDeclRT n tn t (RangeRT l u) => 122 | cks_infor 0 0 0 0 123 | | IntegerTypeDeclRT n tn (RangeRT l u) => 124 | cks_infor 0 0 0 0 125 | | ArrayTypeDeclRT n tn tm t => 126 | cks_infor 0 0 0 0 127 | | RecordTypeDeclRT n tn fs => 128 | cks_infor 0 0 0 0 129 | end. 130 | 131 | Function count_object_decl_check_flags (o: objDeclRT): cks_infor_t := 132 | match o with 133 | | mkobjDeclRT n x t None => 134 | cks_infor 0 0 0 0 135 | | mkobjDeclRT n x t (Some e) => 136 | count_exp_check_flags e 137 | end. 138 | 139 | Function count_object_decls_check_flags (lo: list objDeclRT): cks_infor_t := 140 | match lo with 141 | | nil => cks_infor 0 0 0 0 142 | | o1 :: lo1' => 143 | conj_cks_infor (count_object_decl_check_flags o1) 144 | (count_object_decls_check_flags lo1') 145 | end. 146 | 147 | Function count_param_spec_check_flags (param: paramSpecRT): cks_infor_t := 148 | match param with 149 | | mkparamSpecRT n x m t => 150 | cks_infor 0 0 0 0 151 | end. 152 | 153 | Function count_param_specs_check_flags (lparam: list paramSpecRT): cks_infor_t := 154 | match lparam with 155 | | nil => cks_infor 0 0 0 0 156 | | param1 :: lparam1' => 157 | conj_cks_infor (count_param_spec_check_flags param1) 158 | (count_param_specs_check_flags lparam1') 159 | end. 160 | 161 | (** ** Check Counter for Declaration *) 162 | 163 | Function count_declaration_check_flags (d: declRT): cks_infor_t := 164 | match d with 165 | | NullDeclRT => cks_infor 0 0 0 0 166 | | TypeDeclRT n t => 167 | count_type_decl_check_flags t 168 | | ObjDeclRT n o => 169 | count_object_decl_check_flags o 170 | | ProcBodyDeclRT n p => 171 | count_procedure_body_check_flags p 172 | | SeqDeclRT n d1 d2 => 173 | conj_cks_infor (count_declaration_check_flags d1) (count_declaration_check_flags d2) 174 | end 175 | 176 | (** ** Check Counter for Procedure *) 177 | with count_procedure_body_check_flags (p: procBodyDeclRT): cks_infor_t := 178 | match p with 179 | | mkprocBodyDeclRT n p params decls stmt => 180 | conj_cks_infor (count_param_specs_check_flags params) 181 | (conj_cks_infor 182 | (count_declaration_check_flags decls) 183 | (count_stmt_check_flags stmt) 184 | ) 185 | end. 186 | 187 | (** ** Check Counter for Program *) 188 | Function count_program_check_flags (p: programRT): cks_infor_t := 189 | count_declaration_check_flags (p.(declsRT)). 190 | 191 | Definition count_option_program_check_flags (x: option programRT): cks_infor_t := 192 | match x with 193 | | Some p => count_program_check_flags p 194 | | None => cks_infor 0 0 0 0 195 | end. 196 | 197 | End Check_Count. 198 | 199 | 200 | 201 | 202 | 203 | 204 | 205 | 206 | 207 | -------------------------------------------------------------------------------- /spark2014_semantics/src/rt_gen_impl.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | (* *************************************************************** 13 | Funtion Version 14 | *************************************************************** *) 15 | 16 | Require Export rt_gen. 17 | 18 | 19 | (** * toExpRTImpl *) 20 | 21 | Function toExpRTImpl (st: symTab) (e: exp): expRT := 22 | match e with 23 | | Literal n (Boolean_Literal b) => 24 | LiteralRT n (Boolean_Literal b) nil nil 25 | | Literal n (Integer_Literal v) => 26 | match (Zle_bool min_signed v) && (Zle_bool v max_signed) with 27 | | true => LiteralRT n (Integer_Literal v) nil nil (* optimization *) 28 | | false => LiteralRT n (Integer_Literal v) (OverflowCheck :: nil) nil 29 | end 30 | | Name n nm => 31 | let nRT := toNameRTImpl st nm in 32 | NameRT n nRT 33 | | BinOp n op e1 e2 => 34 | let e1RT := toExpRTImpl st e1 in 35 | let e2RT := toExpRTImpl st e2 in 36 | match op with 37 | | Plus => BinOpRT n op e1RT e2RT (OverflowCheck :: nil) nil 38 | | Minus => BinOpRT n op e1RT e2RT (OverflowCheck :: nil) nil 39 | | Multiply => BinOpRT n op e1RT e2RT (OverflowCheck :: nil) nil 40 | | Divide => BinOpRT n op e1RT e2RT (DivCheck :: OverflowCheck :: nil) nil 41 | | Modulus => BinOpRT n op e1RT e2RT (DivCheck :: nil) nil 42 | | _ => BinOpRT n op e1RT e2RT nil nil 43 | end 44 | | UnOp n op e => 45 | let eRT := toExpRTImpl st e in 46 | match op with 47 | | Unary_Minus => UnOpRT n op eRT (OverflowCheck :: nil) nil 48 | | _ => UnOpRT n op eRT nil nil 49 | end 50 | end 51 | 52 | (** * toNameRTImpl *) 53 | with toNameRTImpl (st: symTab) (n: name): nameRT := 54 | match n with 55 | | Identifier n x => 56 | IdentifierRT n x nil 57 | | IndexedComponent n x e => 58 | let xRT := toNameRTImpl st x in 59 | let eRT := toExpRTImpl st e in 60 | IndexedComponentRT n xRT (update_exterior_checks_exp eRT (RangeCheck :: nil)) nil 61 | | SelectedComponent n x f => 62 | let xRT := toNameRTImpl st x in 63 | SelectedComponentRT n xRT f nil 64 | end. 65 | 66 | (** * toArgsRTImpl *) 67 | Function toArgsRTImpl (st: symTab) (params: list paramSpec) (args: list exp): option (list expRT) := 68 | match params, args with 69 | | nil, nil => Some nil 70 | | param :: lparam, arg :: larg => 71 | match (toArgsRTImpl st lparam larg) with 72 | | Some largRT => 73 | match param.(parameter_mode) with 74 | | In => 75 | let argRT := toExpRTImpl st arg in 76 | if is_range_constrainted_type (param.(parameter_subtype_mark)) then 77 | Some ((update_exterior_checks_exp argRT (RangeCheck :: nil)) :: largRT) 78 | else 79 | Some (argRT :: largRT) 80 | | Out => 81 | match arg with 82 | | Name n nm => 83 | match fetch_exp_type n st with 84 | | Some t => 85 | let nRT := toNameRTImpl st nm in 86 | if is_range_constrainted_type t then 87 | Some ((NameRT n (update_exterior_checks_name nRT (RangeCheckOnReturn :: nil))) :: largRT) 88 | else 89 | Some ((NameRT n nRT) :: largRT) 90 | | None => None 91 | end 92 | | _ => None 93 | end 94 | | In_Out => 95 | match arg with 96 | | Name n nm => 97 | match fetch_exp_type n st with 98 | | Some t => 99 | let nRT := toNameRTImpl st nm in 100 | if is_range_constrainted_type (param.(parameter_subtype_mark)) then 101 | if is_range_constrainted_type t then 102 | Some ((NameRT n (update_exterior_checks_name nRT (RangeCheck :: RangeCheckOnReturn :: nil))) :: largRT) 103 | else 104 | Some ((NameRT n (update_exterior_checks_name nRT (RangeCheck :: nil))) :: largRT) 105 | else 106 | if is_range_constrainted_type t then 107 | Some ((NameRT n (update_exterior_checks_name nRT (RangeCheckOnReturn :: nil))) :: largRT) 108 | else 109 | Some ((NameRT n nRT) :: largRT) 110 | | _ => None 111 | end 112 | | _ => None 113 | end 114 | end 115 | | _ => None 116 | end 117 | | _, _ => None 118 | end. 119 | 120 | (** * toStmtRTImpl *) 121 | Function toStmtRTImpl (st: symTab) (c: stmt): option stmtRT := 122 | match c with 123 | | Null => Some NullRT 124 | | Assign n x e => 125 | let xRT := toNameRTImpl st x in 126 | let eRT := toExpRTImpl st e in 127 | match fetch_exp_type (name_astnum x) st with 128 | | Some t => 129 | if is_range_constrainted_type t then 130 | Some (AssignRT n xRT (update_exterior_checks_exp eRT (RangeCheck :: nil))) 131 | else 132 | Some (AssignRT n xRT eRT) 133 | | None => None 134 | end 135 | | If n e c1 c2 => 136 | let eRT := toExpRTImpl st e in 137 | let c1' := toStmtRTImpl st c1 in 138 | let c2' := toStmtRTImpl st c2 in 139 | match (c1', c2') with 140 | | (Some c1RT, Some c2RT) => 141 | Some (IfRT n eRT c1RT c2RT) 142 | | _ => None 143 | end 144 | | While n e c => 145 | let eRT := toExpRTImpl st e in 146 | let c' := toStmtRTImpl st c in 147 | match c' with 148 | | Some cRT => 149 | Some (WhileRT n eRT cRT) 150 | | _ => None 151 | end 152 | | Call n pn p args => 153 | match fetch_proc p st with 154 | | Some (n0, pb) => 155 | match toArgsRTImpl st (procedure_parameter_profile pb) args with 156 | | Some argsRT => Some (CallRT n pn p argsRT) 157 | | None => None 158 | end 159 | | None => None 160 | end 161 | | Seq n c1 c2 => 162 | let c1' := toStmtRTImpl st c1 in 163 | let c2' := toStmtRTImpl st c2 in 164 | match (c1', c2') with 165 | | (Some c1RT, Some c2RT) => 166 | Some (SeqRT n c1RT c2RT) 167 | | _ => None 168 | end 169 | end. 170 | 171 | Function toTypeDeclRTImpl (t: typeDecl): typeDeclRT := 172 | match t with 173 | | SubtypeDecl n tn t (Range l u) => 174 | SubtypeDeclRT n tn t (RangeRT l u) 175 | | DerivedTypeDecl n tn t (Range l u) => 176 | DerivedTypeDeclRT n tn t (RangeRT l u) 177 | | IntegerTypeDecl n tn (Range l u) => 178 | IntegerTypeDeclRT n tn (RangeRT l u) 179 | | ArrayTypeDecl n tn tm t => (* tn: array type name, tm: index type mark, t: component type *) 180 | ArrayTypeDeclRT n tn tm t 181 | | RecordTypeDecl n tn fs => (* tn: record type name, fs: list of field types *) 182 | RecordTypeDeclRT n tn fs 183 | end. 184 | 185 | (* Fails due to a bug in Coq 8.8. *) 186 | Fail Function toObjDeclRTImpl (st: symTab) (o: objDecl): objDeclRT := 187 | match o with 188 | | mkobjDecl n x t None => 189 | mkobjDeclRT n x t None 190 | | mkobjDecl n x t (Some e) => 191 | let eRT := toExpRTImpl st e in 192 | if is_range_constrainted_type t then 193 | mkobjDeclRT n x t (Some (update_exterior_checks_exp eRT (RangeCheck :: nil))) 194 | else 195 | mkobjDeclRT n x t (Some eRT) 196 | end. 197 | 198 | (* Temporary replacement for Coq 8.8. Remove when the original above ceases to fail. *) 199 | Function toObjDeclRTImpl (st: symTab) (o: objDecl): objDeclRT := 200 | match o with 201 | | mkobjDecl n x t None => 202 | mkobjDeclRT n x t None 203 | | mkobjDecl n x t (Some e) => 204 | if is_range_constrainted_type t then 205 | mkobjDeclRT n x t (Some (update_exterior_checks_exp (toExpRTImpl st e) (RangeCheck :: nil))) 206 | else 207 | mkobjDeclRT n x t (Some (toExpRTImpl st e)) 208 | end. 209 | 210 | Function toObjDeclsRTImpl (st: symTab) (lo: list objDecl): list objDeclRT := 211 | match lo with 212 | | nil => nil 213 | | o :: lo' => 214 | let oRT := toObjDeclRTImpl st o in 215 | let loRT := toObjDeclsRTImpl st lo' in 216 | oRT :: loRT 217 | end. 218 | 219 | Function toParamSpecRTImpl (param: paramSpec): paramSpecRT := 220 | match param with 221 | | mkparamSpec n x t m => 222 | mkparamSpecRT n x t m 223 | end. 224 | 225 | Function toParamSpecsRTImpl (lparam: list paramSpec): list paramSpecRT := 226 | match lparam with 227 | | nil => nil 228 | | param :: lparam' => 229 | let paramRT := toParamSpecRTImpl param in 230 | let lparamRT := toParamSpecsRTImpl lparam' in 231 | paramRT :: lparamRT 232 | end. 233 | 234 | (** * toDeclRTImpl *) 235 | Function toDeclRTImpl (st: symTab) (d: decl): option declRT := 236 | match d with 237 | | NullDecl => Some NullDeclRT 238 | | TypeDecl n t => 239 | let tRT := toTypeDeclRTImpl t in 240 | Some (TypeDeclRT n tRT) 241 | | ObjDecl n o => 242 | let oRT := toObjDeclRTImpl st o in 243 | Some (ObjDeclRT n oRT) 244 | | ProcBodyDecl n p => 245 | match toProcBodyDeclRTImpl st p with 246 | | Some pRT => Some (ProcBodyDeclRT n pRT) 247 | | None => None 248 | end 249 | | SeqDecl n d1 d2 => 250 | let d1' := toDeclRTImpl st d1 in 251 | let d2' := toDeclRTImpl st d2 in 252 | match (d1', d2') with 253 | | (Some d1RT, Some d2RT) => Some (SeqDeclRT n d1RT d2RT) 254 | | _ => None 255 | end 256 | end 257 | 258 | (** * toProcBodyDeclRTImpl *) 259 | with toProcBodyDeclRTImpl (st: symTab) (p: procBodyDecl): option procBodyDeclRT := 260 | match p with 261 | | mkprocBodyDecl n p params decls stmt => 262 | let paramsRT := toParamSpecsRTImpl params in 263 | let decls' := toDeclRTImpl st decls in 264 | let stmt' := toStmtRTImpl st stmt in 265 | match (decls', stmt') with 266 | | (Some declsRT, Some stmtRT) => Some (mkprocBodyDeclRT n p paramsRT declsRT stmtRT) 267 | | _ => None 268 | end 269 | end. 270 | 271 | (** * toProgramRTImpl *) 272 | Function toProgramRTImpl (st: symTab) (p: program): option programRT := 273 | match toDeclRTImpl st p.(decls) with 274 | | Some declsRT => Some (mkprogramRT declsRT p.(main)) 275 | | None => None 276 | end. 277 | -------------------------------------------------------------------------------- /spark2014_semantics/src/rt_gen_util.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | (** * Helper Lemmas for RT-GEN *) 13 | 14 | Require Import CpdtTactics. 15 | Require Export rt_gen. 16 | 17 | Lemma exp_ast_num_eq: forall st e e', 18 | toExpRT st e e' -> 19 | expression_astnum e = expression_astnum_rt e'. 20 | Proof. 21 | intros; inversion H; smack. 22 | Qed. 23 | 24 | Lemma name_ast_num_eq: forall st n n', 25 | toNameRT st n n' -> 26 | name_astnum n = name_astnum_rt n'. 27 | Proof. 28 | intros; inversion H; smack. 29 | Qed. 30 | 31 | Lemma exp_exterior_checks_beq_nil: forall st e e', 32 | toExpRT st e e' -> 33 | exp_exterior_checks e' = nil. 34 | Proof. 35 | intros; 36 | inversion H; smack; 37 | inversion H; subst; 38 | destruct n; 39 | match goal with 40 | | [H: toNameRT _ _ _ |- _] => inversion H; smack 41 | end. 42 | Qed. 43 | 44 | Lemma name_exterior_checks_beq_nil: forall st n n', 45 | toNameRT st n n' -> 46 | name_exterior_checks n' = nil. 47 | Proof. 48 | intros; 49 | inversion H; smack. 50 | Qed. 51 | 52 | Lemma procedure_components_rel: forall st p p', 53 | toProcBodyDeclRT st p p' -> 54 | toParamSpecsRT (procedure_parameter_profile p) (procedure_parameter_profile_rt p') /\ 55 | toDeclRT st (procedure_declarative_part p) (procedure_declarative_part_rt p') /\ 56 | toStmtRT st (procedure_statements p) (procedure_statements_rt p'). 57 | Proof. 58 | intros. 59 | inversion H; smack. 60 | Qed. 61 | 62 | 63 | (** * SymTabRT Generator *) 64 | 65 | Inductive toProcDeclRTMap: symTab -> 66 | list (idnum * (level * procBodyDecl)) -> 67 | list (idnum * (level * procBodyDeclRT)) -> Prop := 68 | | ToProcDeclMapNull: forall st, 69 | toProcDeclRTMap st nil nil 70 | | ToProcDeclMap: forall st pb pb' pl pl' p l, 71 | toProcBodyDeclRT st pb pb' -> 72 | toProcDeclRTMap st pl pl' -> 73 | toProcDeclRTMap st ((p, (l, pb)) :: pl) ((p, (l, pb')) :: pl'). 74 | 75 | Inductive toTypeDeclRTMap: list (idnum * typeDecl) -> list (idnum * typeDeclRT) -> Prop := 76 | | ToTypeDeclMapNull: 77 | toTypeDeclRTMap nil nil 78 | | ToTypeDeclMap: forall t t' tl tl' x, 79 | toTypeDeclRT t t' -> 80 | toTypeDeclRTMap tl tl' -> 81 | toTypeDeclRTMap ((x, t) :: tl) ((x, t') :: tl'). 82 | 83 | Inductive toSymTabRT: symTab -> symTabRT -> Prop := 84 | | ToSymTab: forall p p' t t' x e srcloc nametable, 85 | toProcDeclRTMap (mkSymTab x p t e srcloc nametable) p p' -> 86 | toTypeDeclRTMap t t' -> 87 | toSymTabRT (mkSymTab x p t e srcloc nametable) (mkSymTabRT x p' t' e srcloc nametable). 88 | 89 | 90 | (** ** Help Lemmas for SymTabRT *) 91 | 92 | Lemma procedure_declaration_rel: forall st pm pm' p n pb, 93 | toProcDeclRTMap st pm pm' -> 94 | Symbol_Table_Module.SymTable_Procs.fetches p pm = Some (n, pb) -> 95 | exists pb', 96 | Symbol_Table_Module_RT.SymTable_Procs.fetches p pm' = Some (n, pb') /\ 97 | toProcBodyDeclRT st pb pb'. 98 | Proof. 99 | intros st pm pm' p n pb H; revert p n pb. 100 | induction H; intros. 101 | - inversion H; inversion H0; auto. 102 | - unfold Symbol_Table_Module.SymTable_Procs.fetches in H1. 103 | unfold Symbol_Table_Module_RT.SymTable_Procs.fetches. 104 | remember (beq_nat p0 p) as Beq. 105 | destruct Beq. 106 | + smack. 107 | + specialize (IHtoProcDeclRTMap _ _ _ H1). 108 | auto. 109 | Qed. 110 | 111 | Lemma procedure_declaration_rel_backward: forall st pm pm' p n pb, 112 | toProcDeclRTMap st pm pm' -> 113 | Symbol_Table_Module_RT.SymTable_Procs.fetches p pm' = Some (n, pb) -> 114 | exists pb', 115 | Symbol_Table_Module.SymTable_Procs.fetches p pm = Some (n, pb') /\ 116 | toProcBodyDeclRT st pb' pb. 117 | Proof. 118 | intros st pm pm' p n pb H; revert p n pb. 119 | induction H; intros. 120 | - inversion H; inversion H0; auto. 121 | - unfold Symbol_Table_Module_RT.SymTable_Procs.fetches in H1. 122 | unfold Symbol_Table_Module.SymTable_Procs.fetches. 123 | remember (beq_nat p0 p) as Beq. 124 | destruct Beq. 125 | + smack. 126 | + specialize (IHtoProcDeclRTMap _ _ _ H1); 127 | auto. 128 | Qed. 129 | 130 | Lemma symbol_table_procedure_rel: forall st st' p n pb, 131 | toSymTabRT st st' -> 132 | fetch_proc p st = Some (n, pb) -> 133 | exists pb', 134 | fetch_proc_rt p st' = Some (n, pb') /\ 135 | toProcBodyDeclRT st pb pb'. 136 | Proof. 137 | intros. 138 | inversion H; subst; clear H. 139 | unfold fetch_proc_rt; 140 | unfold fetch_proc in H0; 141 | simpl in *. 142 | specialize (procedure_declaration_rel _ _ _ _ _ _ H1 H0); 143 | auto. 144 | Qed. 145 | 146 | Lemma symbol_table_procedure_rel_backward: forall st st' p n pb, 147 | toSymTabRT st st' -> 148 | fetch_proc_rt p st' = Some (n, pb) -> 149 | exists pb', 150 | fetch_proc p st = Some (n, pb') /\ 151 | toProcBodyDeclRT st pb' pb. 152 | Proof. 153 | intros. 154 | inversion H; subst; clear H. 155 | unfold fetch_proc; 156 | unfold fetch_proc_rt in H0; 157 | simpl in *. 158 | specialize (procedure_declaration_rel_backward _ _ _ _ _ _ H1 H0); 159 | auto. 160 | Qed. 161 | 162 | Lemma symbol_table_var_rel: forall st st' x, 163 | toSymTabRT st st' -> 164 | fetch_var x st = fetch_var_rt x st'. 165 | Proof. 166 | intros. 167 | inversion H; smack. 168 | Qed. 169 | 170 | Lemma type_declaration_rel: forall tm tm' t td, 171 | toTypeDeclRTMap tm tm' -> 172 | Symbol_Table_Module.SymTable_Types.fetches t tm = Some td -> 173 | exists td', 174 | Symbol_Table_Module_RT.SymTable_Types.fetches t tm' = Some td' /\ 175 | toTypeDeclRT td td'. 176 | Proof. 177 | intros tm tm' t td H; revert t td. 178 | induction H; smack. 179 | destruct (beq_nat t0 x). 180 | - inversion H; smack. 181 | - apply IHtoTypeDeclRTMap; auto. 182 | Qed. 183 | 184 | Lemma type_declaration_rel_backward: forall tm tm' t td, 185 | toTypeDeclRTMap tm tm' -> 186 | Symbol_Table_Module_RT.SymTable_Types.fetches t tm' = Some td -> 187 | exists td', 188 | Symbol_Table_Module.SymTable_Types.fetches t tm = Some td' /\ 189 | toTypeDeclRT td' td. 190 | Proof. 191 | intros tm tm' t td H; revert t td. 192 | induction H; smack. 193 | destruct (beq_nat t0 x). 194 | - inversion H; smack. 195 | - apply IHtoTypeDeclRTMap; auto. 196 | Qed. 197 | 198 | Lemma symbol_table_type_rel: forall st st' t td, 199 | toSymTabRT st st' -> 200 | fetch_type t st = Some td -> 201 | exists td', 202 | fetch_type_rt t st' = Some td' /\ toTypeDeclRT td td'. 203 | Proof. 204 | intros. 205 | inversion H; smack. 206 | unfold fetch_type, Symbol_Table_Module.fetch_type in H0; 207 | unfold fetch_type_rt, Symbol_Table_Module_RT.fetch_type; smack. 208 | apply type_declaration_rel with (tm := t0); auto. 209 | Qed. 210 | 211 | Lemma symbol_table_type_rel_backward: forall st st' t td, 212 | toSymTabRT st st' -> 213 | fetch_type_rt t st' = Some td -> 214 | exists td', 215 | fetch_type t st = Some td' /\ toTypeDeclRT td' td. 216 | Proof. 217 | intros. 218 | inversion H; smack. 219 | unfold fetch_type, Symbol_Table_Module_RT.fetch_type in H0; 220 | unfold fetch_type_rt, Symbol_Table_Module.fetch_type; smack. 221 | apply type_declaration_rel_backward with (tm' := t'); auto. 222 | Qed. 223 | 224 | Lemma symbol_table_exp_type_rel: forall st st' e t, 225 | toSymTabRT st st' -> 226 | fetch_exp_type e st = t -> 227 | fetch_exp_type_rt e st' = t. 228 | Proof. 229 | intros. 230 | inversion H; smack. 231 | Qed. 232 | 233 | Lemma symbol_table_exp_type_eq: forall st st' e, 234 | toSymTabRT st st' -> 235 | fetch_exp_type e st = fetch_exp_type_rt e st'. 236 | Proof. 237 | intros. 238 | inversion H; smack. 239 | Qed. 240 | 241 | Lemma subtype_range_rel: forall st st' t l u, 242 | toSymTabRT st st' -> 243 | extract_subtype_range st t (Range l u) -> 244 | extract_subtype_range_rt st' t (RangeRT l u). 245 | Proof. 246 | intros. 247 | inversion H0; clear H0; smack. 248 | specialize (symbol_table_type_rel _ _ _ _ H H6); clear H6; smack. 249 | apply Extract_Range_RT with (tn := tn) (td := x); smack. 250 | destruct td; inversion H7; subst; 251 | inversion H2; auto. 252 | Qed. 253 | 254 | Lemma subtype_range_rel_backward: forall st st' t l u, 255 | toSymTabRT st st' -> 256 | extract_subtype_range_rt st' t (RangeRT l u) -> 257 | extract_subtype_range st t (Range l u). 258 | Proof. 259 | intros. 260 | inversion H0; clear H0; smack. 261 | specialize (symbol_table_type_rel_backward _ _ _ _ H H6); clear H6; smack. 262 | apply Extract_Range with (tn := tn) (td := x); smack. 263 | destruct td; inversion H7; subst; 264 | inversion H2; auto. 265 | Qed. 266 | 267 | Lemma index_range_rel: forall st st' t l u, 268 | toSymTabRT st st' -> 269 | extract_array_index_range st t (Range l u) -> 270 | extract_array_index_range_rt st' t (RangeRT l u). 271 | Proof. 272 | intros. 273 | inversion H0; clear H0; smack. 274 | specialize (symbol_table_type_rel _ _ _ _ H H3); clear H3; smack. 275 | specialize (symbol_table_type_rel _ _ _ _ H H7); clear H7; smack. 276 | apply Extract_Index_Range_RT with (a_ast_num := a_ast_num) (tn := tn) (tm := tm) 277 | (typ := typ) (tn' := tn') (td := x0); smack. 278 | inversion H2; auto. 279 | destruct td; inversion H8; subst; 280 | inversion H5; auto. 281 | Qed. 282 | 283 | Lemma index_range_rel_backward: forall st st' t l u, 284 | toSymTabRT st st' -> 285 | extract_array_index_range_rt st' t (RangeRT l u) -> 286 | extract_array_index_range st t (Range l u). 287 | Proof. 288 | intros. 289 | inversion H0; clear H0; smack. 290 | specialize (symbol_table_type_rel_backward _ _ _ _ H H3); clear H3; smack. 291 | specialize (symbol_table_type_rel_backward _ _ _ _ H H7); clear H7; smack. 292 | apply Extract_Index_Range with (a_ast_num := a_ast_num) (tn := tn) (tm := tm) 293 | (typ := typ) (tn' := tn') (td := x0); smack. 294 | inversion H2; auto. 295 | destruct td; inversion H8; subst; 296 | inversion H5; auto. 297 | Qed. 298 | -------------------------------------------------------------------------------- /spark2014_semantics/src/run: -------------------------------------------------------------------------------- 1 | if [ ! $# == 2 ]; then 2 | echo "Usage: $0 coq_ast_source_file_name any_file_name.v" 3 | exit 4 | fi 5 | 6 | INPUT_FILENAME="$1" 7 | OUTPUT_FILENAME="$2" 8 | SUFFIX=".v" 9 | GNAT_AST="AST_Tree" 10 | GNAT_AST_X="AST_TreeRT" 11 | EXPECTED_COMPLETE_AST="Expected_Complete_Ast" 12 | EXPECTED_OPTIMIZED_AST="Expected_Optimized_Ast" 13 | SYMBOL_TABLE="Symbol_Table" 14 | SYMBOL_TABLE_X="Symbol_TableRT" 15 | 16 | if [[ ! -f $INPUT_FILENAME ]]; then 17 | echo "Warning: File ($INPUT_FILENAME) does not exist !" 18 | exit 19 | fi 20 | 21 | if [[ $OUTPUT_FILENAME != *".v" ]]; then 22 | echo "Warning: File ($OUTPUT_FILENAME) should end with \".v\"" 23 | exit 24 | fi 25 | 26 | > $2 # empty the file 27 | echo -e "Require Export rt_counter.\n" >> $2 28 | #echo -e "Require Export rt_opt_compare.\n" >> $2 # -e makes \n to insert newline 29 | echo -e "Require Export rt_validator.\n" >> $2 # -e makes \n to insert newline 30 | echo "" >> $2 # insert newline 31 | echo -e "Require Export rt_gen_impl.\n" >> $2 32 | echo -e "Require Export rt_opt_impl.\n" >> $2 33 | 34 | cat $1 >> $2 35 | 36 | ############################################################################### 37 | # GNAT_AST_X: run-time checks of GNAT compiler 38 | # EXPECTED_COMPLETE_AST: run-time checks generated by our checks_generator 39 | # EXPECTED_OPTIMIZED_AST: run-time checks optimized by our checks_optimization 40 | # it should hold that: 41 | # EXPECTED_OPTIMIZED_AST <= GNAT_AST_X <= EXPECTED_COMPLETE_AST 42 | ############################################################################### 43 | 44 | echo "" >> $2 45 | 46 | echo -e "Definition $EXPECTED_COMPLETE_AST := toProgramRTImpl $SYMBOL_TABLE $GNAT_AST.\n" >> $2 47 | echo -e "Definition $EXPECTED_OPTIMIZED_AST := optOProgramImpl $SYMBOL_TABLE_X $EXPECTED_COMPLETE_AST.\n" >> $2 48 | echo -e "Definition Return_Msgs := program_checks_validator $EXPECTED_OPTIMIZED_AST $GNAT_AST_X $EXPECTED_COMPLETE_AST.\n" >> $2 49 | #echo -e "Definition Return_Msgs := checks_optimization_compare $EXPECTED_OPTIMIZED_AST $GNAT_AST_X $EXPECTED_COMPLETE_AST.\n" >> $2 50 | echo -e "Definition Result := map_to_source_location $SYMBOL_TABLE Return_Msgs.\n" >> $2 51 | echo "Eval compute in Result." >> $2 52 | echo -e "Eval compute in (count_option_program_check_flags $EXPECTED_OPTIMIZED_AST).\n" >> $2 53 | echo -e "Eval compute in (count_program_check_flags $GNAT_AST_X).\n" >> $2 54 | echo -e "Eval compute in (count_option_program_check_flags $EXPECTED_COMPLETE_AST).\n" >> $2 55 | 56 | eval "coqc $OUTPUT_FILENAME" # run the test 57 | 58 | 59 | 60 | 61 | -------------------------------------------------------------------------------- /spark2014_semantics/src/run_check_count: -------------------------------------------------------------------------------- 1 | if [ ! $# == 1 ]; then 2 | echo "Usage: $0 filename" 3 | exit 4 | fi 5 | 6 | INPUT_FILENAME="$1" 7 | 8 | egrep -o "((Do_\S+))" $INPUT_FILENAME | cut -d " " -f 2 | wc -l 9 | 10 | # egrep -o "(\s(rs\S+))" data.txt | cut -d " " -f 2 > newfile.txt 11 | # \s looks for something that starts with any whitespace character 12 | # (rs\S+) and then searches for a string that starts with "rs" and is followed by any non-whitespace character 13 | #The results still have the white spaces in it, which we don't want, 14 | #so we "cut" them out, before the content gets written to new file. 15 | -------------------------------------------------------------------------------- /spark2014_semantics/src/run_tests: -------------------------------------------------------------------------------- 1 | if [ ! $# == 1 ]; then 2 | echo "Usage: $0 output_file_name" 3 | exit 4 | fi 5 | 6 | OUTPUT_FILENAME="$1" 7 | 8 | echo "jago_acats_c53007a" >> $1 9 | run ./tests/jago_acats_c53007a.smf.v ./tests/jago_acats_c53007a.v >> $1 10 | echo "" >> $1 # insert newline 11 | echo "" >> $1 # insert newline 12 | 13 | # > $1 # empty the file 14 | echo "jago_acats_c55c02b" >> $1 15 | run ./tests/jago_acats_c55c02b.smf.v ./tests/jago_acats_c55c02b.v >> $1 16 | echo "" >> $1 # insert newline 17 | echo "" >> $1 # insert newline 18 | 19 | echo "jago_array_record_package" >> $1 20 | run ./tests/jago_array_record_package.smf.v ./tests/jago_array_record_package.v >> $1 21 | echo "" >> $1 # insert newline 22 | echo "" >> $1 # insert newline 23 | 24 | echo "jago_array_subtype_index" >> $1 25 | run ./tests/jago_array_subtype_index.smf.v ./tests/jago_array_subtype_index.v >> $1 26 | echo "" >> $1 # insert newline 27 | echo "" >> $1 # insert newline 28 | 29 | echo "jago_arrayrecord" >> $1 30 | run ./tests/jago_arrayrecord.smf.v ./tests/jago_arrayrecord.v >> $1 31 | echo "" >> $1 # insert newline 32 | echo "" >> $1 # insert newline 33 | 34 | echo "jago_assign_subtype_var" >> $1 35 | run ./tests/jago_assign_subtype_var.smf.v ./tests/jago_assign_subtype_var.v >> $1 36 | echo "" >> $1 # insert newline 37 | echo "" >> $1 # insert newline 38 | 39 | echo "jago_binary_search" >> $1 40 | run ./tests/jago_binary_search.smf.v ./tests/jago_binary_search.v >> $1 41 | echo "" >> $1 # insert newline 42 | echo "" >> $1 # insert newline 43 | 44 | echo "jago_binary_search_test" >> $1 45 | run ./tests/jago_binary_search_test.smf.v ./tests/jago_binary_search_test.v >> $1 46 | echo "" >> $1 # insert newline 47 | echo "" >> $1 # insert newline 48 | 49 | echo "jago_bounded_in_out" >> $1 50 | run ./tests/jago_bounded_in_out.smf.v ./tests/jago_bounded_in_out.v >> $1 51 | echo "" >> $1 # insert newline 52 | echo "" >> $1 # insert newline 53 | 54 | echo "jago_dependence_test_suite_01" >> $1 55 | run ./tests/jago_dependence_test_suite_01.smf.v ./tests/jago_dependence_test_suite_01.v >> $1 56 | echo "" >> $1 # insert newline 57 | echo "" >> $1 # insert newline 58 | 59 | echo "jago_dependence_test_suite_02" >> $1 60 | run ./tests/jago_dependence_test_suite_02.smf.v ./tests/jago_dependence_test_suite_02.v >> $1 61 | echo "" >> $1 # insert newline 62 | echo "" >> $1 # insert newline 63 | 64 | echo "jago_division_by_non_zero" >> $1 65 | run ./tests/jago_division_by_non_zero.smf.v ./tests/jago_division_by_non_zero.v >> $1 66 | echo "" >> $1 # insert newline 67 | echo "" >> $1 # insert newline 68 | 69 | echo "jago_example" >> $1 70 | run ./tests/jago_example.smf.v ./tests/jago_example.v >> $1 71 | echo "" >> $1 # insert newline 72 | echo "" >> $1 # insert newline 73 | 74 | echo "jago_factorial" >> $1 75 | run ./tests/jago_factorial.smf.v ./tests/jago_factorial.v >> $1 76 | echo "" >> $1 # insert newline 77 | echo "" >> $1 # insert newline 78 | 79 | echo "jago_faultintegrator" >> $1 80 | run ./tests/jago_faultintegrator.smf.v ./tests/jago_faultintegrator.v >> $1 81 | echo "" >> $1 # insert newline 82 | echo "" >> $1 # insert newline 83 | 84 | echo "jago_gcd" >> $1 85 | run ./tests/jago_gcd.smf.v ./tests/jago_gcd.v >> $1 86 | echo "" >> $1 # insert newline 87 | echo "" >> $1 # insert newline 88 | 89 | echo "jago_gnatprove_test_bool" >> $1 90 | run ./tests/jago_gnatprove_test_bool.smf.v ./tests/jago_gnatprove_test_bool.v >> $1 91 | echo "" >> $1 # insert newline 92 | echo "" >> $1 # insert newline 93 | 94 | echo "jago_linear_div" >> $1 95 | run ./tests/jago_linear_div.smf.v ./tests/jago_linear_div.v >> $1 96 | echo "" >> $1 # insert newline 97 | echo "" >> $1 # insert newline 98 | 99 | echo "jago_max" >> $1 100 | run ./tests/jago_max.smf.v ./tests/jago_max.v >> $1 101 | echo "" >> $1 # insert newline 102 | echo "" >> $1 # insert newline 103 | 104 | echo "jago_min" >> $1 105 | run ./tests/jago_min.smf.v ./tests/jago_min.v >> $1 106 | echo "" >> $1 # insert newline 107 | echo "" >> $1 # insert newline 108 | 109 | echo "jago_modulus" >> $1 110 | run ./tests/jago_modulus.smf.v ./tests/jago_modulus.v >> $1 111 | echo "" >> $1 # insert newline 112 | echo "" >> $1 # insert newline 113 | 114 | echo "jago_odd" >> $1 115 | run ./tests/jago_odd.smf.v ./tests/jago_odd.v >> $1 116 | echo "" >> $1 # insert newline 117 | echo "" >> $1 # insert newline 118 | 119 | echo "jago_p_simple_call" >> $1 120 | run ./tests/jago_p_simple_call.smf.v ./tests/jago_p_simple_call.v >> $1 121 | echo "" >> $1 # insert newline 122 | echo "" >> $1 # insert newline 123 | 124 | echo "jago_p_simple_call_two" >> $1 125 | run ./tests/jago_p_simple_call_two.smf.v ./tests/jago_p_simple_call_two.v >> $1 126 | echo "" >> $1 # insert newline 127 | echo "" >> $1 # insert newline 128 | 129 | echo "jago_packagedemo_a" >> $1 130 | run ./tests/jago_packagedemo_a.smf.v ./tests/jago_packagedemo_a.v >> $1 131 | echo "" >> $1 # insert newline 132 | echo "" >> $1 # insert newline 133 | 134 | echo "jago_prime" >> $1 135 | run ./tests/jago_prime.smf.v ./tests/jago_prime.v >> $1 136 | echo "" >> $1 # insert newline 137 | echo "" >> $1 # insert newline 138 | 139 | echo "jago_proceduretest01" >> $1 140 | run ./tests/jago_proceduretest01.smf.v ./tests/jago_proceduretest01.v >> $1 141 | echo "" >> $1 # insert newline 142 | echo "" >> $1 # insert newline 143 | 144 | echo "jago_quantifiertest" >> $1 145 | run ./tests/jago_quantifiertest.smf.v ./tests/jago_quantifiertest.v >> $1 146 | echo "" >> $1 # insert newline 147 | echo "" >> $1 # insert newline 148 | 149 | echo "jago_recordtest01" >> $1 150 | run ./tests/jago_recordtest01.smf.v ./tests/jago_recordtest01.v >> $1 151 | echo "" >> $1 # insert newline 152 | echo "" >> $1 # insert newline 153 | 154 | echo "jago_recursive_proc_pkg" >> $1 155 | run ./tests/jago_recursive_proc_pkg.smf.v ./tests/jago_recursive_proc_pkg.v >> $1 156 | echo "" >> $1 # insert newline 157 | echo "" >> $1 # insert newline 158 | 159 | echo "jago_sort" >> $1 160 | run ./tests/jago_sort.smf.v ./tests/jago_sort.v >> $1 161 | echo "" >> $1 # insert newline 162 | echo "" >> $1 # insert newline 163 | 164 | echo "jago_test_case_10" >> $1 165 | run ./tests/jago_test_case_10.smf.v ./tests/jago_test_case_10.v >> $1 166 | echo "" >> $1 # insert newline 167 | echo "" >> $1 # insert newline 168 | 169 | echo "jago_the_stack" >> $1 170 | run ./tests/jago_the_stack.smf.v ./tests/jago_the_stack.v >> $1 171 | echo "" >> $1 # insert newline 172 | echo "" >> $1 # insert newline 173 | 174 | echo "jago_the_stack_praxis" >> $1 175 | run ./tests/jago_the_stack_praxis.smf.v ./tests/jago_the_stack_praxis.v >> $1 176 | echo "" >> $1 # insert newline 177 | echo "" >> $1 # insert newline 178 | 179 | echo "jago_two_way_sort" >> $1 180 | run ./tests/jago_two_way_sort.smf.v ./tests/jago_two_way_sort.v >> $1 181 | echo "" >> $1 # insert newline 182 | echo "" >> $1 # insert newline 183 | 184 | echo "jago_tetris" >> $1 185 | run ./tests/jago_tetris_functional.smf.v ./tests/jago_tetris_functional.v >> $1 186 | echo "" >> $1 # insert newline 187 | echo "" >> $1 # insert newline 188 | 189 | echo "jago_skein" >> $1 190 | run ./tests/jago_skein.smf.v ./tests/jago_skein.v >> $1 191 | echo "" >> $1 # insert newline 192 | echo "" >> $1 # insert newline 193 | -------------------------------------------------------------------------------- /spark2014_semantics/src/symboltable.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Export symboltable_module. 13 | Require Export ast_rt. 14 | 15 | (** * Symbol Table *) 16 | 17 | (** it's used to map back to the source location once an error is detected in ast tree *) 18 | Record source_location := sloc{ 19 | line : nat; 20 | col : nat; 21 | endline: nat; 22 | endcol : nat 23 | }. 24 | 25 | (** symbol table for normal SPARK program *) 26 | Module Symbol_Table_Elements <: SymTable_Element. 27 | Definition Procedure_Decl := procBodyDecl. 28 | Definition Type_Decl := typeDecl. 29 | Definition Source_Location := source_location. 30 | End Symbol_Table_Elements. 31 | 32 | (** symbol table for SPARK program with run-time check decorations *) 33 | Module Symbol_Table_Elements_RT <: SymTable_Element. 34 | Definition Procedure_Decl := procBodyDeclRT. 35 | Definition Type_Decl := typeDeclRT. 36 | Definition Source_Location := source_location. 37 | End Symbol_Table_Elements_RT. 38 | 39 | Module Symbol_Table_Module := SymbolTableM (Symbol_Table_Elements). 40 | 41 | Module Symbol_Table_Module_RT := SymbolTableM (Symbol_Table_Elements_RT). 42 | 43 | (** ** SymTab *) 44 | Definition symTab := Symbol_Table_Module.symboltable. 45 | Definition mkSymTab := Symbol_Table_Module.mkSymbolTable. 46 | Definition proc_decl := Symbol_Table_Module.proc_decl. 47 | Definition type_decl := Symbol_Table_Module.type_decl. 48 | 49 | (** ** SymTabRT *) 50 | Definition symTabRT := Symbol_Table_Module_RT.symboltable. 51 | Definition mkSymTabRT := Symbol_Table_Module_RT.mkSymbolTable. 52 | Definition proc_decl_rt := Symbol_Table_Module_RT.proc_decl. 53 | Definition type_decl_rt := Symbol_Table_Module_RT.type_decl. 54 | 55 | Definition level := Symbol_Table_Module.level. 56 | 57 | (** * Symbol Table Operation *) 58 | 59 | (** ** Symbol Table Operation for AST *) 60 | (** name table and symbol table operations for program (AST) *) 61 | 62 | Definition reside_nametable_vars := Symbol_Table_Module.reside_nametable_vars. 63 | Definition reside_nametable_procs := Symbol_Table_Module.reside_nametable_procs. 64 | Definition reside_nametable_pkgs := Symbol_Table_Module.reside_nametable_pkgs. 65 | Definition reside_nametable_types := Symbol_Table_Module.reside_nametable_types. 66 | Definition fetch_var_name := Symbol_Table_Module.fetch_var_name. 67 | Definition fetch_proc_name := Symbol_Table_Module.fetch_proc_name. 68 | Definition fetch_pkg_name := Symbol_Table_Module.fetch_pkg_name. 69 | Definition fetch_type_name := Symbol_Table_Module.fetch_type_name. 70 | 71 | Definition reside_symtable_vars := Symbol_Table_Module.reside_symtable_vars. 72 | Definition reside_symtable_procs := Symbol_Table_Module.reside_symtable_procs. 73 | Definition reside_symtable_types := Symbol_Table_Module.reside_symtable_types. 74 | Definition reside_symtable_exps := Symbol_Table_Module.reside_symtable_exps. 75 | Definition reside_symtable_sloc := Symbol_Table_Module.reside_symtable_sloc. 76 | Definition fetch_var := Symbol_Table_Module.fetch_var. 77 | Definition fetch_proc := Symbol_Table_Module.fetch_proc. 78 | Definition fetch_type := Symbol_Table_Module.fetch_type. 79 | Definition fetch_exp_type := Symbol_Table_Module.fetch_exp_type. 80 | Definition fetch_sloc := Symbol_Table_Module.fetch_sloc. 81 | Definition update_vars := Symbol_Table_Module.update_vars. 82 | Definition update_procs := Symbol_Table_Module.update_procs. 83 | Definition update_types := Symbol_Table_Module.update_types. 84 | Definition update_exps := Symbol_Table_Module.update_exps. 85 | Definition update_sloc := Symbol_Table_Module.update_sloc. 86 | 87 | (** ** Symbol Table Operation for AST_RT *) 88 | (** name table and symbol table operations for program with run-time check decorations (AST_RT) *) 89 | 90 | Definition reside_nametable_vars_rt := Symbol_Table_Module_RT.reside_nametable_vars. 91 | Definition reside_nametable_procs_rt := Symbol_Table_Module_RT.reside_nametable_procs. 92 | Definition reside_nametable_pkgs_rt := Symbol_Table_Module_RT.reside_nametable_pkgs. 93 | Definition reside_nametable_types_rt := Symbol_Table_Module_RT.reside_nametable_types. 94 | Definition fetch_var_name_rt := Symbol_Table_Module_RT.fetch_var_name. 95 | Definition fetch_proc_name_rt := Symbol_Table_Module_RT.fetch_proc_name. 96 | Definition fetch_pkg_name_rt := Symbol_Table_Module_RT.fetch_pkg_name. 97 | Definition fetch_type_name_rt := Symbol_Table_Module_RT.fetch_type_name. 98 | 99 | Definition reside_symtable_vars_rt := Symbol_Table_Module_RT.reside_symtable_vars. 100 | Definition reside_symtable_procs_rt := Symbol_Table_Module_RT.reside_symtable_procs. 101 | Definition reside_symtable_types_rt := Symbol_Table_Module_RT.reside_symtable_types. 102 | Definition reside_symtable_exps_rt := Symbol_Table_Module_RT.reside_symtable_exps. 103 | Definition reside_symtable_sloc_rt := Symbol_Table_Module_RT.reside_symtable_sloc. 104 | Definition fetch_var_rt := Symbol_Table_Module_RT.fetch_var. 105 | Definition fetch_proc_rt := Symbol_Table_Module_RT.fetch_proc. 106 | Definition fetch_type_rt := Symbol_Table_Module_RT.fetch_type. 107 | Definition fetch_exp_type_rt := Symbol_Table_Module_RT.fetch_exp_type. 108 | Definition fetch_sloc_rt := Symbol_Table_Module_RT.fetch_sloc. 109 | Definition update_vars_rt := Symbol_Table_Module_RT.update_vars. 110 | Definition update_procs_rt := Symbol_Table_Module_RT.update_procs. 111 | Definition update_types_rt := Symbol_Table_Module_RT.update_types. 112 | Definition update_exps_rt := Symbol_Table_Module_RT.update_exps. 113 | Definition update_sloc_rt := Symbol_Table_Module_RT.update_sloc. 114 | 115 | 116 | 117 | Definition fetch_array_index_type (st: symTab) (array_ast_num: astnum): option type := 118 | match fetch_exp_type array_ast_num st with 119 | | Some (Array_Type t) => 120 | match fetch_type t st with 121 | | Some (ArrayTypeDecl ast_num tn indexSubtypeMark componentType) => 122 | Some indexSubtypeMark 123 | | _ => None 124 | end 125 | | _ => None 126 | end. 127 | 128 | Definition fetch_array_index_type_rt (st: symTabRT) (array_ast_num: astnum): option type := 129 | match fetch_exp_type_rt array_ast_num st with 130 | | Some (Array_Type t) => 131 | match fetch_type_rt t st with 132 | | Some (ArrayTypeDeclRT ast_num tn indexSubtypeMark componentType) => 133 | Some indexSubtypeMark 134 | | _ => None 135 | end 136 | | _ => None 137 | end. 138 | 139 | Inductive extract_subtype_range: symTab -> type -> range -> Prop := 140 | | Extract_Range: forall t tn st td l u, 141 | subtype_num t = Some tn -> 142 | fetch_type tn st = Some td -> 143 | subtype_range td = Some (Range l u) -> 144 | extract_subtype_range st t (Range l u). 145 | 146 | (* tm is a subtype_mark *) 147 | Inductive extract_array_index_range: symTab -> typenum -> range -> Prop := 148 | | Extract_Index_Range: forall t st a_ast_num tn tm typ tn' td l u, 149 | fetch_type t st = Some (ArrayTypeDecl a_ast_num tn tm typ) -> 150 | subtype_num tm = Some tn' -> 151 | fetch_type tn' st = Some td -> 152 | subtype_range td = Some (Range l u) -> 153 | extract_array_index_range st t (Range l u). 154 | 155 | 156 | Inductive extract_subtype_range_rt: symTabRT -> type -> rangeRT -> Prop := 157 | | Extract_Range_RT: forall t tn st td l u, 158 | subtype_num t = Some tn -> 159 | fetch_type_rt tn st = Some td -> 160 | subtype_range_rt td = Some (RangeRT l u) -> 161 | extract_subtype_range_rt st t (RangeRT l u). 162 | 163 | 164 | (* tm is a subtype_mark *) 165 | Inductive extract_array_index_range_rt: symTabRT -> typenum -> rangeRT -> Prop := 166 | | Extract_Index_Range_RT: forall t st a_ast_num tn tm typ tn' td l u, 167 | fetch_type_rt t st = Some (ArrayTypeDeclRT a_ast_num tn tm typ) -> 168 | subtype_num tm = Some tn' -> 169 | fetch_type_rt tn' st = Some td -> 170 | subtype_range_rt td = Some (RangeRT l u) -> 171 | extract_array_index_range_rt st t (RangeRT l u). 172 | 173 | (** * Help Lemmas *) 174 | Lemma extract_array_index_range_rt_unique: forall st a l u l' u', 175 | extract_array_index_range_rt st a (RangeRT l u) -> 176 | extract_array_index_range_rt st a (RangeRT l' u') -> 177 | l = l' /\ u = u'. 178 | Proof. 179 | intros. 180 | inversion H; inversion H0; subst. 181 | repeat progress match goal with 182 | | [H1: ?x = _, 183 | H2: ?x = _ |- _] => rewrite H1 in H2; clear H1; inversion H2; subst 184 | end; auto. 185 | Qed. 186 | 187 | Ltac apply_extract_array_index_range_rt_unique := 188 | match goal with 189 | | [H1: extract_array_index_range_rt _ ?a (RangeRT ?l ?u), 190 | H2: extract_array_index_range_rt _ ?a (RangeRT ?l' ?u') |- _] => 191 | specialize (extract_array_index_range_rt_unique _ _ _ _ _ _ H1 H2); 192 | let HZ := fresh "HZ" in intros HZ; inversion HZ 193 | end. 194 | 195 | Lemma extract_subtype_range_unique: forall st t l u l' u', 196 | extract_subtype_range_rt st t (RangeRT l u) -> 197 | extract_subtype_range_rt st t (RangeRT l' u') -> 198 | l = l' /\ u = u'. 199 | Proof. 200 | intros; 201 | inversion H; inversion H0; smack. 202 | Qed. 203 | 204 | Ltac apply_extract_subtype_range_unique := 205 | match goal with 206 | | [H1: extract_subtype_range_rt _ ?t _, 207 | H2: extract_subtype_range_rt _ ?t _ |- _] => 208 | specialize (extract_subtype_range_unique _ _ _ _ _ _ H1 H2); 209 | let HZ := fresh "HZ" in intros HZ; destruct HZ; subst 210 | end. 211 | 212 | -------------------------------------------------------------------------------- /spark2014_semantics/src/symboltable_module.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Export environment. 13 | Require Export Coq.Strings.String. 14 | 15 | (** * Symbol Table Element *) 16 | 17 | Record nametable := mkNameTable{ 18 | varNames: list (idnum * (string * string)); 19 | procNames: list (procnum * (string * string)); 20 | pkgNames: list (pkgnum * (string * string)); 21 | typeNames: list (typenum * (string * string)) 22 | }. 23 | 24 | 25 | Module Type SymTable_Element. 26 | 27 | Parameter Procedure_Decl: Type. 28 | 29 | Parameter Type_Decl: Type. 30 | 31 | Parameter Source_Location: Type. 32 | 33 | End SymTable_Element. 34 | 35 | (** * Symbol Table Module *) 36 | 37 | Module SymbolTableM (S: SymTable_Element). 38 | 39 | Definition level := nat. 40 | 41 | Definition proc_decl := S.Procedure_Decl. 42 | 43 | Definition type_decl := S.Type_Decl. 44 | 45 | Definition source_location := S.Source_Location. 46 | 47 | (** ** Symbol Table Structure *) 48 | (** components of the symbol table: 49 | - vars : map each variable to its declared type and in/out mode; 50 | - procs: map each procedure name to its procedure declaration and 51 | the nested level of its procedure declaration; 52 | - types: map each type name to its type declaration; 53 | - exps : record the type for each expression ast node; 54 | - sloc : map each ast node back to the source location in SPARK source code; 55 | - names: map name id, represented as natural number, to a pair of (name string, unique name string), 56 | e.g. 1 -> ("x", "ada://variable/pacakge_name/procedure_name_where_X_is_decalred/X+12:24") 57 | *) 58 | 59 | Record symboltable := mkSymbolTable{ 60 | vars: list (idnum * (mode * type)); 61 | procs: list (procnum * (level * proc_decl)); 62 | types: list (typenum * type_decl); 63 | exps: list (astnum * type); 64 | sloc: list (astnum * source_location); (*used to map back to the source location once an error is detected in ast tree*) 65 | names: nametable 66 | }. 67 | 68 | (** name table entry *) 69 | Module Entry_Name <: ENTRY. 70 | Definition T := prod string string. 71 | End Entry_Name. 72 | 73 | (** symbol table entry *) 74 | Module Entry_Type <: ENTRY. 75 | Definition T := prod mode type. 76 | End Entry_Type. 77 | 78 | Module Entry_Procedure_Decl <: ENTRY. 79 | Definition T := prod level proc_decl. 80 | End Entry_Procedure_Decl. 81 | 82 | Module Entry_Type_Decl <: ENTRY. 83 | Definition T := type_decl. 84 | End Entry_Type_Decl. 85 | 86 | Module Entry_Exp_Type <: ENTRY. 87 | Definition T := type. 88 | End Entry_Exp_Type. 89 | 90 | Module Entry_Sloc <: ENTRY. 91 | Definition T := source_location. 92 | End Entry_Sloc. 93 | 94 | (** name table module *) 95 | Module Names := STORE(Entry_Name). 96 | 97 | (** symbol table module *) 98 | Module SymTable_Vars := STORE(Entry_Type). 99 | Module SymTable_Procs := STORE(Entry_Procedure_Decl). 100 | Module SymTable_Types := STORE(Entry_Type_Decl). 101 | Module SymTable_Exps := STORE(Entry_Exp_Type). 102 | Module SymTable_Sloc := STORE(Entry_Sloc). 103 | 104 | (** ** Name Table Operation *) 105 | Function reside_nametable_vars (x: idnum) (nt: nametable) := Names.resides x nt.(varNames). 106 | Function reside_nametable_procs (x: procnum) (nt: nametable) := Names.resides x nt.(procNames). 107 | Function reside_nametable_pkgs (x: pkgnum) (nt: nametable) := Names.resides x nt.(pkgNames). 108 | Function reside_nametable_types (x: typenum) (nt: nametable) := Names.resides x nt.(typeNames). 109 | 110 | Function fetch_var_name (x: idnum) (nt: nametable) := Names.fetches x nt.(varNames). 111 | Function fetch_proc_name (x: procnum) (nt: nametable) := Names.fetches x nt.(procNames). 112 | Function fetch_pkg_name (x: pkgnum) (nt: nametable) := Names.fetches x nt.(pkgNames). 113 | Function fetch_type_name (x: typenum) (nt: nametable) := Names.fetches x nt.(typeNames). 114 | 115 | 116 | (** ** Symbol Table Operation *) 117 | 118 | Function reside_symtable_vars (x: idnum) (st: symboltable) := SymTable_Vars.resides x st.(vars). 119 | Function reside_symtable_procs (x: procnum) (st: symboltable) := SymTable_Procs.resides x st.(procs). 120 | Function reside_symtable_types (x: typenum) (st: symboltable) := SymTable_Types.resides x st.(types). 121 | Function reside_symtable_exps (x: astnum) (st: symboltable) := SymTable_Exps.resides x st.(exps). 122 | Function reside_symtable_sloc (x: astnum) (st: symboltable) := SymTable_Sloc.resides x st.(sloc). 123 | 124 | Function fetch_var (x: idnum) (st: symboltable) := SymTable_Vars.fetches x st.(vars). 125 | Function fetch_proc (x: procnum) (st: symboltable) := SymTable_Procs.fetches x st.(procs). 126 | Function fetch_type (x: typenum) (st: symboltable) := SymTable_Types.fetches x st.(types). 127 | Function fetch_exp_type (x: astnum) (st: symboltable) := SymTable_Exps.fetches x st.(exps). 128 | Function fetch_sloc (x: astnum) (st: symboltable) := SymTable_Sloc.fetches x st.(sloc). 129 | 130 | Function update_store {V} (s: list (idnum * V)) (i: idnum) (v: V): list (idnum * V) := 131 | match s with 132 | | (i1, v1) :: s1 => 133 | if beq_nat i1 i then 134 | (i1, v) :: s1 135 | else 136 | (i1, v1) :: (update_store s1 i v) 137 | | nil => (i, v) :: nil 138 | end. 139 | 140 | Arguments update_store {V} _ _ _. 141 | 142 | Function update_vars (st: symboltable) (x: idnum) (mt: mode * type): symboltable := 143 | mkSymbolTable (update_store st.(vars) x mt) st.(procs) st.(types) st.(exps) st.(sloc) st.(names). 144 | 145 | Function update_procs (st: symboltable) (pid: procnum) (p: level * proc_decl): symboltable := 146 | mkSymbolTable st.(vars) (update_store st.(procs) pid p) st.(types) st.(exps) st.(sloc) st.(names). 147 | 148 | Function update_types (st: symboltable) (tid: typenum) (td: type_decl): symboltable := 149 | mkSymbolTable st.(vars) st.(procs) (update_store st.(types) tid td) st.(exps) st.(sloc) st.(names). 150 | 151 | Function update_exps (st: symboltable) (ast_num: astnum) (t: type): symboltable := 152 | mkSymbolTable st.(vars) st.(procs) st.(types) (update_store st.(exps) ast_num t) st.(sloc) st.(names). 153 | 154 | Function update_sloc (st: symboltable) (ast_num: astnum) (sl: source_location): symboltable := 155 | mkSymbolTable st.(vars) st.(procs) st.(types) st.(exps) (update_store st.(sloc) ast_num sl) st.(names). 156 | 157 | Function update_names (st: symboltable) (names: nametable): symboltable := 158 | mkSymbolTable st.(vars) st.(procs) st.(types) st.(exps) st.(sloc) names. 159 | 160 | End SymbolTableM. 161 | 162 | 163 | 164 | 165 | 166 | 167 | 168 | 169 | 170 | 171 | 172 | 173 | 174 | 175 | 176 | 177 | 178 | 179 | 180 | 181 | 182 | 183 | 184 | 185 | 186 | 187 | 188 | 189 | 190 | 191 | 192 | 193 | 194 | 195 | 196 | 197 | 198 | 199 | 200 | 201 | 202 | 203 | 204 | -------------------------------------------------------------------------------- /spark2014_semantics/src/values.v: -------------------------------------------------------------------------------- 1 | (** 2 | _AUTHOR_ 3 | 4 | << 5 | Zhi Zhang 6 | Department of Computer and Information Sciences 7 | Kansas State University 8 | zhangzhi@ksu.edu 9 | >> 10 | *) 11 | 12 | Require Export ast_basics. 13 | 14 | (** * Run-Time Error *) 15 | Inductive errorType: Type := 16 | | DivByZero 17 | | OverflowError 18 | | RangeError. 19 | 20 | (** * Return Value / State *) 21 | (** Statement and expression evaluation returns one of the following results: 22 | - normal result; 23 | - run-time errors, which are required to be detected at run time, 24 | for example, overflow check and division by zero check; 25 | - unterminated state caused by infinite loop (only for functional semantics); 26 | - abnormal state, which includes compile time errors 27 | (for example, type checks failure and undefined variables), 28 | bounded errors and erroneous execution. 29 | In the future, the abnormal state can be refined into these 30 | more precise categories (1.1.5); 31 | *) 32 | 33 | (* return value for exp/statement evaluation defined in inductive type *) 34 | Inductive Ret (A:Type): Type := 35 | | OK: A -> Ret A 36 | | RTE: errorType -> Ret A. 37 | 38 | 39 | Arguments OK [A] _. 40 | Arguments RTE [A] _. 41 | 42 | (** the range of 32-bit (singed/unsigned) integer type: 43 | - modulus : 2^32 ; 44 | - half_modulus : 2^31 ; 45 | - max_unsigned : 2^32 -1 ; 46 | - max_signed : 2^31 - 1 ; 47 | - min_signed : -2^31 ; 48 | *) 49 | Definition wordsize: nat := 32. 50 | Definition modulus : Z := two_power_nat wordsize. 51 | Definition half_modulus : Z := Z.div modulus 2. 52 | Definition max_unsigned : Z := Z.sub modulus 1. 53 | Definition max_signed : Z := Z.sub half_modulus 1. 54 | Definition min_signed : Z := Z.opp half_modulus. 55 | 56 | (** * Value *) 57 | 58 | (** Type of basic values*) 59 | 60 | (** Type of stored values in the store *) 61 | Inductive value: Type := 62 | | Undefined 63 | | Int (n : Z) 64 | | Bool (n : bool) 65 | | ArrayV (a : list (arrindex * value)) (* in SPARK, array index may start from negative number *) 66 | | RecordV (r : list (idnum * value)). 67 | 68 | Definition Division_Error: Ret value := RTE DivByZero. 69 | Definition Overflow_Error: Ret value := RTE OverflowError. 70 | Definition Range_Error: Ret value := RTE RangeError. 71 | 72 | 73 | (** * Bound of Value *) 74 | 75 | (** value is represented by a range, for a variable, if its initial value is undefined 76 | or it's a parameter or its value is dynamically determined, then we use the range 77 | of its type as its value, e.g. x: Integer; it's value is: (Interval Integer'First Integer'Last), 78 | x: Integer := 1; it's value is: (Interval 1 1); 79 | for boolean value, it doesn't matter whether it's true or false, so we just use Bool to 80 | represent boolean value; 81 | *) 82 | 83 | Inductive bound : Type := 84 | | Interval (l : Z) (u: Z) 85 | | Boolval 86 | | Aggregate. 87 | 88 | Definition int32_bound : bound := (Interval min_signed max_signed). 89 | 90 | (** check whether a value falls in a bound *) 91 | Inductive in_bound: Z -> bound -> bool -> Prop := 92 | | IB_True: forall v l u, 93 | (Zle_bool l v) && (Zle_bool v u) = true -> 94 | in_bound v (Interval l u) true 95 | | IB_False: forall v l u, 96 | (Zle_bool l v) && (Zle_bool v u) = false -> 97 | in_bound v (Interval l u) false. 98 | 99 | (** * Value Operation *) 100 | Module Math. 101 | 102 | (** ** Arithmetic Operation *) 103 | 104 | Definition add (v1 v2: value): option value := 105 | match v1, v2 with 106 | | Int v1', Int v2' => Some (Int (v1' + v2')) 107 | | _, _ => None 108 | end. 109 | 110 | Definition sub (v1 v2: value): option value := 111 | match v1, v2 with 112 | | Int v1', Int v2' => Some (Int (v1' - v2')) 113 | | _, _ => None 114 | end. 115 | 116 | Definition mul (v1 v2: value): option value := 117 | match v1, v2 with 118 | | Int v1', Int v2' => Some (Int (v1' * v2')) 119 | | _, _ => None 120 | end. 121 | 122 | 123 | (** map Ada operators to corresponding Coq operators: 124 | - div -> Z.quot 125 | - rem -> Z.rem 126 | - mod -> Z.modulo 127 | 128 | (Note: Ada "mod" has the following formula in Why: 129 | - if y > 0 then EuclideanDivision.mod x y else EuclideanDivision.mod x y + y) 130 | *) 131 | 132 | Definition div (v1 v2: value): option value := 133 | match v1, v2 with 134 | | Int v1', Int v2' => Some (Int (Z.quot v1' v2')) 135 | | _, _ => None 136 | end. 137 | 138 | Definition rem (v1 v2: value): option value := 139 | match v1, v2 with 140 | | Int v1', Int v2' => Some (Int (Z.rem v1' v2')) 141 | | _, _ => None 142 | end. 143 | 144 | (* the keyword "mod" cannot redefined here, so we use "mod'" *) 145 | Definition mod' (v1 v2: value): option value := 146 | match v1, v2 with 147 | | Int v1', Int v2' => Some (Int (Z.modulo v1' v2')) 148 | | _, _ => None 149 | end. 150 | 151 | (** ** Logic Operation *) 152 | Definition and (v1 v2: value): option value := 153 | match v1, v2 with 154 | | Bool v1', Bool v2' => Some (Bool (andb v1' v2')) 155 | | _, _ => None 156 | end. 157 | 158 | Definition or (v1 v2: value): option value := 159 | match v1, v2 with 160 | | Bool v1', Bool v2' => Some (Bool (orb v1' v2')) 161 | | _, _ => None 162 | end. 163 | 164 | (** ** Relational Operation *) 165 | Definition eq (v1 v2: value): option value := 166 | match v1, v2 with 167 | | Int v1', Int v2' => Some (Bool (Zeq_bool v1' v2')) 168 | | _, _ => None 169 | end. 170 | 171 | Definition ne (v1 v2: value): option value := 172 | match v1, v2 with 173 | | Int v1', Int v2' => Some (Bool (Zneq_bool v1' v2')) 174 | | _, _ => None 175 | end. 176 | 177 | Definition gt (v1 v2: value): option value := 178 | match v1, v2 with 179 | | Int v1', Int v2' => Some (Bool (Zgt_bool v1' v2')) 180 | | _, _ => None 181 | end. 182 | 183 | Definition ge (v1 v2: value): option value := 184 | match v1, v2 with 185 | | Int v1', Int v2' => Some (Bool (Zge_bool v1' v2')) 186 | | _, _ => None 187 | end. 188 | 189 | Definition lt (v1 v2: value): option value := 190 | match v1, v2 with 191 | | Int v1', Int v2' => Some (Bool (Zlt_bool v1' v2')) 192 | | _, _ => None 193 | end. 194 | 195 | Definition le (v1 v2: value): option value := 196 | match v1, v2 with 197 | | Int v1', Int v2' => Some (Bool (Zle_bool v1' v2')) 198 | | _, _ => None 199 | end. 200 | 201 | (** Unary Operations *) 202 | Definition unary_not (v: value): option value := 203 | match v with 204 | | Bool v' => Some (Bool (negb v')) 205 | | _ => None 206 | end. 207 | 208 | Definition unary_plus (v: value): option value := 209 | match v with 210 | | Int v' => Some v 211 | | _ => None 212 | end. 213 | 214 | Definition unary_minus (v: value): option value := 215 | match v with 216 | | Int v' => Some (Int (Z.opp v')) 217 | | _ => None 218 | end. 219 | 220 | 221 | (** * Binary Operation *) 222 | Definition binary_operation (op: binary_operator) (v1: value) (v2: value): option value := 223 | match op with 224 | | Equal => eq v1 v2 225 | | Not_Equal => ne v1 v2 226 | | Greater_Than => gt v1 v2 227 | | Greater_Than_Or_Equal => ge v1 v2 228 | | Less_Than => lt v1 v2 229 | | Less_Than_Or_Equal => le v1 v2 230 | | And => and v1 v2 231 | | Or => or v1 v2 232 | | Plus => add v1 v2 233 | | Minus => sub v1 v2 234 | | Multiply => mul v1 v2 235 | | Divide => div v1 v2 236 | | Modulus => mod' v1 v2 237 | end. 238 | 239 | (** * Unary Operation *) 240 | Definition unary_operation (op: unary_operator) (v: value): option value := 241 | match op with 242 | | Not => unary_not v 243 | (* | Unary_Plus => unary_plus v *) 244 | | Unary_Minus => unary_minus v 245 | end. 246 | 247 | End Math. 248 | 249 | 250 | -------------------------------------------------------------------------------- /spark83_semantics/dynamic.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AdaCore/sparkformal/51ed67be1b1d80f7f2681237dfbf4ee7add395d6/spark83_semantics/dynamic.pdf -------------------------------------------------------------------------------- /spark83_semantics/static.pdf: -------------------------------------------------------------------------------- https://raw.githubusercontent.com/AdaCore/sparkformal/51ed67be1b1d80f7f2681237dfbf4ee7add395d6/spark83_semantics/static.pdf --------------------------------------------------------------------------------