├── .gitignore ├── LICENSE ├── README.md ├── index.js ├── packages.dhall ├── spago.dhall ├── src ├── Check │ ├── Core.purs │ ├── Core │ │ └── Traversal.purs │ ├── Elaborate.purs │ ├── Environment.purs │ ├── Error.purs │ ├── Monad.purs │ ├── Solver.purs │ └── Unify.purs ├── Eval │ └── Normalize.purs ├── Main.purs ├── Print.purs ├── Print │ └── Precedence.purs └── Syntax │ ├── Lexer.purs │ ├── Parser.purs │ ├── Token.purs │ └── Tree.purs └── test └── test.flub /.gitignore: -------------------------------------------------------------------------------- 1 | /bower_components/ 2 | /node_modules/ 3 | /.pulp-cache/ 4 | /output/ 5 | /generated-docs/ 6 | /.psc-package/ 7 | /.psc* 8 | /.purs* 9 | /.psa* 10 | /.spago 11 | -------------------------------------------------------------------------------- /LICENSE: -------------------------------------------------------------------------------- 1 | The MIT License (MIT) 2 | 3 | Copyright (c) 2021 Nathan Faubion 4 | 5 | Permission is hereby granted, free of charge, to any person obtaining a copy of 6 | this software and associated documentation files (the "Software"), to deal in 7 | the Software without restriction, including without limitation the rights to 8 | use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of 9 | the Software, and to permit persons to whom the Software is furnished to do so, 10 | subject to the following conditions: 11 | 12 | The above copyright notice and this permission notice shall be included in all 13 | copies or substantial portions of the Software. 14 | 15 | THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 16 | IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS 17 | FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR 18 | COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER 19 | IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN 20 | CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. 21 | -------------------------------------------------------------------------------- /README.md: -------------------------------------------------------------------------------- 1 | # flub-example-compiler 2 | 3 | This project is an example compiler frontend for a straightforward, 4 | statically-typed, functional language called Flub (functional Blub). It 5 | should illustrate an architecture for "real-world" language demands (ranges, 6 | error reporting, actual versus expected, etc.) while still being readable 7 | over a weekend. 8 | 9 | Flub is syntactically a mashup between ML, Scala, and Haskell. It supports 10 | parametric polymorphism, higher-kinded types, type-in-type, recursive-let, 11 | and not much else. 12 | 13 | ## Running the compiler 14 | 15 | First, run the build: 16 | ```sh 17 | spago build 18 | ``` 19 | 20 | Output formatted core: 21 | 22 | ```sh 23 | node index.js example.flub 24 | ``` 25 | 26 | Normalize the "module" (the last declared expression): 27 | 28 | ```sh 29 | node index.js example.flub -n 30 | ``` 31 | 32 | ## Walkthrough 33 | 34 | ### Lexing 35 | 36 | * [Syntax.Token](./src/Syntax/Token.purs) 37 | * [Syntax.Lexer](./src/Syntax/Lexer.purs) 38 | 39 | The lexer parses the non-recursive syntax (tokens). Lexical parsing is 40 | generally very straightforward, but is where the majority of parsing time is 41 | spent. Separating the language parser from the lexer, while optimizing the 42 | lexer, is a great way to improve overall parsing performance. 43 | 44 | This lexer is not optimized, but illustrates how one can implement a lazy 45 | token stream with accurate source annotations for tokens. Lazy token streams 46 | have a nice property of only parsing on demand, while also sharing work if 47 | the language parser must backtrack. 48 | 49 | ### Parsing 50 | 51 | * [Syntax.Tree](./src/Syntax/Tree.purs) 52 | * [Syntax.Parser](./src/Syntax/Parser.purs) 53 | 54 | The syntax tree is the concrete language syntax. Every token is represented 55 | in the tree, and every token is annotated with positions, comments, and 56 | whitespace. This means our original source is fully represented in this tree, 57 | and can be printed back out exactly. This is a nice property since it means 58 | we can implement transformations on input syntax while potentially retaining 59 | the original formatting. 60 | 61 | ### Elaboration 62 | 63 | * [Check.Core](./src/Check/Core.purs) 64 | * [Check.Elaborate](./src/Check/Elaborate.purs) 65 | 66 | The elaborator takes the concrete syntax, and transforms it into an internal 67 | core language (known as [System Fω](https://en.wikipedia.org/wiki/System_F)). 68 | In Core, all polymorphism is represented as explicit type-abstractions and 69 | type-applications. Additionally, every term and argument is assigned an 70 | explicit type. Unification variables are used to track unknown types, which 71 | we later solve as part of type-checking. 72 | 73 | While elaborating into Core, we also emit type-checking constraints, which 74 | assert equalities between types. At specific points, we will then invoke the 75 | solver, which processes these constraints and yields a substitution 76 | (solutions) for our unknowns. Applying the substitution will yield Core 77 | without any unknowns. 78 | 79 | ### Solving 80 | 81 | * [Check.Solver](./src/Check/Solver.purs) 82 | * [Check.Unify](./src/Check/Unify.purs) 83 | 84 | The solver takes pending equality constraints from the elaborator and 85 | attempts to solve them one-by-one through a process called unification. 86 | Unification walks structurally over the two operands, and when a unification 87 | variable meets some other type, records it as a solution. Types that are not 88 | equivalent result in an error. 89 | 90 | ### Normalization 91 | 92 | * [Eval.Normalize](./src/Eval/Normalize.purs) 93 | 94 | Normalization reduces a term until it can't be reduced anymore. Generally, 95 | System F is strongly normalizing (always reduces in finite time), but because 96 | we allow recursive bindings, it's possible for normalization to loop forever. 97 | 98 | In this implementation, it's done through a process called 99 | normalization-by-evaluation. Terms are evaluated against their closure 100 | environment, and then syntax is reifed from that environment. 101 | 102 | ### Printing 103 | 104 | * [Print](./src/Print.purs) 105 | * [Print.Precedence](./src/Print/Precedence.purs) 106 | 107 | It's always nice to see our output in a human-readable manner. The most 108 | confusing part about pretty-printing is handling fixity and precedence such 109 | that parentheses are inserted in the appropriate places. 110 | 111 | There are many ways to tackle this problem, but one of the more 112 | straightforward ways is through an intermediate data structure that annotates 113 | syntax with its fixity. 114 | -------------------------------------------------------------------------------- /index.js: -------------------------------------------------------------------------------- 1 | require("./output/Main").main(); 2 | -------------------------------------------------------------------------------- /packages.dhall: -------------------------------------------------------------------------------- 1 | {- 2 | Welcome to your new Dhall package-set! 3 | 4 | Below are instructions for how to edit this file for most use 5 | cases, so that you don't need to know Dhall to use it. 6 | 7 | ## Warning: Don't Move This Top-Level Comment! 8 | 9 | Due to how `dhall format` currently works, this comment's 10 | instructions cannot appear near corresponding sections below 11 | because `dhall format` will delete the comment. However, 12 | it will not delete a top-level comment like this one. 13 | 14 | ## Use Cases 15 | 16 | Most will want to do one or both of these options: 17 | 1. Override/Patch a package's dependency 18 | 2. Add a package not already in the default package set 19 | 20 | This file will continue to work whether you use one or both options. 21 | Instructions for each option are explained below. 22 | 23 | ### Overriding/Patching a package 24 | 25 | Purpose: 26 | - Change a package's dependency to a newer/older release than the 27 | default package set's release 28 | - Use your own modified version of some dependency that may 29 | include new API, changed API, removed API by 30 | using your custom git repo of the library rather than 31 | the package set's repo 32 | 33 | Syntax: 34 | Replace the overrides' "{=}" (an empty record) with the following idea 35 | The "//" or "⫽" means "merge these two records and 36 | when they have the same value, use the one on the right:" 37 | ------------------------------- 38 | let overrides = 39 | { packageName = 40 | upstream.packageName // { updateEntity1 = "new value", updateEntity2 = "new value" } 41 | , packageName = 42 | upstream.packageName // { version = "v4.0.0" } 43 | , packageName = 44 | upstream.packageName // { repo = "https://www.example.com/path/to/new/repo.git" } 45 | } 46 | ------------------------------- 47 | 48 | Example: 49 | ------------------------------- 50 | let overrides = 51 | { halogen = 52 | upstream.halogen // { version = "master" } 53 | , halogen-vdom = 54 | upstream.halogen-vdom // { version = "v4.0.0" } 55 | } 56 | ------------------------------- 57 | 58 | ### Additions 59 | 60 | Purpose: 61 | - Add packages that aren't already included in the default package set 62 | 63 | Syntax: 64 | Replace the additions' "{=}" (an empty record) with the following idea: 65 | ------------------------------- 66 | let additions = 67 | { package-name = 68 | { dependencies = 69 | [ "dependency1" 70 | , "dependency2" 71 | ] 72 | , repo = 73 | "https://example.com/path/to/git/repo.git" 74 | , version = 75 | "tag ('v4.0.0') or branch ('master')" 76 | } 77 | , package-name = 78 | { dependencies = 79 | [ "dependency1" 80 | , "dependency2" 81 | ] 82 | , repo = 83 | "https://example.com/path/to/git/repo.git" 84 | , version = 85 | "tag ('v4.0.0') or branch ('master')" 86 | } 87 | , etc. 88 | } 89 | ------------------------------- 90 | 91 | Example: 92 | ------------------------------- 93 | let additions = 94 | { benchotron = 95 | { dependencies = 96 | [ "arrays" 97 | , "exists" 98 | , "profunctor" 99 | , "strings" 100 | , "quickcheck" 101 | , "lcg" 102 | , "transformers" 103 | , "foldable-traversable" 104 | , "exceptions" 105 | , "node-fs" 106 | , "node-buffer" 107 | , "node-readline" 108 | , "datetime" 109 | , "now" 110 | ] 111 | , repo = 112 | "https://github.com/hdgarrood/purescript-benchotron.git" 113 | , version = 114 | "v7.0.0" 115 | } 116 | } 117 | ------------------------------- 118 | -} 119 | 120 | 121 | let upstream = 122 | https://github.com/purescript/package-sets/releases/download/psc-0.13.8-20200708/packages.dhall sha256:df5b0f1ae92d4401404344f4fb2a7a3089612c9f30066dcddf9eaea4fe780e29 123 | 124 | let overrides = {=} 125 | 126 | let additions = 127 | { dodo-printer = 128 | { dependencies = 129 | [ "ansi" 130 | , "foldable-traversable" 131 | , "lists" 132 | , "maybe" 133 | , "strings" 134 | ] 135 | , repo = 136 | "https://github.com/natefaubion/purescript-dodo-printer.git" 137 | , version = 138 | "v1.0.6" 139 | } 140 | } 141 | 142 | in upstream // overrides // additions 143 | -------------------------------------------------------------------------------- /spago.dhall: -------------------------------------------------------------------------------- 1 | {- 2 | Welcome to a Spago project! 3 | You can edit this file as you like. 4 | -} 5 | { name = "my-project" 6 | , dependencies = 7 | [ "arrays" 8 | , "console" 9 | , "control" 10 | , "debug" 11 | , "dodo-printer" 12 | , "effect" 13 | , "free" 14 | , "lists" 15 | , "maybe" 16 | , "node-fs-aff" 17 | , "node-process" 18 | , "ordered-collections" 19 | , "parsing" 20 | , "psci-support" 21 | , "run" 22 | , "string-parsers" 23 | , "transformers" 24 | ] 25 | , packages = ./packages.dhall 26 | , sources = [ "src/**/*.purs", "test/**/*.purs" ] 27 | } 28 | -------------------------------------------------------------------------------- /src/Check/Core.purs: -------------------------------------------------------------------------------- 1 | module Check.Core where 2 | 3 | import Prelude 4 | 5 | import Data.List.Types (NonEmptyList) 6 | import Data.Newtype (class Newtype) 7 | import Data.Tuple (Tuple) 8 | import Syntax.Token (Range) 9 | 10 | newtype ScopeLevel = ScopeLevel Int 11 | 12 | derive instance newtypeScopeLevel :: Newtype ScopeLevel _ 13 | derive newtype instance eqScopeLevel :: Eq ScopeLevel 14 | derive newtype instance ordScopeLevel :: Ord ScopeLevel 15 | 16 | data Scoped a = Scoped ScopeLevel a 17 | 18 | derive instance eqScoped :: Eq a => Eq (Scoped a) 19 | derive instance ordScoped :: Ord a => Ord (Scoped a) 20 | 21 | newtype Identifier = Identifier String 22 | 23 | derive instance newtypeIdentifier :: Newtype Identifier _ 24 | derive instance eqIdentifier :: Eq Identifier 25 | derive instance ordIdentifier :: Ord Identifier 26 | 27 | newtype Uni = Uni Int 28 | 29 | derive instance newtypeUni :: Newtype Uni _ 30 | derive instance eqUni :: Eq Uni 31 | derive instance ordUni :: Ord Uni 32 | 33 | data Module 34 | = Module (NonEmptyList Decl) 35 | 36 | data Decl 37 | = DeclLet Let 38 | 39 | data Expr 40 | = ExprLit Ty Range Lit 41 | | ExprVar Ty Range (Scoped Identifier) 42 | | ExprLet Ty Range Let Expr 43 | | ExprApp Ty Range Expr Expr 44 | | ExprAbs Ty Range Binding Expr 45 | | ExprTyApp Ty Range Expr Ty 46 | | ExprTyAbs Ty Range Binding Expr 47 | 48 | data Let 49 | = LetOne (Scoped Identifier) Expr 50 | | LetRec (NonEmptyList (Tuple (Scoped Identifier) Expr)) 51 | 52 | data Ty 53 | = TyType Range 54 | | TyArrow Range Ty Ty 55 | | TyForall Range Binding Ty 56 | | TyVar Ty Range (Scoped Identifier) 57 | | TyApp Ty Range Ty Ty 58 | | TyUni Ty Range (Scoped Uni) 59 | 60 | data Binding 61 | = Binding Ty Range (Scoped Identifier) 62 | 63 | data Lit 64 | = LitInt Int 65 | | LitString String 66 | 67 | data Constraint = 68 | ConEquals Range { expect :: Ty, actual :: Ty } 69 | 70 | typeOfExpr :: Expr -> Ty 71 | typeOfExpr = case _ of 72 | ExprLit ty _ _ -> ty 73 | ExprVar ty _ _ -> ty 74 | ExprLet ty _ _ _ -> ty 75 | ExprApp ty _ _ _ -> ty 76 | ExprAbs ty _ _ _ -> ty 77 | ExprTyApp ty _ _ _ -> ty 78 | ExprTyAbs ty _ _ _ -> ty 79 | 80 | rangeOfExpr :: Expr -> Range 81 | rangeOfExpr = case _ of 82 | ExprLit _ range _ -> range 83 | ExprVar _ range _ -> range 84 | ExprLet _ range _ _ -> range 85 | ExprApp _ range _ _ -> range 86 | ExprAbs _ range _ _ -> range 87 | ExprTyApp _ range _ _ -> range 88 | ExprTyAbs _ range _ _ -> range 89 | 90 | typeOfType :: Ty -> Ty 91 | typeOfType = case _ of 92 | TyType range -> TyType range 93 | TyArrow range _ _ -> TyType range 94 | TyForall range _ _ -> TyType range 95 | TyVar ty _ _ -> ty 96 | TyApp ty _ _ _ -> ty 97 | TyUni ty _ _ -> ty 98 | 99 | rangeOfType :: Ty -> Range 100 | rangeOfType = case _ of 101 | TyType range -> range 102 | TyArrow range _ _ -> range 103 | TyForall range _ _ -> range 104 | TyVar _ range _ -> range 105 | TyApp _ range _ _ -> range 106 | TyUni _ range _ -> range 107 | -------------------------------------------------------------------------------- /src/Check/Core/Traversal.purs: -------------------------------------------------------------------------------- 1 | module Check.Core.Traversal 2 | ( rewriteExprBottomUpM 3 | , rewriteExprTopDownM 4 | , rewriteExprWithContextM 5 | , rewriteExprBottomUp 6 | , rewriteExprTopDown 7 | , rewriteExprWithContext 8 | , foldMapExpr 9 | , rewriteTypeBottomUpM 10 | , rewriteTypeTopDownM 11 | , rewriteTypeWithContextM 12 | , rewriteTypeBottomUp 13 | , rewriteTypeTopDown 14 | , rewriteTypeWithContext 15 | , foldMapType 16 | , substituteTypeVar 17 | , unknownsInType 18 | ) where 19 | 20 | import Prelude 21 | 22 | import Check.Core (Binding(..), Expr(..), Identifier, Let(..), Scoped, Ty(..), Uni) 23 | import Control.Monad.Free (runFree) 24 | import Control.Monad.Reader (ReaderT(..), runReaderT) 25 | import Data.Const (Const(..)) 26 | import Data.Functor.Compose (Compose(..)) 27 | import Data.Identity (Identity(..)) 28 | import Data.Newtype (un) 29 | import Data.Traversable (traverse) 30 | import Data.Tuple (Tuple, uncurry) 31 | 32 | type Traversal a = forall f. Applicative f => (a -> f a) -> a -> f a 33 | type MonadicTraversal a = forall m. Monad m => (a -> m a) -> a -> m a 34 | type MonadicTraversalWithContext a = forall c m. Monad m => (c -> a -> m (Tuple c a)) -> c -> a -> m a 35 | type MonoidalTraversal a = forall m. Monoid m => (a -> m) -> a -> m 36 | type PureTraversal a = (a -> a) -> a -> a 37 | type PureTraversalWithContext a = forall c. (c -> a -> Tuple c a) -> c -> a -> a 38 | 39 | traverseExpr1 :: Traversal Expr 40 | traverseExpr1 k = case _ of 41 | ExprLet ty range lets expr -> ExprLet ty range <$> goLet lets <*> k expr 42 | ExprApp ty range expr1 expr2 -> ExprApp ty range <$> k expr1 <*> k expr2 43 | ExprAbs ty range binding expr -> ExprAbs ty range binding <$> k expr 44 | ExprTyApp ty range expr arg -> ExprTyApp ty range <$> k expr <*> pure arg 45 | ExprTyAbs ty range binding expr -> ExprTyAbs ty range binding <$> k expr 46 | expr -> pure expr 47 | where 48 | goLet = case _ of 49 | LetOne ident expr -> LetOne ident <$> k expr 50 | LetRec lets -> LetRec <$> traverse (traverse k) lets 51 | 52 | traverseType1 :: Traversal Ty 53 | traverseType1 k = case _ of 54 | TyArrow range ty1 ty2 -> TyArrow range <$> k ty1 <*> k ty2 55 | TyForall range binding ty -> TyForall range <$> goBinding binding <*> k ty 56 | TyApp ty range ty1 ty2 -> TyApp ty range <$> k ty1 <*> k ty2 57 | ty -> pure ty 58 | where 59 | goBinding (Binding ty range var) = 60 | (\newTy -> Binding newTy range var) <$> k ty 61 | 62 | bottomUpTraversal :: Traversal ~> MonadicTraversal 63 | bottomUpTraversal traversal k = go 64 | where go a = k =<< traversal go a 65 | 66 | topDownTraversal :: Traversal ~> MonadicTraversal 67 | topDownTraversal traversal k = go 68 | where go a = k a >>= traversal go 69 | 70 | topDownTraversalWithContext :: Traversal ~> MonadicTraversalWithContext 71 | topDownTraversalWithContext traversal k = flip (runReaderT <<< go) 72 | where go a = ReaderT \ctx -> k ctx a >>= uncurry (flip (runReaderT <<< traversal go)) 73 | 74 | monoidalTraversal :: Traversal ~> MonoidalTraversal 75 | monoidalTraversal traversal k = un Const <<< runFree (un Identity) <<< un Compose <<< go 76 | where go a = Compose (pure (Const (k a))) <*> traversal go a 77 | 78 | purely :: MonadicTraversal ~> PureTraversal 79 | purely traversal k = runFree (un Identity) <<< traversal (pure <<< k) 80 | 81 | purelyWithContext :: MonadicTraversalWithContext ~> PureTraversalWithContext 82 | purelyWithContext traversal k c = runFree (un Identity) <<< traversal (\c' a' -> pure (k c' a')) c 83 | 84 | rewriteExprBottomUpM :: MonadicTraversal Expr 85 | rewriteExprBottomUpM = bottomUpTraversal traverseExpr1 86 | 87 | rewriteExprTopDownM :: MonadicTraversal Expr 88 | rewriteExprTopDownM = topDownTraversal traverseExpr1 89 | 90 | rewriteExprWithContextM :: MonadicTraversalWithContext Expr 91 | rewriteExprWithContextM = topDownTraversalWithContext traverseExpr1 92 | 93 | rewriteExprBottomUp :: PureTraversal Expr 94 | rewriteExprBottomUp = purely rewriteExprBottomUpM 95 | 96 | rewriteExprTopDown :: PureTraversal Expr 97 | rewriteExprTopDown = purely rewriteExprTopDownM 98 | 99 | rewriteExprWithContext :: PureTraversalWithContext Expr 100 | rewriteExprWithContext = purelyWithContext rewriteExprWithContextM 101 | 102 | foldMapExpr :: MonoidalTraversal Expr 103 | foldMapExpr = monoidalTraversal traverseExpr1 104 | 105 | rewriteTypeBottomUpM :: MonadicTraversal Ty 106 | rewriteTypeBottomUpM = bottomUpTraversal traverseType1 107 | 108 | rewriteTypeTopDownM :: MonadicTraversal Ty 109 | rewriteTypeTopDownM = topDownTraversal traverseType1 110 | 111 | rewriteTypeWithContextM :: MonadicTraversalWithContext Ty 112 | rewriteTypeWithContextM = topDownTraversalWithContext traverseType1 113 | 114 | rewriteTypeBottomUp :: PureTraversal Ty 115 | rewriteTypeBottomUp = purely rewriteTypeBottomUpM 116 | 117 | rewriteTypeTopDown :: PureTraversal Ty 118 | rewriteTypeTopDown = purely rewriteTypeTopDownM 119 | 120 | rewriteTypeWithContext :: PureTraversalWithContext Ty 121 | rewriteTypeWithContext = purelyWithContext rewriteTypeWithContextM 122 | 123 | foldMapType :: MonoidalTraversal Ty 124 | foldMapType = monoidalTraversal traverseType1 125 | 126 | substituteTypeVar :: Scoped Identifier -> Ty -> Ty -> Ty 127 | substituteTypeVar var ty = rewriteTypeBottomUp case _ of 128 | TyVar _ _ var' | var == var' -> ty 129 | other -> other 130 | 131 | unknownsInType :: Ty -> Array (Scoped Uni) 132 | unknownsInType = foldMapType case _ of 133 | TyUni _ _ u -> [u] 134 | _ -> [] 135 | -------------------------------------------------------------------------------- /src/Check/Elaborate.purs: -------------------------------------------------------------------------------- 1 | module Check.Elaborate where 2 | 3 | import Prelude 4 | 5 | import Check.Core as Core 6 | import Check.Core.Traversal (substituteTypeVar) 7 | import Check.Environment (BindingData, BindingNamespace(..), UniSource(..), primEnvironment, primTyInt, primTyString) 8 | import Check.Error (CheckError(..)) 9 | import Check.Monad (Check, addToEnvironment, currentScope, emit, freshUni, resetSubstitution, scoped) 10 | import Check.Solver (Substitution, solve) 11 | import Check.Solver as Solver 12 | import Control.Monad.Error.Class (throwError) 13 | import Control.Monad.State (get) 14 | import Data.Array.NonEmpty as NonEmptyArray 15 | import Data.Foldable (foldr, for_) 16 | import Data.List as List 17 | import Data.List.NonEmpty as NonEmptyList 18 | import Data.List.Types (List, NonEmptyList) 19 | import Data.Map as Map 20 | import Data.Maybe (Maybe(..), isNothing) 21 | import Data.Traversable (for) 22 | import Data.Tuple (Tuple(..), fst, snd) 23 | import Syntax.Token (range1) 24 | import Syntax.Tree as Syntax 25 | 26 | elaborateModule :: Syntax.Module -> Check Core.Module 27 | elaborateModule (Syntax.Module (Syntax.Delimited hd tl _) _) = do 28 | addToEnvironment primEnvironment 29 | scoped do 30 | let decls = NonEmptyList.cons' hd $ snd <$> tl 31 | Core.Module <$> for decls \decl -> 32 | elaborateDecl decl <* resetSubstitution 33 | 34 | elaborateDecl :: Syntax.Decl -> Check Core.Decl 35 | elaborateDecl = case _ of 36 | Syntax.DeclLet _ letDecl -> do 37 | Tuple letDeclElab letDeclEnv <- checkLet letDecl 38 | sub <- solve 39 | for_ (NonEmptyArray.fromArray (unsolvedInSubstitution sub)) do 40 | throwError <<< pure <<< UnknownTypesInDecl (Syntax.rangeOfLet letDecl) 41 | let letDeclSub = Solver.applySubstitutionToLet sub letDeclElab 42 | let letDeclEnvSub = map (Solver.applySubstitutionToEnvironment sub) <$> letDeclEnv 43 | addToEnvironment letDeclEnvSub 44 | pure $ Core.DeclLet letDeclSub 45 | where 46 | unsolvedInSubstitution :: Substitution -> Array (Core.Scoped Core.Uni) 47 | unsolvedInSubstitution = 48 | _.univars 49 | >>> Map.filter (_.solution >>> isNothing) 50 | >>> Map.toUnfoldable 51 | >>> map (\(Tuple u { scopeLevel }) -> Core.Scoped scopeLevel u) 52 | 53 | inferExpr :: Syntax.Expr -> Check Core.Expr 54 | inferExpr = case _ of 55 | Syntax.ExprLit lit -> 56 | inferLit lit 57 | Syntax.ExprVar ident@(Syntax.Identifier tok var) -> do 58 | { environment } <- get 59 | case Map.lookup (Core.Identifier var) environment of 60 | Just { namespace: BindingExprVar, scopeLevel, type: ty } -> 61 | pure $ Core.ExprVar ty (range1 tok) $ Core.Scoped scopeLevel $ Core.Identifier var 62 | Just { range } -> 63 | throwError $ pure $ TypeVariableInExpr range ident 64 | Nothing -> 65 | throwError $ pure $ UnknownIdentifier ident 66 | Syntax.ExprApp fn args -> do 67 | let Tuple fn' args' = flattenExprApp fn args 68 | fnElab <- inferExpr fn' 69 | checkExprApp fnElab args' 70 | Syntax.ExprFun tok args _ expr -> do 71 | argEnv <- for args inferArgument 72 | addToEnvironment argEnv 73 | exprElab <- inferExpr expr 74 | let 75 | toExprAbs exprElab' (Tuple var { range, scopeLevel, type: varTy }) = do 76 | let absType = Core.TyArrow range varTy (Core.typeOfExpr exprElab') 77 | let absRange = range <> Core.rangeOfExpr exprElab' 78 | let binding = Core.Binding varTy range (Core.Scoped scopeLevel var) 79 | -- TODO: Not sure about this range 80 | Core.ExprAbs absType absRange binding exprElab' 81 | pure $ foldr (flip toExprAbs) exprElab argEnv 82 | Syntax.ExprLet tok letGroup _ expr -> 83 | scoped do 84 | Tuple letGroupElab letEnvBindings <- checkLet letGroup 85 | addToEnvironment letEnvBindings 86 | exprElab <- inferExpr expr 87 | let range = range1 tok <> Core.rangeOfExpr exprElab 88 | pure $ Core.ExprLet (Core.typeOfExpr exprElab) range letGroupElab exprElab 89 | Syntax.ExprTyped expr _ ty -> do 90 | tyElab <- checkType ty $ Core.TyType $ Syntax.rangeOfType ty 91 | checkExpr expr tyElab 92 | Syntax.ExprParens (Syntax.Wrapped _ expr _) -> 93 | inferExpr expr 94 | 95 | checkLet :: Syntax.Let -> Check (Tuple Core.Let (NonEmptyList (Tuple Core.Identifier BindingData))) 96 | checkLet = case _ of 97 | Syntax.LetOne (Syntax.LetDefinition (Syntax.LetBinding ident@(Syntax.Identifier identTok var) tyVarBindings argBindings mbResultTy) _ expr) -> do 98 | scopeLevel <- currentScope 99 | scoped do 100 | tyVarEnv <- inferTypeVariables tyVarBindings 101 | addToEnvironment tyVarEnv 102 | scoped do 103 | letVarEnv <- for argBindings inferArgument 104 | addToEnvironment letVarEnv 105 | resultTy <- case mbResultTy of 106 | Just (Syntax.Typed _ resultTy) -> do 107 | let range = range1 identTok -- TODO: This is not a good range. 108 | checkType resultTy $ Core.TyType range 109 | Nothing -> do 110 | let range = range1 identTok 111 | let kind = Core.TyType range 112 | freshUni (UniLetResultType ident) kind range 113 | exprElab <- checkExpr expr resultTy 114 | let 115 | letExprElab = elaborateLetExpr (tyVarEnv <> letVarEnv) exprElab 116 | letElab = Core.LetOne (Core.Scoped scopeLevel (Core.Identifier var)) letExprElab 117 | bindings = NonEmptyList.singleton $ Tuple (Core.Identifier var) 118 | { namespace: BindingExprVar 119 | , range: range1 identTok 120 | , scopeLevel 121 | , type: Core.typeOfExpr letExprElab 122 | } 123 | pure $ Tuple letElab bindings 124 | Syntax.LetRec definitions -> do 125 | scopeLevel <- currentScope 126 | scoped do 127 | definitionInfos <- for definitions \(Tuple tok (Syntax.LetDefinition (Syntax.LetBinding ident@(Syntax.Identifier identTok var) tyVarBindings argBindings mbResultTy) _ expr)) -> do 128 | Tuple arguments resultTy <- scoped do 129 | tyVarEnv <- inferTypeVariables tyVarBindings 130 | addToEnvironment tyVarEnv 131 | scoped do 132 | letVarEnv <- for argBindings inferArgument 133 | addToEnvironment letVarEnv 134 | resultTy <- case mbResultTy of 135 | Just (Syntax.Typed _ resultTy) -> do 136 | let range = range1 identTok -- TODO: This is not a good range. 137 | checkType resultTy $ Core.TyType range 138 | Nothing -> do 139 | let range = range1 identTok 140 | let kind = Core.TyType range 141 | freshUni (UniLetResultType ident) kind range 142 | let bindingEnv = tyVarEnv <> letVarEnv 143 | pure $ Tuple (tyVarEnv <> letVarEnv) resultTy 144 | let 145 | definitionRange = range1 tok <> Syntax.rangeOfExpr expr 146 | definitionTy = elaborateLetType arguments resultTy 147 | binding = Tuple (Core.Identifier var) 148 | { namespace: BindingExprVar 149 | , range: definitionRange 150 | , scopeLevel 151 | , type: definitionTy 152 | } 153 | pure 154 | { binding 155 | , range: definitionRange 156 | , type: resultTy 157 | , expr 158 | , arguments 159 | } 160 | let definitionBindings = _.binding <$> definitionInfos 161 | addToEnvironment definitionBindings 162 | definitionElabs <- for definitionInfos \def -> 163 | scoped do 164 | addToEnvironment def.arguments 165 | exprElab <- checkExpr def.expr def.type 166 | let definitionElab = elaborateLetExpr def.arguments exprElab 167 | pure $ Tuple (Core.Scoped scopeLevel (fst def.binding)) definitionElab 168 | pure $ Tuple (Core.LetRec definitionElabs) definitionBindings 169 | where 170 | elaborateLetExpr :: List (Tuple Core.Identifier BindingData) -> Core.Expr -> Core.Expr 171 | elaborateLetExpr = flip $ foldr \(Tuple var { namespace, range, scopeLevel, type: ty }) expr -> 172 | case namespace of 173 | BindingExprVar -> do 174 | let exprTy = Core.typeOfExpr expr 175 | let binding = Core.Binding ty range (Core.Scoped scopeLevel var) 176 | Core.ExprAbs (Core.TyArrow (range <> Core.rangeOfType exprTy) ty exprTy) 177 | (range <> Core.rangeOfExpr expr) 178 | binding expr 179 | BindingTyVar -> do 180 | let exprTy = Core.typeOfExpr expr 181 | let binding = Core.Binding ty range (Core.Scoped scopeLevel var) 182 | Core.ExprTyAbs 183 | (Core.TyForall (range <> Core.rangeOfType exprTy) 184 | binding exprTy) 185 | (range <> Core.rangeOfExpr expr) binding expr 186 | 187 | elaborateLetType :: List (Tuple Core.Identifier BindingData) -> Core.Ty -> Core.Ty 188 | elaborateLetType = flip $ foldr \(Tuple var { namespace, range, scopeLevel, type: ty }) innerTy -> 189 | case namespace of 190 | BindingExprVar -> 191 | Core.TyArrow (range <> Core.rangeOfType innerTy) ty innerTy 192 | BindingTyVar -> do 193 | let binding = Core.Binding ty range (Core.Scoped scopeLevel var) 194 | Core.TyForall (range <> Core.rangeOfType innerTy) binding innerTy 195 | 196 | inferTypeVariables :: Maybe (Syntax.Wrapped (Syntax.Delimited Syntax.TypeArgument)) -> Check (List (Tuple Core.Identifier BindingData)) 197 | inferTypeVariables tyVarBindings = do 198 | let 199 | tyVars = case tyVarBindings of 200 | Just (Syntax.Wrapped _ (Syntax.Delimited hd tl _) _) -> 201 | List.Cons hd $ snd <$> tl 202 | _ -> 203 | List.Nil 204 | scopeLevel <- currentScope 205 | for tyVars \(Syntax.TypeArgument ident@(Syntax.Identifier tok var) mbKind) -> do 206 | let range = range1 tok 207 | kind <- case mbKind of 208 | Just (Syntax.Typed _ ty) -> 209 | checkType ty $ Core.TyType $ Syntax.rangeOfType ty 210 | Nothing -> 211 | freshUni (UniTypeArgumentType ident) (Core.TyType range) range 212 | pure $ Tuple (Core.Identifier var) 213 | { namespace: BindingTyVar 214 | , range 215 | , scopeLevel 216 | , type: kind 217 | } 218 | 219 | inferArgument :: Syntax.Argument -> Check (Tuple Core.Identifier BindingData) 220 | inferArgument = case _ of 221 | Syntax.InferredArgument ident@(Syntax.Identifier tok var) -> do 222 | let range = range1 tok 223 | let kind = Core.TyType range 224 | tyElab <- freshUni (UniArgumentType ident) kind range 225 | scopeLevel <- currentScope 226 | pure $ Tuple (Core.Identifier var) 227 | { namespace: BindingExprVar 228 | , range 229 | , scopeLevel 230 | , type: tyElab 231 | } 232 | Syntax.TypedArgument (Syntax.Wrapped _ (Syntax.TypedIdentifier (Syntax.Identifier tok var) _ ty) _) -> do 233 | let range = range1 tok 234 | tyElab <- checkType ty $ Core.TyType range 235 | scopeLevel <- currentScope 236 | pure $ Tuple (Core.Identifier var) 237 | { namespace: BindingExprVar 238 | , range 239 | , scopeLevel 240 | , type: tyElab 241 | } 242 | 243 | inferLit :: Syntax.Lit -> Check Core.Expr 244 | inferLit = case _ of 245 | Syntax.LitInt tok value -> do 246 | let range = range1 tok 247 | pure $ Core.ExprLit (primTyInt range) range $ Core.LitInt value 248 | Syntax.LitString tok value -> do 249 | let range = range1 tok 250 | pure $ Core.ExprLit (primTyString range) range $ Core.LitString value 251 | 252 | checkExpr :: Syntax.Expr -> Core.Ty -> Check Core.Expr 253 | checkExpr expr ty = do 254 | exprElab <- instantiateExpr =<< inferExpr expr 255 | emit $ Core.ConEquals (Core.rangeOfExpr exprElab) { actual: Core.typeOfExpr exprElab, expect: ty } 256 | pure exprElab 257 | 258 | checkExprApp :: Core.Expr -> NonEmptyList Syntax.Expr -> Check Core.Expr 259 | checkExprApp expr args = do 260 | instantiatedExpr <- instantiateExpr expr 261 | NonEmptyList.foldM checkArg instantiatedExpr args 262 | where 263 | checkArg :: Core.Expr -> Syntax.Expr -> Check Core.Expr 264 | checkArg fn arg = case Core.typeOfExpr fn of 265 | Core.TyArrow range argTy resultTy -> do 266 | argElab <- checkExpr arg argTy 267 | let appRange = Core.rangeOfExpr fn <> Core.rangeOfExpr argElab 268 | pure $ Core.ExprApp resultTy appRange fn argElab 269 | ty -> do 270 | let tyRange = Core.rangeOfType ty 271 | let tyKind = Core.TyType tyRange 272 | argElab <- inferExpr arg 273 | let exprRange = Core.rangeOfExpr fn 274 | let appRange = exprRange <> Core.rangeOfExpr argElab 275 | resultTy <- freshUni (UniAppResultType appRange) tyKind tyRange 276 | let fnTy = Core.TyArrow appRange (Core.typeOfExpr argElab) resultTy 277 | emit $ Core.ConEquals exprRange { actual: ty, expect: fnTy } 278 | pure $ Core.ExprApp resultTy appRange fn argElab 279 | 280 | instantiateExpr :: Core.Expr -> Check Core.Expr 281 | instantiateExpr = applyTypeArgs 282 | where 283 | applyTypeArgs :: Core.Expr -> Check Core.Expr 284 | applyTypeArgs expr = case Core.typeOfExpr expr of 285 | Core.TyForall _ (Core.Binding kind varRange var) resultTy -> do 286 | let range = Core.rangeOfExpr expr 287 | varTy <- freshUni (UniInstantiation range var) kind varRange 288 | let instantiatedTy = substituteTypeVar var varTy resultTy 289 | applyTypeArgs $ Core.ExprTyApp instantiatedTy range expr varTy 290 | _ -> 291 | pure expr 292 | 293 | checkType :: Syntax.Ty -> Core.Ty -> Check Core.Ty 294 | checkType ty kind = do 295 | elabTy <- inferType ty 296 | emit $ Core.ConEquals (Core.rangeOfType elabTy) 297 | { actual: Core.typeOfType elabTy 298 | , expect: kind 299 | } 300 | pure elabTy 301 | 302 | inferType :: Syntax.Ty -> Check Core.Ty 303 | inferType = case _ of 304 | Syntax.TyType tok -> 305 | pure $ Core.TyType $ range1 tok 306 | Syntax.TyArrow arg tok result -> do 307 | argElab <- checkType arg $ Core.TyType $ Syntax.rangeOfType arg 308 | resultElab <- checkType result $ Core.TyType $ Syntax.rangeOfType result 309 | let range = Core.rangeOfType argElab <> Core.rangeOfType resultElab 310 | pure $ Core.TyArrow range argElab resultElab 311 | Syntax.TyVar ident@(Syntax.Identifier tok var) -> do 312 | { environment } <- get 313 | case Map.lookup (Core.Identifier var) environment of 314 | Just { namespace: BindingTyVar, scopeLevel, type: varTy } -> 315 | pure $ Core.TyVar varTy (range1 tok) $ Core.Scoped scopeLevel $ Core.Identifier var 316 | Just { range, scopeLevel } -> 317 | throwError $ pure $ ExprVariableInType range ident 318 | Nothing -> 319 | throwError $ pure $ UnknownIdentifier ident 320 | Syntax.TyApp ctor args -> do 321 | let Tuple ctor' args' = flattenTypeApp ctor args 322 | ctorElab <- inferType ctor' 323 | checkTypeApp ctorElab args' 324 | Syntax.TyTyped ty _ kind -> do 325 | kindElab <- checkType kind $ Core.TyType $ Syntax.rangeOfType kind 326 | checkType ty kindElab 327 | Syntax.TyParens (Syntax.Wrapped _ ty _) -> 328 | inferType ty 329 | 330 | checkTypeApp :: Core.Ty -> NonEmptyList (Syntax.Ty) -> Check Core.Ty 331 | checkTypeApp = NonEmptyList.foldM \ctor arg -> 332 | case Core.typeOfType ctor of 333 | Core.TyArrow range argTy resultTy -> do 334 | argElab <- checkType arg argTy 335 | let appRange = range <> Core.rangeOfType argElab 336 | pure $ Core.TyApp resultTy appRange ctor argElab 337 | ty -> do 338 | let tyRange = Core.rangeOfType ty 339 | let tyKind = Core.TyType tyRange 340 | argElab <- inferType arg 341 | let appRange = Core.rangeOfType ctor <> Core.rangeOfType argElab 342 | resultTy <- freshUni (UniAppResultType appRange) tyKind tyRange 343 | let ctorTy = Core.TyArrow tyRange (Core.typeOfType argElab) resultTy 344 | emit $ Core.ConEquals tyRange { actual: ty, expect: ctorTy } 345 | pure $ Core.TyApp resultTy appRange ctor argElab 346 | 347 | flattenExprApp :: Syntax.Expr -> NonEmptyList Syntax.Expr -> Tuple Syntax.Expr (NonEmptyList Syntax.Expr) 348 | flattenExprApp = case _, _ of 349 | Syntax.ExprParens (Syntax.Wrapped _ fn _), args -> 350 | flattenExprApp fn args 351 | Syntax.ExprApp fn args', args -> 352 | flattenExprApp fn (args' <> args) 353 | fn, args -> 354 | Tuple fn args 355 | 356 | flattenTypeApp :: Syntax.Ty -> NonEmptyList Syntax.Ty -> Tuple Syntax.Ty (NonEmptyList Syntax.Ty) 357 | flattenTypeApp = case _, _ of 358 | Syntax.TyParens (Syntax.Wrapped _ ctor _), args -> 359 | flattenTypeApp ctor args 360 | Syntax.TyApp ctor args', args -> 361 | flattenTypeApp ctor (args' <> args) 362 | ctor, args -> 363 | Tuple ctor args 364 | -------------------------------------------------------------------------------- /src/Check/Environment.purs: -------------------------------------------------------------------------------- 1 | module Check.Environment where 2 | 3 | import Check.Core as Core 4 | import Data.Array.NonEmpty (NonEmptyArray) 5 | import Data.Map (Map) 6 | import Data.Maybe (Maybe) 7 | import Data.Tuple (Tuple(..)) 8 | import Syntax.Token (Range, emptyRange) 9 | import Syntax.Tree as Syntax 10 | 11 | type BindingMap = Map Core.Identifier BindingData 12 | 13 | data BindingNamespace 14 | = BindingExprVar 15 | | BindingTyVar 16 | 17 | type BindingData = 18 | { namespace :: BindingNamespace 19 | , range :: Range 20 | , scopeLevel :: Core.ScopeLevel 21 | , type :: Core.Ty 22 | } 23 | 24 | type UniMap = Map Core.Uni UniData 25 | 26 | data UniSource 27 | = UniLetResultType Syntax.Identifier 28 | | UniTypeArgumentType Syntax.Identifier 29 | | UniArgumentType Syntax.Identifier 30 | | UniInstantiation Range (Core.Scoped Core.Identifier) 31 | | UniAppResultType Range 32 | | UniTypeAppResultType Range 33 | 34 | type UniData = 35 | { range :: Range 36 | , scopeLevel :: Core.ScopeLevel 37 | , solution :: Maybe Core.Ty 38 | , sources :: NonEmptyArray UniSource 39 | , type :: Core.Ty 40 | } 41 | 42 | primInt :: Core.Identifier 43 | primInt = Core.Identifier "Int" 44 | 45 | primString :: Core.Identifier 46 | primString = Core.Identifier "String" 47 | 48 | primTyInt :: Range -> Core.Ty 49 | primTyInt = primTyVar primInt 50 | 51 | primTyString :: Range -> Core.Ty 52 | primTyString = primTyVar primString 53 | 54 | primTyVar :: Core.Identifier -> Range -> Core.Ty 55 | primTyVar ident range = Core.TyVar (Core.TyType range) range (Core.Scoped (Core.ScopeLevel 0) ident) 56 | 57 | primEnvironment :: Array (Tuple Core.Identifier BindingData) 58 | primEnvironment = 59 | [ Tuple primInt primBindingData 60 | , Tuple primString primBindingData 61 | ] 62 | where 63 | primBindingData :: BindingData 64 | primBindingData = 65 | { namespace: BindingTyVar 66 | , range: emptyRange 67 | , scopeLevel: Core.ScopeLevel 0 68 | , type: Core.TyType emptyRange 69 | } 70 | -------------------------------------------------------------------------------- /src/Check/Error.purs: -------------------------------------------------------------------------------- 1 | module Check.Error where 2 | 3 | import Check.Core as Core 4 | import Check.Unify (UnifyError) 5 | import Data.Array.NonEmpty (NonEmptyArray) 6 | import Syntax.Token (Range) 7 | import Syntax.Tree as Syntax 8 | 9 | data CheckError 10 | = TypeVariableInExpr Range Syntax.Identifier 11 | | ExprVariableInType Range Syntax.Identifier 12 | | UnknownIdentifier Syntax.Identifier 13 | | UnknownTypesInDecl Range (NonEmptyArray (Core.Scoped Core.Uni)) 14 | | UnificationError UnifyError 15 | -------------------------------------------------------------------------------- /src/Check/Monad.purs: -------------------------------------------------------------------------------- 1 | module Check.Monad where 2 | 3 | import Prelude 4 | 5 | import Check.Core as Core 6 | import Check.Environment (BindingMap, UniMap, UniSource, BindingData) 7 | import Check.Error (CheckError) 8 | import Control.Monad.Except (ExceptT, runExceptT) 9 | import Control.Monad.Free (Free, runFree) 10 | import Control.Monad.State (StateT, gets, modify_, runStateT, state) 11 | import Data.Array.NonEmpty (NonEmptyArray) 12 | import Data.CatQueue (CatQueue) 13 | import Data.CatQueue as CatQueue 14 | import Data.Either (Either) 15 | import Data.Foldable (class Foldable) 16 | import Data.Identity (Identity(..)) 17 | import Data.Map as Map 18 | import Data.Maybe (Maybe(..)) 19 | import Data.Newtype (over, un) 20 | import Data.Tuple (Tuple(..)) 21 | import Syntax.Token (Range) 22 | 23 | type Check = ExceptT (NonEmptyArray CheckError) (StateT CheckState (Free Identity)) 24 | 25 | runCheck :: forall a. CheckState -> Check a -> Tuple (Either (NonEmptyArray CheckError) a) CheckState 26 | runCheck state = 27 | runExceptT 28 | >>> flip runStateT state 29 | >>> runFree (un Identity) 30 | 31 | type CheckState = 32 | { fresh :: Int 33 | , constraints :: CatQueue Core.Constraint 34 | , environment :: BindingMap 35 | , univars :: UniMap 36 | , scopeLevel :: Core.ScopeLevel 37 | } 38 | 39 | emptyState :: CheckState 40 | emptyState = 41 | { fresh: 0 42 | , constraints: CatQueue.empty 43 | , environment: Map.empty 44 | , univars: Map.empty 45 | , scopeLevel: Core.ScopeLevel 0 46 | } 47 | 48 | -- | Allocates a fresh unification variable with a given kind. 49 | freshUni :: UniSource -> Core.Ty -> Range -> Check Core.Ty 50 | freshUni source ty range = state \st -> do 51 | let 52 | uni = Core.Uni st.fresh 53 | uniData = 54 | { range 55 | , scopeLevel: st.scopeLevel 56 | , solution: Nothing 57 | , sources: pure source 58 | , type: ty 59 | } 60 | Tuple (Core.TyUni ty range (Core.Scoped st.scopeLevel uni)) $ st 61 | { fresh = st.fresh + 1 62 | , univars = Map.insert uni uniData st.univars 63 | } 64 | 65 | -- | Resets the unification environment. This is done after every top-level 66 | -- | declaration. Type-checking top-level declarations is self contained, and 67 | -- | free unification variables should either be generalized or result in an 68 | -- | error. 69 | resetSubstitution :: Check Unit 70 | resetSubstitution = modify_ \st -> st { univars = Map.empty :: _ } 71 | 72 | -- | Yields the current scope level. 73 | currentScope :: Check Core.ScopeLevel 74 | currentScope = gets _.scopeLevel 75 | 76 | -- | Swaps the scope level for another, returning the current scope. 77 | swapScope :: Core.ScopeLevel -> Check Core.ScopeLevel 78 | swapScope scope = state \st -> Tuple st.scopeLevel $ st { scopeLevel = scope } 79 | 80 | -- | Emits a constraint, which should be solved by the constraint solver 81 | -- | sometime in the future. 82 | emit :: Core.Constraint -> Check Unit 83 | emit con = modify_ \st -> st { constraints = CatQueue.snoc st.constraints con} 84 | 85 | -- | Increments the scope level for a given block. 86 | scoped :: forall a. Check a -> Check a 87 | scoped m = join $ state \st -> do 88 | let scopeLevel' = over Core.ScopeLevel (_ + 1) st.scopeLevel 89 | let restore = _ { environment = st.environment, scopeLevel = st.scopeLevel } 90 | Tuple (m <* modify_ restore) $ st { scopeLevel = scopeLevel' } 91 | 92 | -- | Adds bindings to the type-checking environment. 93 | addToEnvironment :: forall f. Foldable f => f (Tuple Core.Identifier BindingData) -> Check Unit 94 | addToEnvironment bindings = modify_ \st -> 95 | st { environment = Map.fromFoldable bindings <> st.environment } 96 | -------------------------------------------------------------------------------- /src/Check/Solver.purs: -------------------------------------------------------------------------------- 1 | module Check.Solver where 2 | 3 | import Prelude 4 | 5 | import Check.Core as Core 6 | import Check.Core.Traversal (rewriteExprTopDown, rewriteTypeTopDown) 7 | import Check.Environment (BindingData, UniData) 8 | import Check.Error (CheckError(..)) 9 | import Check.Monad (Check) 10 | import Check.Unify (UnifyError, unify) 11 | import Control.Monad.Error.Class (throwError) 12 | import Control.Monad.State (gets, modify_, state) 13 | import Data.Array.NonEmpty as NonEmptyArray 14 | import Data.CatQueue as CatQueue 15 | import Data.Either (Either(..)) 16 | import Data.Foldable (for_) 17 | import Data.Map (Map) 18 | import Data.Map as Map 19 | import Data.Maybe (Maybe(..)) 20 | import Data.Tuple (Tuple(..)) 21 | 22 | type ConstraintProgress = 23 | { unifyErrors :: Array UnifyError 24 | } 25 | 26 | type Substitution = 27 | { univars :: Map Core.Uni UniData 28 | } 29 | 30 | solveConstraint :: Core.Constraint -> Check ConstraintProgress 31 | solveConstraint = case _ of 32 | Core.ConEquals range tys -> do 33 | univars <- gets _.univars 34 | case unify univars range tys of 35 | Left errs -> 36 | pure { unifyErrors: NonEmptyArray.toArray errs } 37 | Right newUnivars -> do 38 | modify_ _ { univars = newUnivars } 39 | pure { unifyErrors: [] } 40 | 41 | solve :: Check Substitution 42 | solve = go [] 43 | where 44 | go :: Array UnifyError -> Check Substitution 45 | go unifyErrors = 46 | next >>= case _ of 47 | Just con -> do 48 | progress <- solveConstraint con 49 | go (unifyErrors <> progress.unifyErrors) 50 | Nothing -> do 51 | for_ (NonEmptyArray.fromFoldable unifyErrors) do 52 | throwError <<< map UnificationError 53 | { univars: _ } <$> gets _.univars 54 | 55 | next :: Check (Maybe Core.Constraint) 56 | next = state \st -> 57 | case CatQueue.uncons st.constraints of 58 | Just (Tuple head tail) -> Tuple (Just head) $ st { constraints = tail } 59 | Nothing -> Tuple Nothing st 60 | 61 | applySubstitutionToExpr :: Substitution -> Core.Expr -> Core.Expr 62 | applySubstitutionToExpr sub = rewriteExprTopDown case _ of 63 | Core.ExprLit ty range lit -> 64 | Core.ExprLit (applySubstitutionToType sub ty) range lit 65 | Core.ExprVar ty range ident -> 66 | Core.ExprVar (applySubstitutionToType sub ty) range ident 67 | Core.ExprLet ty range letGroup expr -> 68 | Core.ExprLet (applySubstitutionToType sub ty) range letGroup expr 69 | Core.ExprApp ty range expr1 expr2 -> 70 | Core.ExprApp (applySubstitutionToType sub ty) range expr1 expr2 71 | Core.ExprAbs ty range binding expr -> 72 | Core.ExprAbs (applySubstitutionToType sub ty) range (applySubstitutionToBinding sub binding) expr 73 | Core.ExprTyApp ty range expr appTy -> 74 | Core.ExprTyApp (applySubstitutionToType sub ty) range expr (applySubstitutionToType sub appTy) 75 | Core.ExprTyAbs ty range binding expr -> 76 | Core.ExprTyAbs (applySubstitutionToType sub ty) range (applySubstitutionToBinding sub binding) expr 77 | 78 | applySubstitutionToBinding :: Substitution -> Core.Binding -> Core.Binding 79 | applySubstitutionToBinding sub (Core.Binding ty range ident) = 80 | Core.Binding (applySubstitutionToType sub ty) range ident 81 | 82 | applySubstitutionToLet :: Substitution -> Core.Let -> Core.Let 83 | applySubstitutionToLet sub = case _ of 84 | Core.LetOne ident expr -> 85 | Core.LetOne ident (applySubstitutionToExpr sub expr) 86 | Core.LetRec defs -> 87 | Core.LetRec (map (applySubstitutionToExpr sub) <$> defs) 88 | 89 | applySubstitutionToType :: Substitution -> Core.Ty -> Core.Ty 90 | applySubstitutionToType sub = rewriteTypeTopDown case _ of 91 | ty@(Core.TyType _) -> ty 92 | ty@(Core.TyArrow _ _ _) -> ty 93 | Core.TyForall range binding ty -> 94 | Core.TyForall range (applySubstitutionToBinding sub binding) ty 95 | Core.TyVar ty range ident -> 96 | Core.TyVar (applySubstitutionToType sub ty) range ident 97 | Core.TyApp ty range ty1 ty2 -> 98 | Core.TyApp (applySubstitutionToType sub ty) range ty1 ty2 99 | Core.TyUni ty range uni@(Core.Scoped scope u) 100 | | Just { solution: Just tySub } <- Map.lookup u sub.univars -> 101 | tySub 102 | | otherwise -> 103 | Core.TyUni (applySubstitutionToType sub ty) range uni 104 | 105 | applySubstitutionToEnvironment :: Substitution -> BindingData -> BindingData 106 | applySubstitutionToEnvironment sub binding = 107 | binding { type = applySubstitutionToType sub binding.type } 108 | -------------------------------------------------------------------------------- /src/Check/Unify.purs: -------------------------------------------------------------------------------- 1 | module Check.Unify 2 | ( UnifyCheck 3 | , UnifyError(..) 4 | , unify 5 | ) where 6 | 7 | import Prelude 8 | 9 | import Check.Core as Core 10 | import Check.Core.Traversal (rewriteTypeTopDownM) 11 | import Check.Environment (UniData, UniMap, UniSource) 12 | import Control.Monad.Except (lift) 13 | import Control.Monad.Free (Free, runFree) 14 | import Control.Monad.Maybe.Trans (runMaybeT) 15 | import Control.Monad.State (StateT, execStateT, gets, modify_, state) 16 | import Control.MonadZero (guard) 17 | import Data.Array as Array 18 | import Data.Array.NonEmpty (NonEmptyArray) 19 | import Data.Array.NonEmpty as NonEmptyArray 20 | import Data.Either (Either(..)) 21 | import Data.Identity (Identity(..)) 22 | import Data.List (List) 23 | import Data.List as List 24 | import Data.Map as Map 25 | import Data.Maybe (Maybe(..)) 26 | import Data.Newtype (un, unwrap) 27 | import Data.Tuple (Tuple(..)) 28 | import Partial.Unsafe (unsafeCrashWith) 29 | import Syntax.Token (Range) 30 | 31 | type UnifyState = 32 | { univars :: UniMap 33 | , errors :: Array UnifyError 34 | , scopeLevel :: Core.ScopeLevel 35 | , check :: UnifyCheck 36 | , range :: Range 37 | } 38 | 39 | type UnifyCheck = 40 | { expect :: Core.Ty 41 | , actual :: Core.Ty 42 | } 43 | 44 | data UnifyError 45 | = UnifyInfiniteType (Core.Scoped Core.Uni) Core.Ty UnifyCheck 46 | -- UnifyFailure contains an optional trace. A unification failure will have 47 | -- the specific types that failed, but we want to preserve the context of the 48 | -- original larger constraint. 49 | | UnifyFailure Range UnifyCheck (Maybe UnifyCheck) 50 | 51 | type Unify = StateT UnifyState (Free Identity) 52 | 53 | -- There can be multiple errors when unifying two types. For example: 54 | -- (String -> Int) ~ (Foo -> Bar) 55 | -- Has two errors, where String does not unify with Foo, and Int does not 56 | -- unify with Bar. Instead of stopping as soon as we find an error, we collect 57 | -- all errors among unification branches. 58 | unify :: UniMap -> Range -> UnifyCheck -> Either (NonEmptyArray UnifyError) UniMap 59 | unify univars range check = 60 | case NonEmptyArray.fromArray result.errors of 61 | Just errors -> Left errors 62 | Nothing -> Right result.univars 63 | where 64 | result = 65 | unify' false check.expect check.actual 66 | # flip execStateT { univars, errors: [], range, check, scopeLevel: Core.ScopeLevel 0 } 67 | # runFree (un Identity) 68 | 69 | -- We track whether we are performing a "nested" unification (unifying children 70 | -- of the parent constraint) so that we don't add a redundant unification trace 71 | -- in the error where the trace is identical to the constraint that fails. An 72 | -- alternative would be to track a full trace with a list or use a Maybe. 73 | unify' :: Boolean -> Core.Ty -> Core.Ty -> Unify Unit 74 | unify' nested expect actual = do 75 | -- We must deref both types so we don't accidentally solve a unification 76 | -- variable that has already been solved. We always want to make sure we are 77 | -- unifying against solutions if they exist. 78 | expect' <- uniDeref expect 79 | actual' <- uniDeref actual 80 | case expect', actual' of 81 | Core.TyType _, Core.TyType _ -> 82 | pure unit 83 | Core.TyVar _ _ a, Core.TyVar _ _ b | a == b -> 84 | pure unit 85 | Core.TyUni _ _ (Core.Scoped _ a), Core.TyUni _ _ (Core.Scoped _ b) | a == b -> 86 | pure unit 87 | Core.TyApp _ _ a1 b1, Core.TyApp _ _ a2 b2 -> do 88 | unify' true a1 a2 89 | unify' true b1 b2 90 | Core.TyArrow _ a1 b1, Core.TyArrow _ a2 b2 -> do 91 | unify' true a1 a2 92 | unify' true b1 b2 93 | -- When unifying two unification variables, we compare their scope levels to 94 | -- not unnecessarily promote one. The one with the larger scope level (more 95 | -- nested) gets solved to the one with the smaller scope level. 96 | ty1@(Core.TyUni _ _ u1@(Core.Scoped s1 _)), ty2@(Core.TyUni _ _ u2@(Core.Scoped s2 _)) 97 | | s1 < s2 -> uniSolve u2 ty1 98 | | otherwise -> uniSolve u1 ty2 99 | Core.TyUni _ _ u, ty -> 100 | uniSolve u ty 101 | ty, Core.TyUni _ _ u -> 102 | uniSolve u ty 103 | ty1, ty2 -> modify_ \st -> do 104 | let trace = if nested then Just st.check else Nothing 105 | let error = UnifyFailure st.range { expect: expect', actual: actual' } trace 106 | st { errors = Array.snoc st.errors error } 107 | 108 | -- When solving a unification variable, we need to do some bookkeeping to make 109 | -- sure we have a valid solution. This is primarily handled by 110 | -- `uniCheckOccurences`, but we also take care of propagating sources for use 111 | -- in errors. 112 | uniSolve :: Core.Scoped Core.Uni -> Core.Ty -> Unify Unit 113 | uniSolve scu@(Core.Scoped initScope u) initTy = do 114 | uniData <- state \st -> Tuple (unsafeUniLookup u st.univars) $ st { scopeLevel = initScope } 115 | mbTy <- uniCheckOccurrences u initTy 116 | case mbTy of 117 | -- If we get no result of uniCheckOccurences, our solution would have 118 | -- resulted in an infinite type. 119 | Nothing -> modify_ \st -> do 120 | let error = UnifyInfiniteType scu initTy st.check 121 | st { errors = Array.snoc st.errors error } 122 | Just ty -> do 123 | newScope <- gets _.scopeLevel 124 | let newUniData = uniData { scopeLevel = newScope, solution = Just ty } 125 | modify_ \st -> st { univars = Map.insert u newUniData $ uniPropagate uniData.sources ty st.univars } 126 | unify' true (Core.typeOfType ty) uniData.type 127 | 128 | -- A solution is invalid if it mentions the unification variable we are solving. 129 | -- This is the source of an "infinite type" error. If a solution mentions the 130 | -- variable we are solving, then applying this solution would loop forever. In 131 | -- literature, this is called the "occurs check", by which we check if the 132 | -- variable occurs in the solution. This is done by traversing the solution 133 | -- top-down while dereferencing unification variables. 134 | uniCheckOccurrences :: Core.Uni -> Core.Ty -> Unify (Maybe Core.Ty) 135 | uniCheckOccurrences u = runMaybeT <<< rewriteTypeTopDownM \ty -> do 136 | ty' <- lift $ uniDeref ty 137 | case ty' of 138 | Core.TyUni _ _ (Core.Scoped scope' u') -> do 139 | -- Make sure that the unification variable is not the variable we are 140 | -- currently solving. This will short-circuit and return Nothing. 141 | -- Otherwise we will get a rewritten type. 142 | guard (u /= u') 143 | scope <- gets _.scopeLevel 144 | -- TODO: This promotion is wrong. We should promote the scope of other 145 | -- variables we find, not the one we are solving, because it's solved! 146 | when (scope' < scope) do 147 | modify_ _ { scopeLevel = scope' } 148 | pure ty' 149 | _ -> 150 | pure ty' 151 | 152 | -- Propagates sources for unification variables. When we solve a unification 153 | -- variable with another unification variable, the solution should inherit its 154 | -- sources. This lets us track all the places that unification variables are 155 | -- references for error messages. 156 | uniPropagate :: NonEmptyArray UniSource -> Core.Ty -> UniMap -> UniMap 157 | uniPropagate sources = case _ of 158 | Core.TyUni _ _ (Core.Scoped _ u) -> 159 | Map.update (\uniData -> Just (uniData { sources = uniData.sources <> sources })) u 160 | _ -> 161 | identity 162 | 163 | -- An invariant is that when we are unifying, any unification variables should 164 | -- be in our UniMap. If it's not, something has gone very wrong, so we just 165 | -- crash in the case that it is not found, or the case that it has already been 166 | -- solved. 167 | unsafeUniLookup :: Core.Uni -> UniMap -> UniData 168 | unsafeUniLookup u univars = case Map.lookup u univars of 169 | Just uniData@{ solution: Nothing } -> 170 | uniData 171 | Just _ -> 172 | unsafeCrashWith $ "[solveUnivar] Unification variable ?T" <> show (unwrap u) <> " already solved." 173 | Nothing -> 174 | unsafeCrashWith $ "[solveUnivar] Unification variable ?T" <> show (unwrap u) <> " not in scope." 175 | 176 | type UniPath = List (Tuple Core.Uni UniData) 177 | 178 | -- Much of the time, a type may reference a unification variable that has been 179 | -- solved. Dereferencing the unification variable involves following a path of 180 | -- solutions (which may themselves be unification variables with solutions) until 181 | -- we reach a type that is not a solved unification variable. During this process 182 | -- we compress the path to avoid redundant lookups later. That is, if a solution 183 | -- has multiple layers of indirection, we should go ahead and just store the 184 | -- shortest paths to the solution. Path compression is only an optimization and 185 | -- is not needed for soundness. 186 | uniDeref :: Core.Ty -> Unify Core.Ty 187 | uniDeref = case _ of 188 | ty@(Core.TyUni _ _ (Core.Scoped _ u)) -> 189 | state \st -> case Map.lookup u st.univars of 190 | Just uniData@{ solution: Just solTy } -> do 191 | st { univars = _ } <$> goPath st.univars (pure (Tuple u uniData)) solTy 192 | _ -> 193 | Tuple ty st 194 | ty -> 195 | pure ty 196 | where 197 | goPath :: UniMap -> UniPath -> Core.Ty -> Tuple Core.Ty UniMap 198 | goPath univars path = case _ of 199 | -- Keep following solved unification variables until we reach something 200 | -- that is not solved, or is not a unification variable. 201 | Core.TyUni _ _ (Core.Scoped _ u) 202 | | Just uniData@{ solution: Just ty } <- Map.lookup u univars -> 203 | goPath univars (List.Cons (Tuple u uniData) path) ty 204 | ty -> case path of 205 | List.Cons _ ps -> 206 | Tuple ty (goCompress ty univars ps) 207 | _ -> 208 | Tuple ty univars 209 | 210 | -- Walk back over the path, inserting the solution for all of the unification 211 | -- variables in the path. 212 | goCompress :: Core.Ty -> UniMap -> UniPath -> UniMap 213 | goCompress ty us = case _ of 214 | List.Cons (Tuple u uniData) ps -> 215 | goCompress ty (Map.insert u (uniData { solution = Just ty }) us) ps 216 | List.Nil -> 217 | us 218 | -------------------------------------------------------------------------------- /src/Eval/Normalize.purs: -------------------------------------------------------------------------------- 1 | module Eval.Normalize where 2 | 3 | import Prelude 4 | 5 | import Check.Core as Core 6 | import Check.Core.Traversal (rewriteTypeWithContext) 7 | import Data.Foldable (foldl, foldr) 8 | import Data.List.NonEmpty as NonEmptyList 9 | import Data.List.Types (NonEmptyList) 10 | import Data.Map (Map) 11 | import Data.Map as Map 12 | import Data.Maybe (Maybe(..), fromJust) 13 | import Data.Tuple (Tuple(..), fst, uncurry) 14 | import Partial.Unsafe (unsafePartial) 15 | 16 | type EvalEnvironment = 17 | { exprs :: Map (Core.Scoped Core.Identifier) EvalExpr 18 | , types :: Map (Core.Scoped Core.Identifier) Core.Ty 19 | } 20 | 21 | insertExpr :: Core.Scoped Core.Identifier -> EvalExpr -> EvalEnvironment -> EvalEnvironment 22 | insertExpr ident expr env = env { exprs = Map.insert ident expr env.exprs } 23 | 24 | insertType :: Core.Scoped Core.Identifier -> Core.Ty -> EvalEnvironment -> EvalEnvironment 25 | insertType ident ty env = env { types = Map.insert ident ty env.types } 26 | 27 | data EvalExpr 28 | = EvalClosure EvalEnvironment Core.Expr 29 | | EvalRec EvalEnvironment (NonEmptyList (Tuple (Core.Scoped Core.Identifier) Core.Expr)) Core.Expr 30 | 31 | normalizeModule :: Core.Module -> Core.Expr 32 | normalizeModule mod@(Core.Module decls) = do 33 | let 34 | env = evalModule mod 35 | ident = case NonEmptyList.last decls of 36 | Core.DeclLet (Core.LetOne id _) -> id 37 | Core.DeclLet (Core.LetRec letGroup) -> fst $ NonEmptyList.last letGroup 38 | unsafePartial 39 | $ uncurry reifyExpr 40 | $ uncurry evalExpr 41 | $ unrollExpr 42 | $ fromJust 43 | $ Map.lookup ident env.exprs 44 | 45 | evalModule :: Core.Module -> EvalEnvironment 46 | evalModule (Core.Module decls) = foldl evalDecl mempty decls 47 | 48 | evalDecl :: EvalEnvironment -> Core.Decl -> EvalEnvironment 49 | evalDecl env = case _ of 50 | Core.DeclLet declLet -> 51 | evalLet env declLet 52 | 53 | evalLet :: EvalEnvironment -> Core.Let -> EvalEnvironment 54 | evalLet env = case _ of 55 | Core.LetOne ident expr -> 56 | insertExpr ident (EvalClosure env expr) env 57 | Core.LetRec exprs -> 58 | foldr (\(Tuple ident expr) -> insertExpr ident (EvalRec env exprs expr)) env exprs 59 | 60 | evalExpr :: EvalEnvironment -> Core.Expr -> Tuple EvalEnvironment Core.Expr 61 | evalExpr = go 62 | where 63 | go env = case _ of 64 | expr@(Core.ExprVar _ _ ident) -> 65 | case Map.lookup ident env.exprs of 66 | Just exprSub | Tuple env' expr' <- unrollExpr exprSub -> 67 | go env' expr' 68 | _ -> 69 | Tuple env expr 70 | Core.ExprLet _ _ letGroup expr' -> 71 | go (evalLet env letGroup) expr' 72 | expr@(Core.ExprApp _ _ expr1 expr2) -> 73 | case evalExpr env expr1 of 74 | Tuple env' (Core.ExprAbs _ _ (Core.Binding _ _ ident) body) -> 75 | go (insertExpr ident (EvalClosure env expr2) env') body 76 | _ -> 77 | Tuple env expr 78 | expr@(Core.ExprTyApp _ _ expr1 ty) -> 79 | case evalExpr env expr1 of 80 | Tuple env' (Core.ExprTyAbs _ _ (Core.Binding _ _ ident) body) -> 81 | go (insertType ident ty env') body 82 | _ -> 83 | Tuple env expr 84 | expr -> 85 | Tuple env expr 86 | 87 | unrollExpr :: EvalExpr -> Tuple EvalEnvironment Core.Expr 88 | unrollExpr = case _ of 89 | EvalClosure env expr -> 90 | Tuple env expr 91 | EvalRec env recEnv expr -> 92 | Tuple env $ Core.ExprLet (Core.typeOfExpr expr) (Core.rangeOfExpr expr) (Core.LetRec recEnv) expr 93 | 94 | type SubstEnvironment = 95 | { used :: Map Core.Identifier Int 96 | , env :: EvalEnvironment 97 | } 98 | 99 | reifyExpr :: EvalEnvironment -> Core.Expr -> Core.Expr 100 | reifyExpr = go <<< { used: mempty, env: _ } 101 | where 102 | go :: SubstEnvironment -> Core.Expr -> Core.Expr 103 | go { used, env } = case _ of 104 | Core.ExprVar ty range ident -> 105 | case Map.lookup ident env.exprs of 106 | Just expr | Tuple env' expr' <- unrollExpr expr -> 107 | go { used, env: env' } expr' 108 | Nothing -> 109 | Core.ExprVar (substituteType env ty) range ident 110 | Core.ExprApp ty range expr1 expr2 -> do 111 | let expr1' = go { used, env } expr1 112 | let Tuple env' expr2' = evalExpr env expr2 113 | let expr2'' = go { used, env: env' } expr2' 114 | Core.ExprApp (substituteType env ty) range expr1' expr2'' 115 | expr@(Core.ExprTyApp ty range expr1 ty2) -> do 116 | let expr1' = go { used, env } expr1 117 | Core.ExprTyApp (substituteType env ty) range expr1 (substituteType env ty2) 118 | Core.ExprAbs ty1 range1 (Core.Binding ty2 range2 ident) body -> do 119 | let Tuple ident' used' = freshName used ident 120 | let env' = env { exprs = Map.insert ident (EvalClosure mempty (Core.ExprVar ty2 range2 ident')) env.exprs } 121 | let Tuple env'' body' = evalExpr env' body 122 | let body'' = go { used: used', env: env'' } body' 123 | Core.ExprAbs (substituteType env ty1) range1 (Core.Binding (substituteType env ty2) range2 ident') body 124 | Core.ExprTyAbs ty1 range1 (Core.Binding ty2 range2 ident) body -> do 125 | let Tuple ident' used' = freshName used ident 126 | let env' = env { types = Map.insert ident (Core.TyVar ty2 range2 ident') env.types } 127 | let Tuple env'' body' = evalExpr env' body 128 | let body'' = go { used: used', env: env'' } body' 129 | Core.ExprTyAbs (substituteType env ty2) range1 (Core.Binding (substituteType env ty2) range2 ident') body'' 130 | expr -> 131 | expr 132 | 133 | freshName 134 | :: Map Core.Identifier Int 135 | -> Core.Scoped Core.Identifier 136 | -> Tuple (Core.Scoped Core.Identifier) (Map Core.Identifier Int) 137 | freshName used scident@(Core.Scoped sc ident@(Core.Identifier var)) = 138 | case Map.lookup ident used of 139 | Nothing -> 140 | Tuple scident (Map.insert ident 0 used) 141 | Just n -> 142 | Tuple (Core.Scoped sc (Core.Identifier (var <> "'" <> show n))) (Map.insert ident (n + 1) used) 143 | 144 | substituteType :: EvalEnvironment -> Core.Ty -> Core.Ty 145 | substituteType = rewriteTypeWithContext \env -> case _ of 146 | ty@(Core.TyVar _ _ ident) -> 147 | case Map.lookup ident env.types of 148 | Just tySub -> 149 | Tuple env tySub 150 | Nothing -> 151 | Tuple env ty 152 | ty@(Core.TyForall _ (Core.Binding _ _ ident) _) -> do 153 | Tuple (env { types = Map.delete ident env.types }) ty 154 | ty -> 155 | Tuple env ty 156 | -------------------------------------------------------------------------------- /src/Main.purs: -------------------------------------------------------------------------------- 1 | module Main where 2 | 3 | import Prelude 4 | 5 | import Check.Elaborate (elaborateModule) 6 | import Check.Monad (emptyState, runCheck) 7 | import Data.Array (any) 8 | import Data.Array as Array 9 | import Data.Either (Either(..)) 10 | import Data.Foldable (for_) 11 | import Data.Tuple (Tuple(..)) 12 | import Dodo (print, twoSpaces) 13 | import Dodo.Ansi (ansiGraphics) 14 | import Effect (Effect) 15 | import Effect.Aff (launchAff_) 16 | import Effect.Class (liftEffect) 17 | import Effect.Class.Console as Console 18 | import Eval.Normalize (normalizeModule) 19 | import Node.Encoding as Encoding 20 | import Node.FS.Aff as FS 21 | import Node.Process (argv) 22 | import Print (printError, printExpr, printModule, printParseError) 23 | import Syntax.Parser (parseModule, runParser) 24 | import Syntax.Token (Position(..)) 25 | import Text.Parsing.Parser (ParseError(..)) 26 | import Text.Parsing.Parser.Pos as Pos 27 | 28 | main :: Effect Unit 29 | main = launchAff_ do 30 | args <- liftEffect argv 31 | let filePath = Array.take 1 (Array.drop 2 args) 32 | case filePath of 33 | [fp] -> do 34 | source <- FS.readTextFile Encoding.UTF8 fp 35 | case runParser source parseModule of 36 | Left (ParseError err (Pos.Position { column, line })) -> 37 | Console.log $ printDoc $ printParseError err (Position line column) 38 | Right mod -> do 39 | let Tuple res st = runCheck emptyState (elaborateModule mod) 40 | case res of 41 | Left errs -> 42 | for_ errs \err -> 43 | Console.log $ printDoc $ printError st err 44 | Right coreMod 45 | | any (eq "-n") args -> 46 | Console.log $ printDoc $ printExpr $ normalizeModule coreMod 47 | | otherwise -> 48 | Console.log $ printDoc $ printModule coreMod 49 | _ -> 50 | Console.error "File path needed" 51 | where 52 | printDoc = 53 | print ansiGraphics twoSpaces 54 | -------------------------------------------------------------------------------- /src/Print.purs: -------------------------------------------------------------------------------- 1 | module Print where 2 | 3 | import Prelude 4 | 5 | import Ansi.Codes (Color(..), GraphicsParam) 6 | import Check.Core as Core 7 | import Check.Core.Traversal (unknownsInType) 8 | import Check.Environment (UniSource(..), UniData) 9 | import Check.Error (CheckError(..)) 10 | import Check.Monad (CheckState) 11 | import Check.Solver (Substitution, applySubstitutionToType) 12 | import Check.Unify (UnifyError(..)) 13 | import Data.Array as Array 14 | import Data.Array.NonEmpty as NonEmptyArray 15 | import Data.Foldable (fold, foldMap) 16 | import Data.List as List 17 | import Data.Map as Map 18 | import Data.Tuple (Tuple(..)) 19 | import Dodo (Doc, alignCurrentColumn, enclose, flexGroup, indent, lines, spaceBreak, text, words, (<+>)) 20 | import Dodo.Ansi (bold, dim, foreground) 21 | import Print.Precedence (Assoc(..), Prec, binaryop, parensLeft, parensRight, prec, term, unPrec) 22 | import Syntax.Token (Position(..), Range(..), emptyRange, range1) 23 | import Syntax.Tree as Syntax 24 | 25 | type PrettyDoc = Doc GraphicsParam 26 | 27 | printParseError :: String -> Position -> PrettyDoc 28 | printParseError err pos = 29 | lines 30 | [ bold $ words 31 | [ text "Failed to parse input" 32 | , foreground Yellow (enclose (text "[") (text "]") $ printPosition pos) <> text ":" 33 | ] 34 | , indent $ text err 35 | ] 36 | 37 | printError :: CheckState -> CheckError -> PrettyDoc 38 | printError st = case _ of 39 | UnificationError (UnifyFailure range fail mbTrace) -> do 40 | let expect' = applySubstitutionToType substitution fail.expect 41 | let actual' = applySubstitutionToType substitution fail.actual 42 | let unknowns = Array.nub $ unknownsInCheck fail <> foldMap unknownsInCheck mbTrace 43 | lines 44 | [ bold $ text "Types do not match " <> printRange range <> text ":" 45 | , indent $ lines 46 | [ text "Actual type " <> printRange (Core.rangeOfType actual') <> text ":" 47 | , indent $ printType actual' 48 | , text "does not match expected type " <> printRange (Core.rangeOfType expect') <> text ":" 49 | , indent $ printType expect' 50 | , mbTrace # foldMap \trace -> do 51 | let expectTrace = applySubstitutionToType substitution trace.expect 52 | let actualTrace = applySubstitutionToType substitution trace.actual 53 | lines 54 | [ text "While checking that actual type " <> printRange (Core.rangeOfType actualTrace) <> text ":" 55 | , indent $ printType actualTrace 56 | , text "matches expected type " <> printRange (Core.rangeOfType expectTrace) <> text ":" 57 | , indent $ printType expectTrace 58 | ] 59 | , printUnivars substitution unknowns 60 | ] 61 | ] 62 | UnificationError (UnifyInfiniteType scu@(Core.Scoped _ u) ty trace) -> do 63 | let sub = substitution { univars = Map.delete u substitution.univars } 64 | let ty' = applySubstitutionToType sub ty 65 | let unknowns = Array.nub $ unknownsInType ty' 66 | let expectTrace = applySubstitutionToType sub trace.expect 67 | let actualTrace = applySubstitutionToType sub trace.actual 68 | lines 69 | [ bold $ text "Infinite type required " <> printRange (Core.rangeOfType ty') <> text ":" 70 | , indent $ lines 71 | [ text "The unknown type " <> printUni scu <> text " would need to equal:" 72 | , indent $ printType ty' 73 | , text "which references itself, resulting in an infinitely expanding type." 74 | , text "While checking that actual type " <> printRange (Core.rangeOfType actualTrace) <> text ":" 75 | , indent $ printType actualTrace 76 | , text "matches expected type " <> printRange (Core.rangeOfType expectTrace) <> text ":" 77 | , indent $ printType expectTrace 78 | , printUnivars substitution unknowns 79 | ] 80 | ] 81 | TypeVariableInExpr range (Syntax.Identifier tok ident) -> 82 | lines 83 | [ bold $ text "Type variable used as an expression " <> printRange (range1 tok) <> text ":" 84 | , indent $ lines 85 | [ text ident 86 | , text "bound at " <> printRange range 87 | ] 88 | ] 89 | ExprVariableInType range (Syntax.Identifier tok ident) -> 90 | lines 91 | [ bold $ text "Variable used as a type " <> printRange (range1 tok) <> text ":" 92 | , indent $ lines 93 | [ text ident 94 | , text "bound at " <> printRange range 95 | ] 96 | ] 97 | UnknownIdentifier (Syntax.Identifier tok ident) -> 98 | lines 99 | [ bold $ text "Unknown identifier " <> printRange (range1 tok) <> text ":" 100 | , indent $ text ident 101 | ] 102 | UnknownTypesInDecl range unknowns -> 103 | lines 104 | [ bold $ text "Unknown types in declaration " <> printRange range <> text ":" 105 | , indent $ lines 106 | [ printUnivars substitution $ NonEmptyArray.toArray unknowns 107 | , text "Unknowns must be annotated with a specific type or generalized with a type variable." 108 | ] 109 | ] 110 | where 111 | substitution = 112 | { univars: st.univars } 113 | 114 | unknownsInCheck check = do 115 | let expect' = applySubstitutionToType substitution check.expect 116 | let actual' = applySubstitutionToType substitution check.actual 117 | unknownsInType expect' <> unknownsInType actual' 118 | 119 | printUnivars :: Substitution -> Array (Core.Scoped Core.Uni) -> PrettyDoc 120 | printUnivars { univars } unis = 121 | unis 122 | # Array.mapMaybe (\scu@(Core.Scoped _ u) -> Tuple scu <$> Map.lookup u univars) 123 | # map printUniSources 124 | # lines 125 | where 126 | printUniSources :: Tuple (Core.Scoped Core.Uni) UniData -> PrettyDoc 127 | printUniSources (Tuple u { sources }) = do 128 | let 129 | better = NonEmptyArray.filter isBetterSource sources 130 | purpose' 131 | | Array.null better = NonEmptyArray.toArray sources 132 | | otherwise = better 133 | purpose' 134 | # map (printUniSourceItem u) 135 | # lines 136 | 137 | printUniSourceItem :: Core.Scoped Core.Uni -> UniSource -> PrettyDoc 138 | printUniSourceItem u p = 139 | words 140 | [ text "*" 141 | , printUni u 142 | , text "is an unknown" 143 | , printUniSource p 144 | ] 145 | 146 | isBetterSource :: UniSource -> Boolean 147 | isBetterSource = case _ of 148 | UniLetResultType _ -> true 149 | UniTypeArgumentType _ -> true 150 | UniArgumentType _ -> true 151 | _ -> false 152 | 153 | printUniSource :: UniSource -> PrettyDoc 154 | printUniSource = case _ of 155 | UniLetResultType (Syntax.Identifier tok var) -> 156 | words 157 | [ text "result type for" 158 | , text var 159 | , printRange (range1 tok) 160 | ] 161 | UniTypeArgumentType (Syntax.Identifier tok var) -> 162 | words 163 | [ text "type for type argument" 164 | , text var 165 | , printRange (range1 tok) 166 | ] 167 | UniArgumentType (Syntax.Identifier tok var) -> 168 | words 169 | [ text "type for argument" 170 | , text var 171 | , printRange (range1 tok) 172 | ] 173 | UniInstantiation range (Core.Scoped _ (Core.Identifier var)) -> 174 | words 175 | [ text "instantiation for type variable" 176 | , text var 177 | , printRange range 178 | ] 179 | UniAppResultType range -> 180 | words 181 | [ text "result type for function application" 182 | , printRange range 183 | ] 184 | UniTypeAppResultType range -> 185 | words 186 | [ text "result type for type application" 187 | , printRange range 188 | ] 189 | 190 | printModule :: Core.Module -> PrettyDoc 191 | printModule (Core.Module decls) = lines $ printDecl <$> decls 192 | 193 | printDecl :: Core.Decl -> PrettyDoc 194 | printDecl = case _ of 195 | Core.DeclLet (Core.LetOne (Core.Scoped _ (Core.Identifier var)) letExpr) -> 196 | lines 197 | [ words 198 | [ text "let" 199 | , text var 200 | , text ":" 201 | , alignCurrentColumn $ flexGroup $ printType (Core.typeOfExpr letExpr) 202 | , text "=" 203 | ] 204 | , indent (flexGroup (printExpr letExpr)) <> text ";" 205 | ] 206 | Core.DeclLet (Core.LetRec defs) -> do 207 | let 208 | recs = 209 | defs # map \(Tuple (Core.Scoped _ (Core.Identifier var)) letExpr) -> 210 | lines 211 | [ words 212 | [ text "rec" 213 | , text var 214 | , text ":" 215 | , printType (Core.typeOfExpr letExpr) 216 | , text "=" 217 | ] 218 | , indent $ flexGroup $ printExpr letExpr 219 | ] 220 | lines 221 | [ text "let" 222 | , indent (lines recs) <> text ";" 223 | ] 224 | 225 | printExpr :: Core.Expr -> PrettyDoc 226 | printExpr expr = unPrec (printExprPrec expr) 227 | 228 | printExprPrec :: Core.Expr -> Prec Int PrettyDoc 229 | printExprPrec = go 230 | where 231 | go = case _ of 232 | Core.ExprLit _ _ (Core.LitInt value) -> 233 | term $ text $ show value 234 | Core.ExprLit _ _ (Core.LitString value) -> 235 | term $ text $ show value 236 | Core.ExprVar _ _ (Core.Scoped _ (Core.Identifier var)) -> 237 | term $ text var 238 | Core.ExprLet _ _ (Core.LetOne (Core.Scoped _ (Core.Identifier var)) letExpr) expr -> do 239 | prec bottom $ lines 240 | [ words 241 | [ text "let" 242 | , text var 243 | , text ":" 244 | , printType (Core.typeOfExpr letExpr) 245 | , text "=" 246 | ] 247 | , indent (flexGroup (printExpr letExpr)) <> text ";" 248 | , printExpr expr 249 | ] 250 | Core.ExprLet _ _ (Core.LetRec defs) expr -> do 251 | let 252 | recs = 253 | defs # map \(Tuple (Core.Scoped _ (Core.Identifier var)) letExpr) -> 254 | lines 255 | [ words 256 | [ text "rec" 257 | , text var 258 | , text ":" 259 | , printType $ Core.typeOfExpr letExpr 260 | , text "=" 261 | ] 262 | , indent $ flexGroup $ printExpr letExpr 263 | ] 264 | prec bottom $ lines 265 | [ text "let" 266 | , indent (lines recs) <> text ";" 267 | , printExpr expr 268 | ] 269 | (Core.ExprApp _ _ expr1 expr2) -> do 270 | let 271 | -- We don't need to print parens around abtractions if it is the last 272 | -- argument, so we special case them and wrap them with `term`. 273 | lastArg = case expr2 of 274 | Core.ExprAbs _ _ _ _ -> term $ printExpr expr2 275 | Core.ExprTyAbs _ _ _ _ -> term $ printExpr expr2 276 | _ -> printExprPrec expr2 277 | prec top $ goApp (pure lastArg) expr1 278 | expr@(Core.ExprTyApp _ _ _ _) -> do 279 | prec top $ goApp mempty expr 280 | Core.ExprAbs _ _ (Core.Binding ty _ (Core.Scoped _ (Core.Identifier var))) expr -> do 281 | let arg = words [ text var, text ":", printType ty ] 282 | prec bottom $ fold 283 | [ words 284 | [ text "\\" <> enclose (text "(") (text ")") arg 285 | , text "->" 286 | ] 287 | , spaceBreak 288 | , printExpr expr 289 | ] 290 | Core.ExprTyAbs _ _ (Core.Binding ty _ (Core.Scoped _ (Core.Identifier var))) expr -> do 291 | let arg = words [ text var, text ":", printType ty ] 292 | prec bottom $ fold 293 | [ words 294 | [ text "\\" <> enclose (text "[") (text "]") arg 295 | , text "->" 296 | ] 297 | , spaceBreak 298 | , printExpr expr 299 | ] 300 | 301 | goApp args = case _ of 302 | Core.ExprApp _ _ e1 e2 -> 303 | goApp (List.Cons (printExprPrec e2) args) e1 304 | Core.ExprTyApp _ _ e1 ty1 -> do 305 | let tyApp = term $ enclose (text "[") (text "]") $ printType ty1 306 | goApp (List.Cons tyApp args) e1 307 | e1 -> do 308 | let app expr = spaceBreak <> indent (parensRight AssocLeft top expr) 309 | parensLeft AssocLeft top (printExprPrec e1) <> flexGroup (foldMap app args) 310 | 311 | printType :: Core.Ty -> PrettyDoc 312 | printType ty = unPrec (printTypePrec ty) 313 | 314 | printTypePrec :: Core.Ty -> Prec Int PrettyDoc 315 | printTypePrec = go 316 | where 317 | go = case _ of 318 | Core.TyType _ -> 319 | term $ text "type" 320 | Core.TyVar _ _ (Core.Scoped _ (Core.Identifier var)) -> 321 | term $ text var 322 | Core.TyUni _ _ u -> 323 | term $ printUni u 324 | Core.TyApp _ _ ty1 ty2 -> 325 | binaryop AssocLeft top (<+>) (go ty1) (go ty2) 326 | Core.TyArrow _ ty1 ty2 -> do 327 | let op a b = a <+> text "->" <+> b 328 | binaryop AssocRight bottom op (go ty1) (go ty2) 329 | Core.TyForall _ (Core.Binding kind _ (Core.Scoped _ (Core.Identifier var))) ty -> 330 | prec bottom $ fold 331 | [ enclose (text "[") (text "]") $ words 332 | [ text var 333 | , text ":" 334 | , printType kind 335 | ] 336 | , spaceBreak 337 | , printType ty 338 | ] 339 | 340 | printUni :: Core.Scoped Core.Uni -> PrettyDoc 341 | printUni (Core.Scoped _ (Core.Uni u)) = foreground Blue $ text $ "?T" <> show u 342 | 343 | printPosition :: Position -> PrettyDoc 344 | printPosition (Position line column) = 345 | fold 346 | [ text $ show (line + 1) 347 | , text ":" 348 | , text $ show (column + 1) 349 | ] 350 | 351 | printRange :: Range -> PrettyDoc 352 | printRange range@(Range pos1 pos2) 353 | | range == emptyRange = 354 | dim $ text "[internal]" 355 | | otherwise = 356 | foreground Yellow $ enclose (text "[") (text "]") $ words 357 | [ printPosition pos1 358 | , text "-" 359 | , printPosition pos2 360 | ] 361 | -------------------------------------------------------------------------------- /src/Print/Precedence.purs: -------------------------------------------------------------------------------- 1 | module Print.Precedence where 2 | 3 | import Prelude 4 | 5 | import Dodo (Doc, alignCurrentColumn, enclose, text) 6 | 7 | data Assoc = AssocLeft | AssocRight | NoAssoc 8 | 9 | data Fixity p = Fixity Assoc p | NoFixity 10 | 11 | data Prec p a = Prec (Fixity p) a 12 | 13 | unPrec :: forall p a. Prec p a -> a 14 | unPrec (Prec _ a) = a 15 | 16 | binaryop 17 | :: forall p a 18 | . Ord p 19 | => Assoc 20 | -> p 21 | -> (Doc a -> Doc a -> Doc a) 22 | -> Prec p (Doc a) 23 | -> Prec p (Doc a) 24 | -> Prec p (Doc a) 25 | binaryop assoc1 p1 op a b = 26 | Prec (Fixity assoc1 p1) $ op lhs rhs 27 | where 28 | lhs = parensLeft assoc1 p1 a 29 | rhs = parensRight assoc1 p1 b 30 | 31 | parens :: forall a. Doc a -> Doc a 32 | parens = enclose (text "(") (text ")") <<< alignCurrentColumn 33 | 34 | parensLeft :: forall p a. Ord p => Assoc -> p -> Prec p (Doc a) -> Doc a 35 | parensLeft assoc1 p1 (Prec f doc) = case assoc1, f of 36 | AssocLeft, Fixity AssocLeft p2 37 | | p2 < p1 -> parens doc 38 | | otherwise -> doc 39 | _, Fixity _ p2 40 | | p2 <= p1 -> parens doc 41 | _, _ -> doc 42 | 43 | parensRight :: forall p a. Ord p => Assoc -> p -> Prec p (Doc a) -> Doc a 44 | parensRight assoc1 p1 (Prec f doc) = case assoc1, f of 45 | AssocRight, Fixity AssocRight p2 46 | | p2 < p1 -> parens doc 47 | | otherwise -> doc 48 | _, Fixity _ p2 49 | | p2 <= p1 -> parens doc 50 | _, _ -> doc 51 | 52 | term :: forall p a. a -> Prec p a 53 | term = Prec NoFixity 54 | 55 | prec :: forall p a. p -> a -> Prec p a 56 | prec = Prec <<< Fixity NoAssoc 57 | -------------------------------------------------------------------------------- /src/Syntax/Lexer.purs: -------------------------------------------------------------------------------- 1 | module Syntax.Lexer 2 | ( TokenStream(..) 3 | , TokenStep(..) 4 | , lex 5 | ) where 6 | 7 | import Prelude 8 | 9 | import Control.Plus ((<|>)) 10 | import Data.Array as Array 11 | import Data.Either (Either(..)) 12 | import Data.Foldable (foldl, oneOf) 13 | import Data.Int as Int 14 | import Data.Lazy (Lazy) 15 | import Data.Lazy as Lazy 16 | import Data.Maybe (Maybe(..), maybe) 17 | import Data.String (Pattern(..), split) 18 | import Data.String as String 19 | import Data.String.CodeUnits as SCU 20 | import Partial.Unsafe (unsafeCrashWith) 21 | import Syntax.Token (Comment(..), Position(..), PositionedToken(..), Range(..), Token(..)) 22 | import Text.Parsing.StringParser (ParseError(..), Parser(..), PosString, fail) 23 | import Text.Parsing.StringParser.CodePoints (regex) 24 | 25 | -- A TokenStream is a lazy cons-list of tokens, terminating with either an 26 | -- end-of-file node, or with an error while parsing a token. This lets us parse 27 | -- on-demand, and avoid lexing an entire document just to potentially fail in 28 | -- the language parser. We can also backtrack in the language parser without 29 | -- having to re-lex any tokens. 30 | newtype TokenStream = TokenStream (Lazy TokenStep) 31 | 32 | data TokenStep 33 | = TokenError Position String 34 | -- There may be trailing comments in a module, so we emit those as well as the 35 | -- final position in the document. 36 | | TokenEOF (Array Comment) Position 37 | | TokenCons PositionedToken Position TokenStream 38 | 39 | -- Lexing proceeds by: 40 | -- * Parsing a single token 41 | -- * Parsing trailing comments (comments up until the next token or newline) 42 | -- * Parsing leading comments for the _next_ token. 43 | -- * Emitting a PositiongedToken and a Position for the start of the next token. 44 | lex :: String -> TokenStream 45 | lex = init <<< { str: _, pos: 0 } 46 | where 47 | -- We need to prime the token stream by parsing the initial leading comments 48 | -- for the first token, at which point we can proceed with the main parsing 49 | -- loop. 50 | init :: PosString -> TokenStream 51 | init str = TokenStream $ Lazy.defer \_ -> do 52 | let Parser k = leadingComments 53 | case k str of 54 | Left _ -> 55 | -- Since leading comments are completely optional, it's not possible 56 | -- for the parser to fail. An alternative is to always parse non-empty 57 | -- comments, treating failure as empty comments. 58 | unsafeCrashWith "Leading comments can't fail." 59 | Right { result: leading, suffix } -> do 60 | let TokenStream next = go (foldl bumpComment (Position 0 0) leading) leading suffix 61 | Lazy.force next 62 | 63 | go :: Position -> Array Comment -> PosString -> TokenStream 64 | go startPos leading str = TokenStream $ Lazy.defer \_ -> 65 | if str.pos == SCU.length str.str then 66 | TokenEOF leading startPos 67 | else do 68 | let Parser k = token' 69 | case k str of 70 | Left { pos, error: ParseError error } -> 71 | -- The parser position may potentially be in the middle of a token, 72 | -- depending on how the token parser is implemented. This takes the 73 | -- difference and adjusts the position accordingly. 74 | TokenError (bumpMultiline startPos (String.take pos str.str)) error 75 | Right { result, suffix } -> do 76 | let 77 | endPos = bumpToken startPos result.token 78 | nextStart = foldl bumpComment (foldl bumpComment endPos result.trailing) result.nextLeading 79 | posToken = PositionedToken leading (Range startPos endPos) result.token result.trailing 80 | TokenCons posToken nextStart (go nextStart result.nextLeading suffix) 81 | 82 | token' :: Parser { token :: Token, trailing :: Array Comment, nextLeading :: Array Comment } 83 | token' = 84 | { token: _, trailing: _, nextLeading: _ } 85 | <$> token 86 | <*> trailingComments 87 | <*> leadingComments 88 | 89 | bumpToken :: Position -> Token -> Position 90 | bumpToken pos@(Position line col) = case _ of 91 | Let -> Position line (col + 3) 92 | Rec -> Position line (col + 3) 93 | Ty -> Position line (col + 4) 94 | Equals -> Position line (col + 1) 95 | Comma -> Position line (col + 1) 96 | Colon -> Position line (col + 1) 97 | Semicolon -> Position line (col + 1) 98 | Lambda -> Position line (col + 1) 99 | Arrow -> Position line (col + 2) 100 | ParenLeft -> Position line (col + 1) 101 | ParenRight -> Position line (col + 1) 102 | SquareLeft -> Position line (col + 1) 103 | SquareRight -> Position line (col + 1) 104 | Identifier ident -> Position line (col + String.length ident) 105 | Int raw _ -> Position line (col + String.length raw) 106 | String raw _ -> bumpMultiline pos raw 107 | 108 | bumpComment :: Position -> Comment -> Position 109 | bumpComment pos@(Position line col) = case _ of 110 | Comment comm -> Position line (col + String.length comm) 111 | Spaces str -> bumpMultiline pos str 112 | 113 | bumpMultiline :: Position -> String -> Position 114 | bumpMultiline (Position line col) str = Position newLine newCol 115 | where 116 | lines = split (Pattern "\n") str 117 | lastLine = maybe 0 String.length $ Array.last lines 118 | newLine = line + Array.length lines - 1 119 | newCol 120 | | line == newLine = col + lastLine 121 | | otherwise = lastLine 122 | 123 | leadingComments :: Parser (Array Comment) 124 | leadingComments = Array.many $ leadingSpace <|> lineComment 125 | 126 | trailingComments :: Parser (Array Comment) 127 | trailingComments = Array.many $ trailingSpace <|> lineComment 128 | 129 | leadingSpace :: Parser Comment 130 | leadingSpace = Spaces <$> regex "(\\s|\\n)+" 131 | 132 | trailingSpace :: Parser Comment 133 | trailingSpace = Spaces <$> regex "\\s+" 134 | 135 | lineComment :: Parser Comment 136 | lineComment = Comment <$> regex "--[^\\n]*" 137 | 138 | -- This is not necessarily an efficient token parser, but it is straightforward. 139 | -- Tokens can effectively be parsed with basic regular expressions since they are 140 | -- not recursive. This is a place to aggressively optimize with a hand-written 141 | -- parser if we need more overall parser performance. 142 | token :: Parser Token 143 | token = oneOf 144 | [ Equals <$ regex "=" 145 | , Comma <$ regex "," 146 | , Colon <$ regex ":" 147 | , Semicolon <$ regex ";" 148 | , Lambda <$ regex "\\\\" 149 | , Arrow <$ regex "->" 150 | , ParenLeft <$ regex "\\(" 151 | , ParenRight <$ regex "\\)" 152 | , SquareLeft <$ regex "\\[" 153 | , SquareRight <$ regex "\\]" 154 | , identifierOrKeyword <$> regex "[_a-zA-Z][_a-zA-Z0-9]*" 155 | , int 156 | , string 157 | ] <|> fail "Unexpected input" 158 | where 159 | int = do 160 | raw <- regex "[0-9]+" 161 | case Int.fromString raw of 162 | Just n -> pure $ Int raw n 163 | Nothing -> fail "Invalid integer" 164 | 165 | string = do 166 | raw <- regex "\"(\\\\\"|[^\"])*\"" 167 | pure $ String raw $ SCU.drop 1 $ SCU.dropRight 1 raw 168 | 169 | identifierOrKeyword = case _ of 170 | "let" -> Let 171 | "rec" -> Rec 172 | "type" -> Ty 173 | ident -> Identifier ident 174 | -------------------------------------------------------------------------------- /src/Syntax/Parser.purs: -------------------------------------------------------------------------------- 1 | module Syntax.Parser 2 | ( Parser 3 | , runParser 4 | , parseModule 5 | , parseDecl 6 | , parseExpr 7 | , parseType 8 | ) where 9 | 10 | import Prelude 11 | 12 | import Control.Alternative ((<|>)) 13 | import Control.Lazy (defer) 14 | import Control.Monad.Error.Class (catchError, throwError) 15 | import Control.Monad.Free (Free, runFree) 16 | import Control.Monad.State (gets, put) 17 | import Data.Either (Either) 18 | import Data.Identity (Identity(..)) 19 | import Data.Lazy as Lazy 20 | import Data.List (List) 21 | import Data.List as List 22 | import Data.List.NonEmpty as NonEmptyList 23 | import Data.List.Types (NonEmptyList) 24 | import Data.Maybe (Maybe(..)) 25 | import Data.Newtype (un) 26 | import Data.String as String 27 | import Data.Tuple (Tuple(..)) 28 | import Syntax.Lexer (TokenStep(..), TokenStream(..), lex) 29 | import Syntax.Token (Comment, Position(..), PositionedToken(..), Range(..), Token(..), printTokenName) 30 | import Syntax.Token as Token 31 | import Syntax.Tree (Argument(..), Decl(..), Delimited(..), Expr(..), LetBinding(..), Lit(..), Module(..), Ty(..), TypeArgument(..), Typed(..), TypedIdentifier(..), Wrapped(..)) 32 | import Syntax.Tree as Tree 33 | import Text.Parsing.Parser (ParseError, ParseState(..), ParserT, failWithPosition) 34 | import Text.Parsing.Parser as Parser 35 | import Text.Parsing.Parser.Combinators (lookAhead, optionMaybe) 36 | import Text.Parsing.Parser.Pos as Pos 37 | 38 | -- Free Identity is just to act as a trampoline so we don't stack overflow in 39 | -- a JS runtime while parsing large documents. 40 | type Parser = ParserT TokenStream (Free Identity) 41 | 42 | runParser :: forall a. String -> Parser a -> Either ParseError a 43 | runParser s = runFree (un Identity) <<< Parser.runParserT (lex s) 44 | 45 | eof :: (Token -> String) -> Parser (Array Comment) 46 | eof mkError = do 47 | TokenStream stream <- gets \(ParseState stream _ _) -> stream 48 | case Lazy.force stream of 49 | TokenError _ error -> Parser.fail error 50 | TokenEOF leading _ -> pure leading 51 | TokenCons (PositionedToken _ _ tok _) _ _ -> do 52 | let expected = mkError tok 53 | if String.null expected then 54 | Parser.fail $ "Unexpected " <> printTokenName tok <> "; expected EOF" 55 | else 56 | Parser.fail $ "Unexpected " <> printTokenName tok <> "; expected " <> expected 57 | 58 | expectMap :: forall a. (PositionedToken -> Maybe a) -> Parser a 59 | expectMap pred = do 60 | TokenStream stream <- gets \(ParseState stream _ _) -> stream 61 | case Lazy.force stream of 62 | TokenError _ error -> Parser.fail error 63 | TokenEOF _ _ -> Parser.fail "Unexpected EOF" 64 | TokenCons ptok@(PositionedToken _ (Range pos _) tok _) nextPos next -> 65 | case pred ptok of 66 | Nothing -> 67 | failWithPosition 68 | ("Unexpected " <> printTokenName tok) 69 | (toParserPosition pos) 70 | Just a -> do 71 | put $ ParseState next (toParserPosition nextPos) true 72 | pure a 73 | 74 | -- `Text.Parsing.Parser` does not track expected alternatives, so we are doing 75 | -- it manually by munging the error string. This is not pretty, and it doesn't 76 | -- track optional branches, but it works well enough. This does not account for 77 | -- `empty` however, so we should not use `oneOf`. A different workaround might 78 | -- be to put in another state layer to track it ourselves, or to implement the 79 | -- feature upstream. 80 | withExpected :: forall a. (Maybe Token -> String) -> Parser a -> Parser a 81 | withExpected mkError = flip catchError appendError 82 | where 83 | appendError perr@(Parser.ParseError err pos) = do 84 | mbTok <- lookAhead $ optionMaybe $ expect (const true) 85 | let expected = mkError (Token.token <$> mbTok) 86 | if String.null expected then 87 | throwError perr 88 | else 89 | case String.lastIndexOf (String.Pattern "; expected ") err of 90 | Just ix1 -> 91 | case String.lastIndexOf (String.Pattern expected) err of 92 | Just ix2 | ix2 > ix1 -> 93 | throwError perr 94 | _ -> do 95 | let err' = err <> ", " <> expected 96 | throwError $ Parser.ParseError err' pos 97 | _ -> do 98 | let err' = err <> "; expected " <> expected 99 | throwError $ Parser.ParseError err' pos 100 | 101 | toParserPosition :: Position -> Pos.Position 102 | toParserPosition (Position line column) = Pos.Position { line, column } 103 | 104 | expect :: (Token -> Boolean) -> Parser PositionedToken 105 | expect pred = expectMap \ptok@(PositionedToken _ _ tok _) -> 106 | if pred tok then Just ptok else Nothing 107 | 108 | token :: Token -> Parser PositionedToken 109 | token tok = withExpected (\_ -> printTokenName tok) $ expect (eq tok) 110 | 111 | delimited :: forall a. Token -> Parser a -> Parser (Delimited a) 112 | delimited delim p = go List.Nil =<< p 113 | where 114 | -- Why not use List.many? The naive Applicative implementation of this 115 | -- parser would be: 116 | -- ```purescript 117 | -- Delimited 118 | -- <$> p 119 | -- <*> List.many (try (Tuple <$> token delim <*> p)) 120 | -- <*> optionMaybe (token delim) 121 | -- ``` 122 | -- But the necessary `try` (necessary due to supporting a trailing 123 | -- delimiter) is very problematic. Introducing `try` means our parser will 124 | -- backtrack to this point if there is _any_ failure in `p`. This is much 125 | -- too far, and will lead to extremely poor errors. Any failure in this term 126 | -- parser will result in trying to close off this result and continuing. We 127 | -- want parse failures in `p` to be reported instead. 128 | go :: List (Tuple PositionedToken a) -> a -> Parser (Delimited a) 129 | go stk head = 130 | optionMaybe (token delim) >>= case _ of 131 | Nothing -> pure $ done stk head Nothing 132 | Just tok -> 133 | optionMaybe p >>= case _ of 134 | Nothing -> pure $ done stk head (Just tok) 135 | Just next -> go (List.Cons (Tuple tok next) stk) head 136 | 137 | done :: List (Tuple PositionedToken a) -> a -> Maybe PositionedToken -> Delimited a 138 | done stk head last = Delimited head (List.reverse stk) last 139 | 140 | wrapped :: forall a. Token -> Parser a -> Token -> Parser (Wrapped a) 141 | wrapped l p r = Wrapped <$> token l <*> p <*> token r 142 | 143 | many1 :: forall a. Parser a -> Parser (NonEmptyList a) 144 | many1 p = NonEmptyList.cons' <$> p <*> List.many p 145 | 146 | parseModule :: Parser Module 147 | parseModule = Module <$> delimited Semicolon parseDecl <*> (eof eofTokenError) 148 | where 149 | -- It is easy to miss a semicolon separator between lets. We special case the 150 | -- next token on `Let` to suggest inserting a semicolon. 151 | eofTokenError = case _ of 152 | Let -> printTokenName Semicolon 153 | _ -> printTokenName Let 154 | 155 | parseDecl :: Parser Decl 156 | parseDecl = DeclLet <$> token Let <*> parseLet 157 | 158 | -- All parsers for `Let` must be deferred, because `Let` contains an expression, 159 | -- which can contain other `Let`s. Parser cycles must be evaluated on demand. 160 | parseLet :: Parser Tree.Let 161 | parseLet = defer \_ -> 162 | Tree.LetOne <$> parseLetDefinition 163 | <|> Tree.LetRec <$> many1 (Tuple <$> token Rec <*> parseLetDefinition) 164 | 165 | parseLetDefinition :: Parser Tree.LetDefinition 166 | parseLetDefinition = defer \_ -> 167 | Tree.LetDefinition 168 | <$> parseLetBinding 169 | <*> token Equals 170 | <*> parseExpr 171 | 172 | parseLetBinding :: Parser LetBinding 173 | parseLetBinding = defer \_ -> 174 | LetBinding 175 | <$> parseIdentifier 176 | <*> optionMaybe parseQuantifier 177 | <*> List.many parseArgument 178 | <*> optionMaybe parseTyped 179 | where 180 | parseQuantifier = 181 | wrapped SquareLeft (delimited Comma parseTypeArgument) SquareRight 182 | 183 | parseArgument :: Parser Argument 184 | parseArgument = parseInferredArgument <|> parseTypedArgument 185 | where 186 | parseInferredArgument = 187 | InferredArgument 188 | <$> parseIdentifier 189 | 190 | parseTypedArgument = 191 | TypedArgument 192 | <$> wrapped ParenLeft parseTypedIdentifier ParenRight 193 | 194 | parseTypedIdentifier = 195 | TypedIdentifier 196 | <$> parseIdentifier 197 | <*> token Colon 198 | <*> parseType 199 | 200 | parseTypeArgument :: Parser TypeArgument 201 | parseTypeArgument = TypeArgument <$> parseIdentifier <*> optionMaybe parseTyped 202 | 203 | -- All parsers for `Expr` must be deferred since it is recursive. 204 | parseExpr :: Parser Expr 205 | parseExpr = defer \_ -> do 206 | expr <- parseExpr1 207 | ExprTyped expr <$> token Colon <*> parseType 208 | <|> pure expr 209 | 210 | parseExpr1 :: Parser Expr 211 | parseExpr1 = defer \_ -> 212 | parseExprLet <|> parseExpr2 213 | where 214 | -- Staging `ExprLet` before `ExprApp` means that you cannot use `let` as 215 | -- an argument without wrapping it in parens. This is done to yield better 216 | -- errors. Since this syntax is not whitespace sensitive, and semicolons are 217 | -- used as a separator, we can get odd errors in a top-level declaration if 218 | -- the semicolon is omitted. For example, if we allowed let as an argument: 219 | -- ``` 220 | -- let foo = 42 221 | -- let bar = 12; 222 | -- ``` 223 | -- This would parse as: 224 | -- ``` 225 | -- let foo = 42 (let bar = 12;) 226 | -- ``` 227 | -- Yielding an "Expected expression" error _after_ the `bar` declaration. By 228 | -- not allowing let as an argument, the error is presented after `foo`, which 229 | -- is what we want. 230 | parseExprLet = defer \_ -> do 231 | ExprLet 232 | <$> token Let 233 | <*> parseLet 234 | <*> token Semicolon 235 | <*> parseExpr 236 | 237 | parseExpr2 :: Parser Expr 238 | parseExpr2 = defer \_ -> do 239 | expr <- parseExpr3 240 | ExprApp expr <$> many1 parseExpr3 241 | <|> pure expr 242 | 243 | parseExpr3 :: Parser Expr 244 | parseExpr3 = defer \_ -> 245 | parseExprFun <|> parseExprAtom 246 | where 247 | parseExprFun = defer \_ -> do 248 | ExprFun 249 | <$> token Lambda 250 | <*> many1 parseArgument 251 | <*> token Arrow 252 | <*> parseExpr 253 | 254 | parseExprAtom :: Parser Expr 255 | parseExprAtom = withExpected (const "expression") $ defer \_ -> 256 | ExprLit <$> parseLit 257 | <|> ExprVar <$> parseIdentifier 258 | <|> ExprParens <$> wrapped ParenLeft parseExpr ParenRight 259 | 260 | -- All parsers for `Ty` must be deferred since it is recursive. 261 | parseType :: Parser Ty 262 | parseType = defer \_ -> do 263 | ty <- parseType1 264 | TyTyped ty <$> token Colon <*> parseType1 265 | <|> pure ty 266 | 267 | parseType1 :: Parser Ty 268 | parseType1 = defer \_ -> do 269 | ty <- parseType2 270 | TyArrow ty <$> token Arrow <*> parseType 271 | <|> pure ty 272 | 273 | parseType2 :: Parser Ty 274 | parseType2 = defer \_ -> do 275 | atom <- parseTypeAtom 276 | TyApp atom <$> many1 parseTypeAtom 277 | <|> pure atom 278 | 279 | parseTypeAtom :: Parser Ty 280 | parseTypeAtom = withExpected (const "type") $ defer \_ -> 281 | TyType <$> token Ty 282 | <|> TyVar <$> parseIdentifier 283 | <|> TyParens <$> wrapped ParenLeft parseType ParenRight 284 | 285 | parseTyped :: Parser Typed 286 | parseTyped = Typed <$> token Colon <*> parseType 287 | 288 | parseLit :: Parser Lit 289 | parseLit = parseInt <|> parseString 290 | where 291 | parseInt = expectMap \ptok@(PositionedToken _ _ tok _) -> 292 | case tok of 293 | Int _ value -> Just $ LitInt ptok value 294 | _ -> Nothing 295 | 296 | parseString = expectMap \ptok@(PositionedToken _ _ tok _) -> 297 | case tok of 298 | String _ value -> Just $ LitString ptok value 299 | _ -> Nothing 300 | 301 | parseIdentifier :: Parser Tree.Identifier 302 | parseIdentifier = expectMap \ptok@(PositionedToken _ _ tok _) -> 303 | case tok of 304 | Identifier ident -> Just $ Tree.Identifier ptok ident 305 | _ -> Nothing 306 | -------------------------------------------------------------------------------- /src/Syntax/Token.purs: -------------------------------------------------------------------------------- 1 | module Syntax.Token where 2 | 3 | import Prelude 4 | 5 | data Token 6 | = Let 7 | | Rec 8 | | Ty 9 | | Equals 10 | | Comma 11 | | Colon 12 | | Semicolon 13 | | Lambda 14 | | Arrow 15 | | ParenLeft 16 | | ParenRight 17 | | SquareLeft 18 | | SquareRight 19 | | Identifier String 20 | | Int String Int 21 | | String String String 22 | 23 | derive instance eqToken :: Eq Token 24 | 25 | data Position = Position Int Int 26 | 27 | derive instance eqPosition :: Eq Position 28 | derive instance ordPosition :: Ord Position 29 | 30 | data Range = Range Position Position 31 | 32 | derive instance eqRange :: Eq Range 33 | derive instance ordRange :: Ord Range 34 | 35 | instance semigroupRange :: Semigroup Range where 36 | append (Range start1 end1) (Range start2 end2) 37 | | start1 < start2 = Range start1 end2 38 | | otherwise = Range start2 end2 39 | 40 | data Comment 41 | = Comment String 42 | | Spaces String 43 | 44 | derive instance eqComment :: Eq Comment 45 | 46 | data PositionedToken 47 | = PositionedToken (Array Comment) Range Token (Array Comment) 48 | 49 | token :: PositionedToken -> Token 50 | token (PositionedToken _ _ tok _) = tok 51 | 52 | range1 :: PositionedToken -> Range 53 | range1 (PositionedToken _ range _ _) = range 54 | 55 | range2 :: PositionedToken -> PositionedToken -> Range 56 | range2 (PositionedToken _ start _ _) (PositionedToken _ end _ _) = start <> end 57 | 58 | emptyRange :: Range 59 | emptyRange = Range (Position 0 0) (Position 0 0) 60 | 61 | printTokenName :: Token -> String 62 | printTokenName = case _ of 63 | Let -> "'let'" 64 | Rec -> "'rec'" 65 | Ty -> "'type'" 66 | Equals -> "'='" 67 | Comma -> "','" 68 | Colon -> "':'" 69 | Semicolon -> "';'" 70 | Lambda -> "'\\'" 71 | Arrow -> "'->'" 72 | ParenLeft -> "'('" 73 | ParenRight -> "')'" 74 | SquareLeft -> "'['" 75 | SquareRight -> "']'" 76 | Identifier _ -> "identifier" 77 | Int _ _ -> "int literal" 78 | String _ _ -> "string literal" 79 | -------------------------------------------------------------------------------- /src/Syntax/Tree.purs: -------------------------------------------------------------------------------- 1 | module Syntax.Tree where 2 | 3 | import Prelude 4 | 5 | import Data.List as List 6 | import Data.List.NonEmpty as NonEmptyList 7 | import Data.List.Types (List, NonEmptyList) 8 | import Data.Maybe (Maybe(..)) 9 | import Data.Tuple (Tuple, fst, snd) 10 | import Syntax.Token (Comment, PositionedToken, Range, range1, range2) 11 | 12 | data Module = 13 | Module (Delimited Decl) (Array Comment) 14 | 15 | data Wrapped a = 16 | Wrapped PositionedToken a PositionedToken 17 | 18 | data Delimited a = 19 | Delimited a (List (Tuple PositionedToken a)) (Maybe PositionedToken) 20 | 21 | data Identifier = 22 | Identifier PositionedToken String 23 | 24 | data Decl = 25 | DeclLet PositionedToken Let 26 | 27 | data Let 28 | = LetOne LetDefinition 29 | | LetRec (NonEmptyList (Tuple PositionedToken LetDefinition)) 30 | 31 | data LetDefinition = 32 | LetDefinition LetBinding PositionedToken Expr 33 | 34 | data LetBinding = 35 | LetBinding Identifier (Maybe (Wrapped (Delimited TypeArgument))) (List Argument) (Maybe Typed) 36 | 37 | data Argument 38 | = InferredArgument Identifier 39 | | TypedArgument (Wrapped TypedIdentifier) 40 | 41 | data TypeArgument = 42 | TypeArgument Identifier (Maybe Typed) 43 | 44 | data TypedIdentifier = 45 | TypedIdentifier Identifier PositionedToken Ty 46 | 47 | data Expr 48 | = ExprLit Lit 49 | | ExprVar Identifier 50 | | ExprApp Expr (NonEmptyList Expr) 51 | | ExprFun PositionedToken (NonEmptyList Argument) PositionedToken Expr 52 | | ExprLet PositionedToken Let PositionedToken Expr 53 | | ExprTyped Expr PositionedToken Ty 54 | | ExprParens (Wrapped Expr) 55 | 56 | data Lit 57 | = LitInt PositionedToken Int 58 | | LitString PositionedToken String 59 | 60 | data Ty 61 | = TyType PositionedToken 62 | | TyVar Identifier 63 | | TyApp Ty (NonEmptyList Ty) 64 | | TyArrow Ty PositionedToken Ty 65 | | TyTyped Ty PositionedToken Ty 66 | | TyParens (Wrapped Ty) 67 | 68 | data Typed = 69 | Typed PositionedToken Ty 70 | 71 | rangeOfLet :: Let -> Range 72 | rangeOfLet = case _ of 73 | LetOne def -> 74 | rangeOfLetDefinition def 75 | LetRec defs -> 76 | range1 (fst (NonEmptyList.head defs)) 77 | <> rangeOfLetDefinition (snd (NonEmptyList.last defs)) 78 | 79 | rangeOfLetDefinition :: LetDefinition -> Range 80 | rangeOfLetDefinition (LetDefinition binding _ expr) = 81 | rangeOfLetBinding binding <> rangeOfExpr expr 82 | 83 | rangeOfLetBinding :: LetBinding -> Range 84 | rangeOfLetBinding (LetBinding (Identifier tok _) tyArgs args returnTy) 85 | | Just (Typed _ ty) <- returnTy = range1 tok <> rangeOfType ty 86 | | Just arg <- List.last args = range1 tok <> rangeOfArgument arg 87 | | Just (Wrapped _ _ tok2) <- tyArgs = range1 tok <> range1 tok2 88 | | otherwise = range1 tok 89 | 90 | rangeOfArgument :: Argument -> Range 91 | rangeOfArgument = case _ of 92 | InferredArgument (Identifier tok _) -> range1 tok 93 | TypedArgument (Wrapped tok1 _ tok2) -> range1 tok1 <> range1 tok2 94 | 95 | rangeOfExpr :: Expr -> Range 96 | rangeOfExpr = case _ of 97 | ExprLit lit -> rangeOfLit lit 98 | ExprVar (Identifier tok _) -> range1 tok 99 | ExprApp fn args -> rangeOfExpr fn <> rangeOfExpr (NonEmptyList.last args) 100 | ExprFun tok _ _ expr -> range1 tok <> rangeOfExpr expr 101 | ExprLet tok _ _ expr -> range1 tok <> rangeOfExpr expr 102 | ExprTyped expr _ ty -> rangeOfExpr expr <> rangeOfType ty 103 | ExprParens (Wrapped tok1 _ tok2) -> range2 tok1 tok2 104 | 105 | rangeOfLit :: Lit -> Range 106 | rangeOfLit = case _ of 107 | LitInt tok _ -> range1 tok 108 | LitString tok _ -> range1 tok 109 | 110 | rangeOfType :: Ty -> Range 111 | rangeOfType = case _ of 112 | TyType tok -> range1 tok 113 | TyVar (Identifier tok _) -> range1 tok 114 | TyApp ctor args -> rangeOfType ctor <> rangeOfType (NonEmptyList.last args) 115 | TyArrow arg _ result -> rangeOfType arg <> rangeOfType result 116 | TyTyped ty1 _ ty2 -> rangeOfType ty1 <> rangeOfType ty2 117 | TyParens (Wrapped tok1 _ tok2) -> range2 tok1 tok2 118 | -------------------------------------------------------------------------------- /test/test.flub: -------------------------------------------------------------------------------- 1 | let apply[A, B] (f : A -> B) (a : A) : B = f a; 2 | let const[A, B] (a : A) (b : B) : A = a; 3 | let id[A] (a : A) : A = a; 4 | let compose[A, B, C] (f : B -> C) (g : A -> B) : A -> C = \a -> f (g a); 5 | let flip[A, B, C] (f : A -> B -> C) : B -> A -> C = \b a -> f a b; 6 | 7 | let and[A] (l : A -> A -> A) (r : A -> A -> A) (t : A) (f : A) : A = l (r t f) f; 8 | let or[A] (l : A -> A -> A) (r : A -> A -> A) (t : A) (f : A) : A = l t (r t f); 9 | 10 | let true[A] (t : A) (f : A): A = t; 11 | let false[A] (t : A) (f : A): A = f; 12 | 13 | let pair[Fst, Snd, Res] (a : Fst) (b : Snd) : (Fst -> Snd -> Res) -> Res = \k -> k a b; 14 | let fst[Fst, Snd] (p : (Fst -> Snd -> Fst) -> Fst) : Fst = p const; 15 | let snd[Fst, Snd] (p : (Fst -> Snd -> Snd) -> Snd) : Snd = p \a b -> b; 16 | 17 | let isNil[Elem, List, Bool] (list : (Elem -> List -> Bool) -> Bool -> Bool) (true : Bool) (false : Bool) : Bool = 18 | list (\_ _ -> false) true; 19 | 20 | let headOrElse[Elem, List] (else : Elem) (list : (Elem -> List -> Elem) -> Elem -> Elem) : Elem = 21 | list const else; 22 | 23 | let myList[List, A] (cons : (A -> A -> A) -> List -> List) (nil : List) : List = 24 | cons false (cons false (cons false nil)); 25 | 26 | let test = myList or false "ok" "not-ok"; 27 | 28 | -- let rec goa (a : Int) : Int = gob a 29 | -- rec gob (b : Int) : Int = goa b; 30 | --------------------------------------------------------------------------------