Refactoring Metalua into two separate LuaRocks modules
diff --git a/README-compiler.md b/README-compiler.md
new file mode 100644
index 0000000..b2679cd
--- /dev/null
+++ b/README-compiler.md
@@ -0,0 +1,104 @@
+Metalua Compiler
+================
+
+## Metalua compiler
+
+This module `metalua-compiler` depends on `metalua-parser`. Its main
+feature is to compile ASTs into Lua 5.1 bytecode, allowing to convert
+them into bytecode files and executable functions. This opens the
+following possibilities:
+
+* compiler objects generated with `require 'metalua.compiler'.new()`
+  support methods `:xxx_to_function()` and `:xxx_to_bytecode()`;
+
+* Compile-time meta-programming: use of `-{...}` splices in source
+  code, to generate code during compilation;
+
+* Some syntax extensions, such as structural pattern matching and
+  lists by comprehension;
+
+* Some AST manipulation facilities such as `treequery`, which are
+  implemented with Metalua syntax extensions.
+
+## What's new in Metalua 0.7
+
+This is a major overhaul of the compiler's architecture. Some of the
+most noteworthy changes are:
+
+* No more installation or bootstrap script. Some Metalua source files
+  have been rewritten in plain Lua, and module sources have been
+  refactored, so that if you just drop the `metalua` folder somewhere
+  in your `LUA_PATH`, it works.
+
+* The compiler can be cut in two parts:
+
+  * a parser which generates ASTs out of Lua sources, and should be
+    either portable or easily ported to Lua 5.2;
+
+  * a compiler, which can turn sources and AST into executable
+    Lua 5.1 bytecode and run it. It also supports compile-time
+    meta-programming, i.e. code included between `-{ ... }` is
+    executed during compilation, and the ASTs it produces are
+    included in the resulting bytecode.
+
+* Both parts are packaged as separate LuaRocks, `metalua-parser` and
+  `metalua-compiler` respectively, so that you can install the former
+  without the latter.
+
+* The parser is not a unique object anymore. Instead,
+  `require "metalua.compiler".new()` returns a different compiler
+  instance every time it's called. Compiler instances can be reused on
+  as many source files as wanted, but extending one instance's grammar
+  doesn't affect other compiler instances.
+
+* Included standard library has been shed. There are too many standard
+  libs in Lua, and none of them is standard enough, offering
+  yet-another-one, coupled with a specific compiler can only add to
+  confusion.
+
+* Many syntax extensions, which either were arguably more code samples
+  than actual production-ready tools, or relied too heavily on the
+  removed runtime standard libraries, have been removed.
+
+* The remaining libraries and samples are:
+
+  * `metalua.compiler` converts sources into ASTs, bytecode,
+    functions, and ASTs back into sources.
+
+  * `metalua` compiles and/or executes files from the command line,
+    can start an interactive REPL session.
+
+  * `metalua.loader` adds a package loader which allows to use modules
+    written in Metalua, even from a plain Lua program.
+
+  * `metalua.treequery` is an advanced DSL allowing to search ASTs in
+    a smart way, e.g. "_search `return` statements which return a
+    `local` variable but aren't in a nested `function`_".
+
+  * `metalua.extension.comprehension` is a language extension which
+    supports lists by comprehension
+    (`even = { i for i=1, 100 if i%2==0 }`) and improved loops
+    (`for i=1, 10 for j=1,10 if i~=j do print(i,j) end`).
+
+  * `metalua.extension.match` is a language extension which offers
+    Haskell/ML structural pattern matching
+    (``match AST with `Function{ args, body } -> ... | `Number{ 0 } -> ...end``)
+
+   * **TODO Move basic extensions in a separate module.**
+
+* To remove the compilation speed penalty associated with
+  metaprogramming, when environment variable `LUA_MCACHE` or Lua
+  variable `package.mcache` is defined and LuaFileSystem is available,
+  the results of Metalua source compilations is cached. Unless the
+  source file is more recent than the latest cached bytecode file, the
+  latter is loaded instead of the former.
+
+* The Luarock install for the full compiler lists dependencies towards
+  Readline, LuaFileSytem, and Alt-Getopts. Those projects are
+  optional, but having them automatically installed by LuaRocks offers
+  a better user experience.
+
+* The license has changed from MIT to double license MIT + EPL. This
+  has been done in order to provide the IP guarantees expected by the
+  Eclipse Foundation, to include Metalua in Eclipse's
+  [Lua Development Tools](http://www.eclipse.org/koneki/ldt/).
diff --git a/README-parser.md b/README-parser.md
new file mode 100644
index 0000000..d5edacc
--- /dev/null
+++ b/README-parser.md
@@ -0,0 +1,175 @@
+Metalua Parser
+==============
+
+`metalua-parser` is a subset of the Metalua compiler, which turns
+valid Lua source files and strings into abstract syntax trees
+(AST). This README includes a description of this AST format. People
+interested by Lua code analysis and generation are encouraged to
+produce and/or consume this format to represent ASTs.
+
+It has been designed for Lua 5.1. It hasn't been tested against
+Lua 5.2, but should be easily ported.
+
+## Usage
+
+Module `metalua.compiler` has a `new()` function, which returns a
+compiler instance. This instance has a set of methods of the form
+`:xxx_to_yyy(input)`, where `xxx` and `yyy` must be one of the
+following:
+
+* `srcfile` the name of a Lua source file;
+* `src` a string containing the Lua sources of a list of statements;
+* `lexstream` a lexical tokens stream;
+* `ast` an abstract syntax tree;
+* `bytecode` a chunk of Lua bytecode that can be loaded in a Lua 5.1
+  VM (not available if you only installed the parser);
+* `function` an executable Lua function.
+
+Compiling into bytecode or executable functions requires the whole
+Metalua compiler, not only the parser. The most frequently used
+functions are `:src_to_ast(source_string)` and
+`:srcfile_to_ast("path/to/source/file.lua")`.
+
+    mlc = require 'metalua.compiler'.new()
+    ast = mlc :src_to_ast[[ return 123 ]]
+
+A compiler instance can be reused as much as you want; it's only
+interesting to work with more than one compiler instance when you
+start extending their grammars.
+
+## Abstract Syntax Trees definition
+
+### Notation
+
+Trees are written below with some Metalua syntax sugar, which
+increases their readability. the backquote symbol introduces a `tag`,
+i.e. a string stored in the `"tag"` field of a table:
+
+* `` `Foo{ 1, 2, 3 }`` is a shortcut for `{tag="Foo", 1, 2, 3}`;
+* `` `Foo`` is a shortcut for `{tag="Foo"}`;
+* `` `Foo 123`` is a shortcut for `` `Foo{ 123 }``, and therefore
+  `{tag="Foo", 123 }`; the expression after the tag must be a literal
+  number or string.
+
+When using a Metalua interpreter or compiler, the backtick syntax is
+supported and can be used directly. Metalua's pretty-printing helpers
+also try to use backtick syntax whenever applicable.
+
+### Tree elements
+
+Tree elements are mainly categorized into statements `stat`,
+expressions `expr` and lists of statements `block`. Auxiliary
+definitions include function applications/method invocation `apply`,
+are both valid statements and expressions, expressions admissible on
+the left-hand-side of an assignment statement `lhs`.
+
+    block: { stat* }
+
+    stat:
+      `Do{ stat* }
+    | `Set{ {lhs+} {expr+} }                    -- lhs1, lhs2... = e1, e2...
+    | `While{ expr block }                      -- while e do b end
+    | `Repeat{ block expr }                     -- repeat b until e
+    | `If{ (expr block)+ block? }               -- if e1 then b1 [elseif e2 then b2] ... [else bn] end
+    | `Fornum{ ident expr expr expr? block }    -- for ident = e, e[, e] do b end
+    | `Forin{ {ident+} {expr+} block }          -- for i1, i2... in e1, e2... do b end
+    | `Local{ {ident+} {expr+}? }               -- local i1, i2... = e1, e2...
+    | `Localrec{ ident expr }                   -- only used for 'local function'
+    | `Goto{ <string> }                         -- goto str
+    | `Label{ <string> }                        -- ::str::
+    | `Return{ <expr*> }                        -- return e1, e2...
+    | `Break                                    -- break
+    | apply
+
+    expr:
+      `Nil  |  `Dots  |  `True  |  `False
+    | `Number{ <number> }
+    | `String{ <string> }
+    | `Function{ { `Id{ <string> }* `Dots? } block }
+    | `Table{ ( `Pair{ expr expr } | expr )* }
+    | `Op{ opid expr expr? }
+    | `Paren{ expr }       -- significant to cut multiple values returns
+    | apply
+    | lhs
+
+    apply:
+      `Call{ expr expr* }
+    | `Invoke{ expr `String{ <string> } expr* }
+
+    lhs: `Id{ <string> } | `Index{ expr expr }
+
+    opid: 'add'   | 'sub'   | 'mul'   | 'div'
+        | 'mod'   | 'pow'   | 'concat'| 'eq'
+        | 'lt'    | 'le'    | 'and'   | 'or'
+        | 'not'   | 'len'
+
+### Meta-data (lineinfo)
+
+
+ASTs also embed some metadata, allowing to map them to their source
+representation. Those informations are stored in a `"lineinfo"` field
+in each tree node, which points to the range of characters in the
+source string which represents it, and to the content of any comment
+that would appear immediately before or after that node.
+
+Lineinfo objects have two fields, `"first"` and `"last"`, describing
+respectively the beginning and the end of the subtree in the
+sources. For instance, the sub-node ``Number{123}` produced by parsing
+`[[return 123]]` will have `lineinfo.first` describing offset 8, and
+`lineinfo.last` describing offset 10:
+
+
+    > mlc = require 'metalua.compiler'.new()
+    > ast = mlc :src_to_ast "return 123 -- comment"
+    > print(ast[1][1].lineinfo)
+    <?|L1|C8-10|K8-10|C>
+    >
+
+A lineinfo keeps track of character offsets relative to the beginning
+of the source string/file ("K8-10" above), line numbers (L1 above; a
+lineinfo spanning on several lines would read something like "L1-10"),
+columns i.e. offset within the line ("C8-10" above), and a filename if
+available (the "?" mark above indicating that we have no file name, as
+the AST comes from a string). The final "|C>" indicates that there's a
+comment immediately after the node; an initial "<C|" would have meant
+that there was a comment immediately before the node.
+
+Positions represent either the end of a token and the beginning of an
+inter-token space (`"last"` fields) or the beginning of a token, and
+the end of an inter-token space (`"first"` fields). Inter-token spaces
+might be empty. They can also contain comments, which might be useful
+to link with surrounding tokens and AST subtrees.
+
+Positions are chained with their "dual" one: a position at the
+beginning of and inter-token space keeps a refernce to the position at
+the end of that inter-token space in its `"facing"` field, and
+conversly, end-of-inter-token positions keep track of the inter-token
+space beginning, also in `"facing"`. An inter-token space can be
+empty, e.g. in `"2+2"`, in which case `lineinfo==lineinfo.facing`.
+
+Comments are also kept in the `"comments"` field. If present, this
+field contains a list of comments, with a `"lineinfo"` field
+describing the span between the first and last comment. Each comment
+is represented by a list of one string, with a `"lineinfo"` describing
+the span of this comment only. Consecutive lines of `--` comments are
+considered as one comment: `"-- foo\n-- bar\n"` parses as one comment
+whose text is `"foo\nbar"`, whereas `"-- foo\n\n-- bar\n"` parses as
+two comments `"foo"` and `"bar"`.
+
+So for instance, if `f` is the AST of a function and I want to
+retrieve the comment before the function, I'd do:
+
+    f_comment = f.lineinfo.first.comments[1][1]
+
+The informations in lineinfo positions, i.e. in each `"first"` and
+`"last"` field, are held in the following fields:
+
+* `"source"` the filename (optional);
+* `"offset"` the 1-based offset relative to the beginning of the string/file;
+* `"line"` the 1-based line number;
+* `"column"` the 1-based offset within the line;
+* `"facing"` the position at the opposite end of the inter-token space.
+* `"comments"` the comments in the associated inter-token space (optional).
+* `"id"` an arbitrary number, which uniquely identifies an inter-token
+  space within a given tokens stream.
+
diff --git a/README.md b/README.md
new file mode 100644
index 0000000..dcd10a3
--- /dev/null
+++ b/README.md
@@ -0,0 +1,13 @@
+Metalua
+=======
+
+Metalua is a Lua code analysis tool, as well as a compiler for a
+superset of Lua 5.1 supporting Compile-Time Meta-Programming. It's
+separated into two LuaRocks, `metalua-parser` and
+`metalua-compiler`. The documentation of each rock can be found in
+`README-parser.md` and `README-compiler.md`.
+
+All the code in Metalue is released under dual lincenses:
+
+* MIT public license (same as Lua);
+* EPL public license (same as Eclipse).
diff --git a/metalua-compiler-0.7.2-1.rockspec b/metalua-compiler-0.7.2-1.rockspec
new file mode 100644
index 0000000..7c12bea
--- /dev/null
+++ b/metalua-compiler-0.7.2-1.rockspec
@@ -0,0 +1,60 @@
+--*-lua-*--
+package = "metalua-compiler"
+version = "0.7.2-1"
+source = {
+   url = "git://git.eclipse.org/gitroot/koneki/org.eclipse.koneki.metalua.git",
+   tag = "v0.7.2",
+}
+
+description = {
+    summary = "Metalua's compiler: converting (Meta)lua source strings and files into executable Lua 5.1 bytecode",
+   detailed = [[
+           This is the Metalua copmiler, packaged as a rock, depending
+           on the spearate metalua-parser AST generating library. It
+           compiles a superset of Lua 5.1 into bytecode, which can
+           then be loaded and executed by a Lua 5.1 VM. It also allows
+           to dump ASTs back into Lua source files.
+   ]],
+   homepage = "http://git.eclipse.org/c/koneki/org.eclipse.koneki.metalua.git",
+   license = "EPL + MIT"
+}
+
+dependencies = {
+    "lua ~> 5.1",              -- Lua 5.2 bytecode not supported
+    "checks >= 1.0",           -- Argument type checking
+    "luafilesystem >= 1.6.2",  -- Cached compilation based on file timestamps
+    "readline >= 1.3",         -- Better REPL experience
+    "metalua-parser == 0.7.2", -- AST production
+}
+
+build = {
+    type="builtin",
+    modules={
+        ["metalua.compiler.bytecode"] = "metalua/compiler/bytecode.lua",
+        ["metalua.compiler.globals"] = "metalua/compiler/globals.lua",
+        ["metalua.compiler.bytecode.compile"] = "metalua/compiler/bytecode/compile.lua",
+        ["metalua.compiler.bytecode.lcode"] = "metalua/compiler/bytecode/lcode.lua",
+        ["metalua.compiler.bytecode.lopcodes"] = "metalua/compiler/bytecode/lopcodes.lua",
+        ["metalua.compiler.bytecode.ldump"] = "metalua/compiler/bytecode/ldump.lua",
+        ["metalua.loader"] = "metalua/loader.lua",
+    },
+    install={lua={
+        ["metalua.treequery"] = "metalua/treequery.mlua",
+        ["metalua.compiler.ast_to_src"] = "metalua/compiler/ast_to_src.mlua",
+        ["metalua.treequery.walk"] = "metalua/treequery/walk.mlua",
+        ["metalua.extension.match"] = "metalua/extension/match.mlua",
+        ["metalua.extension.comprehension"] = "metalua/extension/comprehension.mlua",
+        ["metalua.extension.log"] = "metalua/extension/log.mlua",
+        ["metalua.repl"] = "metalua/repl.mlua",
+    }}
+}
+
+--[==[-- Generate file lists
+for _, ext in ipairs{ 'lua', 'mlua' } do
+    for filename in io.popen("find metalua -name '*."..ext.."'") :lines() do
+        local modname = filename :gsub ('/', '.') :gsub ('%.'..ext..'$', '')
+        print((' '):rep(8)..'["' .. modname .. '"] = "' ..  filename .. '",')
+    end
+    print""
+end
+--]==]--
diff --git a/metalua-parser-0.7.2-1.rockspec b/metalua-parser-0.7.2-1.rockspec
new file mode 100644
index 0000000..77718cf
--- /dev/null
+++ b/metalua-parser-0.7.2-1.rockspec
@@ -0,0 +1,42 @@
+--*-lua-*--
+package = "metalua-parser"
+version = "0.7.2-1"
+source = {
+   url = "git://git.eclipse.org/gitroot/koneki/org.eclipse.koneki.metalua.git",
+   tag = "v0.7.2",
+}
+description = {
+   summary = "Metalua's parser: converting Lua source strings and files into AST",
+   detailed = [[
+           This is a subset of the full Metalua compiler. It defines and generates an AST
+           format for Lua programs, which offers a nice level of abstraction to reason about
+           and manipulate Lua programs.
+   ]],
+   homepage = "http://git.eclipse.org/c/koneki/org.eclipse.koneki.metalua.git",
+   license = "EPL + MIT"
+}
+dependencies = {
+    "lua ~> 5.1",
+    "checks >= 1.0",
+}
+build = {
+    type="builtin",
+    modules={
+        ["metalua.grammar.generator"] = "metalua/grammar/generator.lua",
+        ["metalua.grammar.lexer"] = "metalua/grammar/lexer.lua",
+        ["metalua.compiler.parser"] = "metalua/compiler/parser.lua",
+        ["metalua.compiler.parser.common"] = "metalua/compiler/parser/common.lua",
+        ["metalua.compiler.parser.table"] = "metalua/compiler/parser/table.lua",
+        ["metalua.compiler.parser.ext"] = "metalua/compiler/parser/ext.lua",
+        ["metalua.compiler.parser.annot.generator"] = "metalua/compiler/parser/annot/generator.lua",
+        ["metalua.compiler.parser.annot.grammar"] = "metalua/compiler/parser/annot/grammar.lua",
+        ["metalua.compiler.parser.stat"] = "metalua/compiler/parser/stat.lua",
+        ["metalua.compiler.parser.misc"] = "metalua/compiler/parser/misc.lua",
+        ["metalua.compiler.parser.lexer"] = "metalua/compiler/parser/lexer.lua",
+        ["metalua.compiler.parser.meta"] = "metalua/compiler/parser/meta.lua",
+        ["metalua.compiler.parser.expr"] = "metalua/compiler/parser/expr.lua",
+        ["metalua.compiler"] = "metalua/compiler.lua",
+        ["metalua.pprint"] = "metalua/pprint.lua",
+    }
+}
+
diff --git a/metalua.lua b/metalua.lua
index 7c97af3..4641380 100644
--- a/metalua.lua
+++ b/metalua.lua
@@ -17,77 +17,46 @@
 --
 -------------------------------------------------------------------------------
 
+-- Survive lack of checks
+if not pcall(require, 'checks') then function package.preload.checks() function checks() end end end
+
 -- Main file for the metalua executable
-require 'metalua.base'
+require 'metalua.loader' -- load *.mlue files
 require 'metalua.compiler.globals' -- metalua-aware loadstring, dofile etc.
 
-local mlc    = require 'metalua.compiler'
-local clopts = require 'metalua.clopts'
+local alt_getopt = require 'alt_getopt'
+local pp  = require 'metalua.pprint'
+local mlc = require 'metalua.compiler'
 
 local M = { }
 
 local AST_COMPILE_ERROR_NUMBER        = -1
 local RUNTIME_ERROR_NUMBER            = -3
-local BYTECODE_SYNTHESE_ERROR_NUMBER  = -100
 
-local chunks  = { }
-local runargs = { }
+local alt_getopt_options = "f:l:e:o:xivaASbs"
 
-local function acc_chunk(kind)
-    return function(x)
-        table.insert (chunks, { tag=kind, x })
-    end
-end
+local long_opts = {
+    file='f',
+    library='l',
+    literal='e',
+    output='o',
+    run='x',
+    interactive='i',
+    verbose='v',
+    ['print-ast']='a',
+    ['print-ast-lineinfo']='A',
+    ['print-src']='S',
+    ['meta-bugs']='b',
+    ['sharp-bang']='s',
+}
 
-M.cmdline_parser = clopts {
-   -- Chunk loading
-   {  short = 'f', long = 'file', type = 'string', action = acc_chunk 'File',
-      usage = 'load a file to compile and/or run'
-   },
-   {  short = 'l', long = 'library', type = 'string', action = acc_chunk 'Library',
-      usage = 'load a libary from the standard paths'
-   },
-   {  short = 'e', long = 'literal', type = 'string', action = acc_chunk 'Literal',
-      usage = 'load a literal piece of source code'
-   },
-   -- What to do with chunks
-   {  short = 'o', long = 'output', type = 'string',
-      usage = 'set the target name of the next compiled file'
-   },
-   {  short = 'x', long = 'run', type = 'boolean',
-      usage = 'execute the compiled file instead of saving it (unless -o is also used)'
-   },
-   {  short = 'i', long = 'interactive', type = 'boolean',
-      usage = 'run an interactive loop after having run other files'
-   },
-   -- Advanced stuff
-   {  short = 'v', long = 'verbose', type = 'boolean',
-      usage = 'verbose mode'
-   },
-   {  short = 'a', long = 'print-ast',  type = 'boolean',
-      usage = 'print the AST resulting from file compilation'
-   },
-   {  short = 'A', long = 'print-ast-lineinfo',  type = 'boolean',
-      usage = 'print the AST resulting from file compilation, including lineinfo data'
-   },
-   {  short = 'S', long = 'print-src',  type = 'boolean',
-      usage = 'print the AST resulting from file compilation, as re-gerenerated sources'
-   },
-   {  short = 'b', long = 'metabugs', type = 'boolean',
-      usage = 'show syntax errors as compile-time execution errors'
-   },
-   {  short = 's', long = 'sharpbang', type = 'string',
-      usage = 'set a first line to add to compiled file, typically "#!/bin/env mlr"'
-   },
-   {  long  = 'no-base-lib', type = 'boolean',
-      usage = "prevent the automatic requirement of metalua base lib"
-   },
-   {  long  = '', short = 'p', type = '*',
-      action= function (newargs) runargs=table.icat(runargs, newargs) end,
-      usage = "pass all remaining arguments to the program"
-   },
+local chunk_options = {
+    library=1,
+    file=1,
+    literal=1
+}
 
-usage=[[
+local usage=[[
 
 Compile and/or execute metalua programs. Parameters passed to the
 compiler should be prefixed with an option flag, hinting what must be
@@ -106,7 +75,25 @@
   executed by default, unless a --run flag forces it to. Conversely,
   if no --output target is specified, the code is run unless ++run
   forbids it.
-]]}
+]]
+
+function M.cmdline_parser(...)
+    local argv = {...}
+    local opts, optind, optarg =
+        alt_getopt.get_ordered_opts({...}, alt_getopt_options, long_opts)
+    --pp.printf("argv=%s; opts=%s, ending at %i, with optarg=%s",
+    --          argv, opts, optind, optarg)
+    local s2l = { } -- short to long option names conversion table
+    for long, short in pairs(long_opts) do s2l[short]=long end
+    local cfg = { chunks = { } }
+    for i, short in pairs(opts) do
+        local long = s2l[short]
+        if chunk_options[long] then table.insert(cfg.chunks, { tag=long, optarg[i] })
+        else cfg[long] = optarg[i] or true end
+    end
+    cfg.params = { select(optind, ...) }
+    return cfg
+end
 
 function M.main (...)
 
@@ -117,26 +104,26 @@
    -------------------------------------------------------------------
    local function verb_print (fmt, ...)
       if cfg.verbose then
-         return printf ("[ "..fmt.." ]", ...)
+         return pp.printf ("[ "..fmt.." ]", ...)
       end
    end
 
    if cfg.verbose then
-      verb_print("raw options: %s", table.tostring(cfg))
+      verb_print("raw options: %s", cfg)
    end
 
    -------------------------------------------------------------------
    -- If there's no chunk but there are params, interpret the first
    -- param as a file name.
-   if #chunks==0 and cfg.params then
+   if not next(cfg.chunks) and next(cfg.params) then
       local the_file = table.remove(cfg.params, 1)
       verb_print("Param %q considered as a source file", the_file)
-      chunks = { {tag='File', the_file } }
+      cfg.file={ the_file }
    end
 
    -------------------------------------------------------------------
    -- If nothing to do, run REPL loop
-   if #chunks==0 and cfg.interactive==nil then
+   if not next(cfg.chunks) and not cfg.interactive then
       verb_print "Nothing to compile nor run, force interactive loop"
       cfg.interactive=true
    end
@@ -145,7 +132,7 @@
    -------------------------------------------------------------------
    -- Run if asked to, or if no --output has been given
    -- if cfg.run==false it's been *forced* to false, don't override.
-   if cfg.run==nil and not cfg.output then
+   if not cfg.run and not cfg.output then
       verb_print("No output file specified; I'll run the program")
       cfg.run = true
    end
@@ -155,41 +142,44 @@
    -------------------------------------------------------------------
    -- Get ASTs from sources
 
-   local last_file
-   for _, x in pairs(chunks) do
+   local last_file_idx
+   for i, x in ipairs(cfg.chunks) do
       local compiler = mlc.new()
       local tag, val = x.tag, x[1]
-      verb_print("Compiling %s", table.tostring(x))
+      verb_print("Compiling %s", x)
       local st, ast
-      if tag=='Library' then
+      if tag=='library' then
           ast = { tag='Call',
                   {tag='Id', "require" },
                   {tag='String', val } }
-      elseif tag=='Literal' then ast = compiler :src_to_ast(val)
-      elseif tag=='File' then
+      elseif tag=='literal' then ast = compiler :src_to_ast(val)
+      elseif tag=='file' then
          ast = compiler :srcfile_to_ast(val)
          -- Isolate each file in a separate fenv
-         ast = { tag='Call', 
-                 { tag='Function', { { tag='Dots'} }, ast }, 
+         ast = { tag='Call',
+                 { tag='Function', { { tag='Dots'} }, ast },
                  { tag='Dots' } }
-         ast.source  = '@'..val -- TODO [EVE]
-         code.source = '@'..val -- TODO [EVE]
-         last_file = ast
+         ast.source  = '@'..val
+         code.source = '@'..val
+         last_file_idx = i
       else
           error ("Bad option "..tag)
       end
       local valid = true -- TODO: check AST's correctness
       if not valid then
-         printf ("Cannot compile %s:\n%s", table.tostring(x), ast or "no msg")
+         pp.printf ("Cannot compile %s:\n%s", x, ast or "no msg")
          os.exit (AST_COMPILE_ERROR_NUMBER)
       end
       ast.origin = x
       table.insert(code, ast)
    end
    -- The last file returns the whole chunk's result
-   if last_file then
-       local c = table.shallow_copy(last_file)
-       table.override(last_file, {tag='Return', source = c.source, c })
+   if last_file_idx then
+       -- transform  +{ (function(...) -{ast} end)(...) }
+       -- into   +{ return (function(...) -{ast} end)(...) }
+       local prv_ast = code[last_file_idx]
+       local new_ast = { tag='Return', prv_ast }
+       code[last_file_idx] = new_ast
    end
 
    -- Further uses of compiler won't involve AST transformations:
@@ -197,16 +187,17 @@
    -- TODO: reuse last instance if possible.
    local compiler = mlc.new()
 
-
    -------------------------------------------------------------------
    -- AST printing
    if cfg['print-ast'] or cfg['print-ast-lineinfo'] then
       verb_print "Resulting AST:"
       for _, x in ipairs(code) do
-         printf("--- AST From %s: ---", table.tostring(x.source, 'nohash'))
+         pp.printf("--- AST From %s: ---", x.source)
          if x.origin and x.origin.tag=='File' then x=x[1][1][2][1] end
-         if cfg['print-ast-lineinfo'] then table.print(x, 80, "indent1")
-         else table.print(x, 80, 'nohash') end
+         local pp_cfg = cfg['print-ast-lineinfo']
+             and { line_max=1, fix_indent=1, metalua_tag=1 }
+             or  { line_max=1, metalua_tag=1, hide_hash=1  }
+         pp.print(x, 80, pp_cfg)
       end
    end
 
@@ -223,17 +214,6 @@
 
    -- TODO: canonize/check AST
 
-   -------------------------------------------------------------------
-   -- Insert base lib loader
-   if cfg['no-base-lib'] then
-      verb_print "Prevent insertion of command \"require 'metalua.base'\""
-   else
-       local req_runtime = { tag='Call',
-                             {tag='Id', "require"},
-                             {tag='String', "metalua.base"} }
-       table.insert(code, 1, req_runtime)
-   end
-
    local bytecode = compiler :ast_to_bytecode (code)
    code = nil
 
@@ -269,11 +249,10 @@
       bytecode = nil
       -- FIXME: isolate execution in a ring
       -- FIXME: check for failures
-      runargs = table.icat(cfg.params or { }, runargs)
       local function print_traceback (errmsg)
          return errmsg .. '\n' .. debug.traceback ('',2) .. '\n'
       end
-      local function g() return f(unpack (runargs)) end
+      local function g() return f(unpack (cfg.params)) end
       local st, msg = xpcall(g, print_traceback)
       if not st then
          io.stderr:write(msg)
@@ -292,9 +271,4 @@
 
 end
 
--- If the lib is being loaded, the sentinel token is currently
--- put as a placeholder in its package.loaded entry.
-local called_as_a_lib = type(package.loaded.metalua)=='userdata'
-
-if not called_as_a_lib then M.main(...)
-else return M end
\ No newline at end of file
+return M.main(...)
diff --git a/metalua/base.lua b/metalua/base.lua
deleted file mode 100644
index b0f0e36..0000000
--- a/metalua/base.lua
+++ /dev/null
@@ -1,80 +0,0 @@
---------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
---------------------------------------------------------------------------------
-
-----------------------------------------------------------------------
-----------------------------------------------------------------------
---
--- Base library extension
---
-----------------------------------------------------------------------
-----------------------------------------------------------------------
-
-require 'checks'
-
-function o (...)
-   local args = {...}
-   local function g (...)
-      local result = {...}
-      for i=#args, 1, -1 do result = {args[i](unpack(result))} end
-      return unpack (result)
-   end
-   return g
-end
-
-function id (...) return ... end
-function const (k) return function () return k end end
-
-function printf(...) return print(string.format(...)) end
-function eprintf(...) 
-   io.stderr:write(string.format(...).."\n") 
-end
-
-function ivalues (x)
-   checks('table')
-   local i = 1
-   local function iterator ()
-      local r = x[i]; i=i+1; return r
-   end
-   return iterator
-end
-
-
-function values (x)
-   checks('table')
-   local function iterator (state)
-      local it
-      state.content, it = next(state.list, state.content)
-      return it
-   end
-   return iterator, { list = x }
-end
-
-function keys (x)
-   checks('table')
-   local function iterator (state)
-      local it = next(state.list, state.content)
-      state.content = it
-      return it
-   end
-   return iterator, { list = x }
-end
-
-require 'metalua.table'
-require 'metalua.string'
-require 'metalua.package'
\ No newline at end of file
diff --git a/verbose_require.lua b/metalua/bytecode.lua
similarity index 73%
rename from verbose_require.lua
rename to metalua/bytecode.lua
index 7d0af79..b3afbdb 100644
--- a/verbose_require.lua
+++ b/metalua/bytecode.lua
@@ -17,14 +17,13 @@
 --
 --------------------------------------------------------------------------------
 
-do
-   local xrequire, n, ind = require, 0, "| "
-   function require (x)
-      print(ind:rep(n).."/ require: "..x)
-      n=n+1
-      local y = xrequire(x)
-      n=n-1
-      print(ind:rep(n).."\\_")
-      return y
-   end
-end
+local compile = require 'metalua.compiler.bytecode.compile'
+local ldump   = require 'metalua.compiler.bytecode.ldump'
+
+local M = { }
+
+M.ast_to_proto      = compile.ast_to_proto
+M.proto_to_bytecode = ldump.dump_string
+M.proto_to_file     = ldump.dump_file
+
+return M
\ No newline at end of file
diff --git a/metalua/clopts.mlua b/metalua/clopts.mlua
deleted file mode 100644
index 7ea6c4f..0000000
--- a/metalua/clopts.mlua
+++ /dev/null
@@ -1,223 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
-
---------------------------------------------------------------------------------
--- Command Line OPTionS handler
--- ============================
---
--- This lib generates parsers for command-line options. It encourages
--- the following of some common idioms: I'm pissed off by Unix tools
--- which sometimes will let you concatenate single letters options,
--- sometimes won't, will prefix long name options with simple dashes
--- instead of doubles, etc.
---
---------------------------------------------------------------------------------
-
--- TODO:
--- * add a generic way to unparse options ('grab everything')
--- * doc
--- * when a short options that takes a param isn't the last element of a series
---   of shorts, take the remaining of the sequence as that param, e.g. -Ifoo
--- * let unset strings/numbers with +
--- * add a ++ long counterpart to +
---
-
--{ extension ('match',...) }
-
-local function clopts(cfg)
-   local short, long, param_func = { }, { }
-   local legal_types = table.transpose{ 
-      'boolean','string','number','string*','number*','nil', '*' }
-
-   -----------------------------------------------------------------------------
-   -- Fill short and long name indexes, and check its validity
-   -----------------------------------------------------------------------------
-   for _, x in ipairs(cfg) do
-      local xtype = type(x)
-      if xtype=='table' then
-         if not x.type then x.type='nil' end
-         if not legal_types[x.type] then error ("Invalid type name "..x.type) end
-         if x.short then
-            if short[x.short] then error ("multiple definitions for option "..x.short) 
-            else short[x.short] = x end
-         end
-         if x.long then
-            if long[x.long] then error ("multiple definitions for option "..x.long) 
-            else long[x.long] = x end
-         end
-      elseif xtype=='function' then
-         if param_func then error "multiple parameters handler in clopts"
-         else param_func=x end
-      end
-   end
-
-   -----------------------------------------------------------------------------
-   -- Print a help message, summarizing how to use the command line
-   -----------------------------------------------------------------------------
-   local function print_usage(msg)
-      if msg then print(msg,'\n') end
-      print(cfg.usage or "Options:\n")
-      for _, x in pairs(cfg) do
-         if type(x) == 'table' then
-            local opts = { }
-            if x.type=='boolean' then 
-               if x.short then opts = { '-'..x.short..'/+'..x.short } end
-               if x.long  then table.insert (opts, '--'..x.long..'/++'..x.long) end
-            else
-               if x.short then opts = { '-'..x.short..' <'..x.type..'>' } end
-               if x.long  then table.insert (opts,  '--'..x.long..' <'..x.type..'>' ) end
-            end
-            printf("  %s: %s", table.concat(opts,', '), x.usage or '<undocumented>')
-         end
-      end
-      print''
-   end
-
-   -- Unless overridden, -h and --help display the help msg
-   local default_help = { action = | | print_usage() or os.exit(0);
-                          long='help';short='h';type='nil'}
-   if not short.h   then short.h   = default_help end
-   if not long.help then long.help = default_help end
-
-   -----------------------------------------------------------------------------
-   -- Helper function for options parsing. Execute the attached action and/or
-   -- register the config in cfg.
-   --
-   -- * cfg  is the table which registers the options
-   -- * dict the name->config entry hash table that describes options
-   -- * flag is the prefix '-', '--' or '+'
-   -- * opt  is the option name
-   -- * i    the current index in the arguments list
-   -- * args is the arguments list
-   -----------------------------------------------------------------------------
-   local function actionate(cfg, dict, flag, opt, i, args)
-      local entry = dict[opt]
-      if not entry then print_usage ("invalid option "..flag..opt); return false; end
-      local etype, name = entry.type, entry.name or entry.long or entry.short
-      match etype with
-      | 'string' | 'number' | 'string*' | 'number*' -> 
-         if flag=='+' or flag=='++' then 
-            print_usage ("flag "..flag.." is reserved for boolean options, not for "..opt)
-            return false
-         end
-         local arg = args[i+1]
-         if not arg then 
-            print_usage ("missing parameter for option "..flag..opt)
-            return false
-         end
-         if etype:strmatch '^number' then 
-            arg = tonumber(arg)
-            if not arg then 
-               print_usage ("option "..flag..opt.." expects a number argument")
-            end
-         end
-         if etype:strmatch '%*$' then 
-            if not cfg[name] then cfg[name]={ } end
-            table.insert(cfg[name], arg)
-         else cfg[name] = arg end
-         if entry.action then entry.action(arg) end
-         return i+2
-      | 'boolean' -> 
-         local arg = flag=='-' or flag=='--'
-         cfg[name] = arg
-         if entry.action then entry.action(arg) end
-         return i+1
-      | 'nil' -> 
-         cfg[name] = true;
-         if entry.action then entry.action() end
-         return i+1
-      | '*' -> 
-         local arg = table.isub(args, i+1, #args)
-         cfg[name] = arg
-         if entry.action then entry.action(arg) end
-         return #args+1
-      |  _ -> assert( false, 'undetected bad type for clopts action')
-      end
-   end
-
-   -----------------------------------------------------------------------------
-   -- Parse a list of commands: the resulting function
-   -----------------------------------------------------------------------------
-   local function parse(...)
-      local cfg = { }
-      if not ... then return cfg end
-      local args = type(...)=='table' and ... or {...}
-      local i, i_max = 1, #args
-      while i <= i_max do         
-         local arg, flag, opt, opts = args[i]
-         --printf('beginning of loop: i=%i/%i, arg=%q', i, i_max, arg)
-         if arg=='-' then
-            i=actionate (cfg, short, '-', '', i, args)
-            -{ `Goto 'continue' }
-         end
-
-         -----------------------------------------------------------------------
-         -- double dash option
-         -----------------------------------------------------------------------
-         flag, opt = arg:strmatch "^(%-%-)(.*)"
-         if opt then
-            i=actionate (cfg, long, flag, opt, i, args)
-            -{ `Goto 'continue' }
-         end
-
-         -----------------------------------------------------------------------
-         -- double plus option
-         -----------------------------------------------------------------------
-         flag, opt = arg:strmatch "^(%+%+)(.*)"
-         if opt then
-            i=actionate (cfg, long, flag, opt, i, args)
-            -{ `Goto 'continue' }
-         end
-
-         -----------------------------------------------------------------------
-         -- single plus or single dash series of short options
-         -----------------------------------------------------------------------
-         flag, opts = arg:strmatch "^([+-])(.+)"
-         if opts then 
-            local j_max, i2 = opts:len()
-            for j = 1, j_max do
-               opt = opts:sub(j,j)
-               --printf ('parsing short opt %q', opt)               
-               i2 = actionate (cfg, short, flag, opt, i, args)
-               if i2 ~= i+1 and j < j_max then 
-                  error ('short option '..opt..' needs a param of type '..short[opt])
-               end               
-            end
-            i=i2 
-            -{ `Goto 'continue' }
-         end
-
-         -----------------------------------------------------------------------
-         -- handler for non-option parameter
-         -----------------------------------------------------------------------         
-         if param_func then param_func(args[i]) end
-         if cfg.params then table.insert(cfg.params, args[i])
-         else cfg.params = { args[i] } end
-         i=i+1
-
-         -{ `Label 'continue' }
-         if not i then return false end
-      end -- </while>
-      return cfg
-   end
-
-   return parse
-end
-
-return clopts
diff --git a/metalua/compiler.lua b/metalua/compiler.lua
index c54ed52..21355f0 100644
--- a/metalua/compiler.lua
+++ b/metalua/compiler.lua
@@ -1,4 +1,4 @@
---------------------------------------------------------------------------------
+---------------------------------------------------------------------------
 -- Copyright (c) 2006-2013 Fabien Fleutot and others.
 --
 -- All rights reserved.
@@ -25,21 +25,20 @@
 --
 -- Supported formats are:
 --
--- * luafile:    the name of a file containing sources.
--- * luastring:  these sources as a single string.
+-- * srcfile:    the name of a file containing sources.
+-- * src:        these sources as a single string.
 -- * lexstream:  a stream of lexemes.
 -- * ast:        an abstract syntax tree.
 -- * proto:      a (Yueliang) struture containing a high level
 --               representation of bytecode. Largely based on the
 --               Proto structure in Lua's VM
--- * luacstring: a string dump of the function, as taken by
+-- * bytecode:   a string dump of the function, as taken by
 --               loadstring() and produced by string.dump().
 -- * function:   an executable lua function in RAM.
 --
 --------------------------------------------------------------------------------
 
 require 'checks'
-require 'metalua.table'
 
 local M  = { }
 
@@ -61,7 +60,25 @@
 	bytecode   = { 'string', '?string' },
 }
 
-M.order = table.transpose(M.sequence)
+if true then
+    -- if defined, runs on every newly-generated AST
+    function M.check_ast(ast)
+        local function rec(x, n, parent)
+            if not x.lineinfo and parent.lineinfo then
+                local pp = require 'metalua.pprint'
+                pp.printf("WARNING: Missing lineinfo in child #%s `%s{...} of node at %s",
+                          n, x.tag or '', tostring(parent.lineinfo))
+            end
+            for i, child in ipairs(x) do
+                if type(child)=='table' then rec(child, i, x) end
+            end
+        end
+        rec(ast, -1, { })
+    end
+end
+
+
+M.order= { }; for a,b in pairs(M.sequence) do M.order[b]=a end
 
 local CONV = { } -- conversion metatable __index
 
@@ -86,19 +103,30 @@
 	checks('metalua.compiler', 'lexer.stream', '?string')
 	local r = self.parser.chunk(lx)
 	r.source = name
+    if M.check_ast then M.check_ast (r) end
 	return r, name
 end
 
+local bytecode_compiler = nil -- cache to avoid repeated `pcall(require(...))`
+local function get_bytecode_compiler()
+    if bytecode_compiler then return bytecode_compiler else
+        local status, result = pcall(require, 'metalua.compiler.bytecode')
+        if status then
+            bytecode_compiler = result
+            return result
+        elseif string.match(result, "not found") then
+            error "Compilation only available with full Metalua"
+        else error (result) end
+    end
+end
+
 function CONV :ast_to_proto(ast, name)
 	checks('metalua.compiler', 'table', '?string')
-	--table.print(ast, 'nohash', 1) io.flush()
-	local f = require 'metalua.compiler.bytecode.compile'.ast_to_proto
-	return f(ast, name), name
+    return get_bytecode_compiler().ast_to_proto(ast, name), name
 end
 
 function CONV :proto_to_bytecode(proto, name)
-	local bc = require 'metalua.compiler.bytecode'
-	return bc.proto_to_bytecode(proto), name
+    return get_bytecode_compiler().proto_to_bytecode(proto), name
 end
 
 function CONV :bytecode_to_function(bc, name)
@@ -137,13 +165,13 @@
 function CONV :function_to_bytecode(...) return string.dump(...) end
 
 function CONV :ast_to_src(...)
-	require 'metalua.package' -- ast_to_string isn't written in plain lua
+	require 'metalua.loader' -- ast_to_string isn't written in plain lua
 	return require 'metalua.compiler.ast_to_src' (...)
 end
 
 local MT = { __index=CONV, __type='metalua.compiler' }
 
-function M.new() 
+function M.new()
 	local parser = require 'metalua.compiler.parser' .new()
 	local self = { parser = parser }
 	setmetatable(self, MT)
diff --git a/metalua/compiler/ast_to_src.mlua b/metalua/compiler/ast_to_src.mlua
index 3eec4a5..65283c6 100644
--- a/metalua/compiler/ast_to_src.mlua
+++ b/metalua/compiler/ast_to_src.mlua
@@ -22,6 +22,8 @@
 local M = { }
 M.__index = M
 
+local pp=require 'metalua.pprint'
+
 --------------------------------------------------------------------------------
 -- Instanciate a new AST->source synthetizer
 --------------------------------------------------------------------------------
@@ -84,18 +86,20 @@
 --------------------------------------------------------------------------------
 -- Keywords, which are illegal as identifiers.
 --------------------------------------------------------------------------------
-local keywords = table.transpose {
+local keywords_list = {
    "and",    "break",   "do",    "else",   "elseif",
    "end",    "false",   "for",   "function", "if",
    "in",     "local",   "nil",   "not",    "or",
    "repeat", "return",  "then",  "true",   "until",
    "while" }
+local keywords = { }
+for _, kw in pairs(keywords_list) do keywords[kw]=true end
 
 --------------------------------------------------------------------------------
 -- Return true iff string `id' is a legal identifier name.
 --------------------------------------------------------------------------------
 local function is_ident (id)
-   return id:strmatch "^[%a_][%w_]*$" and not keywords[id]
+    return string['match'](id, "^[%a_][%w_]*$") and not keywords[id]
 end
 
 --------------------------------------------------------------------------------
@@ -119,7 +123,7 @@
 local op_preprec = {
    { "or", "and" },
    { "lt", "le", "eq", "ne" },
-   { "concat" }, 
+   { "concat" },
    { "add", "sub" },
    { "mul", "div", "mod" },
    { "unary", "not", "len" },
@@ -168,7 +172,7 @@
       else -- No appropriate method, fall back to splice dumping.
            -- This cannot happen in a plain Lua AST.
          self:acc " -{ "
-         self:acc (table.tostring (node, "nohash"), 80)
+         self:acc (pp.tostring (node, {metalua_tag=1, hide_hash=1}), 80)
          self:acc " }"
       end
    end
@@ -180,13 +184,13 @@
 -- it can be a string or a synth method.
 -- `start' is an optional number (default == 1), indicating which is the
 -- first element of list to be converted, so that we can skip the begining
--- of a list. 
+-- of a list.
 --------------------------------------------------------------------------------
 function M:list (list, sep, start)
    for i = start or 1, # list do
       self:node (list[i])
       if list[i + 1] then
-         if not sep then 
+         if not sep then
          elseif type (sep) == "function" then sep (self)
          elseif type (sep) == "string"   then self:acc (sep)
          else   error "Invalid list separator" end
@@ -210,7 +214,7 @@
 -- There are several things that could be refactored into common subroutines
 -- here: statement blocks dumping, function dumping...
 -- However, given their small size and linear execution
--- (they basically perform series of :acc(), :node(), :list(), 
+-- (they basically perform series of :acc(), :node(), :list(),
 -- :nl(), :nlindent() and :nldedent() calls), it seems more readable
 -- to avoid multiplication of such tiny functions.
 --
@@ -229,8 +233,8 @@
 
 function M:Set (node)
    match node with
-   | `Set{ { `Index{ lhs, `String{ method } } }, 
-           { `Function{ { `Id "self", ... } == params, body } } } 
+   | `Set{ { `Index{ lhs, `String{ method } } },
+           { `Function{ { `Id "self", ... } == params, body } } }
          if is_idx_stack (lhs) and is_ident (method) ->
       -- ``function foo:bar(...) ... end'' --
       self:acc      "function "
@@ -257,7 +261,7 @@
       self:nldedent ()
       self:acc      "end"
 
-   | `Set{ { `Id{ lhs1name } == lhs1, ... } == lhs, rhs } 
+   | `Set{ { `Id{ lhs1name } == lhs1, ... } == lhs, rhs }
          if not is_ident (lhs1name) ->
       -- ``foo, ... = ...'' when foo is *not* a valid identifier.
       -- In that case, the spliced 1st variable must get parentheses,
@@ -284,7 +288,7 @@
       for i=1,n do
           local ell, a = lhs[i], annot[i]
           self:node (ell)
-          if a then 
+          if a then
               self:acc ' #'
               self:node(a)
           end
@@ -326,7 +330,7 @@
       self:nldedent ()
    end
    -- odd number of children --> last one is an `else' clause --
-   if #node%2 == 1 then 
+   if #node%2 == 1 then
       self:acc      "else"
       self:nlindent ()
       self:list     (node[#node], self.nl)
@@ -407,7 +411,7 @@
       self:nldedent ()
       self:acc      "end"
 
-   | _ -> 
+   | _ ->
       -- Other localrec are unprintable ==> splice them --
           -- This cannot happen in a plain Lua AST. --
       self:acc "-{ "
@@ -509,7 +513,7 @@
             self:acc  "] = "
             self:node (value)
 
-         | _ -> 
+         | _ ->
             -- ``value''. --
             self:node (elem)
          end
@@ -527,7 +531,7 @@
    -- Transform ``not (a == b)'' into ``a ~= b''. --
    match node with
    | `Op{ "not", `Op{ "eq", _a, _b } }
-   | `Op{ "not", `Paren{ `Op{ "eq", _a, _b } } } ->  
+   | `Op{ "not", `Paren{ `Op{ "eq", _a, _b } } } ->
       op, a, b = "ne", _a, _b
    | _ ->
    end
@@ -586,11 +590,11 @@
    self:acc  (paren_table and ")")
 
    match key with
-   | `String{ field } if is_ident (field) -> 
+   | `String{ field } if is_ident (field) ->
       -- ``table.key''. --
       self:acc "."
       self:acc (field)
-   | _ -> 
+   | _ ->
       -- ``table [key]''. --
       self:acc   "["
       self:node (key)
@@ -606,7 +610,7 @@
       self:acc    "-{`Id "
       self:String (node, name)
       self:acc    "}"
-   end 
+   end
 end
 
 
diff --git a/metalua/compiler/bytecode/compile.lua b/metalua/compiler/bytecode/compile.lua
index 7b9116c..011517f 100644
--- a/metalua/compiler/bytecode/compile.lua
+++ b/metalua/compiler/bytecode/compile.lua
@@ -25,6 +25,8 @@
 --
 ---------------------------------------------------------------------
 
+local pp = require 'metalua.pprint'
+
 local luaK = require 'metalua.compiler.bytecode.lcode'
 local luaP = require 'metalua.compiler.bytecode.lopcodes'
 
@@ -256,7 +258,7 @@
   f.sizeupvalues = f.nups
   assert (fs.bl == nil)
   if next(fs.forward_gotos) then
-     local x = table.tostring(fs.forward_gotos)
+     local x = pp.tostring(fs.forward_gotos)
      error ("Unresolved goto: "..x)
   end
 end
@@ -882,7 +884,7 @@
       local legal = { VLOCAL=1, VUPVAL=1, VGLOBAL=1, VINDEXED=1 }
       --printv(lhs)
       if not legal [lhs.v.k] then 
-         error ("Bad lhs expr: "..table.tostring(ast_lhs)) 
+         error ("Bad lhs expr: "..pp.tostring(ast_lhs)) 
       end
       if nvars < #ast_lhs then -- this is not the last lhs
          local nv = { v = { }, prev = lhs }
@@ -1032,7 +1034,7 @@
 
 function expr.expr (fs, ast, v)
    if type(ast) ~= "table" then 
-      error ("Expr AST expected, got "..table.tostring(ast)) end
+      error ("Expr AST expected, got "..pp.tostring(ast)) end
 
    if ast.lineinfo then fs.lastline = ast.lineinfo.last.line end
 
@@ -1040,7 +1042,8 @@
    local parser = expr[ast.tag]
    if parser then parser (fs, ast, v)
    elseif not ast.tag then 
-      error ("No tag in expression "..table.tostring(ast, 'nohash', 80))
+       error ("No tag in expression "..
+              pp.tostring(ast, {line_max=80, hide_hash=1, metalua_tag=1}))
    else 
       error ("No parser for node `"..ast.tag) end
    --debugf (" - /Expression `%s", ast.tag)
@@ -1154,7 +1157,7 @@
 function expr.Index (fs, ast, v)
    if #ast ~= 2 then
       print"\n\nBAD INDEX AST:"
-      table.print(ast)
+      pp.print(ast)
       error "generalized indexes not implemented" end
 
    if ast.lineinfo then fs.lastline = ast.lineinfo.last.line end
diff --git a/metalua/compiler/bytecode/lcode.lua b/metalua/compiler/bytecode/lcode.lua
index 8e9e973..ede1a1c 100644
--- a/metalua/compiler/bytecode/lcode.lua
+++ b/metalua/compiler/bytecode/lcode.lua
@@ -162,7 +162,7 @@
   local offset = dest - (pc + 1)
   assert(dest ~= self.NO_JUMP)
   if math.abs(offset) > luaP.MAXARG_sBx then
-    luaX:syntaxerror(fs.ls, "control structure too long")
+    error("control structure too long")
   end
   luaP:SETARG_sBx(jmp, offset)
 end
@@ -317,7 +317,7 @@
   local newstack = fs.freereg + n
   if newstack > fs.f.maxstacksize then
     if newstack >= luaK.MAXSTACK then
-      luaX:syntaxerror(fs.ls, "function or expression too complex")
+      error("function or expression too complex")
     end
     fs.f.maxstacksize = newstack
   end
diff --git a/metalua/compiler/bytecode/ldump.lua b/metalua/compiler/bytecode/ldump.lua
index a5161ea..6ac7617 100644
--- a/metalua/compiler/bytecode/ldump.lua
+++ b/metalua/compiler/bytecode/ldump.lua
@@ -442,7 +442,7 @@
    local file = io.open (filename, "wb")
    file:write (buff.data)
    io.close(file)
-   if UNIX_SHARPBANG then os.execute ("chmod a+x "..filename) end
+   --if UNIX_SHARPBANG then os.execute ("chmod a+x "..filename) end
 end
 
 return M
diff --git a/metalua/compiler/globals.lua b/metalua/compiler/globals.lua
index e10b6cb..d5f7459 100644
--- a/metalua/compiler/globals.lua
+++ b/metalua/compiler/globals.lua
@@ -55,13 +55,14 @@
 end
 
 function M.load(f, name)
+   local acc = { }
    while true do
       local x = f()
       if not x then break end
       assert(type(x)=='string', "function passed to load() must return strings")
       table.insert(acc, x)
    end
-   return M.loadstring(table.concat(x))
+   return M.loadstring(table.concat(acc))
 end
 
 function M.dostring(src)
diff --git a/metalua/compiler/indent.lua b/metalua/compiler/indent.lua
deleted file mode 100644
index 0eb8bdd..0000000
--- a/metalua/compiler/indent.lua
+++ /dev/null
@@ -1,255 +0,0 @@
---------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Sierra Wireless - API and implementation
---
---------------------------------------------------------------------------------
-
--------------------------------------------------------------------------------
--- Copyright (c) 2011, 2012 Sierra Wireless and others.
--- All rights reserved. This program and the accompanying materials
--- are made available under the terms of the Eclipse Public License v1.0
--- which accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- Contributors:
---     Sierra Wireless - initial API and implementation
--------------------------------------------------------------------------------
-
----
--- Uses Metalua capabilities to indent code and provide source code offset
--- semantic depth
---
--- @module luaformatter
-local M = {}
-local mlc = require 'metalua.compiler'
-local math = require 'math'
-local walk = require 'metalua.walk'
-
----
---  calculate all ident level
--- @param Source code to analyze
--- @return #table {linenumber = identationlevel}
--- @usage local depth = format.indentLevel("local var")
-local function getindentlevel(source,indenttable)
-
-	local function getfirstline(node)
-		-- Regular node
-		local offsets = node[1].lineinfo
-		local first
-		local offset
-		-- Consider previous comments as part of current chunk
-		-- WARNING: This is NOT the default in Metalua
-		if offsets.first.comments then
-			first = offsets.first.comments.lineinfo.first.line
-			offset = offsets.first.comments.lineinfo.first.offset
-		else
-			first = offsets.first.line
-			offset = offsets.first.offset
-		end
-		return first, offset
-	end
-
-	local function getlastline(node)
-		-- Regular node
-		local offsets = node[#node].lineinfo
-		local last
-		-- Same for block end comments
-		if offsets.last.comments then
-			last = offsets.last.comments.lineinfo.last.line
-		else
-			last = offsets.last.line
-		end
-		return last
-	end
-
-	--
-	-- Define AST walker
-	--
-	local linetodepth = { 0 }
-	local walker = {
-		block = { },
-		expr  = { },
-		depth = 0,     -- Current depth while walking
-	}
-
-	function walker.block.down(node, parent,...)
-		--ignore empty node
-		if #node == 0 then
-			return end
-		-- get first line of the block
-		local startline,startoffset = getfirstline(node)
-		local endline = getlastline(node)
-		-- If the block doesn't start with a new line, don't indent the first line
-		if not source:sub(1,startoffset-1):find("[\r\n]%s*$") then
-			startline = startline + 1
-		end
-		for i=startline, endline do
-			linetodepth[i]=walker.depth
-		end
-		walker.depth = walker.depth + 1
-	end
-
-	function walker.block.up(node, ...)
-		if #node == 0 then
-			return end
-		walker.depth = walker.depth - 1
-	end
-
-	function walker.expr.down(node, parent, ...)
-		if indenttable and node.tag == 'Table' then
-			if #node == 0 then
-				return end
-			local startline,startoffset = getfirstline(node)
-			local endline = getlastline(node)
-			if source:sub(1,startoffset-1):find("[\r\n]%s*$") then
-				for i=startline, endline do
-					linetodepth[i]=walker.depth
-				end
-			else
-				for i=startline+1, endline do
-					linetodepth[i]=walker.depth
-				end
-			end
-			walker.depth = walker.depth + 1
-		elseif node.tag =='String' then
-			local firstline = node.lineinfo.first.line
-			local lastline = node.lineinfo.last.line
-			for i=firstline+1, lastline do
-				linetodepth[i]=false
-			end
-		end
-	end
-
-	function walker.expr.up(node, parent, ...)
-		if indenttable and node.tag == 'Table' then
-			if #node == 0 then
-				return end
-			walker.depth = walker.depth - 1
-		end
-	end
-
-	-- Walk through AST to build linetodepth
-	local ast = mlc.src_to_ast(source)
-	mlc.check_ast(ast)
-    walk.block(walker, ast)
-	return linetodepth
-end
-
----
--- Trim white spaces before and after given string
---
--- @usage local trimmedstr = trim('          foo')
--- @param #string string to trim
--- @return #string string trimmed
-local function trim(string)
-	local pattern = "^(%s*)(.*)"
-	local _, strip =  string:match(pattern)
-	if not strip then return string end
-	local restrip
-	_, restrip = strip:reverse():match(pattern)
-	return restrip and restrip:reverse() or strip
-end
-
----
--- Indent Lua Source Code.
--- @function [parent=#luaformatter] indentCode
--- @param source source code to format
--- @param delimiter line delimiter to use, usually '\n' or '\r\n'
--- @param indentTable boolean: whether table content must be indented
--- @param tab either a string representing a number of indentation, or the number
---   of spaces taken by a tab (often 8 or 4)
--- @param indentationSize if given, an indentation of depth `n` shifts the code
---   `indentationSize * n` chars to the right, with a mix of chars and spaces.
---   `tab` must then be a number
--- @return #string formatted code
--- @usage indentCode('local var', '\n', true, '\t')
--- @usage indentCode('local var', '\n', true, --[[tabulationSize]]4, --[[indentationSize]]2)
-function M.indentcode(source, delimiter, indenttable, tab, indentationSize)
-    checks('string', 'string', '?', 'number|string', '?numer')
-
-    -- function: generates a string which moves `depth` indentation levels from the left.
-	local tabulation
-    if indentationSize then
-        local tabSize = assert(tonumber(tab))
-		-- When tabulation size and indentation size are given,
-        -- tabulate with a mix of tabs and spaces
-		tabulation = function(depth)
-			local range      = depth * indentationSize
-			local tabCount   = math.floor(range / tabSize)
-			local spaceCount = range % tabSize
-			return string.rep('\t', tabCount) .. string.rep(' ', spaceCount)
-		end
-    else
-        if type(tab)=='number' then tab = string.rep(' ', tab) end
-		tabulation = function (depth) return tab :rep (depth) end
-	end
-
-	-- Delimiter position table: positions[x] is the offset of the first character
-    -- of the n-th delimiter in the source
-	local positions = { 1-#delimiter }
-	local a, b = nil, 0
-	repeat
-        a, b = source :find (delimiter, b+1, true)
-        if a then table.insert (positions, a) end
-    until not a
-
-	-- Don't try to indent a single line!
-	if #positions < 2 then return source end
-
-	-- calculate the line number -> indentation correspondence table
-	local linetodepth = getindentlevel(source,indenttable)
-
-	-- Concatenate string with right identation
-	local indented = { }
-	for  position=1, #positions do
-		-- Extract source code line
-		local offset = positions[position]
-		-- Get the interval between two positions
-		local rawline
-		if positions[position + 1] then
-			rawline = source:sub(offset + delimiterLength, positions[position + 1] -1)
-		else
-			-- From current prosition to end of line
-			rawline = source:sub(offset + delimiterLength)
-		end
-
-		-- Trim white spaces
-		local indentcount = linetodepth[position]
-		if not indentcount then
-			indented[#indented+1] = rawline
-		else
-			local line = trim(rawline)
-			-- Append right indentation
-			-- Indent only when there is code on the line
-			if line:len() > 0 then
-				-- Compute next real depth related offset
-				-- As is offset is pointing a white space before first statement of block,
-				-- We will work with parent node depth
-				indented[#indented+1] = tabulation( indentcount)
-				-- Append timmed source code
-				indented[#indented+1] = line
-			end
-		end
-		-- Append carriage return
-		-- While on last character append carriage return only if at end of original source
-		if position < #positions or source:sub(source:len()-delimiterLength, source:len()) == delimiter then
-			indented[#indented+1] = delimiter
-		end
-	end
-	return table.concat(indented)
-end
-
-return M
diff --git a/metalua/compiler/parser.lua b/metalua/compiler/parser.lua
index 3809c25..74997ae 100644
--- a/metalua/compiler/parser.lua
+++ b/metalua/compiler/parser.lua
@@ -21,41 +21,21 @@
 
 local MT = { __type='metalua.compiler.parser' }
 
+local MODULE_REL_NAMES = { "annot.grammar", "expr", "meta", "misc",
+                           "stat", "table", "ext" }
+
 local function new()
-    local mod_names = { "common", "expr", "lexer", "meta", "misc", "stat", "table", "ext", "annot" }
-
-    for name, _ in pairs(package.loaded) do
-        local x = name :match '^metalua.compiler.parser.(.*)'
-        if x then
-            local found = false
-            for _, y in pairs(mod_names) do
-                if x==y then found=true; break end
-            end
-            --if found then print (">> found "..x)
-            --else print(">> not found: "..x) end
+    local M = {
+        lexer = require "metalua.compiler.parser.lexer" ();
+        extensions = { } }
+    for _, rel_name in ipairs(MODULE_REL_NAMES) do
+        local abs_name = "metalua.compiler.parser."..rel_name
+        local extender = require (abs_name)
+        if not M.extensions[abs_name] then
+            if type (extender) == 'function' then extender(M) end
+            M.extensions[abs_name] = extender
         end
     end
-
-    -- Unload parser modules
-    for _, mod_name in ipairs(mod_names) do
-        package.loaded["metalua.compiler.parser."..mod_name] = nil
-    end
-
-    local M = require 'metalua.compiler.parser.common'
-
-    for _, mod_name in ipairs(mod_names) do
-        -- TODO: expose sub-modules as nested tables? 
-        -- Not sure: it might be confusing, will clash with API names, e.g. for expr
-        local mod = require ("metalua.compiler.parser."..mod_name)
-        assert (type (mod) == 'table')
-        for api_name, val in pairs(mod) do
-            assert(not M[api_name])
-            M[api_name] = val
-        end
-    end
-
-    -- TODO: remove or make somehow optional the 'ext' module
-
     return setmetatable(M, MT)
 end
 
diff --git a/metalua/compiler/parser/annot.lua b/metalua/compiler/parser/annot.lua
deleted file mode 100644
index aae9b92..0000000
--- a/metalua/compiler/parser/annot.lua
+++ /dev/null
@@ -1,138 +0,0 @@
---------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
---------------------------------------------------------------------------------
-
-local gg    = require 'metalua.grammar.generator'
-local misc  = require 'metalua.compiler.parser.misc'
-local mlp   = require 'metalua.compiler.parser.common'
-local lexer = require 'metalua.compiler.parser.lexer'
-local M     = { }
-
-lexer.lexer :add '->'
-
-function M.tid(lx)
-    local w = lx :next()
-    local t = w.tag
-    if t=='Keyword' and w[1] :match '^[%a_][%w_]*$' or w.tag=='Id' then
-        return {tag='TId'; lineinfo=w.lineinfo; w[1]}
-    else return gg.parse_error (lx, 'tid expected') end
-end
-
-local function expr(...) return mlp.expr(...) end
-
-local function te(...) return M.te(...) end
-
-local field_types = { var='TVar'; const='TConst';
-                      currently='TCurrently'; field='TField' }
-
-function M.tf(lx)
-    local tk = lx:next()
-    local w = tk[1]
-    local tag = field_types[w]
-    if not tag then error ('Invalid field type '..w)
-    elseif tag=='TField' then return {tag='TField'} else
-        local te = M.te(lx)
-        return {tag=tag; te}
-    end
-end
-
-local tebar_content = gg.list{
-    name        = 'tebar content',
-    primary     = te,
-    separators  = { ",", ";" },
-    terminators = ")" }
-
-M.tebar = gg.multisequence{
-    name = 'annot.tebar',
-    --{ '*', builder = 'TDynbar' }, -- maybe not user-available
-    { '(', tebar_content, ')',
-      builder = function(x) return x[1] end },
-    { te }
-}
-
-M.te = gg.multisequence{
-    name = 'annot.te',
-    { M.tid, builder=function(x) return x[1] end },
-    { '*', builder = 'TDyn' },
-    { "[",
-      gg.list{
-          primary = gg.sequence{
-              expr, "=", M.tf,
-              builder = 'TPair'
-          },
-          separators  = { ",", ";" },
-          terminators = { "]", "|" } },
-      gg.onkeyword{ "|", M.tf },
-      "]",
-      builder = function(x)
-                    local fields, other = unpack(x)
-                    return { tag='TTable', other or {tag='TField'}, fields }
-                end
-    }, -- "[ ... ]"
-    { '(', tebar_content, ')', '->', '(', tebar_content, ')',
-      builder = function(x)
-                    local p, r = unpack(x)
-                    return {tag='TFunction', p, r }
-                end } }
-
-
-M.ts = gg.multisequence{
-    name = 'annot.ts',
-    { 'return', tebar_content, builder='TReturn' },
-    { M.tid, builder = function(x)
-                           if x[1][1]=='pass' then return {tag='TPass'}
-                           else error "Bad statement type" end
-                       end } }
-
-
--- TODO: add parsers for statements:
--- #return tebar
--- #alias = te
--- #ell = tf
-
-M.stat_annot = gg.sequence{
-    gg.list{ primary=M.tid, separators='.' },
-    '=',
-    M.annot,
-    builder = 'Annot' }
-
-function M.opt(primary, a_type)
-    checks('table|function', 'string')
-    return gg.sequence{
-        primary,
-        gg.onkeyword{ "#", assert(M[a_type]) },
-        builder = function(x)
-                      local t, annot = unpack(x)
-                      return annot and { tag='Annot', t, annot } or t
-                  end }
-end
-
--- split a list of "foo" and "`Annot{foo, annot}" into a list of "foo"
--- and a list of "annot".
--- No annot list is returned if none of the elements were annotated.
-function M.split(lst)
-    local x, a, some = { }, { }, false
-    for i, p in ipairs(lst) do
-        if p.tag=='Annot' then
-            some, x[i], a[i] = true, unpack(p)
-        else x[i] = p end
-    end
-    if some then return x, a else return lst end
-end
-
-return M
\ No newline at end of file
diff --git a/metalua/compiler/parser/annot/generator.lua b/metalua/compiler/parser/annot/generator.lua
new file mode 100644
index 0000000..a8fcd62
--- /dev/null
+++ b/metalua/compiler/parser/annot/generator.lua
@@ -0,0 +1,48 @@
+--------------------------------------------------------------------------------
+-- Copyright (c) 2006-2013 Fabien Fleutot and others.
+--
+-- All rights reserved.
+--
+-- This program and the accompanying materials are made available
+-- under the terms of the Eclipse Public License v1.0 which
+-- accompanies this distribution, and is available at
+-- http://www.eclipse.org/legal/epl-v10.html
+--
+-- This program and the accompanying materials are also made available
+-- under the terms of the MIT public license which accompanies this
+-- distribution, and is available at http://www.lua.org/license.html
+--
+-- Contributors:
+--     Fabien Fleutot - API and implementation
+--
+--------------------------------------------------------------------------------
+
+require 'checks'
+local gg = require 'metalua.grammar.generator'
+local M  = { }
+
+function M.opt(mlc, primary, a_type)
+    checks('table', 'table|function', 'string')
+    return gg.sequence{
+        primary,
+        gg.onkeyword{ "#", function() return assert(mlc.annot[a_type]) end },
+        builder = function(x)
+            local t, annot = unpack(x)
+            return annot and { tag='Annot', t, annot } or t
+        end }
+end
+
+-- split a list of "foo" and "`Annot{foo, annot}" into a list of "foo"
+-- and a list of "annot".
+-- No annot list is returned if none of the elements were annotated.
+function M.split(lst)
+    local x, a, some = { }, { }, false
+    for i, p in ipairs(lst) do
+        if p.tag=='Annot' then
+            some, x[i], a[i] = true, unpack(p)
+        else x[i] = p end
+    end
+    if some then return x, a else return lst end
+end
+
+return M
diff --git a/metalua/compiler/parser/annot/grammar.lua b/metalua/compiler/parser/annot/grammar.lua
new file mode 100644
index 0000000..7ce3ec4
--- /dev/null
+++ b/metalua/compiler/parser/annot/grammar.lua
@@ -0,0 +1,112 @@
+--------------------------------------------------------------------------------
+-- Copyright (c) 2006-2013 Fabien Fleutot and others.
+--
+-- All rights reserved.
+--
+-- This program and the accompanying materials are made available
+-- under the terms of the Eclipse Public License v1.0 which
+-- accompanies this distribution, and is available at
+-- http://www.eclipse.org/legal/epl-v10.html
+--
+-- This program and the accompanying materials are also made available
+-- under the terms of the MIT public license which accompanies this
+-- distribution, and is available at http://www.lua.org/license.html
+--
+-- Contributors:
+--     Fabien Fleutot - API and implementation
+--
+--------------------------------------------------------------------------------
+
+local gg    = require 'metalua.grammar.generator'
+
+return function(M)
+    local _M = gg.future(M)
+    M.lexer :add '->'
+    local A = { }
+    local _A = gg.future(A)
+    M.annot = A
+
+    -- Type identifier: Lua keywords such as `"nil"` allowed.
+    function M.annot.tid(lx)
+        local w = lx :next()
+        local t = w.tag
+        if t=='Keyword' and w[1] :match '^[%a_][%w_]*$' or w.tag=='Id'
+        then return {tag='TId'; lineinfo=w.lineinfo; w[1]}
+        else return gg.parse_error (lx, 'tid expected') end
+    end
+
+    local field_types = { var='TVar'; const='TConst';
+                          currently='TCurrently'; field='TField' }
+
+    -- TODO check lineinfo
+    function M.annot.tf(lx)
+        local tk = lx:next()
+        local w = tk[1]
+        local tag = field_types[w]
+        if not tag then error ('Invalid field type '..w)
+        elseif tag=='TField' then return {tag='TField'} else
+            local te = M.te(lx)
+            return {tag=tag; te}
+        end
+    end
+
+    M.annot.tebar_content = gg.list{
+        name        = 'tebar content',
+        primary     = _A.te,
+        separators  = { ",", ";" },
+        terminators = ")" }
+
+    M.annot.tebar = gg.multisequence{
+        name = 'annot.tebar',
+        --{ '*', builder = 'TDynbar' }, -- maybe not user-available
+        { '(', _A.tebar_content, ')',
+          builder = function(x) return x[1] end },
+        { _A.te }
+    }
+
+    M.annot.te = gg.multisequence{
+        name = 'annot.te',
+        { _A.tid, builder=function(x) return x[1] end },
+        { '*', builder = 'TDyn' },
+        { "[",
+          gg.list{
+              primary = gg.sequence{
+                  _M.expr, "=", _A.tf,
+                  builder = 'TPair'
+              },
+              separators  = { ",", ";" },
+              terminators = { "]", "|" } },
+          gg.onkeyword{ "|", _A.tf },
+          "]",
+          builder = function(x)
+              local fields, other = unpack(x)
+              return { tag='TTable', other or {tag='TField'}, fields }
+          end }, -- "[ ... ]"
+        { '(', _A.tebar_content, ')', '->', '(', _A.tebar_content, ')',
+          builder = function(x)
+               local p, r = unpack(x)
+               return {tag='TFunction', p, r }
+           end } }
+
+    M.annot.ts = gg.multisequence{
+        name = 'annot.ts',
+        { 'return', _A.tebar_content, builder='TReturn' },
+        { _A.tid, builder = function(x)
+              if x[1][1]=='pass' then return {tag='TPass'}
+              else error "Bad statement type" end
+          end } }
+
+-- TODO: add parsers for statements:
+-- #return tebar
+-- #alias = te
+-- #ell = tf
+--[[
+    M.annot.stat_annot = gg.sequence{
+        gg.list{ primary=_A.tid, separators='.' },
+        '=',
+        XXX??,
+        builder = 'Annot' }
+--]]
+
+    return M.annot
+end
\ No newline at end of file
diff --git a/metalua/compiler/parser/expr.lua b/metalua/compiler/parser/expr.lua
index b277f4e..8ce4677 100644
--- a/metalua/compiler/parser/expr.lua
+++ b/metalua/compiler/parser/expr.lua
@@ -26,183 +26,181 @@
 --
 -------------------------------------------------------------------------------
 
-local gg  = require 'metalua.grammar.generator'
-local mlp = require 'metalua.compiler.parser.common'
-local M   = { }
+local pp    = require 'metalua.pprint'
+local gg    = require 'metalua.grammar.generator'
+local annot = require 'metalua.compiler.parser.annot.generator'
 
-local mlp_table = require 'metalua.compiler.parser.table'
-local mlp_meta  = require 'metalua.compiler.parser.meta'
-local mlp_misc  = require 'metalua.compiler.parser.misc'
-local annot     = require 'metalua.compiler.parser.annot'
+return function(M)
+    local _M = gg.future(M)
+    local _table = gg.future(M, 'table')
+    local _meta  = gg.future(M, 'meta') -- TODO move to ext?
+    local _annot = gg.future(M, 'annot') -- TODO move to annot
 
--- Delayed dependencies toward externally-defined parsers
-local function block (lx) return mlp.block (lx) end
-local function stat (lx) return mlp.stat (lx)  end
+    --------------------------------------------------------------------------------
+    -- Non-empty expression list. Actually, this isn't used here, but that's
+    -- handy to give to users.
+    --------------------------------------------------------------------------------
+    M.expr_list = gg.list{ primary=_M.expr, separators="," }
 
--- For recursive definitions
-local function expr (lx) return M.expr (lx) end
+    --------------------------------------------------------------------------------
+    -- Helpers for function applications / method applications
+    --------------------------------------------------------------------------------
+    M.func_args_content = gg.list{
+        name        = "function arguments",
+        primary     = _M.expr,
+        separators  = ",",
+        terminators = ")" }
 
-local id = mlp_misc.id
+    -- Used to parse methods
+    M.method_args = gg.multisequence{
+        name = "function argument(s)",
+        { "{",  _table.content, "}" },
+        { "(",  _M.func_args_content, ")", builder = unpack },
+        { "+{", _meta.quote_content, "}" },
+        -- TODO lineinfo?
+        function(lx) local r = M.opt_string(lx); return r and {r} or { } end }
+
+    --------------------------------------------------------------------------------
+    -- [func_val] parses a function, from opening parameters parenthese to
+    -- "end" keyword included. Used for anonymous functions as well as
+    -- function declaration statements (both local and global).
+    --
+    -- It's wrapped in a [_func_val] eta expansion, so that when expr
+    -- parser uses the latter, they will notice updates of [func_val]
+    -- definitions.
+    --------------------------------------------------------------------------------
+    M.func_params_content = gg.list{
+        name="function parameters",
+        gg.multisequence{ { "...", builder = "Dots" }, annot.opt(M, _M.id, 'te') },
+        separators  = ",", terminators = {")", "|"} }
+
+    -- TODO move to annot
+    M.func_val = gg.sequence{
+        name = "function body",
+        "(", _M.func_params_content, ")", _M.block, "end",
+        builder = function(x)
+             local params, body = unpack(x)
+             local annots, some = { }, false
+             for i, p in ipairs(params) do
+                 if p.tag=='Annot' then
+                     params[i], annots[i], some = p[1], p[2], true
+                 else annots[i] = false end
+             end
+             if some then return { tag='Function', params, body, annots }
+             else  return { tag='Function', params, body } end
+         end }
+
+    local func_val = function(lx) return M.func_val(lx) end
+
+    --------------------------------------------------------------------------------
+    -- Default parser for primary expressions
+    --------------------------------------------------------------------------------
+    function M.id_or_literal (lx)
+        local a = lx:next()
+        if a.tag~="Id" and a.tag~="String" and a.tag~="Number" then
+            local msg
+            if a.tag=='Eof' then
+                msg = "End of file reached when an expression was expected"
+            elseif a.tag=='Keyword' then
+                msg = "An expression was expected, and `"..a[1]..
+                    "' can't start an expression"
+            else
+                msg = "Unexpected expr token " .. pp.tostring (a)
+            end
+            gg.parse_error (lx, msg)
+        end
+        return a
+    end
 
 
---------------------------------------------------------------------------------
--- Non-empty expression list. Actually, this isn't used here, but that's
--- handy to give to users.
---------------------------------------------------------------------------------
-M.expr_list = gg.list{ primary=expr, separators="," }
+    --------------------------------------------------------------------------------
+    -- Builder generator for operators. Wouldn't be worth it if "|x|" notation
+    -- were allowed, but then lua 5.1 wouldn't compile it
+    --------------------------------------------------------------------------------
 
---------------------------------------------------------------------------------
--- Helpers for function applications / method applications
---------------------------------------------------------------------------------
-M.func_args_content = gg.list{
-    name        = "function arguments",
-    primary     = expr,
-    separators  = ",",
-    terminators = ")" }
+    -- opf1 = |op| |_,a| `Op{ op, a }
+    local function opf1 (op) return
+        function (_,a) return { tag="Op", op, a } end end
 
--- Used to parse methods
-M.method_args = gg.multisequence{
-   name = "function argument(s)",
-   { "{",  mlp_table.content, "}" },
-   { "(",  M.func_args_content, ")", builder = unpack },
-   { "+{", mlp_meta.quote_content, "}" },
-   function(lx) local r = mlp.opt_string(lx); return r and {r} or { } end }
+    -- opf2 = |op| |a,_,b| `Op{ op, a, b }
+    local function opf2 (op) return
+        function (a,_,b) return { tag="Op", op, a, b } end end
 
---------------------------------------------------------------------------------
--- [func_val] parses a function, from opening parameters parenthese to
--- "end" keyword included. Used for anonymous functions as well as
--- function declaration statements (both local and global).
---
--- It's wrapped in a [_func_val] eta expansion, so that when expr
--- parser uses the latter, they will notice updates of [func_val]
--- definitions.
---------------------------------------------------------------------------------
-M.func_params_content = gg.list{ name="function parameters",
-   gg.multisequence{ { "...", builder = "Dots" }, annot.opt(id, 'te') },
-   separators  = ",", terminators = {")", "|"} }
+    -- opf2r = |op| |a,_,b| `Op{ op, b, a } -- (args reversed)
+    local function opf2r (op) return
+        function (a,_,b) return { tag="Op", op, b, a } end end
 
-M.func_val = gg.sequence{ name="function body",
-   "(", M.func_params_content, ")", block, "end",
-   builder = function(x)
-       local params, body = unpack(x)
-       local annots, some = { }, false
-       for i, p in ipairs(params) do
-           if p.tag=='Annot' then
-               params[i], annots[i], some = p[1], p[2], true
-           else annots[i] = false end
-       end
-       if some then return { tag='Function', params, body, annots }
-       else  return { tag='Function', params, body } end
-   end }
-
-local func_val = function(lx) return M.func_val(lx) end
-
---------------------------------------------------------------------------------
--- Default parser for primary expressions
---------------------------------------------------------------------------------
-function M.id_or_literal (lx)
-   local a = lx:next()
-   if a.tag~="Id" and a.tag~="String" and a.tag~="Number" then
-      local msg
-      if a.tag=='Eof' then
-         msg = "End of file reached when an expression was expected"
-      elseif a.tag=='Keyword' then
-         msg = "An expression was expected, and `"..a[1]..
-            "' can't start an expression"
-      else
-         msg = "Unexpected expr token " .. table.tostring (a, 'nohash')
-      end
-      gg.parse_error (lx, msg)
-   end
-   return a
-end
+    local function op_ne(a, _, b)
+        -- This version allows to remove the "ne" operator from the AST definition.
+        -- However, it doesn't always produce the exact same bytecode as Lua 5.1.
+        return { tag="Op", "not",
+                 { tag="Op", "eq", a, b, lineinfo= {
+                       first = a.lineinfo.first, last = b.lineinfo.last } } }
+    end
 
 
---------------------------------------------------------------------------------
--- Builder generator for operators. Wouldn't be worth it if "|x|" notation
--- were allowed, but then lua 5.1 wouldn't compile it
---------------------------------------------------------------------------------
+    --------------------------------------------------------------------------------
+    --
+    -- complete expression
+    --
+    --------------------------------------------------------------------------------
 
--- opf1 = |op| |_,a| `Op{ op, a }
-local function opf1 (op) return
-   function (_,a) return { tag="Op", op, a } end end
+    -- FIXME: set line number. In [expr] transformers probably
+    M.expr = gg.expr {
+        name = "expression",
+        primary = gg.multisequence{
+            name = "expr primary",
+            { "(", _M.expr, ")",               builder = "Paren" },
+            { "function", _M.func_val,         builder = unpack },
+            { "-{", _meta.splice_content, "}", builder = unpack },
+            { "+{", _meta.quote_content, "}",  builder = unpack },
+            { "nil",                           builder = "Nil" },
+            { "true",                          builder = "True" },
+            { "false",                         builder = "False" },
+            { "...",                           builder = "Dots" },
+            { "{", _table.content, "}",        builder = unpack },
+            _M.id_or_literal },
 
--- opf2 = |op| |a,_,b| `Op{ op, a, b }
-local function opf2 (op) return
-   function (a,_,b) return { tag="Op", op, a, b } end end
+        infix = {
+            name = "expr infix op",
+            { "+",  prec = 60, builder = opf2 "add"  },
+            { "-",  prec = 60, builder = opf2 "sub"  },
+            { "*",  prec = 70, builder = opf2 "mul"  },
+            { "/",  prec = 70, builder = opf2 "div"  },
+            { "%",  prec = 70, builder = opf2 "mod"  },
+            { "^",  prec = 90, builder = opf2 "pow",    assoc = "right" },
+            { "..", prec = 40, builder = opf2 "concat", assoc = "right" },
+            { "==", prec = 30, builder = opf2 "eq"  },
+            { "~=", prec = 30, builder = op_ne  },
+            { "<",  prec = 30, builder = opf2 "lt"  },
+            { "<=", prec = 30, builder = opf2 "le"  },
+            { ">",  prec = 30, builder = opf2r "lt"  },
+            { ">=", prec = 30, builder = opf2r "le"  },
+            { "and",prec = 20, builder = opf2 "and" },
+            { "or", prec = 10, builder = opf2 "or"  } },
 
--- opf2r = |op| |a,_,b| `Op{ op, b, a } -- (args reversed)
-local function opf2r (op) return
-   function (a,_,b) return { tag="Op", op, b, a } end end
+        prefix = {
+            name = "expr prefix op",
+            { "not", prec = 80, builder = opf1 "not" },
+            { "#",   prec = 80, builder = opf1 "len" },
+            { "-",   prec = 80, builder = opf1 "unm" } },
 
-local function op_ne(a, _, b)
-    -- This version allows to remove the "ne" operator from the AST definition.
-    -- However, it doesn't always produce the exact same bytecode as Lua 5.1.
-    return { tag="Op", "not",
-             { tag="Op", "eq", a, b, lineinfo= {
-                   first = a.lineinfo.first, last = b.lineinfo.last } } }
-end
-
-
---------------------------------------------------------------------------------
---
--- complete expression
---
---------------------------------------------------------------------------------
-
--- FIXME: set line number. In [expr] transformers probably
-
-M.expr = gg.expr { name = "expression",
-
-   primary = gg.multisequence{ name="expr primary",
-      { "(", expr, ")",                     builder = "Paren" },
-      { "function", func_val,               builder = unpack },
-      { "-{", mlp_meta.splice_content, "}", builder = unpack },
-      { "+{", mlp_meta.quote_content, "}",  builder = unpack },
-      { "nil",                              builder = "Nil" },
-      { "true",                             builder = "True" },
-      { "false",                            builder = "False" },
-      { "...",                              builder = "Dots" },
-      mlp_table.table,
-      M.id_or_literal },
-
-   infix = { name="expr infix op",
-      { "+",  prec = 60, builder = opf2 "add"  },
-      { "-",  prec = 60, builder = opf2 "sub"  },
-      { "*",  prec = 70, builder = opf2 "mul"  },
-      { "/",  prec = 70, builder = opf2 "div"  },
-      { "%",  prec = 70, builder = opf2 "mod"  },
-      { "^",  prec = 90, builder = opf2 "pow",    assoc = "right" },
-      { "..", prec = 40, builder = opf2 "concat", assoc = "right" },
-      { "==", prec = 30, builder = opf2 "eq"  },
-      { "~=", prec = 30, builder = op_ne  },
-      { "<",  prec = 30, builder = opf2 "lt"  },
-      { "<=", prec = 30, builder = opf2 "le"  },
-      { ">",  prec = 30, builder = opf2r "lt"  },
-      { ">=", prec = 30, builder = opf2r "le"  },
-      { "and",prec = 20, builder = opf2 "and" },
-      { "or", prec = 10, builder = opf2 "or"  } },
-
-   prefix = { name="expr prefix op",
-      { "not", prec = 80, builder = opf1 "not" },
-      { "#",   prec = 80, builder = opf1 "len" },
-      { "-",   prec = 80, builder = opf1 "unm" } },
-
-   suffix = { name="expr suffix op",
-      { "[", expr, "]", builder = function (tab, idx)
-         return {tag="Index", tab, idx[1]} end},
-      { ".", id, builder = function (tab, field)
-         return {tag="Index", tab, mlp_misc.id2string(field[1])} end },
-      { "(", M.func_args_content, ")", builder = function(f, args)
-         return {tag="Call", f, unpack(args[1])} end },
-      { "{", mlp_table.content, "}", builder = function (f, arg)
-         return {tag="Call", f, arg[1]} end},
-      { ":", id, M.method_args, builder = function (obj, post)
-         local m_name, args = unpack(post)
-         return {tag="Invoke", obj, mlp_misc.id2string(m_name), unpack(args)} end},
-      { "+{", mlp_meta.quote_content, "}", builder = function (f, arg)
-         return {tag="Call", f,  arg[1] } end },
-      default = { name="opt_string_arg", parse = mlp_misc.opt_string, builder = function(f, arg)
-         return {tag="Call", f, arg } end } } }
-
-return M
\ No newline at end of file
+        suffix = {
+            name = "expr suffix op",
+            { "[", _M.expr, "]", builder = function (tab, idx)
+              return {tag="Index", tab, idx[1]} end},
+            { ".", _M.id, builder = function (tab, field)
+              return {tag="Index", tab, _M.id2string(field[1])} end },
+            { "(", _M.func_args_content, ")", builder = function(f, args)
+              return {tag="Call", f, unpack(args[1])} end },
+            { "{", _table.content, "}", builder = function (f, arg)
+              return {tag="Call", f, arg[1]} end},
+            { ":", _M.id, _M.method_args, builder = function (obj, post)
+              local m_name, args = unpack(post)
+              return {tag="Invoke", obj, _M.id2string(m_name), unpack(args)} end},
+            { "+{", _meta.quote_content, "}", builder = function (f, arg)
+              return {tag="Call", f,  arg[1] } end },
+            default = { name="opt_string_arg", parse = _M.opt_string, builder = function(f, arg)
+              return {tag="Call", f, arg } end } } }
+    return M
+end
\ No newline at end of file
diff --git a/metalua/compiler/parser/ext.lua b/metalua/compiler/parser/ext.lua
index 9e4c776..4e9d395 100644
--- a/metalua/compiler/parser/ext.lua
+++ b/metalua/compiler/parser/ext.lua
@@ -24,103 +24,73 @@
 --------------------------------------------------------------------------------
 
 local gg        = require 'metalua.grammar.generator'
-local mlp       = require 'metalua.compiler.parser.common'
-local mlp_lexer = require 'metalua.compiler.parser.lexer'
-local mlp_expr  = require 'metalua.compiler.parser.expr'
-local mlp_stat  = require 'metalua.compiler.parser.stat'
-local mlp_misc  = require 'metalua.compiler.parser.misc'
 
-local expr = mlp_expr.expr
+return function(M)
 
-local M = { }
+    local _M = gg.future(M)
 
---------------------------------------------------------------------------------
--- Algebraic Datatypes
---------------------------------------------------------------------------------
-local function adt (lx)
-   local node = mlp_misc.id (lx)
-   local tagval = node[1]
-   local tagkey = {tag="Pair", {tag="String", "tag"}, {tag="String", tagval} }
-   if lx:peek().tag == "String" or lx:peek().tag == "Number" then
-      return { tag="Table", tagkey, lx:next() }
-   elseif lx:is_keyword (lx:peek(), "{") then
-      local x = mlp.table (lx)
-      table.insert (x, 1, tagkey)
-      return x
-   else return { tag="Table", tagkey } end
-end
+    ---------------------------------------------------------------------------
+    -- Algebraic Datatypes
+    ----------------------------------------------------------------------------
+    local function adt (lx)
+        local node = _M.id (lx)
+        local tagval = node[1]
+        -- tagkey = `Pair{ `String "key", `String{ -{tagval} } }
+        local tagkey = { tag="Pair", {tag="String", "tag"}, {tag="String", tagval} }
+        if lx:peek().tag == "String" or lx:peek().tag == "Number" then
+            -- TODO support boolean litterals
+            return { tag="Table", tagkey, lx:next() }
+        elseif lx:is_keyword (lx:peek(), "{") then
+            local x = M.table.table (lx)
+            table.insert (x, 1, tagkey)
+            return x
+        else return { tag="Table", tagkey } end
+    end
 
-M.adt = gg.sequence{ "`", adt, builder = unpack }
+    M.adt = gg.sequence{ "`", adt, builder = unpack }
 
-expr.primary :add(M.adt)
+    M.expr.primary :add(M.adt)
 
---------------------------------------------------------------------------------
--- Anonymous lambda
---------------------------------------------------------------------------------
-M.lambda_expr = gg.sequence{
-   "|", mlp_expr.func_params_content, "|", expr,
-   builder = function (x)
-      local li = x[2].lineinfo
-      return { tag="Function", x[1],
-               { {tag="Return", x[2], lineinfo=li }, lineinfo=li } }
-   end }
+    ----------------------------------------------------------------------------
+    -- Anonymous lambda
+    ----------------------------------------------------------------------------
+    M.lambda_expr = gg.sequence{
+        "|", _M.func_params_content, "|", _M.expr,
+        builder = function (x)
+            local li = x[2].lineinfo
+            return { tag="Function", x[1],
+                     { {tag="Return", x[2], lineinfo=li }, lineinfo=li } }
+        end }
 
--- In an earlier version, lambda_expr took an expr_list rather than an expr
--- after the 2nd bar. However, it happened to be much more of a burden than an
--- help, So finally I disabled it. If you want to return several results,
--- use the long syntax.
---------------------------------------------------------------------------------
--- local lambda_expr = gg.sequence{
---    "|", func_params_content, "|", expr_list,
---    builder= function (x)
---       return {tag="Function", x[1], { {tag="Return", unpack(x[2]) } } } end }
+    M.expr.primary :add (M.lambda_expr)
 
-expr.primary :add (M.lambda_expr)
+    --------------------------------------------------------------------------------
+    -- Allows to write "a `f` b" instead of "f(a, b)". Taken from Haskell.
+    --------------------------------------------------------------------------------
+    function M.expr_in_backquotes (lx) return M.expr(lx, 35) end -- 35=limited precedence
+    M.expr.infix :add{ name = "infix function",
+        "`", _M.expr_in_backquotes, "`", prec = 35, assoc="left",
+        builder = function(a, op, b) return {tag="Call", op[1], a, b} end }
 
---------------------------------------------------------------------------------
--- Allows to write "a `f` b" instead of "f(a, b)". Taken from Haskell.
--- This is not part of Lua 5.1 syntax, so it's added to the expression
--- afterwards, so that it's easier to disable.
---------------------------------------------------------------------------------
-function M.expr_in_backquotes (lx) return expr(lx, 35) end
+    --------------------------------------------------------------------------------
+    -- C-style op+assignments
+    -- TODO: no protection against side-effects in LHS vars.
+    --------------------------------------------------------------------------------
+    local function op_assign(kw, op)
+        local function rhs(a, b) return { tag="Op", op, a, b } end
+        local function f(a,b)
+            if #a ~= #b then gg.parse_error "assymetric operator+assignment" end
+            local right = { }
+            local r = { tag="Set", a, right }
+            for i=1, #a do right[i] = { tag="Op", op, a[i], b[i] } end
+            return r
+        end
+        M.lexer :add (kw)
+        M.assignments[kw] = f
+    end
 
-expr.infix :add{ name = "infix function",
-   "`", M.expr_in_backquotes, "`", prec = 35, assoc="left",
-   builder = function(a, op, b) return {tag="Call", op[1], a, b} end }
+    local ops = { add='+='; sub='-='; mul='*='; div='/=' }
+    for ast_op_name, keyword in pairs(ops) do op_assign(keyword, ast_op_name) end
 
-
---------------------------------------------------------------------------------
--- table.override assignment
---------------------------------------------------------------------------------
-
-mlp_lexer.lexer:add "<-"
-mlp_stat.stat.assignments["<-"] = function (a, b)
-   assert( #a==1 and #b==1, "No multi-args for '<-'")
-   return { tag="Call", { tag="Index", { tag="Id", "table" },
-                                       { tag="String", "override" } },
-                        a[1], b[1]}
-end
-
---------------------------------------------------------------------------------
--- C-style op+assignments
--- TODO: no protection against side-effects in LHS vars.
---------------------------------------------------------------------------------
-local function op_assign(kw, op)
-   local function rhs(a, b)
-      return { tag="Op", op, a, b }
-   end
-   local function f(a,b)
-       if #a ~= #b then gg.parse_error "assymetric operator+assignment" end
-       local right = { }
-       local r = { tag="Set", a, right }
-       for i=1, #a do right[i] = { tag="Op", op, a[i], b[i] } end
-       return r
-   end
-   mlp.lexer:add (kw)
-   mlp.stat.assignments[kw] = f
-end
-
-local ops = { add='+='; sub='-='; mul='*='; div='/=' }
-for ast_op_name, keyword in pairs(ops) do op_assign(keyword, ast_op_name) end
-
-return M
\ No newline at end of file
+    return M
+end
\ No newline at end of file
diff --git a/metalua/compiler/parser/lexer.lua b/metalua/compiler/parser/lexer.lua
index 7f93f8a..2b5ff7e 100644
--- a/metalua/compiler/parser/lexer.lua
+++ b/metalua/compiler/parser/lexer.lua
@@ -1,5 +1,5 @@
 --------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
+-- Copyright (c) 2006-2014 Fabien Fleutot and others.
 --
 -- All rights reserved.
 --
@@ -18,32 +18,26 @@
 --------------------------------------------------------------------------------
 
 ----------------------------------------------------------------------
--- (Meta)lua-specific lexer, derived from the generic lexer.
-----------------------------------------------------------------------
---
--- Copyright (c) 2006-2012, Fabien Fleutot <metalua@gmail.com>.
---
--- This software is released under the MIT Licence, see licence.txt
--- for details.
---
+-- Generate a new lua-specific lexer, derived from the generic lexer.
 ----------------------------------------------------------------------
 
 local generic_lexer = require 'metalua.grammar.lexer'
-local M = { }
 
-M.lexer = generic_lexer.lexer :clone()
+return function()
+    local lexer = generic_lexer.lexer :clone()
 
-local keywords = {
-    "and", "break", "do", "else", "elseif",
-    "end", "false", "for", "function",
-    "goto", -- Lua5.2
-    "if",
-    "in", "local", "nil", "not", "or", "repeat",
-    "return", "then", "true", "until", "while",
-    "...", "..", "==", ">=", "<=", "~=",
-    "::", -- Lua5,2
-    "+{", "-{" }
- 
-for _, w in ipairs(keywords) do M.lexer :add (w) end
+    local keywords = {
+        "and", "break", "do", "else", "elseif",
+        "end", "false", "for", "function",
+        "goto", -- Lua5.2
+        "if",
+        "in", "local", "nil", "not", "or", "repeat",
+        "return", "then", "true", "until", "while",
+        "...", "..", "==", ">=", "<=", "~=",
+        "::", -- Lua5,2
+        "+{", "-{" } -- Metalua
 
-return M
+    for _, w in ipairs(keywords) do lexer :add (w) end
+
+    return lexer
+end
\ No newline at end of file
diff --git a/metalua/compiler/parser/meta.lua b/metalua/compiler/parser/meta.lua
index 396df85..71eb3c3 100644
--- a/metalua/compiler/parser/meta.lua
+++ b/metalua/compiler/parser/meta.lua
@@ -1,5 +1,5 @@
 -------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
+-- Copyright (c) 2006-2014 Fabien Fleutot and others.
 --
 -- All rights reserved.
 --
@@ -17,120 +17,122 @@
 --
 -------------------------------------------------------------------------------
 
--------------------------------------------------------------------------------
---
--- Summary: Meta-operations: AST quasi-quoting and splicing
---
--------------------------------------------------------------------------------
-
---------------------------------------------------------------------------------
---
--- Exported API:
--- * [mlp.splice_content()]
--- * [mlp.quote_content()]
---
---------------------------------------------------------------------------------
+-- Compile-time metaprogramming features: splicing ASTs generated during compilation,
+-- AST quasi-quoting helpers.
 
 local gg       = require 'metalua.grammar.generator'
-local mlp      = require 'metalua.compiler.parser.common'
-local M        = { }
 
---------------------------------------------------------------------------------
--- External splicing: compile an AST into a chunk, load and evaluate
--- that chunk, and replace the chunk by its result (which must also be
--- an AST).
---------------------------------------------------------------------------------
+return function(M)
+    local _M = gg.future(M)
+    M.meta={ }
+    local _MM = gg.future(M.meta)
 
-function M.splice (ast)
-    -- TODO: should there be one mlc per splice, or per parser instance?
-    local mlc = require 'metalua.compiler'.new()
-    local f = mlc :ast_to_function(ast, '=splice')
-    local result=f(mlp)
-    return result
-end
+    --------------------------------------------------------------------------------
+    -- External splicing: compile an AST into a chunk, load and evaluate
+    -- that chunk, and replace the chunk by its result (which must also be
+    -- an AST).
+    --------------------------------------------------------------------------------
 
---------------------------------------------------------------------------------
--- Going from an AST to an AST representing that AST
--- the only key being lifted in this version is ["tag"]
---------------------------------------------------------------------------------
-function M.quote (t)
-   --print("QUOTING:", table.tostring(t, 60,'nohash'))
-   local cases = { }
-   function cases.table (t)
-      local mt = { tag = "Table" }
-      --table.insert (mt, { tag = "Pair", quote "quote", { tag = "True" } })
-      if t.tag == "Splice" then
-         assert (#t==1, "Invalid splice")
-         local sp = t[1]
-         return sp
-      elseif t.tag then
-         table.insert (mt, { tag="Pair", M.quote "tag", M.quote(t.tag) })
-      end
-      for _, v in ipairs (t) do
-         table.insert (mt, M.quote(v))
-      end
-      return mt
-   end
-   function cases.number (t) return { tag = "Number", t, quote = true } end
-   function cases.string (t) return { tag = "String", t, quote = true } end
-   function cases.boolean (t) return { tag = t and "True" or "False", t, quote = true } end
-   local f = cases [type(t)]
-   if f then return f(t) else error ("Cannot quote an AST containing "..tostring(t)) end
-end
+    -- TODO: that's not part of the parser
+    function M.meta.eval (ast)
+        -- TODO: should there be one mlc per splice, or per parser instance?
+        local mlc = require 'metalua.compiler'.new()
+        local f = mlc :ast_to_function (ast, '=splice')
+        local result=f(M) -- splices act on the current parser
+        return result
+    end
 
---------------------------------------------------------------------------------
--- when this variable is false, code inside [-{...}] is compiled and
--- avaluated immediately. When it's true (supposedly when we're
--- parsing data inside a quasiquote), [-{foo}] is replaced by
--- [`Splice{foo}], which will be unpacked by [quote()].
---------------------------------------------------------------------------------
-M.in_a_quote = false
+    ----------------------------------------------------------------------------
+    -- Going from an AST to an AST representing that AST
+    -- the only hash-part key being lifted is `"tag"`.
+    -- Doesn't lift subtrees protected inside a `Splice{ ... }.
+    -- e.g. change `Foo{ 123 } into
+    -- `Table{ `Pair{ `String "tag", `String "foo" }, `Number 123 }
+    ----------------------------------------------------------------------------
+    local function lift (t)
+        --print("QUOTING:", table.tostring(t, 60,'nohash'))
+        local cases = { }
+        function cases.table (t)
+            local mt = { tag = "Table" }
+            --table.insert (mt, { tag = "Pair", quote "quote", { tag = "True" } })
+            if t.tag == "Splice" then
+                assert (#t==1, "Invalid splice")
+                local sp = t[1]
+                return sp
+            elseif t.tag then
+                table.insert (mt, { tag="Pair", lift "tag", lift(t.tag) })
+            end
+            for _, v in ipairs (t) do
+                table.insert (mt, lift(v))
+            end
+            return mt
+        end
+        function cases.number  (t) return { tag = "Number", t, quote = true } end
+        function cases.string  (t) return { tag = "String", t, quote = true } end
+        function cases.boolean (t) return { tag = t and "True" or "False", t, quote = true } end
+        local f = cases [type(t)]
+        if f then return f(t) else error ("Cannot quote an AST containing "..tostring(t)) end
+    end
+    M.meta.lift = lift
 
---------------------------------------------------------------------------------
--- Parse the inside of a "-{ ... }"
---------------------------------------------------------------------------------
-function M.splice_content (lx)
-   local parser_name = "expr"
-   if lx:is_keyword (lx:peek(2), ":") then
-      local a = lx:next()
-      lx:next() -- skip ":"
-      assert (a.tag=="Id", "Invalid splice parser name")
-      parser_name = a[1]
-  end
-  local parser = require 'metalua.compiler.parser'.new()
-  local ast = parser [parser_name](lx)
-   if M.in_a_quote then
-      --printf("SPLICE_IN_QUOTE:\n%s", _G.table.tostring(ast, "nohash", 60))
-      return { tag="Splice", ast }
-   else
-      if parser_name == "expr" then ast = { { tag="Return", ast } }
-      elseif parser_name == "stat"  then ast = { ast }
-      elseif parser_name ~= "block" then
-         error ("splice content must be an expr, stat or block") end
-      --printf("EXEC THIS SPLICE:\n%s", _G.table.tostring(ast, "nohash", 60))
-      return M.splice (ast)
-   end
-end
+    --------------------------------------------------------------------------------
+    -- when this variable is false, code inside [-{...}] is compiled and
+    -- avaluated immediately. When it's true (supposedly when we're
+    -- parsing data inside a quasiquote), [-{foo}] is replaced by
+    -- [`Splice{foo}], which will be unpacked by [quote()].
+    --------------------------------------------------------------------------------
+    local in_a_quote = false
 
---------------------------------------------------------------------------------
--- Parse the inside of a "+{ ... }"
---------------------------------------------------------------------------------
-function M.quote_content (lx)
-   local parser
-   if lx:is_keyword (lx:peek(2), ":") then -- +{parser: content }
-      parser = mlp[mlp.id(lx)[1]]
-      lx:next()
-   else -- +{ content }
-      parser = mlp.expr
-   end
+    --------------------------------------------------------------------------------
+    -- Parse the inside of a "-{ ... }"
+    --------------------------------------------------------------------------------
+    function M.meta.splice_content (lx)
+        local parser_name = "expr"
+        if lx:is_keyword (lx:peek(2), ":") then
+            local a = lx:next()
+            lx:next() -- skip ":"
+            assert (a.tag=="Id", "Invalid splice parser name")
+            parser_name = a[1]
+        end
+        -- TODO FIXME running a new parser with the old lexer?!
+        local parser = require 'metalua.compiler.parser'.new()
+        local ast = parser [parser_name](lx)
+        if in_a_quote then -- only prevent quotation in this subtree
+            --printf("SPLICE_IN_QUOTE:\n%s", _G.table.tostring(ast, "nohash", 60))
+            return { tag="Splice", ast }
+        else -- convert in a block, eval, replace with result
+            if parser_name == "expr" then ast = { { tag="Return", ast } }
+            elseif parser_name == "stat"  then ast = { ast }
+            elseif parser_name ~= "block" then
+                error ("splice content must be an expr, stat or block") end
+            --printf("EXEC THIS SPLICE:\n%s", _G.table.tostring(ast, "nohash", 60))
+            return M.meta.eval (ast)
+        end
+    end
 
-   local prev_iq = M.in_a_quote
-   M.in_a_quote = true
-   --print("IN_A_QUOTE")
-   local content = parser (lx)
-   local q_content = M.quote (content)
-   M.in_a_quote = prev_iq
-   return q_content
-end
+    M.meta.splice = gg.sequence{ "-{", _MM.splice_content, "}", builder=unpack }
 
-return M
+    --------------------------------------------------------------------------------
+    -- Parse the inside of a "+{ ... }"
+    --------------------------------------------------------------------------------
+    function M.meta.quote_content (lx)
+        local parser
+        if lx:is_keyword (lx:peek(2), ":") then -- +{parser: content }
+            local parser_name = M.id(lx)[1]
+            parser = M[parser_name]
+            lx:next() -- skip ":"
+        else -- +{ content }
+            parser = M.expr
+        end
+
+        local prev_iq = in_a_quote
+        in_a_quote = true
+        --print("IN_A_QUOTE")
+        local content = parser (lx)
+        local q_content = M.meta.lift (content)
+        in_a_quote = prev_iq
+        return q_content
+    end
+
+    return M
+end
\ No newline at end of file
diff --git a/metalua/compiler/parser/misc.lua b/metalua/compiler/parser/misc.lua
index 27bd5d3..a24b006 100644
--- a/metalua/compiler/parser/misc.lua
+++ b/metalua/compiler/parser/misc.lua
@@ -37,136 +37,139 @@
 --------------------------------------------------------------------------------
 
 local gg       = require 'metalua.grammar.generator'
-local mlp      = require 'metalua.compiler.parser.common'
-local mlp_meta = require 'metalua.compiler.parser.meta'
-local M        = { }
 
+-- TODO: replace splice-aware versions with naive ones, move etensions in ./meta
 
-local splice = gg.sequence{ "-{", mlp_meta.splice_content, "}", builder=unpack }
+return function(M)
+    local _M = gg.future(M)
 
---------------------------------------------------------------------------------
--- returns a function that takes the [n]th element of a table.
--- if [tag] is provided, then this element is expected to be a
--- table, and this table receives a "tag" field whose value is
--- set to [tag].
---
--- The primary purpose of this is to generate builders for
--- grammar generators. It has little purpose in metalua, as lambda has
--- a lightweight syntax.
---------------------------------------------------------------------------------
+--[[ metaprog-free versions:
+    function M.id(lx)
+        if lx:peek().tag~='Id' then gg.parse_error(lx, "Identifier expected")
+        else return lx:next() end
+    end
 
-function M.fget (n, tag) 
-   assert (type (n) == "number")
-   if tag then
-      assert (type (tag) == "string")
-      return function (x) 
-         assert (type (x[n]) == "table")       
-         return {tag=tag, unpack(x[n])} end 
-   else
-      return function (x) return x[n] end 
-   end
-end
+    function M.opt_id(lx)
+        if lx:peek().tag~='Id' then return lx:next() else return false end
+    end
 
+    function M.string(lx)
+        if lx:peek().tag~='String' then gg.parse_error(lx, "String expected")
+        else return lx:next() end
+    end
 
---------------------------------------------------------------------------------
--- Try to read an identifier (possibly as a splice), or return [false] if no
--- id is found.
---------------------------------------------------------------------------------
-function M.opt_id (lx)
-   local a = lx:peek();
-   if lx:is_keyword (a, "-{") then
-       local v = splice(lx)
-       if v.tag ~= "Id" and v.tag ~= "Splice" then
-           gg.parse_error(lx, "Bad id splice")
-       end
-       return v
-   elseif a.tag == "Id" then return lx:next()
-   else return false end
-end
+    function M.opt_string(lx)
+        if lx:peek().tag~='String' then return lx:next() else return false end
+    end
 
---------------------------------------------------------------------------------
--- Mandatory reading of an id: causes an error if it can't read one.
---------------------------------------------------------------------------------
-function M.id (lx)
-   return M.opt_id (lx) or gg.parse_error(lx,"Identifier expected")
-end
+    --------------------------------------------------------------------------------
+    -- Converts an identifier into a string. Hopefully one day it'll handle
+    -- splices gracefully, but that proves quite tricky.
+    --------------------------------------------------------------------------------
+    function M.id2string (id)
+        if id.tag == "Id" then id.tag = "String"; return id
+        else error ("Identifier expected: "..table.tostring(id, 'nohash')) end
+    end
+--]]
 
---------------------------------------------------------------------------------
--- Common helper function
---------------------------------------------------------------------------------
-M.id_list = gg.list { primary = M.id, separators = "," }
+    --------------------------------------------------------------------------------
+    -- Try to read an identifier (possibly as a splice), or return [false] if no
+    -- id is found.
+    --------------------------------------------------------------------------------
+    function M.opt_id (lx)
+        local a = lx:peek();
+        if lx:is_keyword (a, "-{") then
+            local v = M.meta.splice(lx)
+            if v.tag ~= "Id" and v.tag ~= "Splice" then
+                gg.parse_error(lx, "Bad id splice")
+            end
+            return v
+        elseif a.tag == "Id" then return lx:next()
+        else return false end
+    end
 
---------------------------------------------------------------------------------
--- Converts an identifier into a string. Hopefully one day it'll handle
--- splices gracefully, but that proves quite tricky.
---------------------------------------------------------------------------------
-function M.id2string (id)
-   --print("id2string:", disp.ast(id))
-   if id.tag == "Id" then id.tag = "String"; return id
-   elseif id.tag == "Splice" then
-      assert (mlp_meta.in_a_quote, "can't do id2string on an outermost splice")
-      error ("id2string on splice not implemented")
-      -- Evaluating id[1] will produce `Id{ xxx },
-      -- and we want it to produce `String{ xxx }
-      -- Morally, this is what I want:
-      -- return `String{ `Index{ `Splice{ id[1] }, `Number 1 } }
-      -- That is, without sugar:
-      return {tag="String",  {tag="Index", {tag="Splice", id[1] },
-                                           {tag="Number", 1 } } }
-   else error ("Identifier expected: "..table.tostring(id, 'nohash')) end
-end
+    --------------------------------------------------------------------------------
+    -- Mandatory reading of an id: causes an error if it can't read one.
+    --------------------------------------------------------------------------------
+    function M.id (lx)
+        return M.opt_id (lx) or gg.parse_error(lx,"Identifier expected")
+    end
 
---------------------------------------------------------------------------------
--- Read a string, possibly spliced, or return an error if it can't
---------------------------------------------------------------------------------
-function M.string (lx)
-   local a = lx:peek()
-   if lx:is_keyword (a, "-{") then
-      local v = splice(lx)
-      if v.tag ~= "String" and v.tag ~= "Splice" then
-         gg.parse_error(lx,"Bad string splice")
-      end
-      return v
-   elseif a.tag == "String" then return lx:next()
-   else error "String expected" end
-end
+    --------------------------------------------------------------------------------
+    -- Common helper function
+    --------------------------------------------------------------------------------
+    M.id_list = gg.list { primary = _M.id, separators = "," }
 
---------------------------------------------------------------------------------
--- Try to read a string, or return false if it can't. No splice allowed.
---------------------------------------------------------------------------------
-function M.opt_string (lx)
-   return lx:peek().tag == "String" and lx:next()
-end
-   
---------------------------------------------------------------------------------
--- Chunk reader: block + Eof
---------------------------------------------------------------------------------
-function M.skip_initial_sharp_comment (lx)
-   -- Dirty hack: I'm happily fondling lexer's private parts
-   -- FIXME: redundant with lexer:newstream()
-   lx :sync()
-   local i = lx.src:match ("^#.-\n()", lx.i)
-   if i then
-      lx.i = i
-      lx.column_offset = i
-      lx.line = lx.line and lx.line + 1 or 1
-   end
-end
+    --------------------------------------------------------------------------------
+    -- Converts an identifier into a string. Hopefully one day it'll handle
+    -- splices gracefully, but that proves quite tricky.
+    --------------------------------------------------------------------------------
+    function M.id2string (id)
+        --print("id2string:", disp.ast(id))
+        if id.tag == "Id" then id.tag = "String"; return id
+        elseif id.tag == "Splice" then
+            error ("id2string on splice not implemented")
+            -- Evaluating id[1] will produce `Id{ xxx },
+            -- and we want it to produce `String{ xxx }.
+            -- The following is the plain notation of:
+            -- +{ `String{ `Index{ `Splice{ -{id[1]} }, `Number 1 } } }
+            return { tag="String",  { tag="Index", { tag="Splice", id[1] },
+                                      { tag="Number", 1 } } }
+        else error ("Identifier expected: "..table.tostring(id, 'nohash')) end
+    end
 
-local function chunk (lx)
-   if lx:peek().tag == 'Eof' then
-       return { } -- handle empty files
-   else
-      M.skip_initial_sharp_comment (lx)
-      local chunk = mlp.block (lx)
-      if lx:peek().tag ~= "Eof" then
-          gg.parse_error(lx, "End-of-file expected")
-      end
-      return chunk
-   end
-end
+    --------------------------------------------------------------------------------
+    -- Read a string, possibly spliced, or return an error if it can't
+    --------------------------------------------------------------------------------
+    function M.string (lx)
+        local a = lx:peek()
+        if lx:is_keyword (a, "-{") then
+            local v = M.meta.splice(lx)
+            if v.tag ~= "String" and v.tag ~= "Splice" then
+                gg.parse_error(lx,"Bad string splice")
+            end
+            return v
+        elseif a.tag == "String" then return lx:next()
+        else error "String expected" end
+    end
 
--- chunk is wrapped in a sequence so that it has a "transformer" field.
-M.chunk = gg.sequence { chunk, builder = unpack }
+    --------------------------------------------------------------------------------
+    -- Try to read a string, or return false if it can't. No splice allowed.
+    --------------------------------------------------------------------------------
+    function M.opt_string (lx)
+        return lx:peek().tag == "String" and lx:next()
+    end
 
-return M
\ No newline at end of file
+    --------------------------------------------------------------------------------
+    -- Chunk reader: block + Eof
+    --------------------------------------------------------------------------------
+    function M.skip_initial_sharp_comment (lx)
+        -- Dirty hack: I'm happily fondling lexer's private parts
+        -- FIXME: redundant with lexer:newstream()
+        lx :sync()
+        local i = lx.src:match ("^#.-\n()", lx.i)
+        if i then
+            lx.i = i
+            lx.column_offset = i
+            lx.line = lx.line and lx.line + 1 or 1
+        end
+    end
+
+    local function chunk (lx)
+        if lx:peek().tag == 'Eof' then
+            return { } -- handle empty files
+        else
+            M.skip_initial_sharp_comment (lx)
+            local chunk = M.block (lx)
+            if lx:peek().tag ~= "Eof" then
+                gg.parse_error(lx, "End-of-file expected")
+            end
+            return chunk
+        end
+    end
+
+    -- chunk is wrapped in a sequence so that it has a "transformer" field.
+    M.chunk = gg.sequence { chunk, builder = unpack }
+
+    return M
+end
\ No newline at end of file
diff --git a/metalua/compiler/parser/stat.lua b/metalua/compiler/parser/stat.lua
index 846f56d..5d5e3a9 100644
--- a/metalua/compiler/parser/stat.lua
+++ b/metalua/compiler/parser/stat.lua
@@ -1,4 +1,4 @@
--------------------------------------------------------------------------------
+------------------------------------------------------------------------------
 -- Copyright (c) 2006-2013 Fabien Fleutot and others.
 --
 -- All rights reserved.
@@ -34,19 +34,9 @@
 -------------------------------------------------------------------------------
 
 local lexer    = require 'metalua.grammar.lexer'
-local mlp      = require 'metalua.compiler.parser.common'
-local mlp_misc = require 'metalua.compiler.parser.misc'
-local mlp_meta = require 'metalua.compiler.parser.meta'
-local annot    = require 'metalua.compiler.parser.annot'
 local gg       = require 'metalua.grammar.generator'
-local M        = { }
 
---------------------------------------------------------------------------------
--- eta-expansions to break circular dependency
---------------------------------------------------------------------------------
-local expr      = function (lx) return mlp.expr     (lx) end
-local func_val  = function (lx) return mlp.func_val (lx) end
-local expr_list = function (lx) return mlp.expr_list(lx) end
+local annot = require 'metalua.compiler.parser.annot.generator'
 
 --------------------------------------------------------------------------------
 -- List of all keywords that indicate the end of a statement block. Users are
@@ -54,233 +44,236 @@
 --------------------------------------------------------------------------------
 
 
-M.block_terminators = { "else", "elseif", "end", "until", ")", "}", "]" }
+return function(M)
+    local _M = gg.future(M)
 
--- FIXME: this must be handled from within GG!!!
-function M.block_terminators :add(x)
-   if type (x) == "table" then for _, y in ipairs(x) do self :add (y) end
-   else table.insert (self, x) end
-end
+    M.block_terminators = { "else", "elseif", "end", "until", ")", "}", "]" }
 
---------------------------------------------------------------------------------
--- list of statements, possibly followed by semicolons
---------------------------------------------------------------------------------
-M.block = gg.list {
-   name        = "statements block",
-   terminators = M.block_terminators,
-   primary     = function (lx)
-      -- FIXME use gg.optkeyword()
-      local x = mlp.stat (lx)
-      if lx:is_keyword (lx:peek(), ";") then lx:next() end
-      return x
-   end }
-
---------------------------------------------------------------------------------
--- Helper function for "return <expr_list>" parsing.
--- Called when parsing return statements.
--- The specific test for initial ";" is because it's not a block terminator,
--- so without it gg.list would choke on "return ;" statements.
--- We don't make a modified copy of block_terminators because this list
--- is sometimes modified at runtime, and the return parser would get out of
--- sync if it was relying on a copy.
---------------------------------------------------------------------------------
-local return_expr_list_parser = gg.multisequence{
-   { ";" , builder = function() return { } end },
-   default = gg.list {
-      expr, separators = ",", terminators = M.block_terminators } }
-
-
-local for_vars_list = gg.list{
-    name        = "for variables list",
-    primary     = mlp_misc.id,
-    separators  = ",",
-    terminators = "in" }
-
---------------------------------------------------------------------------------
--- for header, between [for] and [do] (exclusive).
--- Return the `Forxxx{...} AST, without the body element (the last one).
---------------------------------------------------------------------------------
-function M.for_header (lx)
-    local vars = mlp.id_list(lx)
-    if lx :is_keyword (lx:peek(), "=") then
-        if #vars ~= 1 then
-            gg.parse_error (lx, "numeric for only accepts one variable")
-        end
-        lx:next() -- skip "="
-        local exprs = mlp.expr_list (lx)
-        if #exprs < 2 or #exprs > 3 then
-            gg.parse_error (lx, "numeric for requires 2 or 3 boundaries")
-        end
-        return { tag="Fornum", vars[1], unpack (exprs) }
-    else
-        if not lx :is_keyword (lx :next(), "in") then
-            gg.parse_error (lx, '"=" or "in" expected in for loop')
-        end
-        local exprs = mlp.expr_list (lx)
-        return { tag="Forin", vars, exprs }
+    -- FIXME: this must be handled from within GG!!!
+    -- FIXME: there's no :add method in the list anyway. Added by gg.list?!
+    function M.block_terminators :add(x)
+        if type (x) == "table" then for _, y in ipairs(x) do self :add (y) end
+        else table.insert (self, x) end
     end
-end
 
---------------------------------------------------------------------------------
--- Function def parser helper: id ( . id ) *
---------------------------------------------------------------------------------
-local function fn_builder (list)
-   local acc = list[1]
-   local first = acc.lineinfo.first
-   for i = 2, #list do
-       local index = mlp.id2string(list[i])
-       local li = lexer.new_lineinfo(first, index.lineinfo.last)
-       acc = { tag="Index", acc, index, lineinfo=li }
-   end
-   return acc
-end
-local func_name = gg.list{ mlp_misc.id, separators = ".", builder = fn_builder }
+    ----------------------------------------------------------------------------
+    -- list of statements, possibly followed by semicolons
+    ----------------------------------------------------------------------------
+    M.block = gg.list {
+        name        = "statements block",
+        terminators = M.block_terminators,
+        primary     = function (lx)
+            -- FIXME use gg.optkeyword()
+            local x = M.stat (lx)
+            if lx:is_keyword (lx:peek(), ";") then lx:next() end
+            return x
+        end }
 
---------------------------------------------------------------------------------
--- Function def parser helper: ( : id )?
---------------------------------------------------------------------------------
-local method_name = gg.onkeyword{ name = "method invocation", ":", mlp_misc.id,
-   transformers = { function(x) return x and x.tag=='Id' and mlp_misc.id2string(x) end } }
-
---------------------------------------------------------------------------------
--- Function def builder
---------------------------------------------------------------------------------
-local function funcdef_builder(x)
-
-   local name, method, func = unpack(x)
-
-   if method then
-      name = { tag="Index", name, method, lineinfo = {
-         first = name.lineinfo.first,
-         last  = method.lineinfo.last } }
-      table.insert (func[1], 1, {tag="Id", "self"})
-   end
-   local r = { tag="Set", {name}, {func} }
-   r[1].lineinfo = name.lineinfo
-   r[2].lineinfo = func.lineinfo
-   return r
-end
+    ----------------------------------------------------------------------------
+    -- Helper function for "return <expr_list>" parsing.
+    -- Called when parsing return statements.
+    -- The specific test for initial ";" is because it's not a block terminator,
+    -- so without it gg.list would choke on "return ;" statements.
+    -- We don't make a modified copy of block_terminators because this list
+    -- is sometimes modified at runtime, and the return parser would get out of
+    -- sync if it was relying on a copy.
+    ----------------------------------------------------------------------------
+    local return_expr_list_parser = gg.multisequence{
+        { ";" , builder = function() return { } end },
+        default = gg.list {
+            _M.expr, separators = ",", terminators = M.block_terminators } }
 
 
---------------------------------------------------------------------------------
--- if statement builder
---------------------------------------------------------------------------------
-local function if_builder (x)
-   local cond_block_pairs, else_block, r = x[1], x[2], {tag="If"}
-   local n_pairs = #cond_block_pairs
-   for i = 1, n_pairs do
-       local cond, block = unpack(cond_block_pairs[i])
-       r[2*i-1], r[2*i] = cond, block
-   end
-   if else_block then table.insert(r, #r+1, else_block) end
-   return r
-end 
+    local for_vars_list = gg.list{
+        name        = "for variables list",
+        primary     = _M.id,
+        separators  = ",",
+        terminators = "in" }
 
---------------------------------------------------------------------------------
--- produce a list of (expr,block) pairs
---------------------------------------------------------------------------------
-local elseifs_parser = gg.list {
-   gg.sequence { expr, "then", M.block , name='elseif parser' },
-   separators  = "elseif",
-   terminators = { "else", "end" }
-}
+    ----------------------------------------------------------------------------
+    -- for header, between [for] and [do] (exclusive).
+    -- Return the `Forxxx{...} AST, without the body element (the last one).
+    ----------------------------------------------------------------------------
+    function M.for_header (lx)
+        local vars = M.id_list(lx)
+        if lx :is_keyword (lx:peek(), "=") then
+            if #vars ~= 1 then
+                gg.parse_error (lx, "numeric for only accepts one variable")
+            end
+            lx:next() -- skip "="
+            local exprs = M.expr_list (lx)
+            if #exprs < 2 or #exprs > 3 then
+                gg.parse_error (lx, "numeric for requires 2 or 3 boundaries")
+            end
+            return { tag="Fornum", vars[1], unpack (exprs) }
+        else
+            if not lx :is_keyword (lx :next(), "in") then
+                gg.parse_error (lx, '"=" or "in" expected in for loop')
+            end
+            local exprs = M.expr_list (lx)
+            return { tag="Forin", vars, exprs }
+        end
+    end
 
-local annot_expr = gg.sequence {
-    expr,
-    gg.onkeyword{ "#", annot.tf },
-    builder = function(x) 
-                  local e, a = unpack(x)
-                  if a then return { tag='Annot', e, a }
-                  else return e end
-              end }
+    ----------------------------------------------------------------------------
+    -- Function def parser helper: id ( . id ) *
+    ----------------------------------------------------------------------------
+    local function fn_builder (list)
+        local acc = list[1]
+        local first = acc.lineinfo.first
+        for i = 2, #list do
+            local index = M.id2string(list[i])
+            local li = lexer.new_lineinfo(first, index.lineinfo.last)
+            acc = { tag="Index", acc, index, lineinfo=li }
+        end
+        return acc
+    end
+    local func_name = gg.list{ _M.id, separators = ".", builder = fn_builder }
 
-local annot_expr_list = gg.list {
-    primary = annot.opt(expr, 'tf'), separators = ',' }
+    ----------------------------------------------------------------------------
+    -- Function def parser helper: ( : id )?
+    ----------------------------------------------------------------------------
+    local method_name = gg.onkeyword{ name = "method invocation", ":", _M.id,
+        transformers = { function(x) return x and x.tag=='Id' and M.id2string(x) end } }
 
---------------------------------------------------------------------------------
--- assignments and calls: statements that don't start with a keyword
---------------------------------------------------------------------------------
-local function assign_or_call_stat_parser (lx)
-   local e = annot_expr_list (lx)
-   local a = lx:is_keyword(lx:peek())
-   local op = a and mlp.stat.assignments[a]
-   -- TODO: refactor annotations
-   if op then
-      --FIXME: check that [e] is a LHS
-      lx :next()
-      local annots
-      e, annots = annot.split(e)
-      local v = mlp.expr_list (lx)
-      if type(op)=="string" then return { tag=op, e, v, annots }
-      else return op (e, v) end
-   else
-      assert (#e > 0)
-      if #e > 1 then
-         gg.parse_error (lx,
-            "comma is not a valid statement separator; statement can be "..
-            "separated by semicolons, or not separated at all") end
-      if e[1].tag ~= "Call" and e[1].tag ~= "Invoke" then
-         local typename
-         if e[1].tag == 'Id' then
-            typename = '("'..e[1][1]..'") is an identifier'
-         elseif e[1].tag == 'Op' then
-            typename = "is an arithmetic operation"
-         else typename = "is of type '"..(e[1].tag or "<list>").."'" end
+    ----------------------------------------------------------------------------
+    -- Function def builder
+    ----------------------------------------------------------------------------
+    local function funcdef_builder(x)
+        local name, method, func = unpack(x)
+        if method then
+            name = { tag="Index", name, method,
+                     lineinfo = {
+                         first = name.lineinfo.first,
+                         last  = method.lineinfo.last } }
+            table.insert (func[1], 1, {tag="Id", "self"})
+        end
+        local r = { tag="Set", {name}, {func} }
+        r[1].lineinfo = name.lineinfo
+        r[2].lineinfo = func.lineinfo
+        return r
+    end
 
-         gg.parse_error (lx,
-                         "This expression %s; "..
-                         "a statement was expected, and only function and method call "..
-                         "expressions can be used as statements", typename);
-      end
-      return e[1]
-   end
-end
 
-M.local_stat_parser = gg.multisequence{
-    -- local function <name> <func_val>
-    { "function", mlp_misc.id, func_val, builder =
-      function(x)
-          local vars = { x[1], lineinfo = x[1].lineinfo }
-          local vals = { x[2], lineinfo = x[2].lineinfo }
-          return { tag="Localrec", vars, vals }
-      end },
-    -- local <id_list> ( = <expr_list> )?
-    default = gg.sequence{
-        gg.list{
-            primary = annot.opt(mlp_misc.id, 'tf'),
-            separators = ',' },
-        gg.onkeyword{ "=", expr_list },
+    ----------------------------------------------------------------------------
+    -- if statement builder
+    ----------------------------------------------------------------------------
+    local function if_builder (x)
+        local cond_block_pairs, else_block, r = x[1], x[2], {tag="If"}
+        local n_pairs = #cond_block_pairs
+        for i = 1, n_pairs do
+            local cond, block = unpack(cond_block_pairs[i])
+            r[2*i-1], r[2*i] = cond, block
+        end
+        if else_block then table.insert(r, #r+1, else_block) end
+        return r
+    end
+
+    --------------------------------------------------------------------------------
+    -- produce a list of (expr,block) pairs
+    --------------------------------------------------------------------------------
+    local elseifs_parser = gg.list {
+        gg.sequence { _M.expr, "then", _M.block , name='elseif parser' },
+        separators  = "elseif",
+        terminators = { "else", "end" }
+    }
+
+    local annot_expr = gg.sequence {
+        _M.expr,
+        gg.onkeyword{ "#", gg.future(M, 'annot').tf },
         builder = function(x)
-            local annotated_left, right = unpack(x)
-            local left, annotations = annot.split(annotated_left)
-            return {tag="Local", left, right or { }, annotations }
-        end } }
+            local e, a = unpack(x)
+            if a then return { tag='Annot', e, a }
+            else return e end
+        end }
 
---------------------------------------------------------------------------------
--- statement
---------------------------------------------------------------------------------
-M.stat = gg.multisequence {
-   name = "statement",
-   { "do", M.block, "end", builder =
-      function (x) return { tag="Do", unpack (x[1]) } end },
-   { "for", M.for_header, "do", M.block, "end", builder =
-      function (x) x[1][#x[1]+1] = x[2]; return x[1] end },
-   { "function", func_name, method_name, func_val, builder=funcdef_builder },
-   { "while", expr, "do", M.block, "end", builder = "While" },
-   { "repeat", M.block, "until", expr, builder = "Repeat" },
-   { "local", M.local_stat_parser, builder = unpack },
-   { "return", return_expr_list_parser, builder =
-     function(x) x[1].tag='Return'; return x[1] end },
-   { "break", builder = function() return { tag="Break" } end },
-   { "-{", mlp_meta.splice_content, "}", builder = unpack },
-   { "if", gg.nonempty(elseifs_parser), gg.onkeyword{ "else", M.block }, "end",
-     builder = if_builder },
-   default = assign_or_call_stat_parser }
+    local annot_expr_list = gg.list {
+        primary = annot.opt(M, _M.expr, 'tf'), separators = ',' }
 
-M.stat.assignments = {
-   ["="] = "Set"
-}
+    ------------------------------------------------------------------------
+    -- assignments and calls: statements that don't start with a keyword
+    ------------------------------------------------------------------------
+    local function assign_or_call_stat_parser (lx)
+        local e = annot_expr_list (lx)
+        local a = lx:is_keyword(lx:peek())
+        local op = a and M.assignments[a]
+        -- TODO: refactor annotations
+        if op then
+            --FIXME: check that [e] is a LHS
+            lx :next()
+            local annots
+            e, annots = annot.split(e)
+            local v = M.expr_list (lx)
+            if type(op)=="string" then return { tag=op, e, v, annots }
+            else return op (e, v) end
+        else
+            assert (#e > 0)
+            if #e > 1 then
+                gg.parse_error (lx,
+                    "comma is not a valid statement separator; statement can be "..
+                    "separated by semicolons, or not separated at all")
+            elseif e[1].tag ~= "Call" and e[1].tag ~= "Invoke" then
+                local typename
+                if e[1].tag == 'Id' then
+                    typename = '("'..e[1][1]..'") is an identifier'
+                elseif e[1].tag == 'Op' then
+                    typename = "is an arithmetic operation"
+                else typename = "is of type '"..(e[1].tag or "<list>").."'" end
+                gg.parse_error (lx,
+                     "This expression %s; "..
+                     "a statement was expected, and only function and method call "..
+                     "expressions can be used as statements", typename);
+            end
+            return e[1]
+        end
+    end
 
-function M.stat.assignments:add(k, v) self[k] = v end
+    M.local_stat_parser = gg.multisequence{
+        -- local function <name> <func_val>
+        { "function", _M.id, _M.func_val, builder =
+          function(x)
+              local vars = { x[1], lineinfo = x[1].lineinfo }
+              local vals = { x[2], lineinfo = x[2].lineinfo }
+              return { tag="Localrec", vars, vals }
+          end },
+        -- local <id_list> ( = <expr_list> )?
+        default = gg.sequence{
+            gg.list{
+                primary = annot.opt(M, _M.id, 'tf'),
+                separators = ',' },
+            gg.onkeyword{ "=", _M.expr_list },
+            builder = function(x)
+                 local annotated_left, right = unpack(x)
+                 local left, annotations = annot.split(annotated_left)
+                 return {tag="Local", left, right or { }, annotations }
+             end } }
 
-return M
\ No newline at end of file
+    ------------------------------------------------------------------------
+    -- statement
+    ------------------------------------------------------------------------
+    M.stat = gg.multisequence {
+        name = "statement",
+        { "do", _M.block, "end", builder =
+          function (x) return { tag="Do", unpack (x[1]) } end },
+        { "for", _M.for_header, "do", _M.block, "end", builder =
+          function (x) x[1][#x[1]+1] = x[2]; return x[1] end },
+        { "function", func_name, method_name, _M.func_val, builder=funcdef_builder },
+        { "while", _M.expr, "do", _M.block, "end", builder = "While" },
+        { "repeat", _M.block, "until", _M.expr, builder = "Repeat" },
+        { "local", _M.local_stat_parser, builder = unpack },
+        { "return", return_expr_list_parser, builder =
+          function(x) x[1].tag='Return'; return x[1] end },
+        { "break", builder = function() return { tag="Break" } end },
+        { "-{", gg.future(M, 'meta').splice_content, "}", builder = unpack },
+        { "if", gg.nonempty(elseifs_parser), gg.onkeyword{ "else", M.block }, "end",
+          builder = if_builder },
+        default = assign_or_call_stat_parser }
+
+    M.assignments = {
+        ["="] = "Set"
+    }
+
+    function M.assignments:add(k, v) self[k] = v end
+
+    return M
+end
\ No newline at end of file
diff --git a/metalua/compiler/parser/table.lua b/metalua/compiler/parser/table.lua
index 373a192..11102d9 100644
--- a/metalua/compiler/parser/table.lua
+++ b/metalua/compiler/parser/table.lua
@@ -20,9 +20,9 @@
 --------------------------------------------------------------------------------
 --
 -- Exported API:
--- * [M.bracket_field()]
--- * [M.field()]
--- * [M.content()]
+-- * [M.table_bracket_field()]
+-- * [M.table_field()]
+-- * [M.table_content()]
 -- * [M.table()]
 --
 -- KNOWN BUG: doesn't handle final ";" or "," before final "}"
@@ -30,54 +30,48 @@
 --------------------------------------------------------------------------------
 
 local gg  = require 'metalua.grammar.generator'
-local mlp = require 'metalua.compiler.parser.common'
 
-local M = { }
+return function(M)
 
---------------------------------------------------------------------------------
--- eta expansion to break circular dependencies:
---------------------------------------------------------------------------------
-local function _expr (lx) return mlp.expr(lx) end
+    M.table = { }
+    local _table = gg.future(M.table)
+    local _expr  = gg.future(M).expr
 
---------------------------------------------------------------------------------
--- [[key] = value] table field definition
---------------------------------------------------------------------------------
-M.bracket_field = gg.sequence{ "[", _expr, "]", "=", _expr, builder = "Pair" }
+    --------------------------------------------------------------------------------
+    -- `[key] = value` table field definition
+    --------------------------------------------------------------------------------
+    M.table.bracket_pair = gg.sequence{ "[", _expr, "]", "=", _expr, builder = "Pair" }
 
---------------------------------------------------------------------------------
--- [id = value] or [value] table field definition;
--- [[key]=val] are delegated to [bracket_field()]
---------------------------------------------------------------------------------
-function M.field (lx)
-   if lx :is_keyword (lx :peek(), "[") then return M.bracket_field (lx) end
-   local e = _expr (lx)
-   if lx :is_keyword (lx :peek(), "=") then
-      lx :next(); -- skip the "="
-      -- Allowing only the right type of key, here `Id
-      local etag = e.tag
-      if etag ~= 'Id' then
-         gg.parse_error(lx, 'Identifier expected, got %s.', etag)
-      end
-      local key = mlp.id2string(e)
-      local val = _expr(lx)
-      local r = { tag="Pair", key, val }
-      r.lineinfo = { first = key.lineinfo.first, last = val.lineinfo.last }
-      return r
-   else return e end
-end
+    --------------------------------------------------------------------------------
+    -- table element parser: list value, `id = value` pair or `[value] = value` pair.
+    --------------------------------------------------------------------------------
+    function M.table.element (lx)
+        if lx :is_keyword (lx :peek(), "[") then return M.table.bracket_pair(lx) end
+        local e = M.expr (lx)
+        if not lx :is_keyword (lx :peek(), "=") then return e end
+        lx :next(); -- skip the "="
+        local key = M.id2string(e) -- will fail on non-identifiers
+        local val = M.expr(lx)
+        local r = { tag="Pair", key, val }
+        r.lineinfo = { first = key.lineinfo.first, last = val.lineinfo.last }
+        return r
+    end
 
---------------------------------------------------------------------------------
--- table constructor, without enclosing braces; returns a full table object
---------------------------------------------------------------------------------
-M.content  = gg.list {
-   primary     =  function(...) return M.field(...) end,
-   separators  = { ",", ";" },
-   terminators = "}",
-   builder     = "Table" }
+    -----------------------------------------------------------------------------
+    -- table constructor, without enclosing braces; returns a full table object
+    -----------------------------------------------------------------------------
+    M.table.content  = gg.list {
+        -- eta expansion to allow patching the element definition
+        primary     =  _table.element,
+        separators  = { ",", ";" },
+        terminators = "}",
+        builder     = "Table" }
 
---------------------------------------------------------------------------------
--- complete table constructor including [{...}]
---------------------------------------------------------------------------------
-M.table = gg.sequence{ "{", function(...) return M.content(...) end, "}", builder = unpack }
+    --------------------------------------------------------------------------------
+    -- complete table constructor including [{...}]
+    --------------------------------------------------------------------------------
+    -- TODO beware, stat and expr use only table.content, this can't be patched.
+    M.table.table = gg.sequence{ "{", _table.content, "}", builder = unpack }
 
-return M
+    return M
+end
\ No newline at end of file
diff --git a/metalua/extension/clist.mlua b/metalua/extension/clist.mlua
deleted file mode 100644
index 14cfde8..0000000
--- a/metalua/extension/clist.mlua
+++ /dev/null
@@ -1,164 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
---
--- This extension implements list comprehensions, similar to Haskell and
--- Python syntax, to easily describe lists.
---
--- * x[a ... b] is the list { x[a], x[a+1], ..., x[b] }
--- * { f()..., b } contains all the elements returned by f(), then b
---   (allows to expand list fields other than the last one)
--- * list comprehensions a la python, with "for" and "if" suffixes:
---   {i+10*j for i=1,3 for j=1,3 if i~=j} is { 21, 31, 12, 32, 13, 23 }
---
--------------------------------------------------------------------------------
-
--{ extension ("match", ...) }
-
-local gg  = require 'metalua.grammar.generator'
-local mlp = require 'metalua.compiler.parser'
-local mlp_table = require 'metalua.compiler.parser.table'
-
-local function dots_builder (x) return `Dots{ x } end
-
-local function for_builder (x, h)
-   match x with
-   | `Comp{ _, acc } -> table.insert (acc, h[1]); return x
-   | `Pair{ _, _ }   -> error "No explicit key in a for list generator"
-   |  _              -> return `Comp{ x, {h[1]} }
-   end
-end
-
-local function if_builder (x, p)
-   match x with
-   | `Comp{ _, acc } -> table.insert (acc, `If{ p[1] }); return x
-   | `Pair{ _, _ }   -> error "No explicit key in a list guard"
-   |  _              -> return `Comp{ x, p[1] }
-   end
-end
-
-local function comp_builder(core, list, no_unpack)
-   -- [ti] = temp var holding table.insert
-   -- [v]  = variable holding the table being built
-   -- [r]  = the core of the list being built
-   local ti, v, r = mlp.gensym "table_insert", mlp.gensym "table"
-
-   -----------------------------------------------------------------------------
-   -- 1 - Build the loop's core: if it has suffix "...", every elements of the
-   --     multi-return must be inserted, hence the extra [for] loop.
-   -----------------------------------------------------------------------------
-   match core with
-   | `Dots{ x } -> 
-      local w = mlp.gensym()
-      r = +{stat: for _, -{w} in pairs( -{x} ) do -{ `Call{ ti, v, w } } end }
-   | `Pair{ k, w } ->
-      r = `Set{ { `Index{ v, k } }, { w } }
-   |  _ -> r = `Call{ ti, v, core }
-   end
-
-   -----------------------------------------------------------------------------
-   -- 2 - Stack the if and for control structures, from outside to inside.
-   --     This is done in a destructive way for the elements of [list].
-   -----------------------------------------------------------------------------
-   for i = #list, 1, -1 do
-      table.insert (list[i], {r})
-      r = list[i]
-   end
-   if no_unpack then
-      return `Stat{ { `Local{ {ti, v}, { +{table.insert}, `Table} }, r }, v }
-   else
-      return +{ function() 
-                   local -{ti}, -{v} = table.insert, { }
-                   -{r}; return unpack(-{v}) 
-                end () }
-   end
-end
-
-local function table_content_builder (list)
-   match list with
-   | { `Comp{ y, acc } } -> return comp_builder( y, acc, "no unpack")
-   | _ ->
-      local tables = { `Table }
-      local ctable = tables[1]
-      local function flush() ctable=`Table; table.insert(tables, ctable) end
-      for _, x in pairs(list) do
-         match x with
-         | `Comp{ y, acc } -> table.insert(ctable, comp_builder(y, acc)); flush()
-         | `Dots{ y }      -> table.insert(ctable, y); flush()
-         |  _              -> table.insert (ctable, x); 
-         end
-      end
-      match tables with
-      | { x } | { x, { } } -> return x
-      | _ ->
-         if #tables[#tables]==0 then table.remove(tables) end --suppress empty table      
-         return `Call{ +{table.cat}, unpack(tables) }
-      end
-   end
-end
-
-mlp_table.field = gg.expr{ name="table cell",
-   primary = mlp_table.field,
-   suffix  = { name="table cell suffix",
-      { "...",                 builder = dots_builder },
-      { "for", mlp.for_header, builder = for_builder  },
-      { "if",  mlp.expr,       builder = if_builder   } } }
-
-mlp_table.content.builder = table_content_builder
-
---[[
-mlp.stat:add{ "for", gg.expr {
-      primary = for_header,
-      suffix = {
-         { "for", mlp.for_header, builder = for_builder  },
-         { "if",  mlp.expr,       builder = if_builder   } } }, 
-   "do", mlp.block, "end", builder = for_stat_builder }
---]]
-
---------------------------------------------------------------------------------
--- Back-end for improved index operator.
---------------------------------------------------------------------------------
-local function index_builder(a, suffix)
-   match suffix[1] with
-   -- Single index, no range: keep the native semantics
-   | { { e, false } } -> return `Index{ a, e }
-   -- Either a range, or multiple indexes, or both
-   | ranges ->
-      local r = `Call{ +{table.isub}, a }
-      local function acc (x,y) table.insert (r,x); table.insert (r,y) end
-      for _, seq in ipairs (ranges) do
-         match seq with
-         | { e, false } -> acc(e,e)
-         | { e, f }     -> acc(e,f)
-         end
-      end
-      return r
-   end
-end
-
---------------------------------------------------------------------------------
--- Improved "[...]" index operator:
---  * support for multi-indexes ("foo[bar, gnat]")
---  * support for ranges ("foo[bar ... gnat]")
---------------------------------------------------------------------------------
-mlp.expr.suffix:del '['
-mlp.expr.suffix:add{ name="table index/range",
-   "[", gg.list{
-      gg.sequence { mlp.expr, gg.onkeyword{ "...", mlp.expr } } , 
-      separators = { ",", ";" } }, 
-   "]", builder = index_builder }
diff --git a/metalua/extension/comprehension.mlua b/metalua/extension/comprehension.mlua
new file mode 100644
index 0000000..8917b9a
--- /dev/null
+++ b/metalua/extension/comprehension.mlua
@@ -0,0 +1,282 @@
+-------------------------------------------------------------------------------
+-- Copyright (c) 2006-2013 Fabien Fleutot and others.
+--
+-- All rights reserved.
+--
+-- This program and the accompanying materials are made available
+-- under the terms of the Eclipse Public License v1.0 which
+-- accompanies this distribution, and is available at
+-- http://www.eclipse.org/legal/epl-v10.html
+--
+-- This program and the accompanying materials are also made available
+-- under the terms of the MIT public license which accompanies this
+-- distribution, and is available at http://www.lua.org/license.html
+--
+-- Contributors:
+--     Fabien Fleutot - API and implementation
+--
+-------------------------------------------------------------------------------
+--
+-- This extension implements list comprehensions, similar to Haskell and
+-- Python syntax, to easily describe lists.
+--
+-- * x[a ... b] is the list { x[a], x[a+1], ..., x[b] }
+-- * { f()..., b } contains all the elements returned by f(), then b
+--   (allows to expand list fields other than the last one)
+-- * list comprehensions a la python, with "for" and "if" suffixes:
+--   {i+10*j for i=1,3 for j=1,3 if i~=j} is { 21, 31, 12, 32, 13, 23 }
+--
+-------------------------------------------------------------------------------
+
+-{ extension ("match", ...) }
+
+local SUPPORT_IMPROVED_LOOPS   = true
+local SUPPORT_IMPROVED_INDEXES = false -- depends on deprecated table.isub
+local SUPPORT_CONTINUE         = true
+local SUPPORT_COMP_LISTS       = true
+
+assert (SUPPORT_IMPROVED_LOOPS or not SUPPORT_CONTINUE,
+        "Can't support 'continue' without improved loop headers")
+
+local gg  = require 'metalua.grammar.generator'
+local Q   = require 'metalua.treequery'
+
+local function dots_list_suffix_builder (x) return `DotsSuffix{ x } end
+
+local function for_list_suffix_builder (list_element, suffix)
+    local new_header = suffix[1]
+    match list_element with
+    | `Comp{ _, acc } -> table.insert (acc, new_header); return list_element
+    |  _ -> return `Comp{ list_element, { new_header } }
+    end
+end
+
+local function if_list_suffix_builder (list_element, suffix)
+    local new_header = `If{ suffix[1] }
+    match list_element with
+    | `Comp{ _, acc } -> table.insert (acc, new_header); return list_element
+    |  _ -> return `Comp{ list_element, { new_header } }
+    end
+end
+
+-- Builds a statement from a table element, which adds this element to
+-- a table `t`, potentially thanks to an alias `tinsert` to
+-- `table.insert`.
+-- @param core the part around which the loops are built.
+--   either `DotsSuffix{expr}, `Pair{ expr } or a plain expression
+-- @param list comprehension suffixes, in the order in which they appear
+--   either `Forin{ ... } or `Fornum{ ...} or `If{ ... }. In each case,
+--   it misses a last child node as its body.
+-- @param t a variable containing the table to fill
+-- @param tinsert a variable containing `table.insert`.
+--
+-- @return fill a statement which fills empty table `t` with the denoted element
+local function comp_list_builder(core, list, t, tinsert)
+    local filler
+    -- 1 - Build the loop's core: if it has suffix "...", every elements of the
+    --     multi-return must be inserted, hence the extra [for] loop.
+    match core with
+    | `DotsSuffix{ element } ->
+        local x = gg.gensym()
+        filler = +{stat: for _, -{x} in pairs{ -{element} } do (-{tinsert})(-{t}, -{x}) end }
+    | `Pair{ key, value } ->
+        --filler = +{ -{t}[-{key}] = -{value} }
+        filler = `Set{ { `Index{ t, key } }, { value } }
+    |  _ -> filler = +{ (-{tinsert})(-{t}, -{core}) }
+    end
+
+    -- 2 - Stack the `if` and `for` control structures, from outside to inside.
+    --     This is done in a destructive way for the elements of [list].
+    for i = #list, 1, -1 do
+        table.insert (list[i], {filler})
+        filler = list[i]
+    end
+
+    return filler
+end
+
+local function table_content_builder (list)
+    local special = false -- Does the table need a special builder?
+    for _, element in ipairs(list) do
+        local etag = element.tag
+        if etag=='Comp' or etag=='DotsSuffix' then special=true; break end
+    end
+    if not special then list.tag='Table'; return list end
+
+    local t, tinsert = gg.gensym 'table', gg.gensym 'table_insert'
+    local filler_block = { +{stat: local -{t}, -{tinsert} = { }, table.insert } }
+    for _, element in ipairs(list) do
+        local filler
+        match element with
+        | `Comp{ core, comp } -> filler = comp_list_builder(core, comp, t, tinsert)
+        | _ -> filler = comp_list_builder(element, { }, t, tinsert)
+        end
+        table.insert(filler_block, filler)
+    end
+    return `Stat{ filler_block, t }
+end
+
+
+--------------------------------------------------------------------------------
+-- Back-end for improved index operator.
+local function index_builder(a, suffix)
+   match suffix[1] with
+   -- Single index, no range: keep the native semantics
+   | { { e, false } } -> return `Index{ a, e }
+   -- Either a range, or multiple indexes, or both
+   | ranges ->
+      local r = `Call{ +{table.isub}, a }
+      local function acc (x,y) table.insert (r,x); table.insert (r,y) end
+      for _, seq in ipairs (ranges) do
+         match seq with
+         | { e, false } -> acc(e,e)
+         | { e, f }     -> acc(e,f)
+         end
+      end
+      return r
+   end
+end
+
+-------------------------------------------------------------------
+-- Find continue statements in a loop body, change them into goto
+-- end-of-body.
+local function transform_continue_statements(body)
+   local continue_statements = Q(body)
+       :if_unknown() -- tolerate unknown 'Continue' statements
+       :not_under ('Forin', 'Fornum', 'While', 'Repeat')
+       :filter ('Continue')
+       :list()
+   if next(continue_statements) then
+       local continue_label = gg.gensym 'continue' [1]
+       table.insert(body, `Label{ continue_label })
+       for _, statement in ipairs(continue_statements) do
+           statement.tag = 'Goto'
+           statement[1] = continue_label
+       end
+       return true
+   else return false end
+end
+
+-------------------------------------------------------------------------------
+-- Back-end for loops with a multi-element header
+local function loop_builder(x)
+   local first, elements, body = unpack(x)
+
+   -- Change continue statements into gotos.
+   if SUPPORT_CONTINUE then transform_continue_statements(body) end
+
+   -------------------------------------------------------------------
+   -- If it's a regular loop, don't bloat the code
+   if not next(elements) then
+      table.insert(first, body)
+      return first
+   end
+
+   -------------------------------------------------------------------
+   -- There's no reason to treat the first element in a special way
+   table.insert(elements, 1, first)
+
+   -------------------------------------------------------------------
+   -- Change breaks into gotos that escape all loops at once.
+   local exit_label = nil
+   local function break_to_goto(break_node)
+       if not exit_label then exit_label = gg.gensym 'break' [1] end
+       break_node = break_node or { }
+       break_node.tag = 'Goto'
+       break_node[1] = exit_label
+       return break_node
+   end
+   Q(body)
+       :not_under('Function', 'Forin', 'Fornum', 'While', 'Repeat')
+       :filter('Break')
+       :foreach (break_to_goto)
+
+   -------------------------------------------------------------------
+   -- Compile all headers elements, from last to first.
+   -- invariant: `body` is a block (not a statement)
+   local result = body
+   for i = #elements, 1, -1 do
+      local e = elements[i]
+      match e with
+      | `If{ cond }    ->
+         result = { `If{ cond, result } }
+      | `Until{ cond } ->
+         result = +{block: if -{cond} then -{break_to_goto()} else -{result} end }
+      | `While{ cond } ->
+         if i==1 then result = { `While{ cond, result } } -- top-level while
+         else result = +{block: if -{cond} then -{result} else -{break_to_goto()} end } end
+      | `Forin{ ... } | `Fornum{ ... } ->
+         table.insert (e, result); result={e}
+      | _-> require'metalua.pprint'.printf("Bad loop header element %s", e)
+      end
+   end
+
+
+   -------------------------------------------------------------------
+   -- If some breaks had to be changed into gotos, insert the label
+   if exit_label then result = { result, `Label{ exit_label } } end
+
+   return result
+end
+
+
+--------------------------------------------------------------------------------
+-- Improved "[...]" index operator:
+--  * support for multi-indexes ("foo[bar, gnat]")
+--  * support for ranges ("foo[bar ... gnat]")
+--------------------------------------------------------------------------------
+local function extend(M)
+
+    local _M = gg.future(M)
+
+    if SUPPORT_COMP_LISTS then
+        -- support for "for" / "if" comprehension suffixes in literal tables
+        local original_table_element = M.table.element
+        M.table.element = gg.expr{ name="table cell",
+                                   primary = original_table_element,
+                                   suffix  = { name="table cell suffix",
+                                               { "...",                builder = dots_list_suffix_builder },
+                                               { "for", _M.for_header, builder = for_list_suffix_builder  },
+                                               { "if",  _M.expr,       builder = if_list_suffix_builder   } } }
+        M.table.content.builder = table_content_builder
+    end
+
+    if SUPPORT_IMPROVED_INDEXES then
+        -- Support for ranges and multiple indices in bracket suffixes
+        M.expr.suffix:del '['
+        M.expr.suffix:add{ name="table index/range",
+                           "[", gg.list{
+                               gg.sequence { _M.expr, gg.onkeyword{ "...", _M.expr } } ,
+                               separators = { ",", ";" } },
+                           "]", builder = index_builder }
+    end
+
+    if SUPPORT_IMPROVED_LOOPS then
+        local original_for_header = M.for_header
+        M.stat :del  'for'
+        M.stat :del  'while'
+
+        M.loop_suffix = gg.multisequence{
+            { 'while',  _M.expr, builder = |x| `Until{ `Op{ 'not', x[1] } } },
+            { 'until',  _M.expr, builder = |x| `Until{ x[1] } },
+            { 'if',     _M.expr, builder = |x| `If{ x[1] } },
+            { 'for',    original_for_header, builder = |x| x[1] } }
+
+        M.loop_suffix_list = gg.list{ _M.loop_suffix, terminators='do' }
+
+        M.stat :add{
+            'for', original_for_header, _M.loop_suffix_list, 'do', _M.block, 'end',
+            builder = loop_builder }
+
+        M.stat :add{
+            'while', _M.expr, _M.loop_suffix_list, 'do', _M.block, 'end',
+            builder = |x| loop_builder{ `While{x[1]}, x[2], x[3] } }
+    end
+
+    if SUPPORT_CONTINUE then
+        M.lexer :add 'continue'
+        M.stat :add{ 'continue', builder='Continue' }
+    end
+end
+
+return extend
diff --git a/metalua/extension/log.mlua b/metalua/extension/log.mlua
deleted file mode 100644
index c9323a9..0000000
--- a/metalua/extension/log.mlua
+++ /dev/null
@@ -1,60 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
-
-require 'metalua.dollar'
-
--{ extension ('match', ...) }
-
-function dollar.log(...)
-   local args   = {...}
-   local ti     = table.insert
-   local code   = { }
-   local nohash = false
-   local width  = 80
-
-   local i=1
-   if args[i].tag=='String' then
-      ti(code, +{print(" [LOG] "..-{args[1]})})
-      i += 1
-   end
-
-   local xtra_args, names, vals = { }, { }, { }
-   for i=i, #args do
-      match args[i] with
-      | +{ 'nohash' } -> nohash = true
-      | `Number{ w }  -> width = w
-      --| `String{...} | `Number{...} -> ti (xtra_args, args[i])
-      | `Id{n} -> ti (names, n); ti (vals, args[i])
-      | x      -> ti (names, table.tostring(x, 'nohash')); ti (vals, x)
-      end
-   end
-
-   for i=1, #names do
-      local msg = string.format(" [LOG] %s = ", names[i])
-      local printer = `Call{ +{table.tostring},
-                              vals[i],
-                              `Number{ width },
-                              `Number{ #msg  } }
-      if nohash then ti(printer, +{'nohash'}) end
-      ti (code, `Call{ +{printf}, +{"%s%s"}, `String{ msg }, printer })
-   end
-   return code
-end
-
-return function() end
diff --git a/metalua/extension/match.mlua b/metalua/extension/match.mlua
index d7c7762..8561e05 100644
--- a/metalua/extension/match.mlua
+++ b/metalua/extension/match.mlua
@@ -78,9 +78,9 @@
 -- TODO: cfg.ntmp isn't reset as often as it could. I'm not even sure
 --       the corresponding locals are declared.
 
-module ('spmatch', package.seeall)
 
 local gg  = require 'metalua.grammar.generator'
+local pp  = require 'metalua.pprint'
 
 ----------------------------------------------------------------------
 -- This would have been best done through library 'metalua.walk',
@@ -88,47 +88,46 @@
 -- It replaces all instances of `...' in `ast' with `term', unless
 -- it appears in a function.
 ----------------------------------------------------------------------
-function replace_dots (ast, term)
-    local function rec (x)
-        if type(x) == 'table' then
-            if x.tag=='Dots' then
+local function replace_dots (ast, term)
+    local function rec (node)
+        for i, child in ipairs(node) do
+            if type(child)~="table" then -- pass
+            elseif child.tag=='Dots' then
                 if term=='ambiguous' then
                     error ("You can't use `...' on the right of a match case when it appears "..
                            "more than once on the left")
-                else
-                    x <- term
-                end
-            elseif x.tag=='Function' then return nil
-            else for _, y in ipairs (x) do rec (y) end end
+                else node[i] = term end
+            elseif child.tag=='Function' then return nil
+            else rec(child) end
         end
     end
-    return rec (ast)
+    return rec(ast)
 end
 
-tmpvar_base = gg.gensym 'submatch.' [1]
+local tmpvar_base = gg.gensym 'submatch.' [1]
 
-function next_tmpvar(cfg)
+local function next_tmpvar(cfg)
    assert (cfg.ntmp, "No cfg.ntmp imbrication level in the match compiler")
    cfg.ntmp = cfg.ntmp+1
    return `Id{ tmpvar_base .. cfg.ntmp }
 end
 
 -- Code accumulators
-acc_stat = |x,cfg| table.insert (cfg.code, x)
-acc_test = |x,cfg| acc_stat(+{stat: if -{x} then -{`Goto{cfg.on_failure}} end}, cfg)
+local acc_stat = |x,cfg| table.insert (cfg.code, x)
+local acc_test = |x,cfg| acc_stat(+{stat: if -{x} then -{`Goto{cfg.on_failure}} end}, cfg)
 -- lhs :: `Id{ string }
 -- rhs :: expr
-function acc_assign (lhs, rhs, cfg)
+local function acc_assign (lhs, rhs, cfg)
    assert(lhs.tag=='Id')
    cfg.locals[lhs[1]] = true
    acc_stat (`Set{ {lhs}, {rhs} }, cfg)
 end
 
-literal_tags = table.transpose{ 'String', 'Number', 'True', 'False', 'Nil' }
+local literal_tags = { String=1, Number=1, True=1, False=1, Nil=1 }
 
 -- pattern :: `Id{ string }
 -- term    :: expr
-function id_pattern_element_builder (pattern, term, cfg)
+local function id_pattern_element_builder (pattern, term, cfg)
    assert (pattern.tag == "Id")
    if pattern[1] == "_" then
       -- "_" is used as a dummy var ==> no assignment, no == checking
@@ -143,74 +142,12 @@
    end
 end
 
--- Concatenate code in [cfg.code], that will jump to label
--- [cfg.on_failure] if [pattern] doesn't match [term]. [pattern]
--- should be an identifier, or at least cheap to compute and
--- side-effects free.
---
--- pattern :: pattern_element
--- term    :: expr
-function pattern_element_builder (pattern, term, cfg)
-   if literal_tags[pattern.tag] then
-      acc_test (+{ -{term} ~= -{pattern} }, cfg)
-   elseif "Id" == pattern.tag then
-      id_pattern_element_builder (pattern, term, cfg)
-   elseif "Op" == pattern.tag and "div" == pattern[1] then
-      regexp_pattern_element_builder (pattern, term, cfg)
-   elseif "Op" == pattern.tag and "eq" == pattern[1] then
-      eq_pattern_element_builder (pattern, term, cfg)
-   elseif "Table" == pattern.tag then
-      table_pattern_element_builder (pattern, term, cfg)
-   else
-      error ("Invalid pattern: "..table.tostring(pattern, "nohash"))
-   end
-end
-
-function eq_pattern_element_builder (pattern, term, cfg)
-   local _, pat1, pat2 = unpack (pattern)
-   local ntmp_save = cfg.ntmp
-   pattern_element_builder (pat1, term, cfg)
-   cfg.ntmp = ntmp_save
-   pattern_element_builder (pat2, term, cfg)
-end
-
--- pattern :: `Op{ 'div', string, list{`Id string} or `Id{ string }}
--- term    :: expr
-function regexp_pattern_element_builder (pattern, term, cfg)
-   local op, regexp, sub_pattern = unpack(pattern)
-
-   -- Sanity checks --
-   assert (op=='div', "Don't know what to do with that op in a pattern")
-   assert (regexp.tag=="String",
-           "Left hand side operand for '/' in a pattern must be "..
-           "a literal string representing a regular expression")
-   if sub_pattern.tag=="Table" then
-      for _, x in ipairs(sub_pattern) do
-	 assert (x.tag=="Id" or x.tag=='Dots',
-		 "Right hand side operand for '/' in a pattern must be "..
-		 "a list of identifiers")
-      end
-   else
-      assert (sub_pattern.tag=="Id",
-	      "Right hand side operand for '/' in a pattern must be "..
-              "an identifier or a list of identifiers")
-   end
-
-   -- Regexp patterns can only match strings
-   acc_test (+{ type(-{term}) ~= 'string' }, cfg)
-   -- put all captures in a list
-   local capt_list  = +{ { string.strmatch(-{term}, -{regexp}) } }
-   -- save them in a var_n for recursive decomposition
-   local v2 = next_tmpvar(cfg)
-   acc_stat (+{stat: local -{v2} = -{capt_list} }, cfg)
-   -- was capture successful?
-   acc_test (+{ not next (-{v2}) }, cfg)
-   pattern_element_builder (sub_pattern, v2, cfg)
-end
+-- mutually recursive with table_pattern_element_builder
+local pattern_element_builder
 
 -- pattern :: pattern and `Table{ }
 -- term    :: expr
-function table_pattern_element_builder (pattern, term, cfg)
+local function table_pattern_element_builder (pattern, term, cfg)
    local seen_dots, len = false, 0
    acc_test (+{ type( -{term} ) ~= "table" }, cfg)
    for i = 1, #pattern do
@@ -254,9 +191,80 @@
    end
 end
 
+-- mutually recursive with pattern_element_builder
+local eq_pattern_element_builder, regexp_pattern_element_builder
+
+-- Concatenate code in [cfg.code], that will jump to label
+-- [cfg.on_failure] if [pattern] doesn't match [term]. [pattern]
+-- should be an identifier, or at least cheap to compute and
+-- side-effects free.
+--
+-- pattern :: pattern_element
+-- term    :: expr
+function pattern_element_builder (pattern, term, cfg)
+   if literal_tags[pattern.tag] then
+      acc_test (+{ -{term} ~= -{pattern} }, cfg)
+   elseif "Id" == pattern.tag then
+      id_pattern_element_builder (pattern, term, cfg)
+   elseif "Op" == pattern.tag and "div" == pattern[1] then
+      regexp_pattern_element_builder (pattern, term, cfg)
+   elseif "Op" == pattern.tag and "eq" == pattern[1] then
+      eq_pattern_element_builder (pattern, term, cfg)
+   elseif "Table" == pattern.tag then
+      table_pattern_element_builder (pattern, term, cfg)
+   else
+      error ("Invalid pattern at "..
+             tostring(pattern.lineinfo)..
+             ": "..pp.tostring(pattern, {hide_hash=true}))
+   end
+end
+
+function eq_pattern_element_builder (pattern, term, cfg)
+   local _, pat1, pat2 = unpack (pattern)
+   local ntmp_save = cfg.ntmp
+   pattern_element_builder (pat1, term, cfg)
+   cfg.ntmp = ntmp_save
+   pattern_element_builder (pat2, term, cfg)
+end
+
+-- pattern :: `Op{ 'div', string, list{`Id string} or `Id{ string }}
+-- term    :: expr
+local function regexp_pattern_element_builder (pattern, term, cfg)
+   local op, regexp, sub_pattern = unpack(pattern)
+
+   -- Sanity checks --
+   assert (op=='div', "Don't know what to do with that op in a pattern")
+   assert (regexp.tag=="String",
+           "Left hand side operand for '/' in a pattern must be "..
+           "a literal string representing a regular expression")
+   if sub_pattern.tag=="Table" then
+      for _, x in ipairs(sub_pattern) do
+	 assert (x.tag=="Id" or x.tag=='Dots',
+		 "Right hand side operand for '/' in a pattern must be "..
+		 "a list of identifiers")
+      end
+   else
+      assert (sub_pattern.tag=="Id",
+	      "Right hand side operand for '/' in a pattern must be "..
+              "an identifier or a list of identifiers")
+   end
+
+   -- Regexp patterns can only match strings
+   acc_test (+{ type(-{term}) ~= 'string' }, cfg)
+   -- put all captures in a list
+   local capt_list  = +{ { string.strmatch(-{term}, -{regexp}) } }
+   -- save them in a var_n for recursive decomposition
+   local v2 = next_tmpvar(cfg)
+   acc_stat (+{stat: local -{v2} = -{capt_list} }, cfg)
+   -- was capture successful?
+   acc_test (+{ not next (-{v2}) }, cfg)
+   pattern_element_builder (sub_pattern, v2, cfg)
+end
+
+
 -- Jumps to [cfg.on_faliure] if pattern_seq doesn't match
 -- term_seq.
-function pattern_seq_builder (pattern_seq, term_seq, cfg)
+local function pattern_seq_builder (pattern_seq, term_seq, cfg)
    if #pattern_seq ~= #term_seq then error ("Bad seq arity") end
    cfg.locals = { } -- reset bound variables between alternatives
    for i=1, #pattern_seq do
@@ -275,7 +283,7 @@
 --   goto after_success
 --   label on_failure_i
 --------------------------------------------------
-function case_builder (case, term_seq, cfg)
+local function case_builder (case, term_seq, cfg)
    local patterns_group, guard, block = unpack(case)
    local on_success = gg.gensym 'on_success' [1]
    for i = 1, #patterns_group do
@@ -299,7 +307,7 @@
    acc_stat (`Label{cfg.on_failure}, cfg)
 end
 
-function match_builder (x)
+local function match_builder (x)
    local term_seq, cases = unpack(x)
    local cfg = {
       code          = `Do{ },
@@ -349,36 +357,40 @@
          end
       end
       acc_stat(case_cfg.code, cfg)
-   end
-   acc_stat(+{error 'mismatch'}, cfg)
-   acc_stat(`Label{cfg.after_success}, cfg)
-   return cfg.code
+  end
+  local li = `String{tostring(cases.lineinfo)}
+  acc_stat(+{error('mismatch at '..-{li})}, cfg)
+  acc_stat(`Label{cfg.after_success}, cfg)
+  return cfg.code
 end
 
 ----------------------------------------------------------------------
 -- Syntactical front-end
 ----------------------------------------------------------------------
 
-local function extend(mlp)
-    checks('metalua.compiler.parser')
-    mlp.lexer:add{ "match", "with", "->" }
-    mlp.block.terminators:add "|"
+local function extend(M)
 
-    match_cases_list_parser = gg.list{ name = "match cases list",
+    local _M = gg.future(M)
+
+    checks('metalua.compiler.parser')
+    M.lexer:add{ "match", "with", "->" }
+    M.block.terminators:add "|"
+
+    local match_cases_list_parser = gg.list{ name = "match cases list",
         gg.sequence{ name = "match case",
                      gg.list{ name  = "match case patterns list",
-                              primary     = mlp.expr_list,
+                              primary     = _M.expr_list,
                               separators  = "|",
                               terminators = { "->", "if" } },
-                     gg.onkeyword{ "if", mlp.expr, consume = true },
+                     gg.onkeyword{ "if", _M.expr, consume = true },
                      "->",
-                     mlp.block },
+                     _M.block },
         separators  = "|",
         terminators = "end" }
 
-    mlp.stat:add{ name = "match statement",
+    M.stat:add{ name = "match statement",
                   "match",
-                  mlp.expr_list,
+                  _M.expr_list,
                   "with", gg.optkeyword "|",
                   match_cases_list_parser,
                   "end",
diff --git a/metalua/extension/xloop.mlua b/metalua/extension/xloop.mlua
deleted file mode 100644
index b5aaed9..0000000
--- a/metalua/extension/xloop.mlua
+++ /dev/null
@@ -1,141 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
--- Loop control syntax extensions
---
--- * Allows to compound loop headers together, e.g. write:
---      for i=1,10 for j=1,10 do f(i,j) end
---   instead of:
---      for i=1,10 do for j=1,10 do f(i,j) end end
---   loop headers are "for/=" and "for/in"
---
--- * while <condition> in a loop header will break the loop(s)
---   as soon as condition stops being satisfied.
---
--- * until <condition> in a loop header will break the loop(s)
---   as soon as condition is satisfied.
---
--- * if <condition> in a loop header will skip an iteration
---   if the condition is not satisfied.
---
--- * unless <condition> in a loop header will skip an iteration
---   if the condition is satisfied.
---
--- TODO: document ordering matters, e.g. how 
--- "for i in x() if cond(i) for j in y()" is parsed.
---
--------------------------------------------------------------------------------
-
--{ extension ('match', ...) }
-
-require 'metalua.walk'
-
-local gg  = require 'metalua.grammar.generator'
-local mlp = require 'metalua.compiler.parser'
-
-----------------------------------------------------------------------
--- Back-end:
-----------------------------------------------------------------------
-
--- Parse additional elements in a loop
-loop_element = gg.multisequence{
-   { 'while',  mlp.expr, builder = |x| `Until{ `Op{ 'not', x[1] } } },
-   { 'until',  mlp.expr, builder = |x| `Until{ x[1] } },
-   { 'if',     mlp.expr, builder = |x| `If{ x[1] } },
-   { 'unless', mlp.expr, builder = |x| `If{ `Op{ 'not', x[1] } } },
-   { 'for',    mlp.for_header, builder = |x| x[1] } }
-
--- Recompose the loop
-function xloop_builder(x)
-   local first, elements, body = unpack(x)
-
-   -------------------------------------------------------------------
-   -- If it's a regular loop, don't bloat the code
-   -------------------------------------------------------------------
-   if not next(elements) then
-      table.insert(first, body)
-      return first
-   end
-
-   -------------------------------------------------------------------
-   -- There's no reason to treat the first element in a special way
-   -------------------------------------------------------------------
-   table.insert(elements, 1, first)
-
-   -------------------------------------------------------------------
-   -- if a header or a break must be able to exit the loops, ti will
-   -- set exit_label and use it (a regular break wouldn't be enough,
-   -- as it couldn't escape several nested loops.)
-   -------------------------------------------------------------------
-   local exit_label
-   local function exit()
-      if not exit_label then exit_label = mlp.gensym 'break' [1] end
-      return `Goto{ exit_label }
-   end
-
-   -------------------------------------------------------------------
-   -- Compile all headers elements, from last to first
-   -------------------------------------------------------------------
-   for i = #elements, 1, -1 do
-      local e = elements[i]
-      match e with
-      | `If{ cond }    ->
-         body = `If{ cond, {body} }
-      | `Until{ cond } ->
-         body = +{stat: if -{cond} then -{exit()} else -{body} end }
-      | `Forin{ ... } | `Fornum{ ... } ->
-         table.insert (e, {body}); body=e
-      end
-   end
-
-   -------------------------------------------------------------------
-   -- Change breaks into gotos that escape all loops at once.
-   -------------------------------------------------------------------
-   local cfg = { stat = { }, expr = { } }
-   function cfg.stat.down(x)
-      match x with
-      | `Break -> x <- exit()
-      | `Forin{ ... } | `Fornum{ ... } | `While{ ... } | `Repeat{ ... } ->
-         return 'break'
-      | _ -> -- pass
-      end
-   end
-   function cfg.expr.down(x) if x.tag=='Function' then return 'break' end end
-   walk.stat(cfg, body)
-
-   if exit_label then body = { body, `Label{ exit_label } } end
-   return body
-end
-
-----------------------------------------------------------------------
--- Front-end:
-----------------------------------------------------------------------
-
-mlp.lexer:add 'unless'
-mlp.stat:del  'for'
-mlp.stat:del  'while'
-
-loop_element_list = gg.list{ loop_element, terminators='do' }
-
-mlp.stat:add{
-   'for', mlp.for_header, loop_element_list, 'do', mlp.block, 'end',
-   builder = xloop_builder }
-
-mlp.stat:add{
-   'while', mlp.expr, loop_element_list, 'do', mlp.block, 'end',
-   builder = |x| xloop_builder{ `While{x[1]}, x[2], x[3] } }
diff --git a/metalua/grammar/generator.lua b/metalua/grammar/generator.lua
index d0e150e..4633c6e 100644
--- a/metalua/grammar/generator.lua
+++ b/metalua/grammar/generator.lua
@@ -37,7 +37,7 @@
 -- * [gg.onkeyword()]
 -- * [gg.optkeyword()]
 --
--- Other functions: 
+-- Other functions:
 -- * [gg.parse_error()]
 -- * [gg.make_parser()]
 -- * [gg.is_parser()]
@@ -150,15 +150,17 @@
 -------------------------------------------------------------------------------
 function M.parse_error(lx, fmt, ...)
    local li = lx:lineinfo_left()
-   local line, column, offset, positions
+   local file, line, column, offset, positions
    if li then
-      line, column, offset = li.line, li.column, li.offset
+      file, line, column, offset = li.source, li.line, li.column, li.offset
       positions = { first = li, last = li }
    else
       line, column, offset = -1, -1, -1
    end
 
    local msg  = string.format("line %i, char %i: "..fmt, line, column, ...)
+   if file and file~='?' then msg = "file "..file..", "..msg end
+
    local src = lx.src
    if offset>0 and src then
       local i, j = offset, offset
@@ -295,14 +297,12 @@
             error ("In a multisequence parser, all but one sequences "..
                    "must start with a keyword")
          else self.default = s end -- first default
-      elseif self.sequences[keyword] then -- duplicate keyword
-         print (string.format(
-	    " *** Warning: keyword %q overloaded in multisequence ***",
-            keyword))
+     else
+         if self.sequences[keyword] then -- duplicate keyword
+             -- TODO: warn that initial keyword `keyword` is overloaded in multiseq
+         end
          self.sequences[keyword] = s
-      else -- newly caught keyword
-         self.sequences[keyword] = s
-      end
+     end
    end -- </multisequence.add>
 
    -------------------------------------------------------------------
@@ -315,11 +315,10 @@
    -------------------------------------------------------------------
    function p :del (kw)
       if not self.sequences[kw] then
-         eprintf("*** Warning: trying to delete sequence starting "..
-                 "with %q from a multisequence having no such "..
-                 "entry ***", kw) end
+          -- TODO: warn that we try to delete a non-existent entry
+      end
       local removed = self.sequences[kw]
-      self.sequences[kw] = nil 
+      self.sequences[kw] = nil
       return removed
    end
 
@@ -365,9 +364,9 @@
 -- * the builder takes specific parameters:
 --   - for [prefix], it takes the result of the prefix sequence parser,
 --     and the prefixed expression
---   - for [infix], it takes the left-hand-side expression, the results 
+--   - for [infix], it takes the left-hand-side expression, the results
 --     of the infix sequence parser, and the right-hand-side expression.
---   - for [suffix], it takes the suffixed expression, and the result 
+--   - for [suffix], it takes the suffixed expression, and the result
 --     of the suffix sequence parser.
 --
 -- * the default field is a list, with parameters:
@@ -380,7 +379,7 @@
 -- In [p], useful fields are:
 -- * [transformers]: as usual
 -- * [name]: as usual
--- * [primary]: the atomic expression parser, or a multisequence config 
+-- * [primary]: the atomic expression parser, or a multisequence config
 --   table (mandatory)
 -- * [prefix]:  prefix  operators config table, see above.
 -- * [infix]:   infix   operators config table, see above.
@@ -389,7 +388,7 @@
 -- After creation, these fields are added:
 -- * [kind] == "expr"
 -- * [parse] as usual
--- * each table is turned into a multisequence, and therefore has an 
+-- * each table is turned into a multisequence, and therefore has an
 --   [add] method
 --
 -------------------------------------------------------------------------------
@@ -437,7 +436,7 @@
             local e = p2.builder (op, self :parse (lx, p2.prec))
             local lli = lx :lineinfo_left()
             return transform (transform (e, p2, ili, lli), self, fli, lli)
-         else -- No prefix found, get a primary expression         
+         else -- No prefix found, get a primary expression
             local e = self.primary(lx)
             local lli = lx :lineinfo_left()
             return transform (e, self, fli, lli)
@@ -455,7 +454,7 @@
 
          -----------------------------------------
          -- Handle flattening operators: gather all operands
-         -- of the series in [list]; when a different operator 
+         -- of the series in [list]; when a different operator
          -- is found, stop, build from [list], [transform] and
          -- return.
          -----------------------------------------
@@ -472,13 +471,13 @@
             local e2 = pflat.builder (list)
             local lli = lx:lineinfo_left()
             return transform (transform (e2, pflat, fli, lli), self, fli, lli)
- 
+
          -----------------------------------------
          -- Handle regular infix operators: [e] the LHS is known,
          -- just gather the operator and [e2] the RHS.
          -- Result goes in [e3].
          -----------------------------------------
-         elseif p2.prec and p2.prec>prec or 
+         elseif p2.prec and p2.prec>prec or
                 p2.prec==prec and p2.assoc=="right" then
             local fli = e.lineinfo.first -- lx:lineinfo_right()
             local op = p2_func(lx)
@@ -489,7 +488,7 @@
             return transform (transform (e3, p2, fli, lli), self, fli, lli)
 
          -----------------------------------------
-         -- Check for non-associative operators, and complain if applicable. 
+         -- Check for non-associative operators, and complain if applicable.
          -----------------------------------------
          elseif p2.assoc=="none" and p2.prec==prec then
             M.parse_error (lx, "non-associative operator!")
@@ -524,7 +523,7 @@
       end --</expr.parse.handle_suffix>
 
       ------------------------------------------------------
-      -- Parser body: read suffix and (infix+operand) 
+      -- Parser body: read suffix and (infix+operand)
       -- extensions as long as we're able to fetch more at
       -- this precedence level.
       ------------------------------------------------------
@@ -649,7 +648,7 @@
 -- Keyword-conditioned parser generator
 --
 -------------------------------------------------------------------------------
--- 
+--
 -- Only apply a parser if a given keyword is found. The result of
 -- [gg.onkeyword] parser is the result of the subparser (modulo
 -- [transformers] applications).
@@ -668,7 +667,7 @@
 -- * [peek]: if non-nil, the conditioning keyword is left in the lexeme
 --   stream instead of being consumed.
 --
--- * [primary]: the subparser. 
+-- * [primary]: the subparser.
 --
 -- * [keywords]: list of strings representing triggering keywords.
 --
@@ -676,7 +675,7 @@
 --   Strings are put in [keywords], and the parser is put in [primary].
 --
 -- After the call, the following fields will be set:
---   
+--
 -- * [parse] the parsing method
 -- * [kind] == "onkeyword"
 -- * [primary]
@@ -709,8 +708,7 @@
       if type(x)=="string" then table.insert (p.keywords, x)
       else assert (not p.primary and M.is_parser (x)); p.primary = x end
    end
-   if not next (p.keywords) then
-      eprintf("Warning, no keyword to trigger gg.onkeyword") end
+   assert (next (p.keywords), "Missing trigger keyword in gg.onkeyword")
    assert (p.primary, 'no primary parser in gg.onkeyword')
    return p
 end --</onkeyword>
@@ -723,7 +721,7 @@
 -------------------------------------------------------------------------------
 --
 -- This doesn't return a real parser, just a function. That function parses
--- one of the keywords passed as parameters, and returns it. It returns 
+-- one of the keywords passed as parameters, and returns it. It returns
 -- [false] if no matching keyword is found.
 --
 -- Notice that tokens returned by lexer already carry lineinfo, therefore
@@ -731,7 +729,7 @@
 -------------------------------------------------------------------------------
 function M.optkeyword (...)
    local args = {...}
-   if type (args[1]) == "table" then 
+   if type (args[1]) == "table" then
       assert (#args == 1)
       args = args[1]
    end
@@ -759,7 +757,7 @@
 function M.with_lexer(new_lexer, parser)
 
    -------------------------------------------------------------------
-   -- Most gg functions take their parameters in a table, so it's 
+   -- Most gg functions take their parameters in a table, so it's
    -- better to silently accept when with_lexer{ } is called with
    -- its arguments in a list:
    -------------------------------------------------------------------
@@ -804,4 +802,31 @@
     return p
 end
 
+local FUTURE_MT = { }
+function FUTURE_MT:__tostring() return "<Proxy parser module>" end
+function FUTURE_MT:__newindex(key, value) error "don't write in futures" end
+function FUTURE_MT :__index (parser_name)
+    return function(...)
+        local p, m = rawget(self, '__path'), self.__module
+        if p then for _, name in ipairs(p) do
+            m=rawget(m, name)
+            if not m then error ("Submodule '"..name.."' undefined") end
+        end end
+        local f = rawget(m, parser_name)
+        if not f then error ("Parser '"..parser_name.."' undefined") end
+        return f(...)
+    end
+end
+
+function M.future(module, ...)
+    checks('table')
+    local path = ... and {...}
+    if path then for _, x in ipairs(path) do 
+        assert(type(x)=='string', "Bad future arg")
+    end end
+    local self = { __module = module,
+                   __path   = path }
+    return setmetatable(self, FUTURE_MT)
+end
+
 return M
diff --git a/metalua/grammar/lexer.lua b/metalua/grammar/lexer.lua
index 6c95546..0a58058 100644
--- a/metalua/grammar/lexer.lua
+++ b/metalua/grammar/lexer.lua
@@ -46,8 +46,8 @@
 ----------------------------------------------------------------------
 -- Create a new metatable, for a new class of objects.
 ----------------------------------------------------------------------
-local function new_metatable(name) 
-    local mt = { __type = 'lexer.'..name }; 
+local function new_metatable(name)
+    local mt = { __type = 'lexer.'..name };
     mt.__index = mt
     MT[name] = mt
 end
@@ -68,7 +68,7 @@
 end
 
 function MT.position :__tostring()
-    return string.format("<%s%s|L%d|C%d|K%d>", 
+    return string.format("<%s%s|L%d|C%d|K%d>",
         self.comments and "C|" or "",
         self.source, self.line, self.column, self.offset)
 end
@@ -87,7 +87,7 @@
     for offset in src :gmatch '\n()' do table.insert(lines, offset) end
     local max = #src+1
     table.insert(lines, max+1) -- +1 includes Eof
-    return setmetatable({ src_name=src_name, line2offset=lines, max=max }, 
+    return setmetatable({ src_name=src_name, line2offset=lines, max=max },
         MT.position_factory)
 end
 
@@ -137,7 +137,7 @@
     local line   = fli.line;   if line~=lli.line     then line  =line  ..'-'..lli.line   end
     local column = fli.column; if column~=lli.column then column=column..'-'..lli.column end
     local offset = fli.offset; if offset~=lli.offset then offset=offset..'-'..lli.offset end
-    return string.format("<%s%s|L%s|C%s|K%s%s>", 
+    return string.format("<%s%s|L%s|C%s|K%s%s>",
                          fli.comments and "C|" or "",
                          fli.source, line, column, offset,
                          lli.comments and "|C" or "")
@@ -150,12 +150,12 @@
 new_metatable 'token'
 
 function M.new_token(tag, content, lineinfo)
-    --printf("TOKEN `%s{ %q, lineinfo = %s} boundaries %d, %d", 
-    --       tag, content, tostring(lineinfo), lineinfo.first.id, lineinfo.last.id) 
+    --printf("TOKEN `%s{ %q, lineinfo = %s} boundaries %d, %d",
+    --       tag, content, tostring(lineinfo), lineinfo.first.id, lineinfo.last.id)
     return setmetatable({tag=tag, lineinfo=lineinfo, content}, MT.token)
 end
 
-function MT.token :__tostring()    
+function MT.token :__tostring()
     --return string.format("`%s{ %q, %s }", self.tag, self[1], tostring(self.lineinfo))
     return string.format("`%s %q", self.tag, self[1])
 end
@@ -231,9 +231,9 @@
       local k, j, i = digits :reverse() :byte(1, 3)
       local z = string.byte "0"
       local code = (k or z) + 10*(j or z) + 100*(i or z) - 111*z
-      if code > 255 then 
+      if code > 255 then
          error ("Illegal escape sequence '\\"..digits..
-                "' in string: ASCII codes must be in [0..255]") 
+                "' in string: ASCII codes must be in [0..255]")
       end
       local c = string.char (code)
       if c == '\\' then c = '\\\\' end -- parsed by unesc_letter (test: "\092b" --> "\\b")
@@ -251,7 +251,7 @@
      if c == '\\' then c = '\\\\' end -- parsed by unesc_letter (test: "\x5cb" --> "\\b")
      return backslashes..c
    end
-   
+
    -- Handle Lua 5.2 \z sequences
    local function unesc_z(backslashes, more)
      if #backslashes%2==0 then
@@ -261,10 +261,10 @@
      end
    end
 
-   -- Take a letter [x], and returns the character represented by the 
+   -- Take a letter [x], and returns the character represented by the
    -- sequence ['\\'..x], e.g. [unesc_letter "n" == "\n"].
    local function unesc_letter(x)
-      local t = { 
+      local t = {
          a = "\a", b = "\b", f = "\f",
          n = "\n", r = "\r", t = "\t", v = "\v",
          ["\\"] = "\\", ["'"] = "'", ['"'] = '"', ["\n"] = "\n" }
@@ -280,13 +280,13 @@
 
 lexer.extractors = {
    "extract_long_comment", "extract_short_comment",
-   "extract_short_string", "extract_word", "extract_number", 
+   "extract_short_string", "extract_word", "extract_number",
    "extract_long_string", "extract_symbol" }
 
 
 
 ----------------------------------------------------------------------
--- Really extract next token from the raw string 
+-- Really extract next token from the raw string
 -- (and update the index).
 -- loc: offset of the position just after spaces and comments
 -- previous_i: offset in src before extraction began
@@ -320,7 +320,7 @@
          return tok
        end
        local i_first = self.i -- loc = position after whitespaces
-       
+
        -- try every extractor until a token is found
        for _, extractor in ipairs(self.extractors) do
            local tag, content, xtra = self [extractor] (self)
@@ -449,12 +449,12 @@
 function lexer :extract_symbol()
    local k = self.src:sub (self.i,self.i)
    local symk = self.sym [k]  -- symbols starting with `k`
-   if not symk then 
+   if not symk then
       self.i = self.i + 1
       return 'Keyword', k
    end
    for _, sym in pairs (symk) do
-      if sym == self.src:sub (self.i, self.i + #sym - 1) then 
+      if sym == self.src:sub (self.i, self.i + #sym - 1) then
          self.i = self.i + #sym
          return 'Keyword', sym
       end
@@ -472,7 +472,7 @@
       for _, x in ipairs (w) do self :add (x) end
    else
       if w:match (self.patterns.word .. "$") then self.alpha [w] = true
-      elseif w:match "^%p%p+$" then 
+      elseif w:match "^%p%p+$" then
          local k = w:sub(1,1)
          local list = self.sym [k]
          if not list then list = { }; self.sym [k] = list end
@@ -507,7 +507,7 @@
    self :peek (n)
    local a
    for i=1,n do
-      a = table.remove (self.peeked, 1) 
+      a = table.remove (self.peeked, 1)
       -- TODO: is this used anywhere? I think not.  a.lineinfo.last may be nil.
       --self.lastline = a.lineinfo.last.line
    end
@@ -519,10 +519,7 @@
 -- Returns an object which saves the stream's current state.
 ----------------------------------------------------------------------
 -- FIXME there are more fields than that to save
--- TODO  remove metalua.table dependency, used here to make a shallow copy
--- of self.peeked, hash-part included. The hash-part seems unused,
--- so { unpack(x) } should work OK.
-function lexer :save () return { self.i; table.cat(self.peeked) } end
+function lexer :save () return { self.i; {unpack(self.peeked) } } end
 
 ----------------------------------------------------------------------
 -- Restore the stream's state, as saved by method [save].
@@ -536,7 +533,7 @@
 ----------------------------------------------------------------------
 function lexer :sync()
    local p1 = self.peeked[1]
-   if p1 then 
+   if p1 then
       local li_first = p1.lineinfo.first
       if li_first.comments then li_first=li_first.comments.lineinfo.first end
       self.i = li_first.offset
@@ -587,7 +584,7 @@
    elseif type(src_or_stream)=='string' then -- it's a source string
       local src = src_or_stream
       local pos1 = M.new_position(1, 1, 1, name)
-      local stream = { 
+      local stream = {
          src_name      = name;   -- Name of the file
          src           = src;    -- The source, as a single string
          peeked        = { };    -- Already peeked, but not discarded yet, tokens
@@ -637,10 +634,9 @@
    local words = {...}
    local a = self :next()
    local function err ()
-      error ("Got " .. tostring (a) .. 
+      error ("Got " .. tostring (a) ..
              ", expected one of these keywords : '" ..
              table.concat (words,"', '") .. "'") end
-          
    if not a or a.tag ~= "Keyword" then err () end
    if #words == 0 then return a[1] end
    for _, w in ipairs (words) do
@@ -650,14 +646,13 @@
 end
 
 ----------------------------------------------------------------------
--- 
+--
 ----------------------------------------------------------------------
 function lexer :clone()
-   require 'metalua.table'
-   local clone = {
-      -- TODO: remove metalua.table dependency
-      alpha = table.deep_copy(self.alpha),
-      sym   = table.deep_copy(self.sym) }
+    local alpha_clone, sym_clone = { }, { }
+   for word in pairs(self.alpha) do alpha_clone[word]=true end
+   for letter, list in pairs(self.sym) do sym_clone[letter] = { unpack(list) } end
+   local clone = { alpha=alpha_clone, sym=sym_clone }
    setmetatable(clone, self)
    clone.__index = clone
    return clone
diff --git a/metalua/loader.lua b/metalua/loader.lua
new file mode 100644
index 0000000..5a79a4c
--- /dev/null
+++ b/metalua/loader.lua
@@ -0,0 +1,128 @@
+--------------------------------------------------------------------------------
+-- Copyright (c) 2006-2013 Fabien Fleutot and others.
+--
+-- All rights reserved.
+--
+-- This program and the accompanying materials are made available
+-- under the terms of the Eclipse Public License v1.0 which
+-- accompanies this distribution, and is available at
+-- http://www.eclipse.org/legal/epl-v10.html
+--
+-- This program and the accompanying materials are also made available
+-- under the terms of the MIT public license which accompanies this
+-- distribution, and is available at http://www.lua.org/license.html
+--
+-- Contributors:
+--     Fabien Fleutot - API and implementation
+--
+--------------------------------------------------------------------------------
+
+local M = require "package" -- extend Lua's basic "package" module
+
+M.metalua_extension_prefix = 'metalua.extension.'
+
+-- Initialize package.mpath from package.path
+M.mpath = M.mpath or os.getenv 'LUA_MPATH' or
+    (M.path..";") :gsub("%.(lua[:;])", ".m%1") :sub(1, -2)
+
+M.mcache = M.mcache or os.getenv 'LUA_MCACHE'
+
+----------------------------------------------------------------------
+-- resc(k) returns "%"..k if it's a special regular expression char,
+-- or just k if it's normal.
+----------------------------------------------------------------------
+local regexp_magic = { }
+for k in ("^$()%.[]*+-?") :gmatch "." do regexp_magic[k]="%"..k end
+
+local function resc(k) return regexp_magic[k] or k end
+
+----------------------------------------------------------------------
+-- Take a Lua module name, return the open file and its name,
+-- or <false> and an error message.
+----------------------------------------------------------------------
+function M.findfile(name, path_string)
+   local config_regexp = ("([^\n])\n"):rep(5):sub(1, -2)
+   local dir_sep, path_sep, path_mark, execdir, igmark =
+      M.config :match (config_regexp)
+   name = name:gsub ('%.', dir_sep)
+   local errors = { }
+   local path_pattern = string.format('[^%s]+', resc(path_sep))
+   for path in path_string:gmatch (path_pattern) do
+      --printf('path = %s, rpath_mark=%s, name=%s', path, resc(path_mark), name)
+      local filename = path:gsub (resc (path_mark), name)
+      --printf('filename = %s', filename)
+      local file = io.open (filename, 'r')
+      if file then return file, filename end
+      table.insert(errors, string.format("\tno lua file %q", filename))
+   end
+   return false, '\n'..table.concat(errors, "\n")..'\n'
+end
+
+----------------------------------------------------------------------
+-- Before compiling a metalua source module, try to find and load
+-- a more recent bytecode dump. Requires lfs
+----------------------------------------------------------------------
+local function metalua_cache_loader(name, src_filename, src)
+    local mlc          = require 'metalua.compiler'.new()
+    local lfs          = require 'lfs'
+    local dir_sep      = M.config:sub(1,1)
+    local dst_filename = M.mcache :gsub ('%?', (name:gsub('%.', dir_sep)))
+    local src_a        = lfs.attributes(src_filename)
+    local src_date     = src_a and src_a.modification or 0
+    local dst_a        = lfs.attributes(dst_filename)
+    local dst_date     = dst_a and dst_a.modification or 0
+    local delta        = dst_date - src_date
+    local bytecode, file, msg
+    if delta <= 0 then
+       print "NEED TO RECOMPILE"
+       bytecode = mlc :src_to_bytecode (src, name)
+       for x in dst_filename :gmatch('()'..dir_sep) do
+          lfs.mkdir(dst_filename:sub(1,x))
+       end
+       file, msg = io.open(dst_filename, 'wb')
+       if not file then error(msg) end
+       file :write (bytecode)
+       file :close()
+    else
+       file, msg = io.open(dst_filename, 'rb')
+       if not file then error(msg) end
+       bytecode = file :read '*a'
+       file :close()
+    end
+    return mlc :bytecode_to_function (bytecode)
+end
+
+----------------------------------------------------------------------
+-- Load a metalua source file.
+----------------------------------------------------------------------
+function M.metalua_loader (name)
+   local file, filename_or_msg = M.findfile (name, M.mpath)
+   if not file then return filename_or_msg end
+   local luastring = file:read '*a'
+   file:close()
+   if M.mcache and pcall(require, 'lfs') then
+      return metalua_cache_loader(name, filename_or_msg, luastring)
+   else return require 'metalua.compiler'.new() :src_to_function (luastring, name) end
+end
+
+
+----------------------------------------------------------------------
+-- Placed after lua/luac loader, so precompiled files have
+-- higher precedence.
+----------------------------------------------------------------------
+table.insert(M.loaders, M.metalua_loader)
+
+----------------------------------------------------------------------
+-- Load an extension.
+----------------------------------------------------------------------
+function extension (name, mlp)
+    local complete_name = M.metalua_extension_prefix..name
+    local extend_func = require (complete_name)
+    if not mlp.extensions[complete_name] then
+        local ast =extend_func(mlp)
+        mlp.extensions[complete_name] =extend_func
+        return ast
+     end
+end
+
+return M
diff --git a/metalua/package.lua b/metalua/package.lua
deleted file mode 100644
index 3bba351..0000000
--- a/metalua/package.lua
+++ /dev/null
@@ -1,95 +0,0 @@
---------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
---------------------------------------------------------------------------------
-
-local package = package
-
-require 'metalua.table'
-require 'metalua.string'
-
-package.metalua_extension_prefix = 'metalua.extension.'
-
-package.mpath = package.mpath or os.getenv 'LUA_MPATH' or
-   './?.mlua;/usr/local/share/lua/5.1/?.mlua;'..
-   '/usr/local/share/lua/5.1/?/init.mlua;'..
-   '/usr/local/lib/lua/5.1/?.mlua;'..
-   '/usr/local/lib/lua/5.1/?/init.mlua'
-
-
-----------------------------------------------------------------------
--- resc(k) returns "%"..k if it's a special regular expression char,
--- or just k if it's normal.
-----------------------------------------------------------------------
-local regexp_magic = table.transpose{
-   "^", "$", "(", ")", "%", ".", "[", "]", "*", "+", "-", "?" }
-local function resc(k)
-   return regexp_magic[k] and '%'..k or k
-end
-
-----------------------------------------------------------------------
--- Take a Lua module name, return the open file and its name,
--- or <false> and an error message.
-----------------------------------------------------------------------
-function package.findfile(name, path_string)
-   local config_regexp = ("([^\n])\n"):rep(5):sub(1, -2)
-   local dir_sep, path_sep, path_mark, execdir, igmark =
-      package.config:strmatch (config_regexp)
-   name = name:gsub ('%.', dir_sep)
-   local errors = { }
-   local path_pattern = string.format('[^%s]+', resc(path_sep))
-   for path in path_string:gmatch (path_pattern) do
-      --printf('path = %s, rpath_mark=%s, name=%s', path, resc(path_mark), name)
-      local filename = path:gsub (resc (path_mark), name)
-      --printf('filename = %s', filename)
-      local file = io.open (filename, 'r')
-      if file then return file, filename end
-      table.insert(errors, string.format("\tno lua file %q", filename))
-   end
-   return false, '\n'..table.concat(errors, "\n")..'\n'
-end
-
-----------------------------------------------------------------------
--- Load a metalua source file.
-----------------------------------------------------------------------
-function package.metalua_loader (name)
-   local file, filename_or_msg = package.findfile (name, package.mpath)
-   if not file then return filename_or_msg end
-   local luastring = file:read '*a'
-   file:close()
-   local mlc = require 'metalua.compiler'.new()
-   return mlc :src_to_function (luastring, name)
-end
-
-----------------------------------------------------------------------
--- Placed after lua/luac loader, so precompiled files have
--- higher precedence.
-----------------------------------------------------------------------
-table.insert(package.loaders, package.metalua_loader)
-
-----------------------------------------------------------------------
--- Load an extension.
-----------------------------------------------------------------------
-function extension (name, mlp)
-    local complete_name = package.metalua_extension_prefix..name
-    -- TODO: pass mlp around
-    local extend_func = require (complete_name)
-    local ast =extend_func(mlp)
-    return ast
-end
-
-return package
diff --git a/metalua/pprint.lua b/metalua/pprint.lua
new file mode 100644
index 0000000..73a842b
--- /dev/null
+++ b/metalua/pprint.lua
@@ -0,0 +1,295 @@
+-------------------------------------------------------------------------------
+-- Copyright (c) 2006-2013 Fabien Fleutot and others.
+--
+-- All rights reserved.
+--
+-- This program and the accompanying materials are made available
+-- under the terms of the Eclipse Public License v1.0 which
+-- accompanies this distribution, and is available at
+-- http://www.eclipse.org/legal/epl-v10.html
+--
+-- This program and the accompanying materials are also made available
+-- under the terms of the MIT public license which accompanies this
+-- distribution, and is available at http://www.lua.org/license.html
+--
+-- Contributors:
+--     Fabien Fleutot - API and implementation
+--
+----------------------------------------------------------------------
+
+----------------------------------------------------------------------
+----------------------------------------------------------------------
+--
+-- Lua objects pretty-printer
+--
+----------------------------------------------------------------------
+----------------------------------------------------------------------
+
+local M = { }
+
+M.DEFAULT_CFG = {
+    hide_hash      = false; -- Print the non-array part of tables?
+    metalua_tag    = true;  -- Use Metalua's backtick syntax sugar?
+    fix_indent     = nil;   -- If a number, number of indentation spaces;
+                            -- If false, indent to the previous brace.
+    line_max       = nil;   -- If a number, tries to avoid making lines with
+                            -- more than this number of chars.
+    initial_indent = 0;     -- If a number, starts at this level of indentation
+    keywords       = { };   -- Set of keywords which must not use Lua's field
+                            -- shortcuts {["foo"]=...} -> {foo=...}
+}
+
+local function valid_id(cfg, x)
+    if type(x) ~= "string" then return false end
+    if not x:match "^[a-zA-Z_][a-zA-Z0-9_]*$" then return false end
+    if cfg.keywords and cfg.keywords[x] then return false end
+    return true
+end
+
+local __tostring_cache = setmetatable({ }, {__mode='k'})
+
+-- Retrieve the string produced by `__tostring` metamethod if present,
+-- return `false` otherwise. Cached in `__tostring_cache`.
+local function __tostring(x)
+    local the_string = __tostring_cache[x]
+    if the_string~=nil then return the_string end
+    local mt = getmetatable(x)
+    if mt then
+        local __tostring = mt.__tostring
+        if __tostring then
+            the_string = __tostring(x)
+            __tostring_cache[x] = the_string
+            return the_string
+        end
+    end
+    if x~=nil then __tostring_cache[x] = false end -- nil is an illegal key
+    return false
+end
+
+local xlen -- mutually recursive with `xlen_type`
+
+local xlen_cache = setmetatable({ }, {__mode='k'})
+
+-- Helpers for the `xlen` function
+local xlen_type = {
+    ["nil"] = function ( ) return 3 end;
+    number  = function (x) return #tostring(x) end;
+    boolean = function (x) return x and 4 or 5 end;
+    string  = function (x) return #string.format("%q",x) end;
+}
+
+function xlen_type.table (adt, cfg, nested)
+    local custom_string = __tostring(adt)
+    if custom_string then return #custom_string end
+
+    -- Circular referenced objects are printed with the plain
+    -- `tostring` function in nested positions.
+    if nested [adt] then return #tostring(adt) end
+    nested [adt] = true
+
+    local has_tag  = cfg.metalua_tag and valid_id(cfg, adt.tag)
+    local alen     = #adt
+    local has_arr  = alen>0
+    local has_hash = false
+    local x = 0
+
+    if not cfg.hide_hash then
+        -- first pass: count hash-part
+        for k, v in pairs(adt) do
+            if k=="tag" and has_tag then
+                -- this is the tag -> do nothing!
+            elseif type(k)=="number" and k<=alen and math.fmod(k,1)==0 and k>0 then
+                -- array-part pair -> do nothing!
+            else
+                has_hash = true
+                if valid_id(cfg, k) then x=x+#k
+                else x = x + xlen (k, cfg, nested) + 2 end -- count surrounding brackets
+                x = x + xlen (v, cfg, nested) + 5          -- count " = " and ", "
+            end
+        end
+    end
+
+    for i = 1, alen do x = x + xlen (adt[i], nested) + 2 end -- count ", "
+
+    nested[adt] = false -- No more nested calls
+
+    if not (has_tag or has_arr or has_hash) then return 3 end
+    if has_tag then x=x+#adt.tag+1 end
+    if not (has_arr or has_hash) then return x end
+    if not has_hash and alen==1 and type(adt[1])~="table" then
+        return x-2 -- substract extraneous ", "
+    end
+    return x+2 -- count "{ " and " }", substract extraneous ", "
+end
+
+
+-- Compute the number of chars it would require to display the table
+-- on a single line. Helps to decide whether some carriage returns are
+-- required. Since the size of each sub-table is required many times,
+-- it's cached in [xlen_cache].
+xlen = function (x, cfg, nested)
+    -- no need to compute length for 1-line prints
+    if not cfg.line_max then return 0 end
+    nested = nested or { }
+    if x==nil then return #"nil" end
+    local len = xlen_cache[x]
+    if len then return len end
+    local f = xlen_type[type(x)]
+    if not f then return #tostring(x) end
+    len = f (x, cfg, nested)
+    xlen_cache[x] = len
+    return len
+end
+
+local function consider_newline(p, len)
+    if not p.cfg.line_max then return end
+    if p.current_offset + len <= p.cfg.line_max then return end
+    if p.indent < p.current_offset then
+        p:acc "\n"; p:acc ((" "):rep(p.indent))
+        p.current_offset = p.indent
+    end
+end
+
+local acc_value
+
+local acc_type = {
+    ["nil"] = function(p) p:acc("nil") end;
+    number  = function(p, adt) p:acc (tostring (adt)) end;
+    string  = function(p, adt) p:acc ((string.format ("%q", adt):gsub("\\\n", "\\n"))) end;
+    boolean = function(p, adt) p:acc (adt and "true" or "false") end }
+
+-- Indentation:
+-- * if `cfg.fix_indent` is set to a number:
+--   * add this number of space for each level of depth
+--   * return to the line as soon as it flushes things further left
+-- * if not, tabulate to one space after the opening brace.
+--   * as a result, it never saves right-space to return before first element
+
+function acc_type.table(p, adt)
+    if p.nested[adt] then p:acc(tostring(adt)); return end
+    p.nested[adt]  = true
+
+    local has_tag  = p.cfg.metalua_tag and valid_id(p.cfg, adt.tag)
+    local alen     = #adt
+    local has_arr  = alen>0
+    local has_hash = false
+
+    local previous_indent = p.indent
+
+    if has_tag then p:acc("`"); p:acc(adt.tag) end
+
+    local function indent(p)
+        if not p.cfg.fix_indent then p.indent = p.current_offset
+        else p.indent = p.indent + p.cfg.fix_indent end
+    end
+
+    -- First pass: handle hash-part
+    if not p.cfg.hide_hash then
+        for k, v in pairs(adt) do
+
+            if has_tag and k=='tag' then  -- pass the 'tag' field
+            elseif type(k)=="number" and k<=alen and k>0 and math.fmod(k,1)==0 then
+                -- pass array-part keys (consecutive ints less than `#adt`)
+            else -- hash-part keys
+                if has_hash then p:acc ", " else -- 1st hash-part pair ever found
+                    p:acc "{ "; indent(p)
+                end
+
+                -- Determine whether a newline is required
+                local is_id, expected_len=valid_id(p.cfg, k)
+                if is_id then expected_len=#k+xlen(v, p.cfg, p.nested)+#" = , "
+                else expected_len = xlen(k, p.cfg, p.nested)+xlen(v, p.cfg, p.nested)+#"[] = , " end
+                consider_newline(p, expected_len)
+
+                -- Print the key
+                if is_id then p:acc(k); p:acc " = " else
+                    p:acc "["; acc_value (p, k); p:acc "] = "
+                end
+
+                acc_value (p, v) -- Print the value
+                has_hash = true
+            end
+        end
+    end
+
+    -- Now we know whether there's a hash-part, an array-part, and a tag.
+    -- Tag and hash-part are already printed if they're present.
+    if not has_tag and not has_hash and not has_arr then p:acc "{ }";
+    elseif has_tag and not has_hash and not has_arr then -- nothing, tag already in acc
+    else
+        assert (has_hash or has_arr) -- special case { } already handled
+        local no_brace = false
+        if has_hash and has_arr then p:acc ", "
+        elseif has_tag and not has_hash and alen==1 and type(adt[1])~="table" then
+            -- No brace required; don't print "{", remember not to print "}"
+            p:acc (" "); acc_value (p, adt[1]) -- indent= indent+(cfg.fix_indent or 0))
+            no_brace = true
+        elseif not has_hash then
+            -- Braces required, but not opened by hash-part handler yet
+            p:acc "{ "; indent(p)
+        end
+
+        -- 2nd pass: array-part
+        if not no_brace and has_arr then
+            local expected_len = xlen(adt[1], p.cfg, p.nested)
+            consider_newline(p, expected_len)
+            acc_value(p, adt[1]) -- indent+(cfg.fix_indent or 0)
+            for i=2, alen do
+                p:acc ", ";
+                consider_newline(p, xlen(adt[i], p.cfg, p.nested))
+                acc_value (p, adt[i]) --indent+(cfg.fix_indent or 0)
+            end
+        end
+        if not no_brace then p:acc " }" end
+    end
+    p.nested[adt] = false -- No more nested calls
+    p.indent = previous_indent
+end
+
+
+function acc_value(p, v)
+    local custom_string = __tostring(v)
+    if custom_string then p:acc(custom_string) else
+        local f = acc_type[type(v)]
+        if f then f(p, v) else p:acc(tostring(v)) end
+    end
+end
+
+
+-- FIXME: new_indent seems to be always nil?!s detection
+-- FIXME: accumulator function should be configurable,
+-- so that print() doesn't need to bufferize the whole string
+-- before starting to print.
+function M.tostring(t, cfg)
+
+    cfg = cfg or M.DEFAULT_CFG or { }
+
+    local p = {
+        cfg = cfg;
+        indent = 0;
+        current_offset = cfg.initial_indent or 0;
+        buffer = { };
+        nested = { };
+        acc = function(self, str)
+                  table.insert(self.buffer, str)
+                  self.current_offset = self.current_offset + #str
+              end;
+    }
+    acc_value(p, t)
+    return table.concat(p.buffer)
+end
+
+function M.print(...) return print(M.tostring(...)) end
+function M.sprintf(fmt, ...)
+    local args={...}
+    for i, v in pairs(args) do
+        local t=type(v)
+        if t=='table' then args[i]=M.tostring(v)
+        elseif t=='nil' then args[i]='nil' end
+    end
+    return string.format(fmt, unpack(args))
+end
+
+function M.printf(...) print(M.sprintf(...)) end
+
+return M
\ No newline at end of file
diff --git a/metalua/repl.mlua b/metalua/repl.mlua
index 9dfa7cc..4a39adf 100644
--- a/metalua/repl.mlua
+++ b/metalua/repl.mlua
@@ -23,6 +23,7 @@
 PROMPT     = "M> "
 PROMPT2    = ">> "
 
+local pp=require 'metalua.pprint'
 local M = { }
 
 mlc = require 'metalua.compiler'.new()
@@ -35,10 +36,16 @@
       local rl_handle = editline.init 'metalua'
       readline = |p| rl_handle:read(p)
    else
-      function readline (p)
-         io.write (p)
-         io.flush ()
-         return io.read '*l'
+      local status, rl = pcall(require, 'readline')
+      if status then
+         rl.set_options{histfile='~/.metalua_history', keeplines=100, completion=false }
+         readline = rl.readline
+      else -- neither editline nor readline available
+         function readline (p)
+            io.write (p)
+            io.flush ()
+            return io.read '*l'
+         end
       end
    end
 end
@@ -49,7 +56,7 @@
 
 
 function M.run()
-    printf ("Metalua, interactive REPLoop.\n"..
+    pp.printf ("Metalua, interactive REPLoop.\n"..
             "(c) 2006-2013 <metalua@gmail.com>")
    local lines = { }
    while true do
@@ -72,7 +79,9 @@
               success = table.remove (results, 1)
               if success then
                   -- Success!
-                  table.iforeach(|x| table.print(x, LINE_WIDTH), results)
+                  for _, x in ipairs(results) do
+                      pp.print(x, {line_max=LINE_WIDTH, metalua_tag=true})
+                  end
                   lines = { }
               else
                   print "Evaluation error:"
diff --git a/metalua/string.lua b/metalua/string.lua
deleted file mode 100644
index 7a9b1ad..0000000
--- a/metalua/string.lua
+++ /dev/null
@@ -1,62 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
-
-----------------------------------------------------------------------
-----------------------------------------------------------------------
---
--- String module extension
---
-----------------------------------------------------------------------
-----------------------------------------------------------------------
-
--- Courtesy of lua-users.org
-function string.split(str, pat)
-   local t = {} 
-   local fpat = "(.-)" .. pat
-   local last_end = 1
-   local s, e, cap = string.find(str, fpat, 1)
-   while s do
-      if s ~= 1 or cap ~= "" then
-          table.insert(t,cap)
-       end
-      last_end = e+1
-      s, e, cap = string.find(str, fpat, last_end)
-   end
-   if last_end <= string.len(str) then
-      cap = string.sub(str, last_end)
-      table.insert(t, cap)
-   end
-   return t
-end
-
--- "match" is regularly used as a keyword for pattern matching, 
--- so here is an always available substitute.
-string.strmatch = string["match"]
-
--- change a compiled string into a function
-function string.undump(str)
-   if str:strmatch '^\027LuaQ' or str:strmatch '^#![^\n]+\n\027LuaQ' then
-      local f = (lua_loadstring or loadstring)(str)
-      return f
-   else
-      error "Not a chunk dump"
-   end
-end
-
-return string
\ No newline at end of file
diff --git a/metalua/table.lua b/metalua/table.lua
deleted file mode 100644
index 565f832..0000000
--- a/metalua/table.lua
+++ /dev/null
@@ -1,421 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
-
----------------------------------------------------------------------
-----------------------------------------------------------------------
---
--- Table module extension
---
-----------------------------------------------------------------------
-----------------------------------------------------------------------
-
--- todo: table.scan (scan1?) fold1? flip?
-
-function table.transpose(t)
-   local tt = { }
-   for a, b in pairs(t) do tt[b] = a end
-   return tt
-end
-
-function table.iforeach(f, ...)
-   -- assert (type (f) == "function") [wouldn't allow metamethod __call]
-   local nargs = select("#", ...)
-   if nargs==1 then -- Quick iforeach (most common case), just one table arg
-      local t = ...
-      assert (type (t) == "table")
-      for i = 1, #t do 
-         local result = f (t[i])
-         -- If the function returns non-false, stop iteration
-         if result then return result end
-      end
-   else -- advanced case: boundaries and/or multiple tables
-
-      -- fargs:       arguments fot a single call to f
-      -- first, last: indexes of the first & last elements mapped in each table
-      -- arg1:        index of the first table in args
-
-      -- 1 - find boundaries if any
-      local  args, fargs, first, last, arg1 = {...}, { }
-      if     type(args[1]) ~= "number" then first, arg1 = 1, 1 -- no boundary
-      elseif type(args[2]) ~= "number" then first, last, arg1 = 1, args[1], 2
-      else   first,  last, arg1 = args[1], args[2], 3 end
-      assert (nargs >= arg1) -- at least one table
-      -- 2 - determine upper boundary if not given
-      if not last then for i = arg1, nargs do 
-            assert (type (args[i]) == "table")
-            last = max (#args[i], last) 
-      end end
-      -- 3 - remove non-table arguments from args, adjust nargs
-      if arg1>1 then args = { select(arg1, unpack(args)) }; nargs = #args end
-
-      -- 4 - perform the iteration
-      for i = first, last do
-         for j = 1, nargs do fargs[j] = args[j][i] end -- build args list
-         local result = f (unpack (fargs)) -- here is the call
-         -- If the function returns non-false, stop iteration
-         if result then return result end
-      end
-   end
-end
-
-function table.imap (f, ...)
-   local result, idx = { }, 1
-   local function g(...) result[idx] = f(...);  idx=idx+1 end
-   table.iforeach(g, ...)
-   return result
-end
-
-function table.ifold (f, acc, ...)
-   local function g(...) acc = f (acc,...) end
-   table.iforeach (g, ...)
-   return acc
-end
-
--- function table.ifold1 (f, ...)
---    return table.ifold (f, acc, 2, false, ...)
--- end
-
-function table.izip(...)
-   local function g(...) return {...} end
-   return table.imap(g, ...)
-end
-
-function table.ifilter(f, t)
-   local yes, no = { }, { }
-   for i=1,#t do table.insert (f(t[i]) and yes or no, t[i]) end
-   return yes, no
-end
-
-function table.icat(...)
-   local result = { }
-   for _, t in ipairs {...} do
-      for _, x in pairs (t) do
-         table.insert (result, x)
-      end
-   end
-   return result
-end
-
-function table.iflatten (x) return table.icat (unpack (x)) end
-
-function table.irev (t)
-   local result, nt = { }, #t
-   for i=0, nt-1 do result[nt-i] = t[i+1] end
-   return result
-end
-
-function table.isub (t, ...)
-   local ti, u = table.insert, { }
-   local args, nargs = {...}, select("#", ...)
-   for i=1, nargs/2 do
-      local a, b = args[2*i-1], args[2*i]
-      for i=a, b, a<=b and 1 or -1 do ti(u, t[i]) end
-   end
-   return u
-end
-
-function table.iall (f, ...)
-   local result = true
-   local function g(...) return not f(...) end
-   return not table.iforeach(g, ...)
-   --return result
-end
-
-function table.iany (f, ...)
-   local function g(...) return not f(...) end
-   return not table.iall(g, ...)
-end
-
-function table.shallow_copy(x)
-   local y={ }
-   for k, v in pairs(x) do y[k]=v end
-   return y
-end
-
--- Warning, this is implementation dependent: it relies on
--- the fact the [next()] enumerates the array-part before the hash-part.
-function table.cat(...)
-   local y={ }
-   for _, x in ipairs{...} do
-      -- cat array-part
-      for _, v in ipairs(x) do table.insert(y,v) end
-      -- cat hash-part
-      local lx, k = #x
-      if lx>0 then k=next(x,lx) else k=next(x) end
-      while k do y[k]=x[k]; k=next(x,k) end
-   end
-   return y
-end
-
-function table.deep_copy(x) 
-   local tracker = { }
-   local function aux (x)
-      if type(x) == "table" then
-         local y=tracker[x]
-         if y then return y end
-         y = { }; tracker[x] = y
-         setmetatable (y, getmetatable (x))
-         for k,v in pairs(x) do y[aux(k)] = aux(v) end
-         return y
-      else return x end
-   end
-   return aux(x)
-end
-
-function table.override(dst, src)
-   for k, v in pairs(src) do dst[k] = v end
-   for i = #src+1, #dst   do dst[i] = nil end
-   return dst
-end
-
-function table.range(a,b,c)
-   if not b then assert(not(c)); b=a; a=1
-   elseif not c then c = (b>=a) and 1 or -1 end
-   local result = { }
-   for i=a, b, c do table.insert(result, i) end
-   return result
-end
-
--- FIXME: new_indent seems to be always nil?!
--- FIXME: accumulator function should be configurable,
--- so that print() doesn't need to bufferize the whole string
--- before starting to print.
-function table.tostring(t, ...)
-   local PRINT_HASH, HANDLE_TAG, FIX_INDENT, LINE_MAX, INITIAL_INDENT = true, true
-   for _, x in ipairs {...} do
-      if type(x) == "number" then
-         if not LINE_MAX then LINE_MAX = x
-         else INITIAL_INDENT = x end
-      elseif x=="nohash" then PRINT_HASH = false
-      elseif x=="notag"  then HANDLE_TAG = false
-      else
-         local n = string['match'](x, "^indent%s*(%d*)$")
-         if n then FIX_INDENT = tonumber(n) or 3 end
-      end
-   end
-   LINE_MAX       = LINE_MAX or math.huge
-   INITIAL_INDENT = INITIAL_INDENT or 1
-   
-   local current_offset =  0  -- indentation level
-   local xlen_cache     = { } -- cached results for xlen()
-   local acc_list       = { } -- Generated bits of string
-   local function acc(...)    -- Accumulate a bit of string
-      local x = table.concat{...}
-      current_offset = current_offset + #x
-      table.insert(acc_list, x) 
-   end
-   local function valid_id(x)
-      -- FIXME: we should also reject keywords; but the list of
-      -- current keywords is not fixed in metalua...
-      return type(x) == "string" 
-         and string['match'](x, "^[a-zA-Z_][a-zA-Z0-9_]*$")
-   end
-   
-   -- Compute the number of chars it would require to display the table
-   -- on a single line. Helps to decide whether some carriage returns are
-   -- required. Since the size of each sub-table is required many times,
-   -- it's cached in [xlen_cache].
-   local xlen_type = { }
-   local function xlen(x, nested)
-      nested = nested or { }
-      if x==nil then return #"nil" end
-      --if nested[x] then return #tostring(x) end -- already done in table
-      local len = xlen_cache[x]
-      if len then return len end
-      local f = xlen_type[type(x)]
-      if not f then return #tostring(x) end
-      len = f (x, nested) 
-      xlen_cache[x] = len
-      return len
-   end
-
-   -- optim: no need to compute lengths if I'm not going to use them
-   -- anyway.
-   if LINE_MAX == math.huge then xlen = function() return 0 end end
-
-
-   local tostring_cache = { }
-   local function __tostring(x)
-      local the_string = tostring_cache[x]
-      if the_string~=nil then return the_string end
-      local mt = getmetatable(x)
-      if mt then 
-          local __tostring = mt.__tostring
-          if __tostring then
-              the_string = __tostring(x)
-              tostring_cache[x] = the_string
-              return the_string
-          end
-      end
-      if x~=nil then tostring_cache[x] = false end -- nil is an illegal key
-      return false
-   end
-
-   xlen_type["nil"] = function () return 3 end
-   function xlen_type.number  (x) return #tostring(x) end
-   function xlen_type.boolean (x) return x and 4 or 5 end
-   function xlen_type.string  (x) return #string.format("%q",x) end
-   function xlen_type.table   (adt, nested)
-
-      local custom_string = __tostring(adt)
-      if custom_string then return #custom_string end
-
-      -- Circular references detection
-      if nested [adt] then return #tostring(adt) end
-      nested [adt] = true
-
-      local has_tag  = HANDLE_TAG and valid_id(adt.tag)
-      local alen     = #adt
-      local has_arr  = alen>0
-      local has_hash = false
-      local x = 0
-      
-      if PRINT_HASH then
-         -- first pass: count hash-part
-         for k, v in pairs(adt) do
-            if k=="tag" and has_tag then 
-               -- this is the tag -> do nothing!
-            elseif type(k)=="number" and k<=alen and math.fmod(k,1)==0 and k>0 then 
-               -- array-part pair -> do nothing!
-            else
-               has_hash = true
-               if valid_id(k) then x=x+#k
-               else x = x + xlen (k, nested) + 2 end -- count surrounding brackets
-               x = x + xlen (v, nested) + 5          -- count " = " and ", "
-            end
-         end
-      end
-
-      for i = 1, alen do x = x + xlen (adt[i], nested) + 2 end -- count ", "
-      
-      nested[adt] = false -- No more nested calls
-
-      if not (has_tag or has_arr or has_hash) then return 3 end
-      if has_tag then x=x+#adt.tag+1 end
-      if not (has_arr or has_hash) then return x end
-      if not has_hash and alen==1 and type(adt[1])~="table" then
-         return x-2 -- substract extraneous ", "
-      end
-      return x+2 -- count "{ " and " }", substract extraneous ", "
-   end
-   
-   -- Recursively print a (sub) table at given indentation level.
-   -- [newline] indicates whether newlines should be inserted.
-   local function rec (adt, nested, indent)
-      if not FIX_INDENT then indent = current_offset end
-      local function acc_newline()
-         acc ("\n"); acc (string.rep (" ", indent)) 
-         current_offset = indent
-      end
-      local x = { }
-      x["nil"] = function() acc "nil" end
-      function x.number()   acc (tostring (adt)) end
-      function x.string()   acc ((string.format ("%q", adt):gsub("\\\n", "\\n"))) end
-      function x.boolean()  acc (adt and "true" or "false") end
-      function x.table()
-         if nested[adt] then acc(tostring(adt)); return end
-         nested[adt]  = true
-
-
-         local has_tag  = HANDLE_TAG and valid_id(adt.tag)
-         local alen     = #adt
-         local has_arr  = alen>0
-         local has_hash = false
-
-         if has_tag then acc("`"); acc(adt.tag) end
-
-         -- First pass: handle hash-part
-         if PRINT_HASH then
-            for k, v in pairs(adt) do
-               -- pass if the key belongs to the array-part or is the "tag" field
-               if not (k=="tag" and HANDLE_TAG) and 
-                  not (type(k)=="number" and k<=alen and math.fmod(k,1)==0 and k>0) then
-
-                  -- Is it the first time we parse a hash pair?
-                  if not has_hash then 
-                     acc "{ "
-                     if not FIX_INDENT then indent = current_offset end
-                  else acc ", " end
-
-                  -- Determine whether a newline is required
-                  local is_id, expected_len = valid_id(k)
-                  if is_id then expected_len = #k + xlen (v, nested) + #" = , "
-                  else expected_len = xlen (k, nested) + 
-                                      xlen (v, nested) + #"[] = , " end
-                  if has_hash and expected_len + current_offset > LINE_MAX
-                              then acc_newline() end
-                  
-                  -- Print the key
-                  if is_id then acc(k); acc " = " 
-                  else  acc "["; rec (k, nested, indent+(FIX_INDENT or 0)); acc "] = " end
-
-                  -- Print the value
-                  rec (v, nested, indent+(FIX_INDENT or 0))
-                  has_hash = true
-               end
-            end
-         end
-
-         -- Now we know whether there's a hash-part, an array-part, and a tag.
-         -- Tag and hash-part are already printed if they're present.
-         if not has_tag and not has_hash and not has_arr then acc "{ }"; 
-         elseif has_tag and not has_hash and not has_arr then -- nothing, tag already in acc
-         else 
-            assert (has_hash or has_arr)
-            local no_brace = false
-            if has_hash and has_arr then acc ", " 
-            elseif has_tag and not has_hash and alen==1 and type(adt[1])~="table" then
-               -- No brace required; don't print "{", remember not to print "}"
-               acc (" "); rec (adt[1], nested, indent+(FIX_INDENT or 0))
-               no_brace = true
-            elseif not has_hash then
-               -- Braces required, but not opened by hash-part handler yet
-               acc "{ "
-               if not FIX_INDENT then indent = current_offset end
-            end
-
-            -- 2nd pass: array-part
-            if not no_brace and has_arr then 
-               rec (adt[1], nested, indent+(FIX_INDENT or 0))
-               for i=2, alen do 
-                  acc ", ";                   
-                  if   current_offset + xlen (adt[i], { }) > LINE_MAX
-                  then acc_newline() end
-                  rec (adt[i], nested, indent+(FIX_INDENT or 0)) 
-               end
-            end
-            if not no_brace then acc " }" end
-         end
-         nested[adt] = false -- No more nested calls
-      end
-      local custom_string = __tostring(adt)
-      if custom_string then acc(custom_string) else
-         local y = x[type(adt)]
-         if y then y() else acc(tostring(adt)) end
-     end
-   end
-   --printf("INITIAL_INDENT = %i", INITIAL_INDENT)
-   current_offset = INITIAL_INDENT or 0
-   rec(t, { }, 0)
-   return table.concat (acc_list)
-end
-
-function table.print(...) return print(table.tostring(...)) end
-
-return table
\ No newline at end of file
diff --git a/metalua/treequery.mlua b/metalua/treequery.mlua
index c1f9f84..f5b09d2 100755
--- a/metalua/treequery.mlua
+++ b/metalua/treequery.mlua
@@ -17,445 +17,451 @@
 --
 -------------------------------------------------------------------------------
 
-local walk = require 'metalua.treequery.walk'

-

-local M = { }

--- support for old-style modules

-treequery = M

-

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

---

--- multimap helper mmap: associate a key to a set of values

---

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

-

-local function mmap_add (mmap, node, x)

-    if node==nil then return false end

-    local set = mmap[node]

-    if set then set[x] = true

-    else mmap[node] = {[x]=true} end

-end

-

--- currently unused, I throw the whole set away

-local function mmap_remove (mmap, node, x)

-    local set = mmap[node]

-    if not set then return false

-    elseif not set[x] then return false

-    elseif next(set) then set[x]=nil

-    else mmap[node] = nil end

-    return true

-end

-

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

---

--- TreeQuery object.

---

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

-

-local ACTIVE_SCOPE = setmetatable({ }, {__mode="k"})

-

--- treequery metatable

-local Q = { }; Q.__index = Q

-

---- treequery constructor

---  the resultingg object will allow to filter ans operate on the AST

---  @param root the AST to visit

---  @return a treequery visitor instance

-function M.treequery(root)

-    return setmetatable({

-        root = root,

-        unsatisfied = 0,

-        predicates  = { },

-        until_up    = { },

-        from_up     = { },

-        up_f        = false,

-        down_f      = false,

-        filters     = { },

-    }, Q)

-end

-

--- helper to share the implementations of positional filters

-local function add_pos_filter(self, position, inverted, inclusive, f, ...)

-    if type(f)=='string' then f = M.has_tag(f, ...) end

-    if not inverted then self.unsatisfied += 1 end

-    local x = {

-        pred      = f,

-        position  = position,

-        satisfied = false,

-        inverted  = inverted  or false,

-        inclusive = inclusive or false }

-    table.insert(self.predicates, x)

-    return self

-end

-

--- TODO: offer an API for inclusive pos_filters

-

---- select nodes which are after one which satisfies predicate f

-Q.after     = |self, f, ...| add_pos_filter(self, 'after', false, false, f, ...)

---- select nodes which are not after one which satisfies predicate f

-Q.not_after = |self, f, ...| add_pos_filter(self, 'after', true,  false, f, ...)

---- select nodes which are under one which satisfies predicate f

-Q.under     = |self, f, ...| add_pos_filter(self, 'under', false, false, f, ...)

---- select nodes which are not under one which satisfies predicate f

-Q.not_under = |self, f, ...| add_pos_filter(self, 'under', true,  false, f, ...)

-

---- select nodes which satisfy predicate f

-function Q :filter(f, ...)

-    if type(f)=='string' then f = M.has_tag(f, ...) end

-    table.insert(self.filters, f); 

-    return self

-end

-

---- select nodes which satisfy predicate f

-function Q :filter_not(f, ...)

-    if type(f)=='string' then f = M.has_tag(f, ...) end

-    table.insert(self.filters, |...| not f(...)) 

-    return self

-end

-

--- private helper: apply filters and execute up/down callbacks when applicable

-function Q :execute()

-    local cfg = { }

-    -- TODO: optimize away not_under & not_after by pruning the tree

-    function cfg.down(...)

-        --printf ("[down]\t%s\t%s", self.unsatisfied, table.tostring((...)))

-        ACTIVE_SCOPE[...] = cfg.scope

-        local satisfied = self.unsatisfied==0

-        for _, x in ipairs(self.predicates) do

-            if not x.satisfied and x.pred(...) then

-                x.satisfied = true

-                local node, parent = ...

-                local inc = x.inverted and 1 or -1

-                if x.position=='under' then

-                    -- satisfied from after we get down this node...

-                    self.unsatisfied += inc

-                    -- ...until before we get up this node

-                    mmap_add(self.until_up, node, x)

-                elseif x.position=='after' then

-                    -- satisfied from after we get up this node...

-                    mmap_add(self.from_up, node, x)

-                    -- ...until before we get up this node's parent

-                    mmap_add(self.until_up, parent, x)

-                elseif x.position=='under_or_after' then

-                    -- satisfied from after we get down this node...

-                    self.satisfied += inc

-                    -- ...until before we get up this node's parent...

-                    mmap_add(self.until_up, parent, x)

-                else

-                    error "position not understood"

-                end -- position

-                if x.inclusive then satisfied = self.unsatisfied==0 end

-            end -- predicate passed

-        end -- for predicates

-

-        if satisfied then

-            for _, f in ipairs(self.filters) do

-                if not f(...) then satisfied=false; break end

-            end

-            if satisfied and self.down_f then self.down_f(...) end

-        end

-    end

-

-    function cfg.up(...)

-        --printf ("[up]\t%s", table.tostring((...)))

-        

-        -- Remove predicates which are due before we go up this node

-        local preds = self.until_up[...]

-        if preds then

-            for x, _ in pairs(preds) do

-                local inc = x.inverted and -1 or 1

-                self.unsatisfied += inc

-                x.satisfied = false

-            end

-            self.until_up[...] = nil

-        end

-        

-        -- Execute the up callback

-        -- TODO: cache the filter passing result from the down callback

-        -- TODO: skip if there's no callback

-        local satisfied = self.unsatisfied==0

-        if satisfied then

-            for _, f in ipairs(self.filters) do

-                if not f(self, ...) then satisfied=false; break end

-            end

-            if satisfied and self.up_f then self.up_f(...) end

-        end

-    

-        -- Set predicate which are due after we go up this node

-        local preds = self.from_up[...]

-        if preds then

-            for p, _ in pairs(preds) do

-                local inc = p.inverted and 1 or -1

-                self.unsatisfied += inc

-            end

-            self.from_up[...] = nil

-        end

-        ACTIVE_SCOPE[...] = nil

-    end

-

-    function cfg.binder(id_node, ...)

-        --printf(" >>> Binder called on %s, %s", table.tostring(id_node), 

-        --      table.tostring{...}:sub(2,-2))

-        cfg.down(id_node, ...)

-        cfg.up(id_node, ...)

-        --printf("down/up on binder done")

-    end

-

-    --function cfg.occurrence (binder, occ)

-    --   if binder then OCC2BIND[occ] = binder[1] end

-       --printf(" >>> %s is an occurrence of %s", occ[1], table.tostring(binder and binder[2]))

-    --end

-

-    --function cfg.binder(...) cfg.down(...); cfg.up(...) end

-    return walk.guess(cfg, self.root)

-end

-

---- Execute a function on each selected node

---  @down: function executed when we go down a node, i.e. before its children

---         have been examined.

---  @up: function executed when we go up a node, i.e. after its children

---       have been examined.

-function Q :foreach(down, up)

-    if not up and not down then            

-        error "iterator not implemented"

-    end

-    self.up_f = up

-    self.down_f = down

-    return self :execute()

-end

-

---- Return the list of nodes selected by a given treequery.

-function Q :list()

-    local acc = { }

-    self :foreach(|x| table.insert(acc, x))

-    return acc

-end

-

---- Return the first matching element

---  TODO:  dirty hack, to implement properly with a 'break' return.

---  Also, it won't behave correctly if a predicate causes an error,

---  or if coroutines are involved.

-function Q :first()

-   local result = { }

-   local function f(...) result = {...}; error() end

-   pcall(|| self :foreach(f))

-   return unpack(result)

-end

-

---- Pretty printer for queries

-function Q :__tostring() return "treequery("..table.tostring(self.root, 'nohash')..")" end

-

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

---

--- Predicates.

---

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

-

---- Return a predicate which is true if the tested node's tag is among the

---  one listed as arguments

--- @param ... a sequence of tag names

-function M.has_tag(...)

-    local args = {...}

-    if #args==1 then

-        local tag = ...

-        return (|node| node.tag==tag)

-        --return function(self, node) printf("node %s has_tag %s?", table.tostring(node), tag); return node.tag==tag end

-    else

-        local tags = { }

-        for _, tag in ipairs(args) do tags[tag]=true end

-        return function(self, node)

-            local node_tag = node.tag

-            return node_tag and tags[node_tag]

-        end

-    end

-end

-

---- Predicate to test whether a node represents an expression.

-M.is_expr = M.has_tag('Nil', 'Dots', 'True', 'False', 'Number','String',

-                  'Function', 'Table', 'Op', 'Paren', 'Call', 'Invoke',

-                  'Id', 'Index')

-

--- helper for is_stat

-local STAT_TAGS = { Do=1, Set=1, While=1, Repeat=1, If=1, Fornum=1,

-                    Forin=1, Local=1, Localrec=1, Return=1, Break=1 }

-

---- Predicate to test whether a node represents a statement.

---  It is context-aware, i.e. it recognizes `Call and `Invoke nodes

---  used in a statement context as such.

-function M.is_stat(self, node, parent)

-    local tag = node.tag

-    if not tag then return false

-    elseif STAT_TAGS[tag] then return true

-    elseif tag=='Call' or tag=='Invoke' then return parent.tag==nil

-    else return false end

-end

-

---- Predicate to test whether a node represents a statements block.

-function M.is_block(self, node) return node.tag==nil end

-

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

---

--- Variables and scopes.

---

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

-

-local BINDER_GRAND_PARENT_TAG = { 

-   Local=true, Localrec=true, Forin=true, Function=true }

-

---- Test whether a node is a binder. This is local predicate, although it

---  might need to inspect the parent node.

-function M.is_binder(node, parent)

-   --printf('is_binder(%s, %s)', table.tostring(node), table.tostring(parent))

-   if node.tag ~= 'Id' or not parent then return false end

-   if parent.tag=='Fornum' then  return b[1]==a end

-   if not BINDER_GRAND_PARENT_TAG[parent.tag] then return false end

-   for _, binder in ipairs(parent[1]) do 

-       if binder==node then return true end 

-   end

-   return false

-end

-

---- Retrieve the binder associated to an occurrence within root.

---  @param occurrence an Id node representing an occurrence in `root`.

---  @param root the tree in which `node` and its binder occur.

---  @return the binder node, and its ancestors up to root if found.

---  @return nil if node is global (or not an occurrence) in `root`.

-function M.binder(occurrence, root)

-    local cfg, id_name, result = { }, occurrence[1], { }

-    function cfg.occurrence(id)

-        if id == occurrence then result = cfg.scope :get(id_name) end

-        -- TODO: break the walker

-    end

-    walk.guess(cfg, root)

-    return unpack(result)

-end

-

---- Predicate to filter occurrences of a given binder.

---  Warning: it relies on internal scope book-keeping,

---  and for this reason, it only works as query method argument.

---  It won't work outside of a query.

---  @param binder the binder whose occurrences must be kept by predicate

---  @return a predicate

-

--- function M.is_occurrence_of(binder)

---     return function(node, ...)

---         if node.tag ~= 'Id' then return nil end

---         if M.is_binder(node, ...) then return nil end

---         local scope = ACTIVE_SCOPE[node]

---         if not scope then return nil end

---         local result = scope :get (node[1]) or { }

---         if result[1] ~= binder then return nil end

---         return unpack(result)

---     end

--- end

-

-function M.is_occurrence_of(binder)

-    return function(node, ...)

-        local b = M.get_binder(node)

-        return b and b==binder

-    end

-end

-

-function M.get_binder(occurrence, ...)

-    if occurrence.tag ~= 'Id' then return nil end

-    if M.is_binder(occurrence, ...) then return nil end

-    local binder_list = ACTIVE_SCOPE[occurrence[1]]

-    return unpack (binder_list or { })

-end

-

-

-

---- Transform a predicate on a node into a predicate on this node's

---  parent. For instance if p tests whether a node has property P,

---  then parent(p) tests whether this node's parent has property P.

---  The ancestor level is precised with n, with 1 being the node itself,

---  2 its parent, 3 its grand-parent etc.

---  @param[optional] n the parent to examine, default=2

---  @param pred the predicate to transform

---  @return a predicate

-function M.parent(n, pred, ...)

-    if type(a)~='number' then n, pred = 2, n end

-    if type(pred)=='string' then pred = M.has_tag(pred, ...) end

-    return function(self, ...)

-        return select(n, ...) and pred(self, select(n, ...))

-    end

-end

-

---- Transform a predicate on a node into a predicate on this node's

---  n-th child.

---  @param n the child's index number

---  @param pred the predicate to transform

---  @return a predicate

-function M.child(n, pred)

-    return function(node, ...)

-        local child = node[n]

-        return child and pred(child, node, ...)

-    end

-end

-

---- Predicate to test the position of a node in its parent.

---  The predicate succeeds if the node is the n-th child of its parent, 

---  and a <= n <= b.

---  nth(a) is equivalent to nth(a, a).

---  Negative indices are admitted, and count from the last child,

---  as done for instance by string.sub().

---

---  TODO: This is wrong, this tests the table relationship rather than the

---  AST node relationship.

---  Must build a getindex helper, based on pattern matching, then build

---  the predicate around it.

---

---  @param a lower bound

---  @param a upper bound

---  @return a predicate

-function M.is_nth(a, b)

-    b = b or a

-    return function(self, node, parent)

-        if not parent then return false end

-        local nchildren = #parent

-        local a = a<=0 and nchildren+a+1 or a

-        if a>nchildren then return false end

-        local b = b<=0 and nchildren+b+1 or b>nchildren and nchildren or b

-        for i=a,b do if parent[i]==node then return true end end

-        return false

-    end

-end

-

-

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

---

--- Comments parsing.

---

--- -----------------------------------------------------------------------------

--- -----------------------------------------------------------------------------

-

-local comment_extractor = |which_side| function (node)

-    local x = node.lineinfo

-    x = x and x[which_side]

-    x = x and x.comments

-    if not x then return nil end

-    local lines = { }

-    for _, record in ipairs(x) do

-        table.insert(lines, record[1])

-    end

-    return table.concat(lines, '\n')

-end

-

-M.comment_prefix = comment_extractor 'first'

-M.comment_suffix = comment_extractor 'last'

-

-

---- Shortcut for the query constructor

-function M :__call(...) return self.treequery(...) end

-setmetatable(M, M)

-

-return M

+local walk = require 'metalua.treequery.walk'
+
+local M = { }
+-- support for old-style modules
+treequery = M
+
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+--
+-- multimap helper mmap: associate a key to a set of values
+--
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+
+local function mmap_add (mmap, node, x)
+    if node==nil then return false end
+    local set = mmap[node]
+    if set then set[x] = true
+    else mmap[node] = {[x]=true} end
+end
+
+-- currently unused, I throw the whole set away
+local function mmap_remove (mmap, node, x)
+    local set = mmap[node]
+    if not set then return false
+    elseif not set[x] then return false
+    elseif next(set) then set[x]=nil
+    else mmap[node] = nil end
+    return true
+end
+
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+--
+-- TreeQuery object.
+--
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+
+local ACTIVE_SCOPE = setmetatable({ }, {__mode="k"})
+
+-- treequery metatable
+local Q = { }; Q.__index = Q
+
+--- treequery constructor
+--  the resultingg object will allow to filter ans operate on the AST
+--  @param root the AST to visit
+--  @return a treequery visitor instance
+function M.treequery(root)
+    return setmetatable({
+        root = root,
+        unsatisfied = 0,
+        predicates  = { },
+        until_up    = { },
+        from_up     = { },
+        up_f        = false,
+        down_f      = false,
+        filters     = { },
+    }, Q)
+end
+
+-- helper to share the implementations of positional filters
+local function add_pos_filter(self, position, inverted, inclusive, f, ...)
+    if type(f)=='string' then f = M.has_tag(f, ...) end
+    if not inverted then self.unsatisfied += 1 end
+    local x = {
+        pred      = f,
+        position  = position,
+        satisfied = false,
+        inverted  = inverted  or false,
+        inclusive = inclusive or false }
+    table.insert(self.predicates, x)
+    return self
+end
+
+function Q :if_unknown(f)
+    self.unknown_handler = f or (||nil)
+    return self
+end
+
+-- TODO: offer an API for inclusive pos_filters
+
+--- select nodes which are after one which satisfies predicate f
+Q.after     = |self, f, ...| add_pos_filter(self, 'after', false, false, f, ...)
+--- select nodes which are not after one which satisfies predicate f
+Q.not_after = |self, f, ...| add_pos_filter(self, 'after', true,  false, f, ...)
+--- select nodes which are under one which satisfies predicate f
+Q.under     = |self, f, ...| add_pos_filter(self, 'under', false, false, f, ...)
+--- select nodes which are not under one which satisfies predicate f
+Q.not_under = |self, f, ...| add_pos_filter(self, 'under', true,  false, f, ...)
+
+--- select nodes which satisfy predicate f
+function Q :filter(f, ...)
+    if type(f)=='string' then f = M.has_tag(f, ...) end
+    table.insert(self.filters, f);
+    return self
+end
+
+--- select nodes which satisfy predicate f
+function Q :filter_not(f, ...)
+    if type(f)=='string' then f = M.has_tag(f, ...) end
+    table.insert(self.filters, |...| not f(...))
+    return self
+end
+
+-- private helper: apply filters and execute up/down callbacks when applicable
+function Q :execute()
+    local cfg = { }
+    -- TODO: optimize away not_under & not_after by pruning the tree
+    function cfg.down(...)
+        --printf ("[down]\t%s\t%s", self.unsatisfied, table.tostring((...)))
+        ACTIVE_SCOPE[...] = cfg.scope
+        local satisfied = self.unsatisfied==0
+        for _, x in ipairs(self.predicates) do
+            if not x.satisfied and x.pred(...) then
+                x.satisfied = true
+                local node, parent = ...
+                local inc = x.inverted and 1 or -1
+                if x.position=='under' then
+                    -- satisfied from after we get down this node...
+                    self.unsatisfied += inc
+                    -- ...until before we get up this node
+                    mmap_add(self.until_up, node, x)
+                elseif x.position=='after' then
+                    -- satisfied from after we get up this node...
+                    mmap_add(self.from_up, node, x)
+                    -- ...until before we get up this node's parent
+                    mmap_add(self.until_up, parent, x)
+                elseif x.position=='under_or_after' then
+                    -- satisfied from after we get down this node...
+                    self.satisfied += inc
+                    -- ...until before we get up this node's parent...
+                    mmap_add(self.until_up, parent, x)
+                else
+                    error "position not understood"
+                end -- position
+                if x.inclusive then satisfied = self.unsatisfied==0 end
+            end -- predicate passed
+        end -- for predicates
+
+        if satisfied then
+            for _, f in ipairs(self.filters) do
+                if not f(...) then satisfied=false; break end
+            end
+            if satisfied and self.down_f then self.down_f(...) end
+        end
+    end
+
+    function cfg.up(...)
+        --printf ("[up]\t%s", table.tostring((...)))
+
+        -- Remove predicates which are due before we go up this node
+        local preds = self.until_up[...]
+        if preds then
+            for x, _ in pairs(preds) do
+                local inc = x.inverted and -1 or 1
+                self.unsatisfied += inc
+                x.satisfied = false
+            end
+            self.until_up[...] = nil
+        end
+
+        -- Execute the up callback
+        -- TODO: cache the filter passing result from the down callback
+        -- TODO: skip if there's no callback
+        local satisfied = self.unsatisfied==0
+        if satisfied then
+            for _, f in ipairs(self.filters) do
+                if not f(self, ...) then satisfied=false; break end
+            end
+            if satisfied and self.up_f then self.up_f(...) end
+        end
+
+        -- Set predicate which are due after we go up this node
+        local preds = self.from_up[...]
+        if preds then
+            for p, _ in pairs(preds) do
+                local inc = p.inverted and 1 or -1
+                self.unsatisfied += inc
+            end
+            self.from_up[...] = nil
+        end
+        ACTIVE_SCOPE[...] = nil
+    end
+
+    function cfg.binder(id_node, ...)
+        --printf(" >>> Binder called on %s, %s", table.tostring(id_node),
+        --      table.tostring{...}:sub(2,-2))
+        cfg.down(id_node, ...)
+        cfg.up(id_node, ...)
+        --printf("down/up on binder done")
+    end
+
+    cfg.unknown = self.unknown_handler
+
+    --function cfg.occurrence (binder, occ)
+    --   if binder then OCC2BIND[occ] = binder[1] end
+       --printf(" >>> %s is an occurrence of %s", occ[1], table.tostring(binder and binder[2]))
+    --end
+
+    --function cfg.binder(...) cfg.down(...); cfg.up(...) end
+    return walk.guess(cfg, self.root)
+end
+
+--- Execute a function on each selected node
+--  @down: function executed when we go down a node, i.e. before its children
+--         have been examined.
+--  @up: function executed when we go up a node, i.e. after its children
+--       have been examined.
+function Q :foreach(down, up)
+    if not up and not down then
+        error "iterator missing"
+    end
+    self.up_f = up
+    self.down_f = down
+    return self :execute()
+end
+
+--- Return the list of nodes selected by a given treequery.
+function Q :list()
+    local acc = { }
+    self :foreach(|x| table.insert(acc, x))
+    return acc
+end
+
+--- Return the first matching element
+--  TODO:  dirty hack, to implement properly with a 'break' return.
+--  Also, it won't behave correctly if a predicate causes an error,
+--  or if coroutines are involved.
+function Q :first()
+   local result = { }
+   local function f(...) result = {...}; error() end
+   pcall(|| self :foreach(f))
+   return unpack(result)
+end
+
+--- Pretty printer for queries
+function Q :__tostring() return "<treequery>" end
+
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+--
+-- Predicates.
+--
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+
+--- Return a predicate which is true if the tested node's tag is among the
+--  one listed as arguments
+-- @param ... a sequence of tag names
+function M.has_tag(...)
+    local args = {...}
+    if #args==1 then
+        local tag = ...
+        return (|node| node.tag==tag)
+        --return function(self, node) printf("node %s has_tag %s?", table.tostring(node), tag); return node.tag==tag end
+    else
+        local tags = { }
+        for _, tag in ipairs(args) do tags[tag]=true end
+        return function(node)
+            local node_tag = node.tag
+            return node_tag and tags[node_tag]
+        end
+    end
+end
+
+--- Predicate to test whether a node represents an expression.
+M.is_expr = M.has_tag('Nil', 'Dots', 'True', 'False', 'Number','String',
+                  'Function', 'Table', 'Op', 'Paren', 'Call', 'Invoke',
+                  'Id', 'Index')
+
+-- helper for is_stat
+local STAT_TAGS = { Do=1, Set=1, While=1, Repeat=1, If=1, Fornum=1,
+                    Forin=1, Local=1, Localrec=1, Return=1, Break=1 }
+
+--- Predicate to test whether a node represents a statement.
+--  It is context-aware, i.e. it recognizes `Call and `Invoke nodes
+--  used in a statement context as such.
+function M.is_stat(node, parent)
+    local tag = node.tag
+    if not tag then return false
+    elseif STAT_TAGS[tag] then return true
+    elseif tag=='Call' or tag=='Invoke' then return parent.tag==nil
+    else return false end
+end
+
+--- Predicate to test whether a node represents a statements block.
+function M.is_block(node) return node.tag==nil end
+
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+--
+-- Variables and scopes.
+--
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+
+local BINDER_PARENT_TAG = {
+   Local=true, Localrec=true, Forin=true, Function=true }
+
+--- Test whether a node is a binder. This is local predicate, although it
+--  might need to inspect the parent node.
+function M.is_binder(node, parent)
+   --printf('is_binder(%s, %s)', table.tostring(node), table.tostring(parent))
+   if node.tag ~= 'Id' or not parent then return false end
+   if parent.tag=='Fornum' then  return parent[1]==node end
+   if not BINDER_PARENT_TAG[parent.tag] then return false end
+   for _, binder in ipairs(parent[1]) do
+       if binder==node then return true end
+   end
+   return false
+end
+
+--- Retrieve the binder associated to an occurrence within root.
+--  @param occurrence an Id node representing an occurrence in `root`.
+--  @param root the tree in which `node` and its binder occur.
+--  @return the binder node, and its ancestors up to root if found.
+--  @return nil if node is global (or not an occurrence) in `root`.
+function M.binder(occurrence, root)
+    local cfg, id_name, result = { }, occurrence[1], { }
+    function cfg.occurrence(id)
+        if id == occurrence then result = cfg.scope :get(id_name) end
+        -- TODO: break the walker
+    end
+    walk.guess(cfg, root)
+    return unpack(result)
+end
+
+--- Predicate to filter occurrences of a given binder.
+--  Warning: it relies on internal scope book-keeping,
+--  and for this reason, it only works as query method argument.
+--  It won't work outside of a query.
+--  @param binder the binder whose occurrences must be kept by predicate
+--  @return a predicate
+
+-- function M.is_occurrence_of(binder)
+--     return function(node, ...)
+--         if node.tag ~= 'Id' then return nil end
+--         if M.is_binder(node, ...) then return nil end
+--         local scope = ACTIVE_SCOPE[node]
+--         if not scope then return nil end
+--         local result = scope :get (node[1]) or { }
+--         if result[1] ~= binder then return nil end
+--         return unpack(result)
+--     end
+-- end
+
+function M.is_occurrence_of(binder)
+    return function(node, ...)
+        local b = M.get_binder(node)
+        return b and b==binder
+    end
+end
+
+function M.get_binder(occurrence, ...)
+    if occurrence.tag ~= 'Id' then return nil end
+    if M.is_binder(occurrence, ...) then return nil end
+    local scope = ACTIVE_SCOPE[occurrence]
+    local binder_hierarchy = scope :get(occurrence[1])
+    return unpack (binder_hierarchy or { })
+end
+
+--- Transform a predicate on a node into a predicate on this node's
+--  parent. For instance if p tests whether a node has property P,
+--  then parent(p) tests whether this node's parent has property P.
+--  The ancestor level is precised with n, with 1 being the node itself,
+--  2 its parent, 3 its grand-parent etc.
+--  @param[optional] n the parent to examine, default=2
+--  @param pred the predicate to transform
+--  @return a predicate
+function M.parent(n, pred, ...)
+    if type(n)~='number' then n, pred = 2, n end
+    if type(pred)=='string' then pred = M.has_tag(pred, ...) end
+    return function(self, ...)
+        return select(n, ...) and pred(self, select(n, ...))
+    end
+end
+
+--- Transform a predicate on a node into a predicate on this node's
+--  n-th child.
+--  @param n the child's index number
+--  @param pred the predicate to transform
+--  @return a predicate
+function M.child(n, pred)
+    return function(node, ...)
+        local child = node[n]
+        return child and pred(child, node, ...)
+    end
+end
+
+--- Predicate to test the position of a node in its parent.
+--  The predicate succeeds if the node is the n-th child of its parent,
+--  and a <= n <= b.
+--  nth(a) is equivalent to nth(a, a).
+--  Negative indices are admitted, and count from the last child,
+--  as done for instance by string.sub().
+--
+--  TODO: This is wrong, this tests the table relationship rather than the
+--  AST node relationship.
+--  Must build a getindex helper, based on pattern matching, then build
+--  the predicate around it.
+--
+--  @param a lower bound
+--  @param a upper bound
+--  @return a predicate
+function M.is_nth(a, b)
+    b = b or a
+    return function(self, node, parent)
+        if not parent then return false end
+        local nchildren = #parent
+        local a = a<=0 and nchildren+a+1 or a
+        if a>nchildren then return false end
+        local b = b<=0 and nchildren+b+1 or b>nchildren and nchildren or b
+        for i=a,b do if parent[i]==node then return true end end
+        return false
+    end
+end
+
+
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+--
+-- Comments parsing.
+--
+-- -----------------------------------------------------------------------------
+-- -----------------------------------------------------------------------------
+
+local comment_extractor = |which_side| function (node)
+    local x = node.lineinfo
+    x = x and x[which_side]
+    x = x and x.comments
+    if not x then return nil end
+    local lines = { }
+    for _, record in ipairs(x) do
+        table.insert(lines, record[1])
+    end
+    return table.concat(lines, '\n')
+end
+
+M.comment_prefix = comment_extractor 'first'
+M.comment_suffix = comment_extractor 'last'
+
+
+--- Shortcut for the query constructor
+function M :__call(...) return self.treequery(...) end
+setmetatable(M, M)
+
+return M
diff --git a/metalua/treequery/walk.mlua b/metalua/treequery/walk.mlua
index 2e10956..67dacfd 100755
--- a/metalua/treequery/walk.mlua
+++ b/metalua/treequery/walk.mlua
@@ -1,249 +1,257 @@
--------------------------------------------------------------------------------

--- Copyright (c) 2006-2013 Fabien Fleutot and others.

---

--- All rights reserved.

---

--- This program and the accompanying materials are made available

--- under the terms of the Eclipse Public License v1.0 which

--- accompanies this distribution, and is available at

--- http://www.eclipse.org/legal/epl-v10.html

---

--- This program and the accompanying materials are also made available

--- under the terms of the MIT public license which accompanies this

--- distribution, and is available at http://www.lua.org/license.html

---

--- Contributors:

---     Fabien Fleutot - API and implementation

---

--------------------------------------------------------------------------------

-

--- Low level AST traversal library.

--- This library is a helper for the higher-level treequery library.

--- It walks through every node of an AST, depth-first, and executes

--- some callbacks contained in its cfg config table:

---

--- * cfg.down(...) is called when it walks down a node, and receive as

---   parameters the node just entered, followed by its parent, grand-parent

---   etc. until the root node.

---

--- * cfg.up(...) is called when it walks back up a node, and receive as

---   parameters the node just entered, followed by its parent, grand-parent

---   etc. until the root node.

---

--- * cfg.occurrence(binder, id_node, ...) is called when it visits an `Id{ }

---   node which isn't a local variable creator. binder is a reference to its

---   binder with its context. The binder is the `Id{ } node which created 

---   this local variable. By "binder and its context", we mean a list starting

---   with the `Id{ }, and followed by every ancestor of the binder node, up until

---   the common root node.

---   binder is nil if the variable is global.

---   id_node is followed by its ancestor, up until the root node.

---

--- cfg.scope is maintained during the traversal, associating a

--- variable name to the binder which creates it in the context of the

--- node currently visited.

---

--- walk.traverse.xxx functions are in charge of the recursive descent into

--- children nodes. They're private helpers.

---

--- corresponding walk.xxx functions also take care of calling cfg callbacks.

-

--{ extension ("match",...) }

-

-local M = { traverse = { }; tags = { }; debug = false }

-

---------------------------------------------------------------------------------

--- Standard tags: can be used to guess the type of an AST, or to check

--- that the type of an AST is respected.

---------------------------------------------------------------------------------

-M.tags.stat = table.transpose{ 

-   'Do', 'Set', 'While', 'Repeat', 'Local', 'Localrec', 'Return',

-   'Fornum', 'Forin', 'If', 'Break', 'Goto', 'Label',

-   'Call', 'Invoke' }

-M.tags.expr = table.transpose{

-   'Paren', 'Call', 'Invoke', 'Index', 'Op', 'Function', 'Stat',

-   'Table', 'Nil', 'Dots', 'True', 'False', 'Number', 'String', 'Id' }

-

---------------------------------------------------------------------------------

--- These [M.traverse.xxx()] functions are in charge of actually going through

--- ASTs. At each node, they make sure to call the appropriate walker.

---------------------------------------------------------------------------------

-function M.traverse.stat (cfg, x, ...)

-   if M.debug then printf("traverse stat %s", table.tostring(x)) end

-   local ancestors = {...}

-   local B  = |y| M.block       (cfg, y, x, unpack(ancestors)) -- Block

-   local S  = |y| M.stat        (cfg, y, x, unpack(ancestors)) -- Statement

-   local E  = |y| M.expr        (cfg, y, x, unpack(ancestors)) -- Expression

-   local EL = |y| M.expr_list   (cfg, y, x, unpack(ancestors)) -- Expression List

-   local IL = |y| M.binder_list (cfg, y, x, unpack(ancestors)) -- Id binders List

-   local OS = || cfg.scope :save()                             -- Open scope

-   local CS = || cfg.scope :restore()                          -- Close scope

-

-   match x with

-   | {...} if x.tag == nil -> for _, y in ipairs(x) do M.stat(cfg, y, ...) end

-                          -- no tag --> node not inserted in the history ancestors

-   | `Do{...}                    -> OS(x); for _, y in ipairs(x) do S(y) end; CS(x) 

-   | `Set{ lhs, rhs }            -> EL(lhs); EL(rhs)

-   | `While{ cond, body }        -> E(cond); OS(); B(body); CS()

-   | `Repeat{ body, cond }       -> OS(body); B(body); E(cond); CS(body)

-   | `Local{ lhs }               -> IL(lhs)

-   | `Local{ lhs, rhs }          -> EL(rhs); IL(lhs)

-   | `Localrec{ lhs, rhs }       -> IL(lhs); EL(rhs)

-   | `Fornum{ i, a, b, body }    -> E(a); E(b); OS(); IL{i}; B(body); CS()

-   | `Fornum{ i, a, b, c, body } -> E(a); E(b); E(c); OS(); IL{i}; B(body); CS()

-   | `Forin{ i, rhs, body }      -> EL(rhs); OS(); IL(i); B(body); CS()

-   | `If{...}                    -> 

-       for i=1, #x-1, 2 do

-           E(x[i]); OS(); B(x[i+1]); CS() 

-       end

-       if #x%2 == 1 then 

-           OS(); B(x[#x]); CS()

-       end

-   | `Call{...}|`Invoke{...}|`Return{...} -> EL(x)

-   | `Break | `Goto{ _ } | `Label{ _ }    -> -- nothing

-   | { tag=tag, ...} if M.tags.stat[tag]-> 

-      M.malformed (cfg, x, unpack (ancestors))

-   | _ ->  

-      M.unknown (cfg, x, unpack (ancestors))

-   end

-end

-

-function M.traverse.expr (cfg, x, ...)

-   if M.debug then printf("traverse expr %s", table.tostring(x)) end

-   local ancestors = {...}

-   local B  = |y| M.block       (cfg, y, x, unpack(ancestors)) -- Block

-   local S  = |y| M.stat        (cfg, y, x, unpack(ancestors)) -- Statement

-   local E  = |y| M.expr        (cfg, y, x, unpack(ancestors)) -- Expression

-   local EL = |y| M.expr_list   (cfg, y, x, unpack(ancestors)) -- Expression List

-   local IL = |y| M.binder_list (cfg, y, x, unpack(ancestors)) -- Id binders list

-   local OS = || cfg.scope :save()                             -- Open scope

-   local CS = || cfg.scope :restore()                          -- Close scope

-

-   match x with

-   | `Paren{ e }               -> E(e)

-   | `Call{...} | `Invoke{...} -> EL(x)

-   | `Index{ a, b }            -> E(a); E(b)

-   | `Op{ opid, ... }          -> E(x[2]); if #x==3 then E(x[3]) end

-   | `Function{ params, body } -> OS(body); IL(params); B(body); CS(body)

-   | `Stat{ b, e }             -> OS(body); B(b); E(e); CS(body)

-   | `Id{ name }               -> M.occurrence(cfg, x, unpack(ancestors))

-   | `Table{ ... }             ->

-      for i = 1, #x do match x[i] with

-         | `Pair{ k, v } -> E(k); E(v)

-         | v             -> E(v)

-      end end

-   | `Nil|`Dots|`True|`False|`Number{_}|`String{_} -> -- terminal node

-   | { tag=tag, ...} if M.tags.expr[tag]-> M.malformed (cfg, x, unpack (ancestors))

-   | _ -> M.unknown (cfg, x, unpack (ancestors))

-   end

-end

-

-function M.traverse.block (cfg, x, ...)

+-------------------------------------------------------------------------------
+-- Copyright (c) 2006-2013 Fabien Fleutot and others.
+--
+-- All rights reserved.
+--
+-- This program and the accompanying materials are made available
+-- under the terms of the Eclipse Public License v1.0 which
+-- accompanies this distribution, and is available at
+-- http://www.eclipse.org/legal/epl-v10.html
+--
+-- This program and the accompanying materials are also made available
+-- under the terms of the MIT public license which accompanies this
+-- distribution, and is available at http://www.lua.org/license.html
+--
+-- Contributors:
+--     Fabien Fleutot - API and implementation
+--
+-------------------------------------------------------------------------------
+
+-- Low level AST traversal library.
+-- This library is a helper for the higher-level treequery library.
+-- It walks through every node of an AST, depth-first, and executes
+-- some callbacks contained in its cfg config table:
+--
+-- * cfg.down(...) is called when it walks down a node, and receive as
+--   parameters the node just entered, followed by its parent, grand-parent
+--   etc. until the root node.
+--
+-- * cfg.up(...) is called when it walks back up a node, and receive as
+--   parameters the node just entered, followed by its parent, grand-parent
+--   etc. until the root node.
+--
+-- * cfg.occurrence(binder, id_node, ...) is called when it visits an `Id{ }
+--   node which isn't a local variable creator. binder is a reference to its
+--   binder with its context. The binder is the `Id{ } node which created 
+--   this local variable. By "binder and its context", we mean a list starting
+--   with the `Id{ }, and followed by every ancestor of the binder node, up until
+--   the common root node.
+--   binder is nil if the variable is global.
+--   id_node is followed by its ancestor, up until the root node.
+--
+-- cfg.scope is maintained during the traversal, associating a
+-- variable name to the binder which creates it in the context of the
+-- node currently visited.
+--
+-- walk.traverse.xxx functions are in charge of the recursive descent into
+-- children nodes. They're private helpers.
+--
+-- corresponding walk.xxx functions also take care of calling cfg callbacks.
+
+-{ extension ("match", ...) }
+
+local pp = require 'metalua.pprint'
+
+local M = { traverse = { }; tags = { }; debug = false }
+
+local function table_transpose(t)
+    local tt = { }; for a, b in pairs(t) do tt[b]=a end; return tt
+end
+
+--------------------------------------------------------------------------------
+-- Standard tags: can be used to guess the type of an AST, or to check
+-- that the type of an AST is respected.
+--------------------------------------------------------------------------------
+M.tags.stat = table_transpose{
+   'Do', 'Set', 'While', 'Repeat', 'Local', 'Localrec', 'Return',
+   'Fornum', 'Forin', 'If', 'Break', 'Goto', 'Label',
+   'Call', 'Invoke' }
+M.tags.expr = table_transpose{
+   'Paren', 'Call', 'Invoke', 'Index', 'Op', 'Function', 'Stat',
+   'Table', 'Nil', 'Dots', 'True', 'False', 'Number', 'String', 'Id' }
+
+--------------------------------------------------------------------------------
+-- These [M.traverse.xxx()] functions are in charge of actually going through
+-- ASTs. At each node, they make sure to call the appropriate walker.
+--------------------------------------------------------------------------------
+function M.traverse.stat (cfg, x, ...)
+   if M.debug then pp.printf("traverse stat %s", x) end
+   local ancestors = {...}
+   local B  = |y| M.block       (cfg, y, x, unpack(ancestors)) -- Block
+   local S  = |y| M.stat        (cfg, y, x, unpack(ancestors)) -- Statement
+   local E  = |y| M.expr        (cfg, y, x, unpack(ancestors)) -- Expression
+   local EL = |y| M.expr_list   (cfg, y, x, unpack(ancestors)) -- Expression List
+   local IL = |y| M.binder_list (cfg, y, x, unpack(ancestors)) -- Id binders List
+   local OS = || cfg.scope :save()                             -- Open scope
+   local CS = || cfg.scope :restore()                          -- Close scope
+
+   match x with
+   | {...} if x.tag == nil -> for _, y in ipairs(x) do M.stat(cfg, y, ...) end
+                          -- no tag --> node not inserted in the history ancestors
+   | `Do{...}                    -> OS(x); for _, y in ipairs(x) do S(y) end; CS(x)
+   | `Set{ lhs, rhs }            -> EL(lhs); EL(rhs)
+   | `While{ cond, body }        -> E(cond); OS(); B(body); CS()
+   | `Repeat{ body, cond }       -> OS(body); B(body); E(cond); CS(body)
+   | `Local{ lhs }               -> IL(lhs)
+   | `Local{ lhs, rhs }          -> EL(rhs); IL(lhs)
+   | `Localrec{ lhs, rhs }       -> IL(lhs); EL(rhs)
+   | `Fornum{ i, a, b, body }    -> E(a); E(b); OS(); IL{i}; B(body); CS()
+   | `Fornum{ i, a, b, c, body } -> E(a); E(b); E(c); OS(); IL{i}; B(body); CS()
+   | `Forin{ i, rhs, body }      -> EL(rhs); OS(); IL(i); B(body); CS()
+   | `If{...}                    ->
+       for i=1, #x-1, 2 do
+           E(x[i]); OS(); B(x[i+1]); CS()
+       end
+       if #x%2 == 1 then
+           OS(); B(x[#x]); CS()
+       end
+   | `Call{...}|`Invoke{...}|`Return{...} -> EL(x)
+   | `Break | `Goto{ _ } | `Label{ _ }    -> -- nothing
+   | { tag=tag, ...} if M.tags.stat[tag]->
+      M.malformed (cfg, x, unpack (ancestors))
+   | _ ->
+      M.unknown (cfg, x, unpack (ancestors))
+   end
+end
+
+function M.traverse.expr (cfg, x, ...)
+   if M.debug then pp.printf("traverse expr %s", x) end
+   local ancestors = {...}
+   local B  = |y| M.block       (cfg, y, x, unpack(ancestors)) -- Block
+   local S  = |y| M.stat        (cfg, y, x, unpack(ancestors)) -- Statement
+   local E  = |y| M.expr        (cfg, y, x, unpack(ancestors)) -- Expression
+   local EL = |y| M.expr_list   (cfg, y, x, unpack(ancestors)) -- Expression List
+   local IL = |y| M.binder_list (cfg, y, x, unpack(ancestors)) -- Id binders list
+   local OS = || cfg.scope :save()                             -- Open scope
+   local CS = || cfg.scope :restore()                          -- Close scope
+
+   match x with
+   | `Paren{ e }               -> E(e)
+   | `Call{...} | `Invoke{...} -> EL(x)
+   | `Index{ a, b }            -> E(a); E(b)
+   | `Op{ opid, ... }          -> E(x[2]); if #x==3 then E(x[3]) end
+   | `Function{ params, body } -> OS(body); IL(params); B(body); CS(body)
+   | `Stat{ b, e }             -> OS(b); B(b); E(e); CS(b)
+   | `Id{ name }               -> M.occurrence(cfg, x, unpack(ancestors))
+   | `Table{ ... }             ->
+      for i = 1, #x do match x[i] with
+         | `Pair{ k, v } -> E(k); E(v)
+         | v             -> E(v)
+      end end
+   | `Nil|`Dots|`True|`False|`Number{_}|`String{_} -> -- terminal node
+   | { tag=tag, ...} if M.tags.expr[tag]-> M.malformed (cfg, x, unpack (ancestors))
+   | _ -> M.unknown (cfg, x, unpack (ancestors))
+   end
+end
+
+function M.traverse.block (cfg, x, ...)
    assert(type(x)=='table', "traverse.block() expects a table")
-   if x.tag then M.malformed(cfg, x, ...)

+   if x.tag then M.malformed(cfg, x, ...)
    else for _, y in ipairs(x) do M.stat(cfg, y, x, ...) end
-   end

-end

-

-function M.traverse.expr_list (cfg, x, ...)

-   assert(type(x)=='table', "traverse.expr_list() expects a table")

-   -- x doesn't appear in the ancestors

-   for _, y in ipairs(x) do M.expr(cfg, y, ...) end

-end

-

-function M.malformed(cfg, x, ...)

-    local f = cfg.malformed or cfg.error

-    if f then f(x, ...) else

-        error ("Malformed node of tag "..(x.tag or '(nil)'))

-    end

-end

-

-function M.unknown(cfg, x, ...)

-    local f = cfg.unknown or cfg.error

-    if f then f(x, ...) else

-        error ("Unknown node tag "..(x.tag or '(nil)'))

-    end

-end

-

-function M.occurrence(cfg, x, ...)

-    if cfg.occurrence then cfg.occurrence(cfg.scope :get(x[1]),  x, ...) end

-end

-

--- TODO: Is it useful to call each error handling function?

-function M.binder_list (cfg, id_list, ...)

-    local f = cfg.binder

-    local ferror = cfg.error or cfg.malformed or cfg.unknown

-    for i, id_node in ipairs(id_list) do

-      if id_node.tag == 'Id' then

-         cfg.scope :set (id_node[1], { id_node, ... })

-         if f then f(id_node, ...) end

-      elseif i==#id_list and id_node.tag=='Dots' then

-         -- Do nothing, those are valid `Dots

-      elseif ferror then

-         -- Traverse error handling function

-         ferror(id_node, ...)

-      else

-         error("Invalid binders list")

-      end

-   end

-end

-

-----------------------------------------------------------------------

--- Generic walker generator.

--- * if `cfg' has an entry matching the tree name, use this entry

--- * if not, try to use the entry whose name matched the ast kind

--- * if an entry is a table, look for 'up' and 'down' entries

--- * if it is a function, consider it as a `down' traverser.

-----------------------------------------------------------------------

-local walker_builder = function(traverse)

-   assert(traverse)

-   return function (cfg, ...)

-      if not cfg.scope then cfg.scope = M.newscope() end

-      local down, up = cfg.down, cfg.up

-      local broken = down and down(...)

-      if broken ~= 'break' then M.traverse[traverse] (cfg, ...) end

-      if up then up(...) end

-   end

-end

-

-----------------------------------------------------------------------

--- Declare [M.stat], [M.expr], [M.block] and [M.expr_list]

-----------------------------------------------------------------------

-for _, w in ipairs{ "stat", "expr", "block" } do --, "malformed", "unknown" } do

-   M[w] = walker_builder (w, M.traverse[w])

-end

-

--- Don't call up/down callbacks on expr lists

-M.expr_list = M.traverse.expr_list

-

-

-----------------------------------------------------------------------

--- Try to guess the type of the AST then choose the right walkker.

-----------------------------------------------------------------------

-function M.guess (cfg, x, ...)

-   assert(type(x)=='table', "arg #2 in a walker must be an AST")

-   if M.tags.expr[x.tag] then return M.expr(cfg, x, ...)  end

-   if M.tags.stat[x.tag] then return M.stat(cfg, x, ...)  end

-   if not x.tag          then return M.block(cfg, x, ...) end

-   error ("Can't guess the AST type from tag "..(x.tag or '<none>')) 

-end

-

-local S = { }; S.__index = S

-

-function M.newscope()

-    local instance = { current = { } }

-    instance.stack = { instance.current }

-    setmetatable (instance, S)

-    return instance

-end

-

-function S :save(...)

-    table.insert (self.stack, table.shallow_copy (self.current))

-    if ... then return self :add(...) end

-end

-

-function S :restore() self.current = table.remove (self.stack) end

-function S :get (var_name) return self.current[var_name] end

-function S :set (key, val) self.current[key] = val end

-

+   end
+end
+
+function M.traverse.expr_list (cfg, x, ...)
+   assert(type(x)=='table', "traverse.expr_list() expects a table")
+   -- x doesn't appear in the ancestors
+   for _, y in ipairs(x) do M.expr(cfg, y, ...) end
+end
+
+function M.malformed(cfg, x, ...)
+    local f = cfg.malformed or cfg.error
+    if f then f(x, ...) else
+        error ("Malformed node of tag "..(x.tag or '(nil)'))
+    end
+end
+
+function M.unknown(cfg, x, ...)
+    local f = cfg.unknown or cfg.error
+    if f then f(x, ...) else
+        error ("Unknown node tag "..(x.tag or '(nil)'))
+    end
+end
+
+function M.occurrence(cfg, x, ...)
+    if cfg.occurrence then cfg.occurrence(cfg.scope :get(x[1]),  x, ...) end
+end
+
+-- TODO: Is it useful to call each error handling function?
+function M.binder_list (cfg, id_list, ...)
+    local f = cfg.binder
+    local ferror = cfg.error or cfg.malformed or cfg.unknown
+    for i, id_node in ipairs(id_list) do
+      if id_node.tag == 'Id' then
+         cfg.scope :set (id_node[1], { id_node, ... })
+         if f then f(id_node, ...) end
+      elseif i==#id_list and id_node.tag=='Dots' then
+         -- Do nothing, those are valid `Dots
+      elseif ferror then
+         -- Traverse error handling function
+         ferror(id_node, ...)
+      else
+         error("Invalid binders list")
+      end
+   end
+end
+
+----------------------------------------------------------------------
+-- Generic walker generator.
+-- * if `cfg' has an entry matching the tree name, use this entry
+-- * if not, try to use the entry whose name matched the ast kind
+-- * if an entry is a table, look for 'up' and 'down' entries
+-- * if it is a function, consider it as a `down' traverser.
+----------------------------------------------------------------------
+local walker_builder = function(traverse)
+   assert(traverse)
+   return function (cfg, ...)
+      if not cfg.scope then cfg.scope = M.newscope() end
+      local down, up = cfg.down, cfg.up
+      local broken = down and down(...)
+      if broken ~= 'break' then M.traverse[traverse] (cfg, ...) end
+      if up then up(...) end
+   end
+end
+
+----------------------------------------------------------------------
+-- Declare [M.stat], [M.expr], [M.block] and [M.expr_list]
+----------------------------------------------------------------------
+for _, w in ipairs{ "stat", "expr", "block" } do --, "malformed", "unknown" } do
+   M[w] = walker_builder (w, M.traverse[w])
+end
+
+-- Don't call up/down callbacks on expr lists
+M.expr_list = M.traverse.expr_list
+
+
+----------------------------------------------------------------------
+-- Try to guess the type of the AST then choose the right walkker.
+----------------------------------------------------------------------
+function M.guess (cfg, x, ...)
+   assert(type(x)=='table', "arg #2 in a walker must be an AST")
+   if M.tags.expr[x.tag] then return M.expr(cfg, x, ...)  end
+   if M.tags.stat[x.tag] then return M.stat(cfg, x, ...)  end
+   if not x.tag          then return M.block(cfg, x, ...) end
+   error ("Can't guess the AST type from tag "..(x.tag or '<none>'))
+end
+
+local S = { }; S.__index = S
+
+function M.newscope()
+    local instance = { current = { } }
+    instance.stack = { instance.current }
+    setmetatable (instance, S)
+    return instance
+end
+
+function S :save(...)
+    local current_copy = { }
+    for a, b in pairs(self.current) do current_copy[a]=b end
+    table.insert (self.stack, current_copy)
+    if ... then return self :add(...) end
+end
+
+function S :restore() self.current = table.remove (self.stack) end
+function S :get (var_name) return self.current[var_name] end
+function S :set (key, val) self.current[key] = val end
+
 return M
diff --git a/metalua/walk.mlua b/metalua/walk.mlua
deleted file mode 100644
index 139474f..0000000
--- a/metalua/walk.mlua
+++ /dev/null
@@ -1,338 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
-
--------------------------------------------------------------------------------
--- Code walkers
---
--- This library offers a generic way to write AST transforming
--- functions. Macros can take bits of AST as parameters and generate a
--- more complex AST with them; but modifying an AST a posteriori is
--- much more difficult; typical tasks requiring code walking are
--- transformation such as lazy evaluation or Continuation Passing
--- Style, but more mundane operations are required in more macros than
--- one would thing, such as "transform all returns which aren't inside
--- a nested function into an error throwing".
---
--- AST walking is an intrinsically advanced operation, and the
--- interface of this library, although it tries to remain as simple as
--- possible, is not trivial. You'll probably need to write a couple of
--- walkers with it before feeling comfortable.
---
---
--- We deal here with 3 important kinds of AST: statements, expressions
--- and blocks. Code walkers for these three kinds for AST are called
--- [walk.stat (cfg, ast)], [walk.expr (cfg, ast)] and [walk.block
--- (cfg, ast)] respectively. the [cfg] parameter describes what shall
--- happen as the AST is traversed by the walker, and [ast] is the tree
--- itself. 
---
--- An aparte to fellow functional programmers: although Lua has
--- got all the features that constitute a functional language, its
--- heart, and in particular it table data, is imperative. It's often
--- asking for trouble to work against the host language's nature, so
--- code walkers are imperative, cope with it. Or use table.deep_copy()
--- if you don't want issues with shared state.
---
--- Since walkers are imperative (i.e. they transform the tree in
--- place, rather than returning a fresh variant of it), you'll often
--- want to override a node, i.e. keep its "pointer identity", but
--- replace its content with a new one; this is done by
--- table.override(), and is conveniently abbreviated as
--- "target <- new_content".
---
--- So, [cfg] can contain a series of sub-tables fields 'expr', 'stat',
--- 'block'. each of them can contain a function up() and/or a function
--- down(). 
---
--- * down() is called when the walker starts visiting a node of the
---   matching kind, i.e. before any of its sub-nodes have been
---   visited.  down() is allowed to return either the string "break",
---   which means "don't go further down this tree, don't try to walk
---   its children", or nil, i.e. "please process with the children
---   nodes". 
---
---   There are two reasons why you might want down() to return
---   "break": either because you really weren't interested into the
---   children nodes,or because you wanted to walk through them in a
---   special way, and down() already performed this special walking.
---
--- * up() is called just before the node is left, i.e. after all of
---   its children nodes have been completely parsed, down and up. This
---   is a good place to put treatments which rely on sub-nodes being
---   already treated. Notice that if down() returned 'break', up() is
---   run immediately after.
---
--- In previous versions of this library, there were plenty of fancy
--- configurable ways to decide whether an up() or down() functions
--- would be triggered or not. Experience suggested that the best way
--- is to keep it simpler, as done by the current design: the functions
--- in sub-table expr are run on each expression node, and ditto for
--- stat and block; the user is expected to use the pattern matching
--- extension to decide whether to act or not on a given node.
---
--- Advanced features
--- =================
---
--- The version above is a strict subset of the truth: there are a
--- couple of other, more advanced features in the library.
---
--- Paths in visitor functions
--- --------------------------
--- First, up() and down() don't take only one node as a parameter, but
--- a series thereof: all the nested expr/stat/block nodes on the way
--- up to the ast's root. For instance, when a walker works on
--- +{ foo(bar*2+1) } an is on the node +{2}, up() and down() are called
--- with arguments (+{bar*2}, +{bar*2+1}, +{foo(bar*2+1)}).
---
--- `Call and `Invoke as statements
--- -------------------------------
--- `Call and `Invoke are normally expressions, but they can also
--- appear as statements. In this case, the cfg.expr.xxx() visitors
--- aren't called on them. Sometimes you want to consider tham as
--- expressions, sometimes not, and it's much easier to add a special
--- case in cfg.stat.xxx() visitors than to determine whether we're in
--- a statament's context in cfg.expr.xxx(),
---
--- Extra walkers
--- -------------
--- There are some second class walkers: walk.expr_list() and walk.guess(). 
---
--- * The first one walks through a list of expressions. Although used
---   internally by the other walkers, it remains a second class
---   citizen: the list it works on won't appear in the path of nested
---   ASTs that's passed to up() and down(). This design choice has
---   been made because there's no clear definition of what is or isn't
---   an expr list in an AST, and anyway such lists are probably not
---   part of metacoders' mental image of an AST, so it's been thought
---   best to let people pretend they don't exist.
---
--- * walk.guess() tries to guess the type of the AST it receives,
---   according to its tag, and runs the appropriate walker. Node which
---   can be both stats and exprs (`Call and `Invoke) are considered as
---   expr.
---
--- These three walkers, although used internally by the other walkers,
--- remain second class citizens: the lists they work on won't appear
--- in the path of nested ASTs that's passed to up() and down().
---
--- Tag dictionaries
--- ----------------
--- There are two public dictionaries, walk.tags.stat and
--- walk.tags.expr, which keep the set of all tags that can start a
--- statement or an expression AST. They're used by walk.guess, and
--- users sometimes need them as well, so they've been kept available.
---
--- Binder visitor
--- --------------
--- Finally, there's one last field in [cfg]: binder(). This function
--- is called on identifiers in a binder position, i.e. `Id{ } nodes
--- which create a scoped local variable, in `Function, `Fornum, `Local
--- etc. The main use case for that function is to keep track of
--- variables, captures, etc. and perform alpha conversions. In many
--- cases that work is best done through the library 'walk.id', which
--- understands the notions of scope, free variable, bound variable
--- etc. 
---
--- Binder visitors are called just before the variable's scope starts,
--- e.g. they're called after the right-hand-side has been visited in a
--- `Local node, but before in a `Localrec node.
---
--- TODO: document scopes, relaxed cfg descriptions
--- -----------------------------------------------
---
--- Examples of cfg structures:
---
--- { Id = f1, Local = f2 }
--- f
--- { up = f1, down = f2 }
--- { scope = { up = f1, down = f2 }, up = f1, down = f2 }
--- { stat = f1, expr = { up = f1 } }
---
---
--------------------------------------------------------------------------------
-
--{ extension ('match', ...) }
-
-walk = { traverse = { }; tags = { }; debug = false }
-
--------------------------------------------------------------------------------
--- Standard tags: can be used to guess the type of an AST, or to check
--- that the type of an AST is respected.
--------------------------------------------------------------------------------
-walk.tags.stat = table.transpose{ 
-   'Do', 'Set', 'While', 'Repeat', 'Local', 'Localrec', 'Return',
-   'Fornum', 'Forin', 'If', 'Break', 'Goto', 'Label',
-   'Call', 'Invoke' }
-walk.tags.expr = table.transpose{
-   'Paren', 'Call', 'Invoke', 'Index', 'Op', 'Function', 'Stat',
-   'Table', 'Nil', 'Dots', 'True', 'False', 'Number', 'String', 'Id' }
-
-local function scope (cfg, dir)
-   local h = cfg.scope and cfg.scope[dir]
-   if h then h() end
-end
-
---------------------------------------------------------------------------------
--- These [walk.traverse.xxx()] functions are in charge of actually going through
--- ASTs. At each node, they make sure to call the appropriate walker.
---------------------------------------------------------------------------------
-function walk.traverse.stat (cfg, x, ...)
-   if walk.debug then printf("traverse stat %s", table.tostring(x)) end
-   local log = {...}
-   local B  = |y| walk.block       (cfg, y, x, unpack(log))
-   local S  = |y| walk.stat        (cfg, y, x, unpack(log))
-   local E  = |y| walk.expr        (cfg, y, x, unpack(log))
-   local EL = |y| walk.expr_list   (cfg, y, x, unpack(log))
-   local I  = |y| walk.binder_list (cfg, y, x, unpack(log))
-   local function BS(y)
-      scope (cfg, 'down'); B(y); scope (cfg, 'up')
-   end
-
-   match x with
-   | {...} if x.tag == nil -> for y in ivalues(x) do walk.stat(cfg, y, ...) end
-                              -- no tag --> node not inserted in the history log
-   | `Do{...}                    -> BS(x)
-   | `Set{ lhs, rhs }            -> EL(lhs); EL(rhs)
-   | `While{ cond, body }        -> E(cond); BS(body)
-   | `Repeat{ body, cond }       -> scope(cfg, 'down'); B(body); E(cond); scope(cfg, 'up')
-   | `Local{ lhs }               -> I(lhs)
-   | `Local{ lhs, rhs }          -> EL(rhs); I(lhs)
-   | `Localrec{ lhs, rhs }       -> I(lhs); EL(rhs)
-   | `Fornum{ i, a, b, body }    -> E(a); E(b); I{i}; BS(body)
-   | `Fornum{ i, a, b, c, body } -> E(a); E(b); E(c); I{i}; BS(body)
-   | `Forin{ i, rhs, body }      -> EL(rhs); I(i); BS(body)
-   | `If{...}                    -> for i=1, #x-1, 2 do E(x[i]); BS(x[i+1]) end
-                                    if #x%2 == 1 then BS(x[#x]) end
-   | `Call{...}|`Invoke{...}|`Return{...} -> EL(x)
-   | `Break | `Goto{ _ } | `Label{ _ }    -> -- nothing
-   | { tag=tag, ...} if walk.tags.stat[tag]-> 
-      walk.malformed (cfg, x, unpack (log))
-   | _ ->  
-      walk.unknown (cfg, x, unpack (log))
-   end
-end
-
-function walk.traverse.expr (cfg, x, ...)
-   if walk.debug then printf("traverse expr %s", table.tostring(x)) end
-   local log = {...}
-   local B  = |y| walk.block       (cfg, y, x, unpack(log))
-   local S  = |y| walk.stat        (cfg, y, x, unpack(log))
-   local E  = |y| walk.expr        (cfg, y, x, unpack(log))
-   local EL = |y| walk.expr_list   (cfg, y, x, unpack(log)) 
-   local I  = |y| walk.binder_list (cfg, y, x, unpack(log))
-   match x with
-   | `Paren{ e }               -> E(e)
-   | `Call{...} | `Invoke{...} -> EL(x)
-   | `Index{ a, b }            -> E(a); E(b)
-   | `Op{ opid, ... }          -> E(x[2]); if #x==3 then E(x[3]) end
-   | `Function{ params, body } -> I(params); scope(cfg, 'down'); B(body); scope (cfg, 'in')
-   | `Stat{ b, e }             -> scope(cfg, 'down'); B(b); E(e); scope (cfg, 'in')
-   | `Table{ ... }             ->
-      for i = 1, #x do match x[i] with
-         | `Pair{ k, v } -> E(k); E(v)
-         | v            -> E(v)
-      end end
-   |`Nil|`Dots|`True|`False|`Number{_}|`String{_}|`Id{_} -> -- nothing 
-   | { tag=tag, ...} if walk.tags.expr[tag]-> 
-      walk.malformed (cfg, x, unpack (log))
-   | _ ->  
-      walk.unknown (cfg, x, unpack (log))
-   end
-end
-
-function walk.traverse.block (cfg, x, ...)
-   assert(type(x)=='table', "traverse.block() expects a table")
-   for _, y in ipairs(x) do walk.stat(cfg, y, x, ...) end
-end
-
-function walk.traverse.expr_list (cfg, x, ...)
-   assert(type(x)=='table', "traverse.expr_list() expects a table")
-   -- x doesn't appear in the log
-   for _, y in ipairs(x) do walk.expr(cfg, y, ...) end
-end
-
-
-function walk.malformed(cfg, x, ...)
-    local f = cfg.malformed or cfg.error
-    if f then f(x, ...) else
-        error ("Malformed node of tag "..(x.tag or '(nil)'))
-    end
-end
-
-function walk.unknown(cfg, x, ...)
-    local f = cfg.unknown or cfg.error
-    if f then f(x, ...) else
-        error ("Unknown node tag "..(x.tag or '(nil)'))
-    end
-end
-
-----------------------------------------------------------------------
--- Generic walker generator.
--- * if `cfg' has an entry matching the tree name, use this entry
--- * if not, try to use the entry whose name matched the ast kind
--- * if an entry is a table, look for 'up' and 'down' entries
--- * if it is a function, consider it as a `down' traverser.
-----------------------------------------------------------------------
-local walker_builder = |cfg_field, traverse| function (cfg, x, ...)
-   local sub_cfg = type (x)=='table' and x.tag and cfg[x.tag] 
-      or cfg[cfg_field] or cfg
-   local broken, down, up = false
-   if type(sub_cfg)=='table' then
-      down, up = sub_cfg.down, sub_cfg.up
-   elseif type(sub_cfg)=='function' or sub_cfg=='break' then
-      down, up = sub_cfg, nil
-   else error "Invalid walk config" end
-
-   if down then
-      if down=='break' then broken='break'
-      else broken = down (x, ...) end
-      assert(not broken or broken=='break', 
-             "Map functions must return 'break' or nil")
-   end
-   if not broken and traverse then traverse (cfg, x, ...) end
-   if up then up (x, ...) end
-end
-
-----------------------------------------------------------------------
--- Declare [walk.stat], [walk.expr], [walk.block] and [walk.expr_list]
-----------------------------------------------------------------------
-for _, w in ipairs{ "stat", "expr", "block", "expr_list" } do
-   walk[w] = walker_builder (w, walk.traverse[w])
-end
-
-----------------------------------------------------------------------
--- Walk a list of `Id{...} (mainly a helper function actually).
-----------------------------------------------------------------------
-function walk.binder_list (cfg, x, ...)
-   local f = cfg.binder 
-   if f then for _, v in ipairs(x) do f(v, ...) end end
-end
-
-----------------------------------------------------------------------
--- Tries to guess the type of the AST then choose the right walkker.
-----------------------------------------------------------------------
-function walk.guess (cfg, x, ...)
-   assert(type(x)=='table', "arg #2 in a walker must be an AST")
-   if walk.tags.expr[x.tag] then return walk.expr(cfg, x, ...)  end
-   if walk.tags.stat[x.tag] then return walk.stat(cfg, x, ...)  end
-   if not x.tag             then return walk.block(cfg, x, ...) end
-   error ("Can't guess the AST type from tag "..(x.tag or '<none>')) 
-end
-
-return walk
diff --git a/metalua/walk/bindings.lua b/metalua/walk/bindings.lua
deleted file mode 100644
index 33ae4e1..0000000
--- a/metalua/walk/bindings.lua
+++ /dev/null
@@ -1,55 +0,0 @@
---------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
---------------------------------------------------------------------------------
-
-local W = require 'metalua.treequery.walk'
-
-local function bindings(ast)
-	local cfg     = { }
-	local locals  = { }
-	local globals = { }
-
-	function cfg.occurrence(binding_ctx, id_occ, ...)
-		if binding_ctx then
-			local binder = binding_ctx[2]
-			local id_name = id_occ[1]
-			if not locals[binder][id_name] then
-				locals[binder][id_name] = {}
-			end
-			table.insert(locals[binder][id_name], id_occ)
-		else
-			local occ_name = id_occ[1]
-			local t = globals[occ_name]
-			if t then table.insert(t, id_occ) else globals[occ_name]={ id_occ } end
-		end	
-	end
-
-	function cfg.binder(id, stat, ...)
-		local id_name = id[1]
-		if not locals[stat] then
-			locals[stat] = {}
-		end
-		if not locals[stat][id_name] then
-			locals[stat][id_name] = {}
-		end
-	end
-	W.guess(cfg, ast)
-	return locals, globals
-end
-
-return bindings
diff --git a/metalua/walk/id.mlua b/metalua/walk/id.mlua
deleted file mode 100644
index 52eabe5..0000000
--- a/metalua/walk/id.mlua
+++ /dev/null
@@ -1,205 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
-
--------------------------------------------------------------------------------
---
--- This library walks AST to gather information about the identifiers
--- in it. It classifies them between free variables and bound
--- variables, and keeps track of which AST node created a given bound
--- variable occurence.
---
--- walk_id (kind, ast)
---
--- Input:
--- * an AST kind: 'expr', 'stat', 'block', 'expr_list', 'binder_list', 'guess'
--- * an AST of the corresponding kind.
---
--- > string, AST
---
--- Output: a table with two fields, 'bound' and 'free';
--- * free associates the name of each free variable with the list of
---   all its occurences in the AST. That list is never empty.
--- * bound associates each stat or expr binding a new variable with
---   the occurences of that/those new variable(s).
---
--- > { free  = table (string, AST and `Id{ });
--- >   bound = table (AST, table(AST and `Id{ })) }
---
--- How it works
--- ============
--- Walk the tree to:
--- * locate open variables, and keep pointers on them so that they can
---   be alpha converted.
--- * locate variable bindings, so that we can find bound variables
--- * locate bound variables, keep them in association with their
---   binder, again in order to alpha-convert them.
---
--- Special treatments:
--- * `Function `Local `Localrec `Fornum `Forin have binders;
---   `Local takes effect from the next statement, 
---   `Localrec from the current statement,
---   `Function and other statments inside their bodies.
--- * `Repeat has a special scoping rule for its condition.
--- * blocks create temporary scopes
--- * `Splice must stop the walking, so that user code won't be
---   converted
---
--------------------------------------------------------------------------------
-
--{ extension ('match', ...) }
--- -{ extension ('log', ...) }
-
-require 'metalua.walk'
-require 'metalua.walk.scope'
-
--- variable lists auto-create empty list as values by default.
-local varlist_mt = { __index = function (self, key) 
-                                  local x={ }; self[key] = x; return x 
-                               end }
-
-local function _walk_id (kind, supercfg, ast, ...)
-
-   assert(walk[kind], "Inbalid AST kind selector")
-   assert(type(supercfg=='table'), "Config table expected")
-   assert(type(ast)=='table', "AST expected")
-
-   local cfg = { expr = { }; block = { }; stat = { } }
-   local scope = scope:new()
-
-   local visit_bound_var, visit_free_var
-   if not supercfg.id then
-      printf("Warning, you're using the id walker without id visitor. "..
-             "If you know what you want do to, then you're probably doing "..
-             "something else...")
-      visit_bound_var = || nil
-      visit_free_var  = || nil
-   else
-      visit_free_var  = supercfg.id.free  or || nil
-      visit_bound_var = supercfg.id.bound or || nil
-   end
-
-   -----------------------------------------------------------------------------
-   -- Check identifiers; add functions parameters to scope
-   -----------------------------------------------------------------------------
-   function cfg.expr.down(x, ...)
-      -- Execute the generic expression walker; if it breaks.
-      -- don't do the id walking.
-      if supercfg.expr and  supercfg.expr.down then  
-         local r = supercfg.expr.down(x, ...)
-         if r then return r end
-      end
-      local parents = {...}
-      match x with
-      | `Id{ name } ->
-         local binder, r = scope.current[name] -- binder :: ast which bound var
-         if binder then 
-            --$log( 'walk.id found a bound var:', x, binder)
-            r = visit_bound_var(x, binder, unpack(parents))
-         else 
-            --$log( 'walk.id found a free var:', x, scope.current)
-            r = visit_free_var(x, unpack(parents))
-         end
-         if r then return r end
-      | `Function{ params, _ } -> scope:push (params, x)
-      | `Stat{ block, expr }   -> 
-         -------------------------------------------------------------
-         -- 'expr' is in the scope of 'block': create the scope and
-         -- walk the block 'manually', then prevent automatic walk
-         -- by returning 'break'.
-         -------------------------------------------------------------
-         scope:push()
-         for _, stat in ipairs (block) do walk.stat(cfg, stat, x, ...) end 
-         walk.expr(cfg, expr, x, unpack(parents))
-         scope:pop()
-         return 'break'
-      | _ -> -- pass
-      end
-
-   end
-
-   -----------------------------------------------------------------------------
-   -- Close the function scope opened by 'down()'
-   -----------------------------------------------------------------------------
-   function cfg.expr.up(x, ...)   
-      match x with `Function{...} -> scope:pop() | _ -> end
-      if supercfg.expr and supercfg.expr.up then supercfg.expr.up(x, ...) end
-   end
-
-   -----------------------------------------------------------------------------
-   -- Create a new scope and register loop variable[s] in it
-   -----------------------------------------------------------------------------
-   function cfg.stat.down(x, ...)
-      -- Execute the generic statement walker; if it breaks.
-      -- don't do the id walking.
-      if supercfg.stat and supercfg.stat.down then  
-         local r = supercfg.stat.down(x, ...)
-         if r then return r end
-      end
-      match x with
-      | `Forin{ vars, ... }    -> scope:push (vars,  x)
-      | `Fornum{ var, ... }    -> scope:push ({var}, x)
-      | `Localrec{ vars, ... } -> scope:add  (vars,  x)
-      | `Repeat{ block, expr } ->
-         -------------------------------------------------------------
-         -- 'expr' is in the scope of 'block': create the scope and
-         -- walk the block 'manually', then prevent automatic walk
-         -- by returning 'break'.
-         -------------------------------------------------------------
-         scope:push() 
-         for _, stat in ipairs (block) do walk.stat(cfg, stat, x, ...) end 
-         walk.expr(cfg, expr, x, ...)
-         scope:pop()
-         return 'break'
-      | _ -> -- pass
-      end
-   end
-
-   -----------------------------------------------------------------------------
-   -- Close the scopes opened by 'up()'
-   -----------------------------------------------------------------------------
-   function cfg.stat.up(x, ...)
-      match x with
-      | `Forin{ ... } | `Fornum{ ... } -> scope:pop() 
-      | `Local{ vars, ... }            -> scope:add(vars, x)
-      | _                              -> -- pass
-      -- `Repeat has no up(), because it 'break's.
-      end
-      if supercfg.stat and supercfg.stat.up then supercfg.stat.up(x, ...) end
-   end
-
-   -----------------------------------------------------------------------------
-   -- Create a separate scope for each block
-   -----------------------------------------------------------------------------
-   function cfg.block.down(x, ...) 
-      if supercfg.block and supercfg.block.down then
-         local r = supercfg.block.down(x, ...) 
-         if r then return r end
-      end
-      scope:push() 
-   end
-   function cfg.block.up(x, ...) 
-      scope:pop() 
-      if supercfg.block and supercfg.block.up then supercfg.block.up(x, ...) end
-   end
-   cfg.binder = supercfg.binder
-   walk[kind](cfg, ast, ...)
-end
-
-local mt = { __index = |_,k| |...| _walk_id(k, ...) }
-walk_id = setmetatable({ }, mt)
diff --git a/metalua/walk/scope.lua b/metalua/walk/scope.lua
deleted file mode 100644
index 29d9402..0000000
--- a/metalua/walk/scope.lua
+++ /dev/null
@@ -1,73 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (c) 2006-2013 Fabien Fleutot and others.
---
--- All rights reserved.
---
--- This program and the accompanying materials are made available
--- under the terms of the Eclipse Public License v1.0 which
--- accompanies this distribution, and is available at
--- http://www.eclipse.org/legal/epl-v10.html
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--- Contributors:
---     Fabien Fleutot - API and implementation
---
--------------------------------------------------------------------------------
-
--------------------------------------------------------------------------------
---
--- Scopes: this library helps keeping track of identifier scopes,
--- typically in code walkers.
---
--- * scope:new() returns a new scope instance s
---
--- * s:push() bookmarks the current set of variables, so the it can be
---   retrieved next time a s:pop() is performed.
---
--- * s:pop() retrieves the last state saved by s:push(). Calls to
---   :push() and :pop() can be nested as deep as one wants.
---
--- * s:add(var_list, val) adds new variable names (stirng) into the
---   scope, as keys. val is the (optional) value associated with them:
---   it allows to attach arbitrary information to variables, e.g. the
---   statement or expression that created them.
---
--- * s:push(var_list, val) is a shortcut for 
---   s:push(); s:add(var_list, val).
---
--- * s.current is the current scope, a table with variable names as
---   keys and their associated value val (or 'true') as value.
---
--------------------------------------------------------------------------------
-
-scope = { }
-scope.__index = scope
-
-function scope:new()
-   local ret = { current = { } }
-   ret.stack = { ret.current }
-   setmetatable (ret, self)
-   return ret
-end
-
-function scope:push(...)
-   table.insert (self.stack, table.shallow_copy (self.current))
-   if ... then return self:add(...) end
-end
-
-function scope:pop()
-   self.current = table.remove (self.stack)
-end
-
-function scope:add (vars, val)
-   val = val or true
-   for i, id in ipairs (vars) do
-      assert(id.tag=='Id' or id.tag=='Dots' and i==#vars)
-      if id.tag=='Id' then self.current[id[1]] = val end
-   end
-end
-
-return scope
\ No newline at end of file
diff --git a/strict.lua b/strict.lua
deleted file mode 100644
index 9032ca7..0000000
--- a/strict.lua
+++ /dev/null
@@ -1,54 +0,0 @@
--------------------------------------------------------------------------------
--- Copyright (C) 1994-2008 Lua.org, PUC-Rio.  All rights reserved.
---
--- This program and the accompanying materials are also made available
--- under the terms of the MIT public license which accompanies this
--- distribution, and is available at http://www.lua.org/license.html
---
--------------------------------------------------------------------------------
-
---
--- strict.lua
--- checks uses of undeclared global variables
--- All global variables must be 'declared' through a regular assignment
--- (even assigning nil will do) in a main chunk before being used
--- anywhere or assigned to inside a function.
---
-
-local getinfo, error, rawset, rawget = debug.getinfo, error, rawset, rawget
-
-local mt = getmetatable(_G)
-if mt == nil then
-  mt = {}
-  setmetatable(_G, mt)
-end
-
-__strict = true
-mt.__declared = {}
-
-local function what ()
-  local d = getinfo(3, "S")
-  return d and d.what or "C"
-end
-
-mt.__newindex = function (t, n, v)
-  if __strict and not mt.__declared[n] then
-    local w = what()
-    if w ~= "main" and w ~= "C" then
-      error("assign to undeclared variable '"..n.."'", 2)
-    end
-    mt.__declared[n] = true
-  end
-  rawset(t, n, v)
-end
-  
-mt.__index = function (t, n)
-  if __strict and not mt.__declared[n] and what() ~= "C" then
-    error("variable '"..n.."' is not declared", 2)
-  end
-  return rawget(t, n)
-end
-
-function global(...)
-   for _, v in ipairs{...} do mt.__declared[v] = true end
-end