Papers and code
This page archives some of my work over the years.
Research work
- On plan recognition (off-and-on since 2006). If we have a plan
library describing how tasks are decomposed into actions, then
given observations of actions we can make hypotheses about the
high-level intention(s) of the observed actor(s). I've worked on
plan recognition for the Integrated Learning, Self-Reconfiguring
Systems and Deep Green programs.
- My current work focuses on adapting state-of-the-art parsing techniques like GLR to plan recognition, by way of extending the parsers for shuffle expressions to express the interleaving of subgoals' observations. Algorithm YR is presented in thei This PAIR 2017 paper.
- This ICAPS 2008 paper (with Geib and Goldman) is probably the most technical, and describes the underlying algorithm in good detail. We also had a workshop paper at PAIR'09.
- There's a paper about the Self-Reconfiguring Systems deployment from HICS'09, with many co-authors but mostly by Tom Haigh and Steve Harp.
- On qualitative probability (2013-2015). Z+ is a well-studied
system of qualitative probability, concerned with (a formalization
of) level of surprise instead of exact numeric values. We used
Z+ in the intrusion detection system (IDS) MIFD, as in MIFD's
predecessor Scyllarus, to assess the likelihood of various cyber
attack events based on reports from IDSs. We experimentally
analyzed the accuracy of the Z+ as its qualitative approximation
becomes less and less representative of true distributions. The
interest of these results is not limited to intrusion detection:
the method used in our systems is a general abductive scheme,
based on qualitative Bayes networks, so the results are applicable
to other information fusion and diagnostic applications. We were
surprised to find that ours is the only experimental investigation
of the accuracy of Z+ as an approximation of conventional
probability.
- This paper (with Goldman) from QR'2015 describes our results.
- On hardware verification (2011-2012). As part of our work on
DARPA's META program for cyber-physical systems we experimented
with culprit identification in probabilistic verification,
determining the design components contributing to a failure.
- This ASME'12 paper describes our results (with Musliner and Woods).
- On Semantic Web languages (2006-2010). As part of DARPA's
Integrated Learning project, I implemented an interpreter for the
LTML planning language focused on automated learning for web
service usage. Three papers describe this work:
- A workshop paper on the language LTML (with Burstein and several others) from the 2009 ISWC workshop on Semantics for the Rest of Us.
- A conference paper (with Goldman) describing the interpreter, its approach to planning for web services, and its support of replanning, from ICAPS 2010.
- A conference paper (with Goldman) describing the Lisp implementation of the interpreter, from the 2009 International Lisp Conference.
- On design patterns (1997). Design patterns allow common
programming idioms to be re-used from implementation to
implementation. With Chris Salzmann, at the time a master's
student at Karlsruhe, we have explored the use of design patterns
to design self-extendible systems, in particular an extensible
editor. Chris designed and implemented a prototype in the Pizza
programming language; for years this page said he'd release the
source code after he got a chance to clean up the code, but at
this point it's safe to assume that that's not going to happen.
- Our experience is summarized in this conference paper (with Salzmann).
- The lambda calculus and minimal intuitionistic linear logic
(1996). The motivating idea of this work was that one can better
compare different ways of reducing terms in the lambda calculus by
mapping different reduction strategies in different ways into a
single third system. The standard example from the distant past is
Plotkin's two continuation passing transforms for call-by-name and
call-by-value reduction. After the appropriate transformation,
each reduction order can simulate the other; in fact, the order is
no longer relevant.
- There is a journal paper (with Odersky, Wadler and Turner) describing the translation of call-by-name and call-by-value into a linear lambda calculus, and call-by-need into an affine calculus; and a conference paper describing how all three calling conventions can be mapped into a single system with separate modes for weakening and contraction.
- My doctoral thesis is also about this sort of thing.
- The call-by-need lambda calculus (1994). The intuitive idea of
the call-by-need lambda calculus is to capture both the property
of the call-by-name calculus that only needed terms are reduced
and the call-by-value behavior that only values are
copied. Moreover, call-by-need operates only on terms, without
requiring a separate heap of bindings, which can make reasoning
with it simpler.
- There is a journal paper (with Odersky and Wadler).
- On classical linear logic and its lambda calculus (1999). The
main idea is a natural deduction system and lambda calculus for
full, classical linear logic. (There are also decorations of the
Schellinx translations from the usual lambda calculi into this
system, corresponding to call-by-name and by-value, although it
turned out that Ichiro Ogata had done more-or-less the same sort
of thing, looking at LKQ and LKT, a few months before.)
- At some point after putting this work aside - perhaps in moving back from Australia, or transferring data between computers, or after a disk failure - I lost track of its files. But it is described in Technical Report ACRC/00/014 from the Advanced Computing Research Centre of the University of South Australia (ask the School's secretary to mail you a hard copy).
- On compiling functional-logic programming languages
(1994). Implementations of logic languages, generally speaking,
have some sort of computational step which allows reconsideration
of previous decisions to bind certain quantifiers to certain
identifiers, usually some sort of backtracking. Functional logic
languages - again, generally speaking - augment the reduction
semantics of functional languages with the backtracking mechanism
of logic languages. This work describes an implementation
technique called quasisharing which allows information computed in
one branch of a search to be reused when relevant after
backtracking into a new branch. The technique resembles Lamping's
strategy for optimal lambda reduction.
- There is a conference paper focusing on the design and the graph calculus, and a workshop paper that discusses the abstract machine (both with Silbermann). My master's thesis is also about this topic.
There is also a BibTeX file with entries for (most of) the above papers, their abstracts, and various other superseded earlier papers on those topics. My Google Scholar page is probably more comprehensive than this list in terms of minor papers, workshops, etc.
Code
More and more of my code is on GitHub. This page discusses some of the highlights there, and a handful of things available elsewhere.
- I wrote a simple Perl script to generate HTML pages for browsing a collection of BibTeX files, and related PDFs etc. It lives on GitHub.
- I'm the primary author of the NST unit test framework for Common
Lisp.
- There's a paper about NST from the 2010 International Lisp Conference.
- The slides from the associated talk are more focused than the paper, concentrating on the choice of abstractions which a Lisp unit test framework should provide.
- My preference-for-the-moment when documenting Common Lisp code is
that Docstrings should be in Org-Mode markup. Org-mode is the
highly readable, highly developed outline mode of Emacs. I've
written a tool for extracting these docstrings into individual
Org-mode files. With this tool it's easy to write manuals and
other documentation in Org-Mode, and have them more easily pick up
changes to APIs: it's easier to remember to update docstrings than
to remember to update separate documentation.
On the subject of Lisp code documentation, I once also wrote a
system
defdoc
that provides support for more structured Lisp documentation - essentially, a replacement for docstrings which generates output in various formats. Currently LaTeX and text outputs are supported (the latter of which automatically generates the traditional docstring). I don't lovedefdoc
- it's just too much. I don't find it easy or satisfying to use. Your milage may vary. Althoughdefdoc
is no longer shipped with NST (for which I wrote it), you can still find it in the old version tags of the NST repos. One of these days I might put it somewhere more convenient, but it's not as if anyone's asked, or should really. It's not very well documented, but there are slides from a ILC2010 lightning talk about it. - For my thesis I wrote a LaTeX package which accumulates the page numbers on which each bibliographic entry is cited, and provides an interface command which allows the page list for each label to be included in the document. N.B.: there is no interface yet for BibTeX; the code is a TeX/LaTeX package only.
- Some time ago I wrote a number of Elisp packages. Warning: this is
probably all very bit-rotted.
- Package extraregional.el separates the notions of point, mark and region from the notion of selection. The idea is that, usually in a different window, you should be able to select text without affecting the selected window, the current buffer or the values of point at the various windows and buffers. Version 0.2 is a rather smaller package than version 0.1.1: I found that the extras were not really all that useful, and were actually pretty fragile. The older version is still available, although it does not work under Xemacs version 19.3.
- Package rmail-labelsorter.el makes labels a little bit easier to handle in Rmail. The code is usable as it is, but is not documented - this situation probably won't change, as I no longer use rmail. Caveat emptor.
Resources
Things I've found useful, or just interesting, or just that I want to be able to find again.
- Books on writing
- Steven Pinker, The sense of style: The thinking person's guide to writing in the 21st century. A lovely modern style guide with great examples (recommended by Phil Wadler).
- Florence Isaacs, Business notes: Writing personal notes that build professional relationships. I haven't read this one yet but it looks good (recommended by Matt Might).
- Linux laptops. The last several laptops I've gotten have been from System 76. On balance I've been more happy with them than any other purveyor of machines I've dealt with in the past.
- Articles on academics, mostly other academics' blog posts, and
mostly applicable to non-academics as well.
- "A good little girl" on Xykademiqz.
- "Don't be that asshole," about interviewing, at The Professor Is In.