DaniFong 17 years ago

Alternatively, you can learn macros, as PG might suggest. Fundamentally, compilers just translate one string of symbols into another. The ones we're familiar with just translate a high level language into assembly. But there are others -- every object relational model system has a defacto compiler.

  • aston 17 years ago

    If you take a look at that nanopass paper, it's very much in the "just use macros" school of thought. It's even in Scheme.

    For the tldr crowd, http://www.cs.indiana.edu/~dyb/pubs/nano-jfp.pdf.

    • hsmyers 17 years ago

      Not a new idea-- Griswold wrote Snobol as a set of 360 macros.

      • aston 17 years ago

        In the world of programming languages, nothing's new; all of the good stuff was thought of by 1970. That's basically the extent of what I learned in college.

        • DaniFong 17 years ago

          I don't think that's 100% true. There are some good new ideas emerging due to new machine models: Erlang for example, which is like the machine language of concurrency. And Haskell wasn't really anticipated by the 70's, nor is much of Fortress -- especially the methods they're using for parallelism.

          • cstejerean 17 years ago

            Erlang isn't exactly new. Development started over 20 years ago and it has been open source for 10 years.

            http://www.erlang.org/course/history.html

            • DaniFong 17 years ago

              Sure, but that's around 20 years after the 70's, and the usefulness of the ideas wasn't well known until some time after that.

        • david927 17 years ago

          All the paradigms were discovered by the 70's, and a language is only really the View/Controller for the paradigm. At least that's true as far as what's currently published.

        • KirinDave 17 years ago

          This is absolutely not true. A lot of great work has been done in the past decade on interpreters and speeding them up. Modern interpreters are just compilers where you use the code immediately (most are direct threaded byte-code interpreters), so all that work counts.

          It is strange that modern software engineering en masse took such a huge step backwards in the C++ era of the late 80's and early 90's, with tools falling back to primitive levels and languages becoming much less forgiving. It's only now that we're finally returning to the state of the art from 20 years ago.

          Of course, some people bucked the trend and used these somewhat neglected technologies despite a lack of public popularity. Paul Graham is one of them, and he ended up doing pretty well for himself. :)

        • harshavr 17 years ago

          some exceptions i can think of :

          clos and the meta object protocol - the idea of leaving the language open for the user to change by using a metaclass

          hygienic macros

          first class continuations

          monads

          functional reactive programming

          the first few developed in the 80's while the last is quite recent

    • DaniFong 17 years ago

      That's an excellent paper, thank you.

      One of the advantages of realizing that compilers are 'nothing special' is that you can start to use them all the time to simplify your work. It lets you think at a higher level.

      • DaniFong 17 years ago

        This should be a little lesson on the dangers of skimming over dense prose, and the power of recommendations on YC. I completely skipped past the nanopass paper, even though it's an approach I condone, and skimming a behavior I frown upon.

  • mojuba 17 years ago

    With that approach you won't be able to write a compiler for anything other than a (suboptimal) Lisp derivative, which itself is just a small part of computing world. Take a look at the GNU Compiler Collection and see how far it is to write a working optimizing compiler from the Lisp macro paradigm.

    • narag 17 years ago

      And even if writing a more complete compiler wasn't difficult enough, there's more much to it. There are complex details like exception handling (stack unwinding, signals...), graphical debuggers, interface with GUI libraries, threads, etc.

      The fact that only commercial (and expensive) Lisp implementations have all these features is a hint that they're not trivial.

      • mojuba 17 years ago

        And in fact what we all need is a good, commercial grade open-source Lisp compiler + tools.

        • KirinDave 17 years ago

          I'm sorry, but I strongly disagree. Lisp's problems are not technical, they're social. We have a commercial-grade open-source lisp compiler. It's SBCL. And we have really excellent models for tools. If someone would extract SLIME from emacs and into an editor with less history behind it and more popular appeal, you'd have most of the tools you need.

          Lisp is suffering because its community is fragmented and it has no leaders. Name a popular language that doesn't have an iconic corporation or person behind it, rallying and focusing the community? That condition is actually quite rare.

          • mojuba 17 years ago

            SBCL: is it as good as Allegro? I haven't seen an IDE as beautiful and comfortable to work with as Allegro (although I'm barely a Lisp hacker, just a fan).

            • KirinDave 17 years ago

              I don't know what metrics you'd use to decide it's "better". Certainly the native code generation on SBCL is pretty darn good.

              I think Allegro has better tools and libraries, but SBCL is certainly suitable for professional work. At this point we're talking about matters of degrees.

          • rplevy 17 years ago

            I think the reason why that is not likely to catch on is that most Lisp hackers enjoy using an editor that can itself be hacked in Lisp. Also, emacs is one of the most stable pieces of software in history. It will probably live forever and continue to evolve.

            Also, I think that Lisp isn't really suffering. As far as I can tell there has been increasing interest in Lisp. Even without that trend continuing, Lisp is such a masterpiece (and it continues to develop with new innovations), that it is almost definitely here to stay, regardless of whether it becomes a trendy language to use again. The advantages in performance (say SBCL's for example) and expressiveness over other dynamic languages like Ruby and Python afford Lisp developers an actual advantage, as opposed to a merely perceived one.

          • narag 17 years ago

            >It's SBCL. And we have really excellent models for tools. If someone would extract SLIME from emacs and into an editor with less history behind it and more popular appeal, you'd have most of the tools you need.

            SBCL port to Windows is a work in progress. And what you're saying about tools is more or less that they doesn't exist... yet. And comparing what you need to know and set up to start working with open source Lisp and other environments is... not fair.

            • KirinDave 17 years ago

              > SBCL port to Windows is a work in progress.

              So what? Lots of software gets written without ever touching Windows. I've based my entire career on it. An environment can certainly be mature and commercial quality without being multi-platform.

              > And comparing what you need to know and set up to start working with open source Lisp and other environments is... not fair.

              I do wish that SBCL had a "everything you need to know in 10 minutes or less", but it comes with ASDF and ASDF-INSTALL pre-configured. I mean, the amount of stuff you need to know to start working with Java is pretty darn daunting as well, given that you need massive assistance frameworks, and there is no good central package repo. The reason more people go into Java is that they have colleges introducing them to it.

              Yes, more could be done. But I think commercial, professional work is possible right now given the state of SBCL.

              • pchristensen 17 years ago

                ITA Software is proof of that possibility. Dan Weinreb has written about that often, on comp.lang.lisp and his own blog.

              • narag 17 years ago

                >So what? Lots of software gets written without ever touching Windows.

                Except most people doesn't want to write "lots of software", but software that fills certain requirements, often customers' requirements: run on Windows, do threads, have a rich GUI, minimize to tray, detect screensaver, interface to word processor, customer's database, customer's crappy ERP, etc.

                >the amount of stuff you need to know to start working with Java is pretty darn daunting as well,

                I've done both starts (Java and Lisp) and I can't disagree more.

                >But I think commercial, professional work is possible right now given the state of SBCL.

                It depends on your requirements. For some environments that's not true at all.

                • KirinDave 17 years ago

                  > It depends on your requirements. For some environments that's not true at all.

                  Which is true of nearly every commerical software environment out there. Why does Lisp get such a brutal grading compared to something like MS's CLR or Cocoa? Those are definitely commercially viable platforms that also don't meet these requirements.

                  • narag 17 years ago

                    Huge APIs. Specially GUI. Big community. Many tools.

                    Did I say GUI? GUI. GUI. GUI. Web apps are nice, free you from slavery, all that. But for a lot of tasks there is no other practical option than desktop apps, and that's what a lot of people use, even if they don't write blogs or appear in hip news, so they're invisible.

                    • KirinDave 17 years ago

                      So there are no GUI frameworks on common-lisp.net/projects? And suddenly SWIG doesn't bind to CFFI?

                      I appreciate your concern, but you seem somewhat ignorant to the number and quality of software libraries available to most common lisp implementations. There are excellent GTK bindings and Objective-C bindings. I'm not sure about what's available on the Windows site (although I guess I should learn, given the events of this week).

                      And as for the community, you have me there. The Lisp community is indeed fractured and weird. But, uh, at the end of the day I think this is a wash. People do great things in unusual languages all the time. It's not like EngineYard or I have a massive Erlang community bolstering our efforts on Fuzed and Vertebra, but we're making progress and doing what I consider to be good work.

      • gaius 17 years ago

        I'm not sure that's true. Right now I am working on GNU CLISP with bindings to Oracle and a GUI (via Ltk). Sure it's not trivial, but it's not impossible either.

        • narag 17 years ago

          That could fill the requirements for "enterprise" software, the kind of work that people happily convert to web apps. Making desktop software can be much much more demanding.

    • DaniFong 17 years ago

      I don't claim that this is the only way to learn how to write compilers, it's just one surprisingly effective way.

      And I actually think skill in generalised, simple, macro-type compilers is more useful, generally, than knowing the ins and outs of optimizing compilers, but that's just me.

      For the sake of discussion, what are some of the important things for optimizing compilers that you can't do with this approach? I'm having trouble finding any -- both in a literal sense of possibility, and from a practical standpoint.

      • mojuba 17 years ago

        I may be underestimating macros but I can't imagine any optimization technique that can be done with them when generating low-level output, be it native or virtual machine code. Can you demonstrate (just theoretically, of course) copy propagation, removing loop invariants, automatic inlining of functions, to name a few?

        • gruseom 17 years ago

          It's not hard to see how macros could do this. Macros just transform code into other code, which is what optimizations do too. For example, if you have an s-expr representing a loop, imagine a function which accepts that s-expr and returns a new one with invariants moved outside the loop body.

          • mojuba 17 years ago

            Ok, but my original point was, "you won't be able to write a compiler for anything other than a (suboptimal) Lisp derivative" (I was probably wrong about suboptimal though, because it only applies to the compiler)

            • gruseom 17 years ago

              Well, the easy way to use Lisp macros in a compiler is to start with an s-expr and successively transform it until you've got something that the target platform accepts. "Starting with an s-expr" implies that your source code is in Lisp format. So in that sense, what you're compiling is a Lisp derivative. Is that what you mean?

              No doubt there are clever ways to leverage Lisp macros in compilers beyond this approach. But most people who like Lisp macros wouldn't bother. They'd just do it the easy way.

              Edit: my understanding of what DaniFong is getting at is that in Lisp programming, there is no longer a barrier between application development and compiler development. This makes possible a lot of powerful things that you can't do when the application is written in a fixed language by different programmers than the ones who write the compiler. It's a different point, but one I find very interesting. It's not obvious what belongs to application development and what belongs to language development once this technical (and organizational) barrier is removed.

          • rahulgarg 17 years ago

            What about global optimizations? Macros operate on local expressions only effectively. Things like global alias analysis still needs to be done outside them.

            • gruseom 17 years ago

              Presumably you have the entire program represented as a top-level tree, and a global optimization would be a transformation of that. I don't see why macros wouldn't be just as applicable here as in the local case where you're transforming a subtree.

              • rahulgarg 17 years ago

                I guess u are right. Macros people usually write are local but macros are just AST transformers after all. I was confusing between what happens during macro expansion (arbitrary computation including lets say doing any analysis and building metadata) versus the result of the macro expansion (a transformed AST).

        • DaniFong 17 years ago

          You're right that if you keep the implementations of Lisp compile time macros and use only that, you have as hard time reducing code that's already been macroexpanded, or applying global effects.

          But when you might macros and the language itself, what you find is that you already have the pieces of what you need for a serious compiler: a symbol table, built in, a way to manipulate the parse tree, an easy way to do local expansions, and most importantly, a fully featured language.

          You can do this with existing Lisps by a a few methods: making first class runtime macros, for example, or by saving the source code and working over it in passes.

          Does this explain it?

hsmyers 17 years ago

So did he spell Knuth wrong as a typo or through ignorance? Or is this some subtle insult? Doesn't seem to thing Perl qualifies as useful either...Oh well, one less opinion to worry about.

  • gaius 17 years ago

    It was a joke, a mistake that a newbie intimidated by huge textbooks might make.