Are the authors of this paper present/the ones who posted?
This is really interesting. One thing I'm curious about -- is this technique applicable to applications which do JIT "compilation" in the business-logic/abstract sense of the term?
Realworld usecase: Hasura is fundamentally a Haskell GraphQL-to-SQL JIT compiler/transpiler. It parses, lowers to an internal IR format, and then translates to a specific SQL dialect.
I'm wondering if any of this would carry over to a usecase like that?
Nonetheless, will be interesting to see whether this technique will get adopted by LLVM or some of the major WASM runtimes. I won't pretend to understand the majority of what the writeup stated, but on paper it seems a huge speedup?
> which do JIT "compilation" in the business-logic/abstract sense of the term?
If I understood what you said correctly, you mean translating something to SQL?
The problem this paper is trying to solve is how we can generate binary code fast. If your target (i.e. the stuff you want to generate in the end) is not binary code, but some high level representation like a SQL text, then I think it doesn't have much to do with the technique in our paper.
I would be interested in your thoughts on using PochiVM as a means to add a dynamic re-compiling core for emulators. The emulator would identify hot code blocks, generate C like constructs (or ideally the AST) and use the 'Copy-and-Patch Compilation' algorithm to generate and execute the relevant machine code. The ability to call back to host methods, handle exception semantics and all the while being totally oblivious to the platform architecture that the machine code is produced for seems ideal for an emulator designed to run on multiple platforms.
Not sure what you mean by 'emulator', but I will assume you meant ISA for a different hardware architecture.
> The emulator would identify hot code blocks, generate C like constructs (or ideally the AST)
The idea behind copy-and-patch should be able to handle your use case of quickly translating code blocks in another ISA to native instructions.
However, I think Pochi's metaprogramming capabilities might not be too relevant here. After all, you are translating from a block of CPU instructions (in another ISA). It's probably not necessary or helpful to translate them back to C-like control flow only to compile them again.
> The ability to call back to host methods, handle exception semantics and all the while being totally oblivious to the platform
I'm not sure what you mean here. Yes Pochi supports intuitive inter-operation with the host program (call methods, handle exception etc). This is important for metaprogramming use case (e.g., generating a program that executes a SQL query), but I don't see what it has to do with emulating a program in another architecture.
Isn't it the first sense of the verb to compile and a return to the original way of doing things ?
(transitive) To put together; to assemble; to make by gathering things from various sources.
Samuel Johnson compiled one of the most influential dictionaries of the English language.
Are the authors of this paper present/the ones who posted?
This is really interesting. One thing I'm curious about -- is this technique applicable to applications which do JIT "compilation" in the business-logic/abstract sense of the term?
Realworld usecase: Hasura is fundamentally a Haskell GraphQL-to-SQL JIT compiler/transpiler. It parses, lowers to an internal IR format, and then translates to a specific SQL dialect.
I'm wondering if any of this would carry over to a usecase like that?
Nonetheless, will be interesting to see whether this technique will get adopted by LLVM or some of the major WASM runtimes. I won't pretend to understand the majority of what the writeup stated, but on paper it seems a huge speedup?
I am the author.
> which do JIT "compilation" in the business-logic/abstract sense of the term?
If I understood what you said correctly, you mean translating something to SQL?
The problem this paper is trying to solve is how we can generate binary code fast. If your target (i.e. the stuff you want to generate in the end) is not binary code, but some high level representation like a SQL text, then I think it doesn't have much to do with the technique in our paper.
Got it -- yeah this is the key bit I was wondering about:
Still really interesting though, thanks for sharing =)
Surprised there aren't more comments/votes.
I would be interested in your thoughts on using PochiVM as a means to add a dynamic re-compiling core for emulators. The emulator would identify hot code blocks, generate C like constructs (or ideally the AST) and use the 'Copy-and-Patch Compilation' algorithm to generate and execute the relevant machine code. The ability to call back to host methods, handle exception semantics and all the while being totally oblivious to the platform architecture that the machine code is produced for seems ideal for an emulator designed to run on multiple platforms.
Not sure what you mean by 'emulator', but I will assume you meant ISA for a different hardware architecture.
> The emulator would identify hot code blocks, generate C like constructs (or ideally the AST)
The idea behind copy-and-patch should be able to handle your use case of quickly translating code blocks in another ISA to native instructions.
However, I think Pochi's metaprogramming capabilities might not be too relevant here. After all, you are translating from a block of CPU instructions (in another ISA). It's probably not necessary or helpful to translate them back to C-like control flow only to compile them again.
> The ability to call back to host methods, handle exception semantics and all the while being totally oblivious to the platform
I'm not sure what you mean here. Yes Pochi supports intuitive inter-operation with the host program (call methods, handle exception etc). This is important for metaprogramming use case (e.g., generating a program that executes a SQL query), but I don't see what it has to do with emulating a program in another architecture.
Isn't it the first sense of the verb to compile and a return to the original way of doing things ?
(transitive) To put together; to assemble; to make by gathering things from various sources. Samuel Johnson compiled one of the most influential dictionaries of the English language.
Well in some sense yes. Template compiler is a very old idea, and this is a template-compiler-styled compiler.
The interesting part is the various improvements to this old idea that allows us to both compile fast and generate good code.