← Back to context

Comment by stared

4 years ago

Too bad. In my opinion, Distill was the best thing that had happened to publish in the last 5 years.

I do understand that there is a lot of burden for editors. Also, from what I see - a lot of serious research now ends up on arXiv, and it does not matter if it gets published in a reviewed journal. (At least, in the last 2 years in deep learning, some breakthroughs are only as a PDF on OpenAI or as a website by Nvidia.)

Still - for static papers, it is acceptable to submit them to arXiv. For interactive ones, Distill is (was?) the only suitable venue.

On a more personal note, I am in the process of writing a paper (interactive, on tensor diagrams) for Distill. So well, it will end up as a blog post. It is OK-ish - most of my quality blog posts had orders of magnitude higher impact than my peer-reviewed papers. Still - a persistent DOI, editorial help from Chris Olah, would be game-changers.

I can't help with regard to editorial help from COlah but in terms of a DOI you could save your blog to a Zenodo repository and get the DOI from that. Alternatively if you're building your blog directly from GH repos you can link Zenodo to them and have it update the DOI automatically with each version change.