Comment by cubefox 3 months ago This sounds a bit like H-Net [1] or Byte Latent Transformer [2].1: https://arxiv.org/abs/2507.079552: https://arxiv.org/abs/2412.09871 3 comments cubefox Reply diyer22 3 months ago It does seem that way — we’re both trying to overcome the limitations imposed by LLM tokenization to achieve a truly end-to-end model.And, their work is far more polished; I’ve only put together a quick GPT+DDN proof-of-concept.Thank you for sharing. lukan 3 months ago I vouched for this comment. Your account seems to be shadow banned, but your last comments look fine to me, so you maybe want to email dang to revoke that status .. cubefox 3 months ago Thanks. I sent an email.
diyer22 3 months ago It does seem that way — we’re both trying to overcome the limitations imposed by LLM tokenization to achieve a truly end-to-end model.And, their work is far more polished; I’ve only put together a quick GPT+DDN proof-of-concept.Thank you for sharing.
lukan 3 months ago I vouched for this comment. Your account seems to be shadow banned, but your last comments look fine to me, so you maybe want to email dang to revoke that status .. cubefox 3 months ago Thanks. I sent an email.
It does seem that way — we’re both trying to overcome the limitations imposed by LLM tokenization to achieve a truly end-to-end model.
And, their work is far more polished; I’ve only put together a quick GPT+DDN proof-of-concept.
Thank you for sharing.
I vouched for this comment. Your account seems to be shadow banned, but your last comments look fine to me, so you maybe want to email dang to revoke that status ..
Thanks. I sent an email.