Ask HN: Anyone want to collaborate on a local-first AI-based research assistant

14 hours ago

Hi HN Community, I'm Venkatram, a sophomore who's on a mission to build a local alternative to proprietary third-party AI-based research assistants.

The idea is to turn documents into researchable assets that contain as much as information as the original information does, but it's more reusable.

Well, quite frankly, this is still under a WORK IN PROGRESS, so i'm still figuring on how it can be properly used, and I got to be honest here, i definitely need some help to build this, so if you wish, you are welcome!

TlDR: NotebookLM, but Locally with your OWN AI Model

Github: https://github.com/venkatram-s/gigabook-lm

Hi I'd be interested in collaborating. I'm currently an SDE, and I've been using AI mainly as a research tool so this is something I'm interested in.

AnythingLLM ?

  • By the way, i checked your project "TermonMac", it's interesting, why not build one for Linux, right?

  • No, this is similar to "AnythingLLM", and i'll tell you why, this could work.

    AnythingLLM requires about 2GB of RAM, just to be idle, and in the RAM Crisis, that this world is in today. i want to bring better optimization.

    And, AnythingLLM utilizes a "UI-first, Logic-second" approach. Meanwhile, to combat this, i'm using a "Logic-first, UI-second" approach.

    Not to mention, A system like AnythingLLM isn't easy to work on lower-end computers, and I for one, do not like that, because, i'm using a low-end computer to work on this.