Comment by samstave
1 year ago
I like your thing.
I have been forcing my bot to give me Mermaid diagrams, swim diagrams, markup tables of schemas, code, logic etc...
I like where you guys are going, but what I think would be really fun - would be a Node Based diagram logic, where the boxes that you show in the diagram are Code-geometry-Nodes - and could be connected with code blocks as such.
Watch @HarryBlends videos on Geometry Nodes in Blender for Inspiration:
Thanks for the feedback and resources!
In designing CodeViz we were inspired by the Maya hypershade, which closely resembles the diagram-based blender tool that you shared.
https://help.autodesk.com/view/MAYAUL/2024/ENU/?guid=GUID-22...
These examples show how taking a diagram-based approach to software development can abstract away complexity with minimal loss of control over the end result. I love your image of "atomic code legos," and these legos can still always be edited the level of code when needed.
And yes, if CodeViz can generate architecture diagrams from code, the inverse can and will be possible: generating code from architecture diagrams.
Exactly!
I've been wanting to have a GPT directly inside Blender to Talk Geometry Nodes - because I want to tie geometry nodes to external data to external data which runs as python inside blender that draws the object geometry that suitabley shows/diagrams out the nodes of my game I am slowly piecing together 'The Oligarchs' which is an updated Illuminati style game - but with updates using AI to creat nodes directly from Oligarch IRL files, such as their SEC Filings, Panama Papers, and all the tools on HN are suited to creating. I went to school for Softimage & Alias|WAVEFRONT (which became MAYA) Animation in 1995 :-)
So I like your DNA.
I want to unpack the relationships of the Oligarch, programmatically, with hexagonal nodes, similar to this[0]- but driven by Node-based-python-blocks-GraphQL-hierachy. And I am slowly learning how to get GPTBots to spit out the appropriate Elements for me to get there.
[0] - https://www.youtube.com/watch?v=vSr6yUBs8tY
(ive posted a bunch of disjointed information on this on HN - more specifically about how to compartmentalize GPT responses and code and how to drive them to write code using Style-Guide, and gather data using structures rules for how the outputs need to be presented..)
EDIT:
I wanted to share with you: Building an app with claude, where I tell it to "give me a ps1 that sets a fastAPI directory structure, creates the venv, touches the correct files give me a readme and follow the best practice for fastAPI from [this github repo from netflix]
https://i.imgur.com/7YOjJf8.png
https://i.imgur.com/KecrvfZ.png
https://i.imgur.com/tKYsmb9.png
https://i.imgur.com/nCGOfSU.png
https://i.imgur.com/ayDrXZA.png
Etc -- I always make it diagram. Now I can throw a bunch of blocks in a directory and tell it to grab the components from the directory and build [THIS INTENT].app for my.
This sounds like an ambitious project with a number of interesting technical challenges. It may be some time before the tooling will exist to support your use case, but then again, you can always build custom tooling of your own! It's very interesting to see how with minimal intervention, many of these operations seem very close to being automated. Thanks for sharing!