← Back to context

Comment by utopiah

3 days ago

Indeed, same questions few days ago when somebody shared a "generated" NES emulator. We have to make this answered when sharing otherwise we can't compare.

At some point the llm ingested a few open source NES emulators and many articles on their architecture. So i question the llm creativity involved with these types examples. Probably also for dsps.

  • Right, the amount of hallucinated response data I see at work using any of these leading models is pretty staggering. So anytime I see one of these “AI created a 100% faithful ___” type posts that does not have detailed testing information, I laugh. Without that, this is v0 and only about 5% of the effort.

  • > i question the llm creativity involved with these types examples.

    Indeed but to be fair I'm not sure anybody claimed much "creativity" only that it worked... but that itself is still problematic. What does it mean to claim it even manage to implement an alternative if we don't have an easy way to verify?

I’m not claiming a 100% faithful physical recreation in the strict scientific sense.

If you look at my other comment in this thread, my project is about designing proprioceptive touch sensors (robot skin) using a soft-body simulator largely built with the help of an AI. At this stage, absolute physical accuracy isn’t really the point. By design, the system already includes a neural model in the loop (via EIT), so the notion of "accuracy" is ultimately evaluated through that learned representation rather than against raw physical equations alone.

What I need instead is a model that is faithful to my constraints: very cheap, easily accessible materials, with properties that are usually considered undesirable for sensing: instability, high hysteresis, low gauge factor. My bet is that these constraints can be compensated for by a more circular system design, where the geometry of the sensor is optimized to work with them.

Bridging the gap to reality is intentionally simple: 3D-print whatever geometry the simulator converges to, run the same strain/stress tests on the physical samples, and use that data to fine-tune the sensor model.

Since everything is ultimately interpreted through a neural network, some physical imprecision upstream may actually be acceptable, or even beneficial, if it makes the eventual transfer and fine-tuning on real-world data easier.

  • Well I'm glad you find new ways to progress on whatever you find interesting.

    This honestly though does not help me to estimate if what you claim to be is what it is. I'm not necessarily the audience for either project but my point remains :

    - when somebody claims to recreate something, regardless of why and how, it helps to understand how close they actually got.

    It's not negative criticism by the way. I'm not implying you did not faithfully enough recreate the DSP (or the other person the NES). I'm only saying that for outlookers, people like me who could be potentially interested, who do NOT have a good understanding of the process nor the initial object recreated, it is impossible to evaluate.

    • Oh. just to be clear first, I’m not the OP. Sorry for the confusion.

      I do understand your point, and I think it’s a fair one: when someone claims to "recreate" something, it really helps readers to know how close the result is to the original, especially for people who don’t already understand the domain.

      I was mostly reacting to the idea that faithfulness always has to be the primary axis of evaluation. In practice, only a subset of users actually care about 100% fidelity. For example with DSP plugins or NES emulators, many people ultimately judge them by how they sound or feel, especially when the original artifact is aesthetic in nature.

      My own case is a bit different, but related. Even though I’m working on a sensor, having a perfectly accurate physical model of the material is secondary to my actual goal. What I’m trying to produce is an end result composed of a printable geometry, a neural model to interpret it, and calibration procedures. The physics simulator is merely a tool, not a claim.

      In fact, if I want the design to transfer well from simulation to reality, it probably makes more sense to intentionally train the model across multiple variations of the physics rather than betting everything on a single "accurate" simulator. That way, when confronted with the real world, adaptation becomes easier rather than harder.

      So I fully agree that clarity about "how close" matters when that’s the claim. I’m just suggesting that in some projects, closeness to the original isn’t always the most informative metric.

      One reason I find my case illuminating is that it makes the "what metric are we optimizing?" question very explicit.

      Sure, I can report proxy metrics (e.g. prediction error between simulated vs measured deformation fields, contact localization error, force/pressure estimation error, sensitivity/resolution, robustness across hysteresis/creep and repeated cycles). Those are useful for debugging.

      But the real metric is functional: can this cheap, printable sensor + model enable dexterous manipulation without vision – tasks where humans rely heavily on touch/proprioception, like closing a zipper or handling thin, finicky objects – without needing $500/sq-inch "microscope-like" tactile sensors (GelSight being the canonical example)?

      If it gets anywhere close to that capability with commodity materials, then the project is a success, even if no single simulator configuration is "the" ground truth.

      What could OP’s next move be? Designing and building their own circuit. Likewise, someone who built a NES emulator might eventually try designing their own console. It doesn’t feel that far-fetched.

      2 replies →