Comment by webdevver

7 days ago

theyre going to push "AI on the edge" and "IoT" nonsense again

absolutely unbelievably cooked. anyone pushing that nonsense, short with leverage.

low latency connectivity + goliath data centres will always beat on-device inference/training.

> "AI on the edge" and "IoT" nonsense again.

I love it when my device stays dumb (or at least connect-local) and not become abadonware 6 months after release because the cloud provider felt it a chore to keep running.

> low latency connectivity

That's not exactly easy. I doubt on-device training will become much of a thing. But on-device inference is desirable in all sorts of distributed use cases. We're still a long way off from reliable internet everywhere. Especially when you want to start pushing large quantities of sensor data down the pipe.

I can't even get reliable internet on my phone in the centre of London.

Not necessarily. There are lots of use cases for on device AI inference. I run YOLO on an Nvidia Jetson powered Lenovo Think Edge, which processes incoming video at full frame rates on four channels with recognition and classification for a bespoke premises security system. No clouds involved other than the Nix package manager etc. Make sure your argument May carry more weight when you're talking about ultra low power devices like an Arduino running AI inference locally that seems like more of a stretch.

  • true, true, very true, but i observe you use a nvidia chip. which is perfectly logical. why would you use something that is worse in every single way, right? which is exactly what qcom offerings are...

> low latency connectivity + goliath data centres will always beat on-device inference/training.

Except that it's not always an option...

SOOOO buy Qualcomm. The second they start talking about AI-IOT stock is gonna sky rocket.

We live in a broken world.

Low latency, low power, portable

pick two.

well actually you can't really, low latency is pretty hard to do full stop

tf are you on. just look at meta display glasses. it s all on board compute

  • its cool... but thats not gonna last long at all. soon theyre gonna put their own custom soc into it, just like google did.

    especially for such a specific, space/power/thermal constrained platform. itd be weird if meta didnt put their own custom soc into it.

    running a big tech company these days, theres enough custom work going around that basically all the big players have internal silicon teams. hell, even fintech shops with ~100 employees are doing tape-outs these days!

privacy is a thing people care about.

  • > privacy is a thing people care about.

    Sadly, it seems that privacy is something that HN readers care about, but precious few others.

    Look at the success of Facebook. The covers have been off that stinker for years, yet people still regularly use it; often to the exclusion of more traditional media. I have quite a few friends that I don't get invited to their occasions, because they only advertise them on FB. They invite a bunch of randos they've never met, but not those of us, they see all the time.

    To be fair, if I sit down, and describe exactly what the ramifications of the "always on, always open" Facebook presence means, people will usually suddenly value privacy, but it seems that no one actually ever does that, at a level most folks can understand.

    Hysterical rantings (even when well-founded), by geeks, don't get through to most folks. It needs to be done in the vernacular, and via media they actually consume.

    • There's a commercial version of privacy. E.g. Company A doesn't want to send their data to Company B (a competitor) for processing.