Comment by bastawhiz
6 hours ago
Every year or so there's a new article about some new spectacular storage medium. Crystals, graphene, lasers, quartz, holograms, whatever. It never materializes.
Demonstrating this stuff is possible isn't the hard part, it seems. Productionizing it is. You have to have exceedingly fast read and write speeds: who cares if it can store an exabyte if it takes all month to read it, or if you produce data faster than you can write it? It has to be durable under adverse conditions. It has to be practical to manufacture the medium and the drives. You probably don't want to have to need a separate device to read and a device to write. By the time most of these problems are worked out, most of these technologies aren't a whole lot better than existing tech.
Stick this on the "Wouldn't it be nice if graphene..." pile.
It took 15 if not 20 years to commercialize even such obvious, low-tech thing as radio telegraph, which can literally be built form common house supplies. It happened about 60 years after Maxwell predicted the electromagnetic waves theoretically.
Red LEDs were invented / discovered in 1920s, became commercially successful as indicators in 1960s. Optical fibers were invented in 1920s or so, became a commercial success in 1980s.
Certain things just take time. Do not dismiss a good physical effect, they are much more rare than so-called good ideas.
It doesn't take long to commercialize feasible new tech in this day and age. If someone invented an electromagnetic hovercar tomorrow, it will be available for sale next week and regulations will follow after.
It feels a little disjointed to compare old tech. Computing tech iteration cycles and adoption rates seem more interesting than things at the dawn of communications technology.
Communication technologies have been evolving for billions of years
> who cares if it can store an exabyte if it takes all month to read it
To be fair, if I'm reading an exabyte in a month, my hardware's pushing >3 Tbps, which I'd be very happy with.
Plus just put 32 in stripping RAID if you really need to read an exabyte a day
*RAED
Or maybe RAEND
But if you need 1eb, waiting a whole month for it isn't great. You'd be better off with 720 1pb devices taking an hour in parallel.
Yes it causes problems in this increasingly narrow situation.
Massive storage that takes a month to fully read is acceptable in a wide variety of use cases. If it's cheaper than hard drives it'll get a huge amount of users.
In long term archival use cases this is less of an issue. Especially if it’s many exabytes we’re talking about, needing to be stored for decades.
But I 100% agree with your main point about possibility vs productionisation.
I have no idea if this is practical but I remember when flash memory was this suspicious semi-science fiction thing too. There are probably some people on this site that remember the same for DRAM. There have been loads of things in between that didn't make it. Some of them were semi-crackpot, some actually went into production like bubble memory and Optane. Few of them have met the sweet spot of the market in a way that let them move from a niche to a dominant form of memory, but still I wouldn't discount that it's possible to invent a new form of memory that will take over the world!
Basically you just ignore the hyped up press releases, this just accompanies most semi-cool/exciting papers. The scientists probably know this isn't going to be some new storage that will become widespread but its just part of the game to sell the story like this and the administration wants this.
In fairness, i assume any headline that emphasizes some excessively large storage density is probably at best something useful for archiving and not a replacement for an SSD. If they were targeting latency they would lead with those numbers not the density.
> You probably don't want to have to need a separate device to read and a device to write.
I don’t think this would bother the average enterprise in the least. We used to have entire rooms dedicated to tape libraries that housed dozens of tape drives and thousands of tapes each.
The read and write speed are absolutely critical but having to utilize multiple devices isn’t anything new at all.
It doubles design, development, and manufacturing cost, potentially doubling your supply chain. It's not a problem for the consumer.
Used to? We absolutely still do. LTO is a widely used format, and as far as I'm aware, it is "picking up more steam" each year.
In terms of capacity, LTO sales are increasing. In terms of tape count and drive count, there's been a steady decline.
> Every year or so there's a new article about some new spectacular storage medium. Crystals, graphene, lasers, quartz, holograms, whatever. It never materializes.
Of course, wouldn't you expect that for a fairly mature technology that you'd get tons of false starts from competing tech before eventually getting one breakthrough that completely changed everything? I mean, you could have written a comment that was perfectly analogous to your paragraph above about how AI and neural networks never really amounted to much for about 50-60 years until, all of the sudden, they did (and even if you think AI may currently be overhyped, it's undeniable that in the past 5 years that AI has had an effect on society probably much greater than all the previous history of AI put together).
I prefer to read this academic paper as "Oh, this is a really interesting approach, I wonder what its limitations are" vs. interpreting at as a "this new storage tech will change the world!!!" announcement. I feel like the first approach leads to generally more curiosity, while the second just leads to cynicism and jadedness.
The fact that most of the world's data is still stored on little spinny disks, considering how many times in the last 40 years we've seen this story, is criminal.
Aren't lasers driving the current 32TB+ HDD tech?
yeah but that wasn't a straight upgrade, either. HAMR has all sorts of tradeoffs.