I suspect this would cause alignment issues since you could literally rotate it into the wrong position when doing soldering. That said, perhaps they could get away with cutting less and using more.
If you want to have nice straight edges to clamp into place, then you only need to shave off four slivers. You can lose a couple percent instead of more than a third.
That's the idea in the article. Just one big chip. But the reason why it's normally done is that there is a pretty high defect rate, so cutting if every wafer has 1-2 defects you still get (X-1.5) devices per wafer. In the article thy go into how they avoid this problem (I think its better fault tolerance, at a cost)
Normally yes. But they're using a whole wafer for a single chip! So it's actually a good idea.
I guess the issue is how do you design your routing fabric to work in the edge regions.
Actually I wonder how they are exposing this wafer. Normal chips are exposed in a rectangular batch called a reticle. The reticle mask has repeated patterns across it, and it is then exposed repeatedly across the wafer. So either they have to make a reticle mask the full size of the wafer, which sounds expensive, or they somehow have to precisely align reticle exposures so that the joined edges form valid circuits.
I wonder if you could… just not cut the wafer at all??
I suspect this would cause alignment issues since you could literally rotate it into the wrong position when doing soldering. That said, perhaps they could get away with cutting less and using more.
They already have a notch or flat for alignment, which is much more critical during the lithography process than during soldering.
If you want to have nice straight edges to clamp into place, then you only need to shave off four slivers. You can lose a couple percent instead of more than a third.
You just need a sharpie to mark the top.
That's the idea in the article. Just one big chip. But the reason why it's normally done is that there is a pretty high defect rate, so cutting if every wafer has 1-2 defects you still get (X-1.5) devices per wafer. In the article thy go into how they avoid this problem (I think its better fault tolerance, at a cost)
The article shows them using a single maximally sized square portion of a circular wafer.
I think the proposal you're responding to is "just use the whole circular wafer without cutting out a square".
Might be jumping in without reading, but the chips you cut out of the wafer have to be delivered to physically different locations.
Normally yes. But they're using a whole wafer for a single chip! So it's actually a good idea.
I guess the issue is how do you design your routing fabric to work in the edge regions.
Actually I wonder how they are exposing this wafer. Normal chips are exposed in a rectangular batch called a reticle. The reticle mask has repeated patterns across it, and it is then exposed repeatedly across the wafer. So either they have to make a reticle mask the full size of the wafer, which sounds expensive, or they somehow have to precisely align reticle exposures so that the joined edges form valid circuits.