OpenAI's new image generation model is autoregressive, while DALL-E was diffusion. The yellowish tone is an artefact of their autoregressive pipeline, if I recall correctly.
Could be. My point is that if the pipeline itself didn't impart an unmistakable character to the generated images, OpenAI would feel compelled to make it do so on purpose.
Most DALL-E 3 images have a orange-blue cast, which is absolutely not an unintended artifact. You'd literally have to be blind to miss it, or at least color-blind. That wasn't true at first -- check the original paper, and try the same prompts! It was something they started doing not long after release, and it's hardly a stretch to imagine why.
They will be doing the same thing for the same reasons today, assuming it doesn't just happen as a side effect.
(Shrug) It fits the facts. Do a GIS for images from DALL-E 3 and provide an alternative explanation for what you see.
It absolutely did not do that on day 1.
OpenAI's new image generation model is autoregressive, while DALL-E was diffusion. The yellowish tone is an artefact of their autoregressive pipeline, if I recall correctly.
Could be. My point is that if the pipeline itself didn't impart an unmistakable character to the generated images, OpenAI would feel compelled to make it do so on purpose.
Most DALL-E 3 images have a orange-blue cast, which is absolutely not an unintended artifact. You'd literally have to be blind to miss it, or at least color-blind. That wasn't true at first -- check the original paper, and try the same prompts! It was something they started doing not long after release, and it's hardly a stretch to imagine why.
They will be doing the same thing for the same reasons today, assuming it doesn't just happen as a side effect.