← Back to context

Comment by vignesh_warar

14 days ago

Thanks for the info. We don't use any private data, only publicly available images. So it won't be a problem, in my opinion.

I will contact my lawyer and double check this.

Not only the EU, but you will have to check with each of the 50 US states as they all have a patchwork of laws. Illinois was one of the first, but I don't know much about it; I thought I read it was pretty extensive to the point some facial recognition companies specifically exclude it. Texas also has its own version as well, that I know of; again don't know details.

You might want to look up Clearview AI, who also took publicly available images, performed biometric recognition on them and ended up with a €30.5 million fine: https://blog.barracuda.com/2024/10/23/clearview-ai-fine-gdpr...

  • I just did research on this.

    Clearview vs Introthem:

    - Clearview does photo-to-photo matching. We don't do that, and I don't think I will ever build that.

    - You have to provide the name, then we build the faces collection for analyzing at search time and delete it.

    - We don't retain any face collection once the search is done.

    I still don't know if I am breaking any laws, but here is how Introthem works.

The GDPR works on the personally-identifiable vs anonymous distinction. Private vs public doesn't really factor into it, or at least only becomes relevant in the nuances.

Personally identifiable data is just a mouthful, which is why people like to misleadingly shorten it to private data.