11 Comments
User's avatar
Jay Berman's avatar

Thiughtfull. Thank you.

Expand full comment
Jay Berman's avatar

I think that was Dutch for thoughtful:)

Expand full comment
Chris J. Karr's avatar

We haven't even been able to deploy a ubiquitous public-key cryptosystem for authenticating e-mail senders and messages, so I'm skeptical that one for images is doable outside some very specific narrow niches. And even if you are successful doing so, you still have the "Analog Hole" where one can get around the protections by taking a picture of the image as it's displayed on the screen.

Related to this issue is all of the AI being built into the camera stack itself. The photos people are taking are no longer the values that camera sensor pixels registered, but AI-compiled edits of a variety of individual discrete images, combined with algorithms imputing and replacing image pixels in the name of "better looking" photos. The Verge has a great article on this in the context of Google's new Pixel Pro 8:

https://www.theverge.com/2023/10/7/23906753/google-pixel-8-pro-photo-editing-tools-ai

Expand full comment
Chris J. Karr's avatar

And in the interest of offering a solution, I'm a VERY big fan of not allowing AI-generated content to be copyright protected. This isn't going to stop political ratf*ckers from ginning up fake content, but it will decrease the incentives to use these tools elsewhere in the creative economy, which will lead the bad actors to having to create their own tools instead of piggybacking off a larger technological and economic development.

The good news is that we're already headed down this path:

https://www.hollywoodreporter.com/business/business-news/ai-works-not-copyrightable-studios-1235570316/

Expand full comment
Steve Berman's avatar

I don't think copyright protection for AI-generated content is a good idea at all. However, we are going to have it by the backdoor: AI-generated content under a human name.

Expand full comment
Chris J. Karr's avatar

You're 100% correct here - this is going to be next to impossible to enforce and police effectively unless the Copyright Office starts demanding evidence proving human authorship. And even if they DID start demanding that, it's extremely questionable whether that additional burden would be compatible with the Constitution's copyright clause.

Expand full comment
Curtis Stinespring's avatar

Not quite on topic, but even the phone scams can't be tacked down and punished. I think the day a person turns 65 years old he becomes a target and remains so. Another targeting trigger is a mortgage. AI will further enable predators. The digital society is already unsafe for trusting souls and will get worse when one can't believe his own eyes and ears.

Expand full comment
Jay Berman's avatar

Thank you.

Expand full comment
Steve Berman's avatar

All valid points. However: in DNS and email security, DKIM and DMARC have had a positive impact on preventing scams, though scams continue to be a giant problem (especially AI whaling efforts). DKIM and DMARC are not ubiquitous because many businesses don't know how to configure them, plus too many workers rely on their personal G-Suite accounts versus corporate-run emails. If I were king, I'd end BYOD, permanently.

The Google Pixel Pro issue is not a problem since pixel replacement and artifacts happen at the time the phone stores the image, and the authenticity of the image as stored can be built in to the meta data.

My point is that just because something is not totally effective is not a reason to not do it, if it's somewhat effective and can be improved over time.

Expand full comment
Chris J. Karr's avatar

I'm a fan of NOT letting the perfect being the enemy of the good, but this is precisely the problem:

"DKIM and DMARC are not ubiquitous because many businesses don't know how to configure them, plus too many workers rely on their personal G-Suite accounts versus corporate-run emails."

Now take e-mail administrators and replace them with photographers and you see the problem. And let's assume that you can get to the point where it does become doable, you still have to get the whole scheme through Congress to turn it into a law that DOESN'T end up with a few gatekeepers (like the DVD CSS key administrators ended up being) that pushes too many people taking pictures out of the bounds of "officially-verified" pictures.

I think that as a bottom-up attestation infrastructure, your plan could work and help quite a bit. Hell, I'll even build the browser extension that puts a nice check badge indicating that a photo in an IMG tag hasn't been tampered with. (I wrote a tools two decades ago that played in a similar space when it came to machine-readable content licenses[1].)

I think the pieces are all there to do something, but I shake in my boots imagining getting Congress involved and imposing something, as opposed to coming up with a solid bottom-up solution, like the DNS standards mail providers came up to validate outbound SMTP servers.

[1] https://creativecommons.org/2005/10/01/oyez/

Expand full comment
Steve Berman's avatar

Agreed. Congress should not come up with the solution, nor should it empower regulators to do so (or it will end up subsumed to the NSA like everything else). However, providing cover of law for the cryptographic (private) systems is what Congress should do, which is why I brought up DMCA. The crime should be circumventing the cyptosystem, not requiring the use of it.

Expand full comment