Industry standard

This is the first industry standard in the world to guarantee the authenticity of a photo

Earlier last year, a coalition was formed between technology and media entities such as Adobe, Arm, BBC, Intel, Microsoft and Truepic to combat the spread of misinformation and build trust in content verification by line.

This joint Development Foundation project, known as The Coalition for Content Provenance and Authenticity (C2PA), has successfully developed an end-to-end open standard for tracing the origin and evolution of digital content.

• Read more: Best VPN for Photographers

The C2PA has just published the very first technical specifications designed to certify a source, as well as digital media history. These specifications are the first of their kind and will allow content creators and publishers to develop tamper-proof media, allowing them to selectively disclose information about how their digital content was originally created and subsequently modified.

C2PA’s goal since its inception has been to combat the prevalence of misinformation and content fraud online, by developing technical standards to certify the source and history or provenance of media content.

(Image credit: C2PA)

Some may wonder if this coalition is really necessary; With photo editors such as Photoshop and Lightroom being so popular among photographers, why did Adobe backtrack in a sense and now fight against uploading edited images?

In October 2021, Adobe introduced a new feature in Photoshop called Content Credentials, allowing creators to attach attribution data to images before sharing them online.

This news follows the establishment of The Content Authenticity Initiative (CAI) which was founded by Adobe, Twitter and the New York Times in 2019, as a way to combat image misinformation by increasing trust and transparency of content shared online. The CAI has since expanded and now also includes support from Nikon.

This foundation of misinformation goes way beyond tweaking a few tones using the histogram or adjusting exposure to perfect your photos. Heavily manipulated images and filters can be dangerous when used in the context of news and especially social media in relation to influencer culture and the promotion of something that may not be entirely truthful.

(Image credit: C2PA)

Other image applications that can cause harm to others include things like catfishing (using fake or stolen photos to woo potential partners), fraud, deepfakes, fake profiles, misleading evidence submitted to news outlets… you get the idea.

In the overview section of the first paragraph of the Technical Specification, it is stated: “We are witnessing extraordinary challenges to trust in media. As social platforms amplify the reach and influence of certain content via algorithms Increasingly complex and opaque, misattributed and misleading contextualized content is spreading rapidly. Whether it’s inadvertent misinformation or deliberate deception via misinformation, inauthentic content is on the rise.”

The specification describes the technical aspects of the C2PA architecture; a model for storing and accessing cryptographically verifiable information whose reliability can be assessed based on a defined trust model, as illustrated by a diagram. The trust model has three entities specified in different colors; a consumer is expected to use the signer’s identity, along with other trust signals, to decide whether claims made about an asset are true.

The schematics and specs get more complicated as the content unfolds, but you can check the specification standard to ensure the authenticity of the content on the official website. C2PA website.

Read more:

Adobe Photoshop CC Review
Adobe Lightroom CC Review
Adobe Lightroom Classic Review