Last Updated on 06/11/2024 by Nilofer Khan
The artificial intelligence industry has been booming. From surreal imagery at the tap of a button to ChatGPT penning college essays, the industry’s ascent is undeniable. While behemoth brands have been using AI to cut costs—Disney, Netflix, Coca-Cola, Pepsi, and more—others are accused of pushing users to the brink with labyrinthine contracts. Yes, we are speaking about the notorious Adobe Terms and Conditions—a thorn in the side of millions of creatives, stoking rightful fury and frustration.
Lead image by MIKI Yoshihito. Used with Flickr Creative Commons permissions.
While Adobe has vehemently denied utilizing customer content for AI training, it hasn’t assuaged customers’ concerns. Moreover, the mere mention of “machine learning” has set every user on edge, with many contemplating canceling their subscriptions. In this fog of uncertainty, Adobe’s checkered past—where they likened AI to a new camera—serves as a stark reminder of the disconnect between company and clientele. It seems, in the eyes of many, that Adobe’s understanding of its customers falls short, if not entirely misses the mark.
During this tumultuous period, unsettling revelations have emerged regarding the safety of projects, even those in progress, within Adobe’s Cloud service. What’s particularly alarming is the inclusion of medical records, casting a shadow of concern over the sanctity of personal data and privacy.
The gravity of the situation becomes clearer when considering the extent of Adobe Photoshop’s use in handling medical imaging, including CT scans, X-rays, MRI scans, ultrasounds, and more, under the Digital Imaging and Communications in Medicine (DICOM) standard. These files contain crucial patient information like name, date of birth, age, and gender, among other details. Moreover, this data is often directly linked to the images, facilitating streamlined review by healthcare professionals.
![](https://www.thephoblographer.com/wp-content/uploads/2024/06/01-3-770x447.jpg)
In some cases, doctors may share these files for research purposes, albeit typically after removing or detaching identifiable patient data to uphold doctor-patient confidentiality. However, should a breach occur, the implications could include seeking compensatory damages. This underscores the paramount importance of safeguarding sensitive medical data within digital platforms like Adobe’s Cloud service.
The scenario we’ve painted is a stark reminder of the risks inherent in Adobe’s new policies, particularly concerning the handling of sensitive medical data. While specialized viewing and editing software exists for DICOM files, Adobe’s broad accessibility opens the door to potential misuse. In a world where data breaches are increasingly common and profitable, practitioners face significant jeopardy.
Let’s envision this: a dentist, dedicated to their patients, accesses confidential files on Adobe’s platform without reviewing the new terms. Unknowingly, these records could be used for AI training or shared with undisclosed third parties. The uncertainty surrounding data recipients leaves practitioners vulnerable and patient privacy in question.
![](https://www.thephoblographer.com/wp-content/uploads/2024/06/02-2-770x596.jpg)
Given Adobe’s track record, one can’t help but question its commitment to safeguarding customer rights. The spectre of personal data being monetized looms large, casting a shadow of doubt on the integrity of the company’s practices. In a landscape where trust is currency, Adobe’s actions risk bankrupting the faith of practitioners and patients alike.
Moreover, if these images are leaked without the patient or doctor’s knowledge and fall into the wrong hands, it opens the door for the patient to rightfully sue the doctor, despite the mishap not being their fault. This precarious situation underscores the urgency of distancing oneself from contentious software that appears indifferent to privacy concerns. There should be a golden rule: if a tech giant fails to safeguard customer information, legal recourse should be available. However, pursuing legal action often entails a protracted and emotionally taxing battle, with uncertain outcomes that may not favour the customer financially or emotionally.
As we witness yet another chapter in the saga of data misuse, one can only hope for an end to this charade, where the vulnerable aren’t left to bear the brunt of tech companies’ insatiable appetite for growth.