In this paper, my co-authors and I explored the often-overlooked intersection between privacy and accessibility, focusing on the privacy policies of assistive technologies (ATs). These tools, like speech-to-text software or navigation aids for those with visual disabilities, often require some form of personal data to function, such as an individual’s contact information or even health history. However, how these ATs communicate their privacy practices is not always transparent, posing privacy risks for those who rely on them.
Our research revealed several troubling patterns across the privacy policies of 18 ATs, ranging from apps to wearable devices:
Lack of Disability-Specific Protections
While many privacy policies include special provisions for children, none offered protections tailored for individuals with disabilities, despite the sensitive nature of the data collected, such as health or geolocation information.
Legal Protections Favor the Company
Privacy policies prioritize legal compliance, often protecting the companies rather than their users. For example, some companies reserve the right to retain user data indefinitely for legal defense purposes, sidelining privacy concerns.
Inconsistent and Opaque Data Practices
Policies were inconsistent in how they described data storage and security, with many being vague about how long or where data would be stored. This lack of transparency makes it hard for users to make informed decisions about their privacy.
Unclear Distinction Between Essential and Non-Essential Data
Several policies failed to clarify which types of data were essential for the AT’s functionality and which were collected for secondary purposes. For example, it was unclear why some tools needed sensitive personal data, such as sexual orientation or citizenship status.
Vague Rules Around Third-Party Data Sharing
Many ATs shift the responsibility to users when it comes to third-party data sharing, leaving them to decipher complex privacy implications on their own.
We argue that AT privacy policies must do more than meet basic legal requirements; they must address the specific vulnerabilities of people with disabilities. Users shouldn’t have to choose between functionality and privacy. Trust in ATs can only grow if privacy policies are transparent, user-centric, and reflect the realities of those relying on these technologies.
We encourage AT designers and companies to rethink how they write privacy policies, considering a participatory approach that includes input from users with disabilities. Clear distinctions between essential and non-essential data, as well as transparent handling of third-party sharing, are also crucial steps toward more trustworthy ATs.
Ultimately, we posit that privacy policies should provide AT users with clear, accessible information about how their data will be used to help them make informed choices when those choices are possible. We hope that this work inspires further focus on privacy as a key aspect of accessible design.