Is This Thing On? Decoding Audacity’s Privacy Update

Changes in the data-collection policy for a hugely popular audio editing app are highlighting old and new tensions in digital trustworthiness, and how open-source software can offer solutions.

Changes in the data-collection policy for a hugely popular audio editing app are highlighting old and new tensions in digital trustworthiness, and how open-source software can offer solutions.

A Familiar Tune: The Vague Privacy Policy

Audacity has been a touchstone for open-source editing software for years. Since its first open-source release in 2000, the app has garnered over 100 million downloads, giving rise to vibrant online communities of users. To understand the implications of Audacity’s July 2 update and its widespread backlash, it’s crucial to keep in mind the app’s open-source community.

Diving into Audacity’s Privacy Updates

Let’s look at two explicit changes in Audacity’s policy and their impact for software users and broader regulations.

First, the July 2 update specifies the kinds of personal information that Audacity collects. For analytics and apps improvement, the policy names six pieces of information:

  • “OS version
  • User country based on IP address
  • OS name and version
  • CPU
  • Non-fatal error codes and messages (i.e. project failed to open)
  • Crash reports in Breakpad MiniDump format”

Additionally, the update mentions that Audacity may collect any “data necessary for law enforcement, litigation and authorities’ requests.” The vague wording, combined with the app’s widespread recognition as an audio-editing tool, ignited concerns that Audacity would be sending users’ microphone recordings to law enforcement. There’s no evidence that this is happening. But, speaking as a non-lawyer here, I see no safeguard in the update to reasonably prevent such exchanges, or transfers of other personal information under the wide umbrella of necessary data.

Second, the policy prohibits users under the age of 13 from using Audacity: “If you are under 13 years old, please do not use the App.” One the one hand, this move seems reasonable, given that Audacity’s data-collection practices put it in the scope of the US children’s privacy law, COPPA; prohibiting children from using the app would theoretically achieve COPPA compliance since no child uses the app. On the other hand, Audacity users have brought up potential friction between this age restriction and the terms of the app’s General Public License on who can use the app.

Looking To The Public Reaction

News of the policy update quickly spread through Audacity’s user communities on GitHub and elsewhere. The term “spyware” became the latest label for the app, particularly for its unspecific policy of collecting whatever data necessary from users. In response, the Head of Strategy for Muse Group, Audacity’s parent company (more on this in a minute), issued a clarification of the privacy policy on Github. In that statement, the Head of Strategy mentions, “We do understand that unclear phrasing of the Privacy Policy and lack of context regarding introduction has led to major concerns about how we use and store the very limited data we collect. We will be publishing a revised version shortly.”

Based on Audacity’s comments, it sounds like they already know what to do, right? The obvious vagueness in their data-collection policy should have been easily identified and clarified in internal policy reviews, not in a public re-issuing of a privacy policy that has already undermined users’ trust in the app. Furthermore, shady dealings have already riddled Audacity since its acquisition by Muse Group earlier this year. Among them: a new Contributor Licensing Agreement binding all users to a controversial agreement with Muse Group, and a failed attempt to introduce in-app events tracking. These recent data practices made the road to regaining trust plenty steep for Audacity, which now finds itself in a situation reminiscent of WhatsApp’s privacy issues in early 2021: it is not technical accuracy or the policy authors’ intentions that shape brand trust. It is perception.

Lessons On Trust and Privacy In The Open-Source Community

The Audacity uproar reminds me of Dr. Helen Nissenbaum’s theory of privacy as contextual integrity: usable privacy controls cannot depend on users reading and consenting to convoluted descriptions of data flows, but instead draw on context-specific information norms. By Dr. Nissenbaum’s analysis, norms of appropriateness and distribution vary according to the context. For example, we expect our health data to circulate among practitioners when we visit the clinic, while we expect a wider variety of personal information to remain confidential in intimate conversations with our friends.

The upshot for Audacity is this: people see the app as a tool for making and editing sound recordings, often in personal ventures like music-making. They might not see a need for data collection to accomplish this task. However, with Audacity’s context established—one in which audio files are created and edited—it is reasonable to be concerned that a vague policy provision on third-party data-sharing might impact all user information, including audio information. Whether you adopt Dr. Nissenbaum’s contextual approach to privacy or not, businesses should understand their users’ information norms and expectation throughout the user journey. Doing so won’t be enough to achieve compliance with modern regulations, but it can be a differentiating factor in how your company demonstrates its commitment to respecting users’ data.

The open-source software community provides some of the most compelling evidence to debunk any notion that privacy and transparency are at odds with one another. When done effectively, the two go hand-in-hand.

While the coming weeks will reveal the effectiveness of the clarification, the Audacity community has capitalized on the app’s open-source software to continue the app on their terms. They have forked Audacity, branching off of the source code to maintain the app’s function without the new data-collection updates, which are slated to take effect with Audacity’s next release. It is a striking example of how open-source software empowers users through code transparency, even while the policy remains notably opaque. That transparency is not unique to Audacity. It is a function of an open-source software at large. If it’s any indication of users’ drive to spin off their own audio-editing app, one of the popular forked versions gained the name Tenacity.

Fides enables developers to check for privacy compliance directly in the CI pipeline, proactively addressing risk and compliance according to resource annotations and Fides policies.

Ready to get started?

Our team of data privacy devotees would love to show you how Ethyca helps engineers deploy CCPA, GDPR, and LGPD privacy compliance deep into business systems. Let’s chat!