Privacy in Marketplace

Virginia Legislature, Ignoring Consumer Groups, Steamrolls Bad Privacy Bill toward Enactment

“Pay for Privacy” and other Serious Concerns Ignored

The Virginia legislature is rushing to pass a bill that provides “an illusion of privacy” while actually providing consumers with little meaningful control of their personal data, ignoring pleas from Consumer Federation of America (CFA) and other groups to make significant changes. “The Consumer Data Protection Act should be called the business data protection act because it cements in place the current system of corporate surveillance,” said Susan Grant, CFA’s Director of Consumer Protection and Privacy. “Even worse, it allows companies to discriminate against consumers who exercise the limited rights they would have, throws roadblocks in the way of the state attorney general to enforce the law, and prevents consumers from taking enforcement action on their own.”

In a hearing yesterday of the Virginia House of Delegates Committee on Communications, Technology & Innovation, consumer groups that had signed up to testify were not  called upon, nor was there any acknowledgement of the letters and comments that CFA and others submitted expressing serious concerns about the legislation. In her online comments, Ms. Grant pointed out that the opt-out framework in the bill places the burden on consumers to navigate today’s incredibly complex data ecosystem. Consumers must take steps to opt-out of unwanted uses of their information (to the limited extent they are allowed to do so), rather than being asked for consent to use their data for purposes beyond making a transaction or fulfilling their other requests. “Making “opt-out” the default disempowers consumers and poses equity concerns; consumers with less time and resources to figure out how their data is being used and how to opt-out will inevitably be subject to more privacy violations,” Ms. Grant said.

Among other problems with the bill that Ms. Grant highlighted, it:

  • Gives consumers no rights concerning the personal data that may be gleaned from social media and other “channels of mass media” if they didn’t adequately restrict access to that information.
  • Gives consumers no control over businesses selling their personal information to affiliated companies.
  • Requires opt-in for processing consumers’ “sensitive data” but not for uses of their personal information that may be sensitive.
  • Allows consumers to opt-out of seeing targeted advertising based on tracking their activities over time on multiple websites and apps and profiling them, but that opt-out does not stop the tracking and profiling from occurring.
  • Does not apply to advertising based on tracking consumer’s activities over time on a company’s own website or app and profiling them – the business model of Google and Facebook, which profit from profiling and targeting consumers on behalf of other businesses.
  • Only gives consumers the right to opt-out of profiling when it is used “in furtherance to decisions that produce legal effects concerning a consumer or similarly significant effects concerning a consumer.” There is no overall right to stop being tracked and profiled.
  • Does not apply to consumers’ personal information when it is in the hands of financial services companies or other businesses that are covered by other laws, even if the privacy protections of those laws are much weaker.
  • Allows parents and legal guardians to exercise consumer’s rights but does not enable consumers to designate others to act on their behalf, as an aging parent who doesn’t understand technology might want.
  • Allows businesses to charge consumers more or provide them with lower-quality products or services if they exercise the limited rights they have to opt-out of targeted advertisements, their personal data being sold, or being profiled. In other words, if consumers want privacy, they have to pay more, a blatantly discriminatory policy.
  • Lets companies that hold and process consumers’ personal data avoid any responsibility when third parties to which they disclose the data violate the law unless they knew those parties intended to violate the law. (So Facebook would have no liability for what Cambridge Analytica did with users’ personal information.)
  • Prevents consumers from taking legal action to enforce their rights.
  • Creates a “right to cure” that hampers the ability of the attorney general to take action to stop bad practices and obtain remedies for consumers.

The bill is nearly identical to the Washington Privacy Act, which CFA also opposes. “This legislation is being promoted by Big Tech companies and industry groups around the country as a model for other states to follow,” Ms. Grant noted. “It’s crucial for legislators to take the time to listen to consumer and privacy advocates in order to craft real privacy protections for their constituents.”

Contact: Susan Grant, 202-939-1003