Wicket Logo White
BLOG POSTS
Privacy Man holding lock

Privacy Matters Should be Core of Business Strategy

Privacy has frequently been in the news for the last few years, usually not for the better. There are two pieces of news, among many, that seem to be responses to the excesses of erosion of user privacy at the hands of companies.

First, a former Google executive, Sridhar Ramaswamy, responsible for growing its $115 billion advertising business, left after 15 years with Google to form a different kind of search company, Neeva, with a view to grow by charging users subscriptions rather than selling advertising. Ramaswamy said in 2018 that the ad-supported model had limitations” and “the relentless pressure to maintain Google’s growth had come at a heavy cost to the company’s users.” Online tracking to keep tabs on the ads being watched fundamentally eroded user privacy, and Google, and other large entities like Facebook, know it. Still, the allure of revenue and growth made the ads and user tracking more profitable. This conflict of interest in media, serving the interests of the advertisers (also called the “payers”) vs. the interests of the readers and viewers (the  “eyeballs”)  is not new—in fact, in the print and broadcast business, the ads are placed first and whatever remains is called the “news hole.”

“It could be argued from the consumer point of view that the better the search engine is, the fewer advertisements will be needed for the consumer to find what they want,” Larry Page and Sergey Brin, founders of Google, quite ironically and presciently, wrote in the 1998 paper. Since 1998, when Google’s search results were primarily results with minimal ads, to today’s “Ad-creep,” where more and more ads with increased targeting of users translate to seemingly endless ad dollars for Google. As Ramaswamy says, “It’s a slow drift away from what is the best answer for the user and how do we surface it. As a consumer product, the more pressure there is to show ads, the less useful in the long term the product becomes.”

The excesses of data-driven assaults into our privacy are finally creeping into the public consciousness, and even Google has tried to limit user tracking. However, because Google accounts for roughly 90% of all searches, serious limitations of tracking will harm its revenues from advertisers—thus, it decided to slow the complete ban on tracking it touted for many months.

Second, on July 28, 2021, the House Energy and Commerce Committee held a hearing titled “Transforming the FTC: Legislation to Modernize Consumer Protection.”

During the hearing, Members such as Full Committee Chair Pallone (D-NJ) and Ranking Member McMorris Rodgers (R-WA) emphasized the need for a federal privacy framework, with Republicans cautioning against a patchwork of state laws. Further, Members outlined concerns related to children’s privacy, with Rep. Castor (D-FL) calling on the FTC to prioritize enforcement of the Children’s Online Privacy Protection Act (COPPA) Rule. Commissioner Slaughter also emphasized the need to begin the rulemaking process on data abuses, and she reiterated the need to move past “notice and consent” models, outlining the importance of data minimization, coupled with further use, purpose, sharing, and security requirements.

As companies increase their wanton abuse of user data and breach our privacy, a tipping point will be reached. Long before that tipping point, it behooves for companies to behave responsibly, not just in corporate spin, but in actual action. And meaningful action requires expertise, cash, and time. A partial list of what companies can do to respect the privacy of their patrons might look something like this:

1.    Data minimization—only collect what you actually need. For instance, it would be an overreach of data collection to ask a patron if they were left-handed or right-handed in trying to determine whether to approve a loan or not—why does it matter?

2.    Data protection—use expertise to keep customer data protected by using encryption at rest (when the data is stored) and at flight (when data is moving between systems) as well as using the correct level of encryption technology for the asset that needs to be protected. Too many systems can be easily cracked due to easy ciphers or short, simple passwords or lack of 2FA (two factor authentication).

3.    Treat it like your mother’s vitals—would you pass your mother’s vitals (or personally identifiable information, PII) to a third party that is not meant to see it, or without her consent? Particularly, at large scale, data scientists and companies treat user data as “cohorts” and “patterns” and “trends”—not real human beings who could be harmed by the wanton transfer to an intended party. With consent, you can do whatever the user wants, without consent, you cannot do anything.

Right now, a future is brewing, where the companies that take privacy seriously will reap the benefits from user patronage, and those that flout privacy will be increasingly ostracized. Therefore, just like company KPIs such as revenues, margins, and growth are valued, companies that respect users’ and customers’ privacy will have the user’s trust, and ultimately, less explaining to do.

About Wicket

Wicket Software is a privacy-first facial authentication platform provider with patented computer vision AI technology that enables sensational event experiences for fans, guests, and employees with frictionless touchpoints that delight users and strengthen security.

Wicket has been in use since 2020 and deployed for facial ticketing, credentialing, access control, and payments in numerous sports stadiums, at major conferences, and in corporate office environments.

Other Posts

stadium concession revenue
Increasing and Streamlining NBA Stadium Concession Revenue
sports ticketing systems
How Sports Ticketing Systems Are Transforming the NBA
WANT TO LEARN MORE?

Connect With Us