Wearables and Personal Data: Risks, Considerations and Protections

“From the outset, particularly in the wearables field, a company must know what consumer data they use, where it comes from, where and how long it is stored and who gets to use it—or not.”

wearablesWith each new year comes an uptick in purchases of workout equipment, blenders, gym memberships and wearable devices of all shapes and sizes. Plans are made and uploaded to a wearable device—including smart rings, shoes and bands—and its accompanying app to track progress. These devices and apps share information with each other and across platforms, tracking a person’s diet, sleep or even sexual activity.

Along with well-known wearables for health and fitness, there are wearable devices for web3 experiences (sensory clothing, such as haptic vests), inventory review (smart glasses) and stress management (smart bands). While consumers love these devices, they are also demanding transparency about the data collected by those devices, as well as consequences for those who mistreat or are dishonest about the use of that data.

First: Know Thy Data

Gait, heart rate, travel patterns, fingerprints—wearables collect a ton of data. This valuable data is governed by laws, regulations and best practices, which are actively enforced by regulators and consumers alike. In the United States, privacy and security laws and regulations have proliferated since 2018: California, Colorado, Connecticut, Utah and Virginia have active omnibus privacy laws governing collection of consumer data, with more on the way in at least 10 other states. There are also laws specific to collecting biometric data in Illinois, Texas and Washington; health data in Washington, Connecticut and Utah; children’s data in Arkansas, California, Connecticut, Florida, Louisiana, Texas and Utah; and educational data in Minnesota. The United States also has federal laws specific to the use-case of the data type, including protected healthcare data, online collection of children’s data, financial information and educational data.

What data is used, stored, bought and collected is a threshold inquiry that must be made early in the development and commercialization process, arguably prior to sending the first organizational email. We often refer to this as “Privacy by Design.” From the outset, particularly in the wearables field, a company must know what consumer data they use, where it comes from, where and how long it is stored and who gets to use it—or not. Answers to these questions will build the company’s data maps and provide a start to an accurate accounting of the risks associated with its data.

Privacy compliance doesn’t only apply to consumer-focused companies. While only California currently provides privacy protections to employees, employees are pursuing protections in their own way, including suing employers for failing to protect data accessed in a data breach.

Second: Secure Thy Data

A report from the University of Maryland notes that a cyberattack occurs every 39 seconds. IBM’s “2023 Cost of a Data Breach Report” states that health care breach costs have increased 53.l3% since 2020, and organizations with a high level of regulatory noncompliance experienced breach costs averaging $5.05 million each, above average.

What this means for companies in the wearables game—especially those targeting health, children, wellness, and biometric (in particular, gaming and sports) data—is they are a prime target for bad actors. They are also a target for regulators and plaintiff’s counsel in the event a breach is hinted to have occurred because of a company’s failure to adequately address security. The disruption in development, operations and sales occurring not only from the breach itself but also from regulatory investigations can be massive and destructive to morale and momentum. Business priorities can end up taking a backseat to discovery requests and hours with counsel.

Conducting a meaningful, real audit of security vulnerabilities is of the highest importance. An assessment by a cybersecurity expert can identify gaps to mitigate and close one by one. Yes, it takes time, effort and resources; however, breaches degrade trust, and consumers will leave platforms they cannot trust.

Third: Notify Thine Users

Data breaches, notice letters, and the plague of identity theft have pushed consumers to demand transparency, laws and regulations around their personal data. To date, 15 states have passed new privacy laws. These new laws are helping to reshape how and when companies notify users about use of personal data. The tired “privacy policy” is more interactive and user-friendly. There is a definite uptick in implementation of “preference centers” and other tools providing consumers with the chance to opt out of secondary uses of data, which may not be in line with the consumer’s expectation about the use of their information.

This means consumers must have notice and choice. The privacy notice needs to quickly and simply explain to users what data is being collected from or about them, where it comes from, when it is collected, who has it, how long it is kept and—possibly most importantly—how their data is being used. “Sharing” data, when there is a benefit derived from the tracking of consumers for advertising purposes, has been losing favor with consumers and regulators, particularly when consumers are not given notice or choice. Further, “dark patterns,” design methods asserted to subvert or impair user decision-making or choice, are prohibited by many new state privacy laws—in addition to being called out as unfair or deceptive under consumer protection laws and regulations.

While this is by no means a comprehensive assessment of the risks associated with use of wearable technologies, it provides the basic considerations when developing or selling a wearable, particularly where the wearable collects biometric, health or children’s data. True, there may be no ceiling for what amazing things wearables can provide, but companies need to wear their tech privacy on their sleeve to get past the growing minefield of privacy compliance.

Image Source: Deposit Photos
Author: masha_tace
Image ID: 65365057 

Share

Warning & Disclaimer: The pages, articles and comments on IPWatchdog.com do not constitute legal advice, nor do they create any attorney-client relationship. The articles published express the personal opinion and views of the author as of the time of publication and should not be attributed to the author’s employer, clients or the sponsors of IPWatchdog.com.

Join the Discussion

One comment so far. Add my comment.

  • [Avatar for Anon]
    Anon
    January 29, 2024 11:28 am

    The overlap between any IP aspects and the separate GENERAL aspects of data – and particularly HUMAN data – would be well to clearly delineate.

Add Comment

Your email address will not be published. Required fields are marked *