Shopify has been named in a putative class action, with a number of named plaintiffs alleging that the e-commerce hosting platform, along with content moderation provider TaskUs, failed to exercise “reasonable care in securing and safeguarding consumer information in connection with a massive 2020 data breach impacting Ledger SAS hardware wallets,” which are used to store individuals’ cryptocurrency holdings and other digital tokens. The sweeping breach was caused by “rogue members” of Shopify’s support team, “including employees of TaskUs, Inc.,” the plaintiffs assert, and ultimately, resulted in “the unauthorized public release of approximately 272,000 pieces of detailed personally identifiable information.”

In the complaint that they filed with a federal court in Delaware on April 1, Plaintiffs Gregory Forsberg, Christopher Gunter, Scott Sipprell, and Samuel Kissinger allege that Shopify, which hosts Ledger’s e-commerce site, and TaskUs “repeatedly and profoundly failed to protect [Ledger] customers’ identities,” resulting in the [their] personally identifiable information – from full names and telephone numbers to email and post addresses – being accessed in the July 2020 breach and subsequently shared across the dark web. 

(Ledger – which recently partnered with LVMH-owned brands Fendi and Hublot for luxury crypto hardware collabs – is not named as a defendant in the lawsuit, although it and Shopify were both named as defendants in a previously-filed class action related to the breach. That case, which was lodged in the Northern District of California, was dismissed on personal jurisdiction grounds.)

Aside from making such information available to “every hacker who wanted access to [it],” the plaintiffs assert that such alleged failures by Shopify and TaskUs to safeguard consumers’ information is particularly problematic because it means that they are “no longer in possession of a secure cryptocurrency portfolio.” Specifically, Forsberg and the other named plaintiffs, who had their personal information stolen after hackers accessed Shopify’s list of Ledger customers, claim that while “cryptocurrency transactions are publicly visible through a transaction’s underlying blockchain,” they “cannot be traced back to their particular owner without more information.” 

“When hackers know the identity of a cryptocurrency owner and know what platform that consumer is storing their crypto-assets on, the hacker can work backwards to create a targeted attack aimed at luring hardware wallet owners into mounting their hardware device to a computer and entering their passphrase, allowing unfettered access and transfer authority over their crypto-assets,” the plaintiffs assert. And that is precisely what happened here, they claim, stating that in the wake of the Shopify breach, hackers engaged in “targeted attacks on thousands of customers’ crypto-assets and causing [the plaintiffs and proposed class] members to receive far less security than they thought they had purchased with their Ledger Wallets.” 

The plaintiffs allege that Shopify and TaskUs’s “misconduct,” including but not limited to “their failure to (a) prevent the data breach and (b) take action in response thereto for approximately six months – if not longer – has made [them and other class members] targets.” Such alleged misconduct was only made worse, according to the plaintiffs, by Shopify and TaskUs’s “deficient response,” including their “failure to notify every affected customer or admit to the full scope of the data breach,” which resulted in “many Ledger customers falling victim to hackers’ phishing emails and resulting fraud.” 

With the foregoing in mind, the named plaintiffs set out claims of negligence, unjust enrichment, and violations of various states’ consumer fraud and deceptive and unfair trade practices laws, and contend that they have suffered damages as a result of the defendants’ negligence, including “the fraudulent removal of cryptocurrency from [their] portfolios due to sophisticated scam attacks on [their] Ledger wallets.” Beyond that, they argue that they “remain at a significant risk of additional attacks now that [their] personally identifiable information has been leaked online.” In total, they claim that damages in this case exceed $5 million and that the number of class members exceeds 100 people. 

Reflecting on the potential damages at play in such a case, Covington & Burling lawyers Samuel Greeley, Ashley Simonsen, Mike Nonaka, and Kathryn Cahoy state that “due to the nature of cryptocurrency valuations, the individual damages claims in these cases have the potential to far exceed the more nominal individual amounts in a typical data breach case where the primary payout is identity theft protection services.” However, these cases are hardly expected to be straightforward matters given that “cryptocurrency transactions often are non-reversible, [and] so, unlike thefts from traditional online banking services, it may be difficult or impossible to claw back stolen crypto funds” – which is presumably why more easily-identifiable and accountable parties like Shopify and co. are named as defendants in these largely negligence-centric lawsuits.

Other cases have been filed recently involving similar theories relating to data breaches that allegedly resulted in the theft of cryptocurrency, including in the Northern and Central Districts of California, the Covington & Burling lawyers state, which they say, “suggests that this area will continue to face increasing litigation activity.” In addition to suits centering on the theft of cryptocurrencies, the number of suits being filed against marketplaces like OpenSea in connection with the theft of non-fungible tokens (“NFTs”) from users’ crypto wallets in connection with phishing attacks that were allegedly caused by platform owners’ negligence is also growing. 

“As cryptocurrency storage and related transactions,” as well as enduring purchases of digital tokens, such as NFTs, “increasingly feature in companies’ online presence,” Greeley, Simonsen, Nonaka, and Cahoy claim that “there is likely to be a growing risk posed by threat actors motivated to target crypto-related assets and data, and more litigation activity in this space.” 

The case is Forsberg v. Shopify, Inc., 1:22-cv-00436 (D. Del.).

Consumers using online retail marketplaces, such as eBay and Amazon, “have little effective choice in the amount of data they share,” according to the latest report of the Australian Competition & Consumer Commission (“ACCC”) Digital Platform Services Inquiry. While consumers may benefit from personalization and recommendations from these marketplaces based on their data, the data privacy-focused report states that many people are in the dark about how much personal information these companies collect and share for other purposes. 

The report reiterates the ACCC’s earlier calls for amendments to the Australian Consumer Law to address unfair data terms and practices. It also points out that the government is considering proposals for major changes to privacy law. However, none of these proposals is likely to come into effect in the near future. In the meantime, it is worth considering whether practices, such as obtaining information about users from third-party data brokers, are fully compliant with existing data privacy law. 

Online Marketplace Examination

The ACCC examined competition and consumer issues associated with “general online retail marketplaces” as part of its five-year Digital Platform Services Inquiry. These marketplaces facilitate transactions between third-party sellers and consumers on a common platform. They do not include retailers that do not operate marketplaces or platforms that publish classified ads but don’t allow transactions.

The ACCC report focuses on the four largest online marketplaces in Australia: Amazon Australia, Catch, eBay Australia and Kogan. In 2020–21, these four carried sales totaling $8.4 billion. According to the report, eBay has the largest sales of these companies. Amazon Australia is the second-largest and the fastest-growing, with an 87% increase in sales over the past two years. In furtherance of its report, The ACCC examined the state of competition in the relevant markets; issues facing sellers who depend on selling their products through these marketplaces; and consumer issues including concerns about personal information collection, use and sharing.

Consumers Don’t Want Their Data Used for Other Purposes

The ACCC expressed concern that in online marketplaces, “the extent of data collection, use and disclosure … often does not align with consumer preferences.” The Commission pointed to surveys about Australian consumer attitudes to privacy which indicate that 94 percent  did not feel comfortable with how digital platforms including online marketplaces collect their personal information. 92 percent agreed that companies should only collect information they need for providing their product or service, and 60 percent considered it very or somewhat unacceptable for their online behavior to be monitored for targeted ads and offers.

However, the four online marketplaces analyzed do not proactively present privacy terms to consumers “throughout the purchasing journey;” may allow advertisers or other third parties to place tracking cookies on users’ devices; and do not clearly identify how consumers can opt out of cookies while still using the marketplace. Some of the marketplaces also obtain extra data about individuals from third-party data brokers or advertisers.

The harms from increased tracking and profiling of consumers include decreased data privacy; manipulation based on detailed profiling of traits and weaknesses; and discrimination or exclusion from opportunities. 

You Can’t Just “Walk Out of a Store”

Some might argue that consumers must not actually care that much about data privacy if they keep using these companies, but the choice is not so simple. The ACCC notes the relevant privacy terms are often spread across multiple web pages and offered on a “take it or leave it” basis. The terms also use “bundled consents.” This means that agreeing to the company using your data to fill your order, for example, may be bundled together with agreeing for the company to use your data for its separate advertising business. 

Further, there is so little competition on privacy between these marketplaces that consumers cannot just find a better offer. The ACCC agrees, stating, “While consumers in Australia can choose between a number of online marketplaces, the common approaches and practices of the major online marketplaces to data collection and use mean that consumers have little effective choice in the amount of data they share.” Consumers also seem unable to require these companies to delete their data. The situation is quite different from conventional retail interactions where a consumer can select “unsubscribe” or walk out of a store. 

Do Our Data Privacy Laws Permit These Practices?

The ACCC has reiterated its earlier calls to amend the Australian Consumer Law to prohibit unfair practices and make unfair contract terms illegal. (At present unfair contract terms are just void, or unenforceable.) The report also points out that the government is considering proposals for major changes to privacy law, but these changes are uncertain and may take more than a year to come into effect.

In the meantime, it is worth looking more closely at the practices of these marketplaces under current privacy law. For example, under the federal Privacy Act the four marketplaces “must collect personal information about an individual only from the individual unless … it is unreasonable or impracticable to do so.” However, some online marketplaces say they collect information about individual consumers’ interests and demographics from “data providers” and other third parties. We do not know the full detail of what is collected, but demographic information might include our age range, income, or family details. 

How is it “unreasonable or impracticable” to obtain information about our demographics and interests directly from us? Consumers could ask online marketplaces this question, and complain to the Office of the Australian Information Commissioner if there is no reasonable answer.

Katharine Kemp is a Senior Lecturer in the Faculty of Law & Justice at UNSW Sydney. (This article was initially published by The Conversation.)

With so much conflicting information on customer technology innovation being fed to premium and luxury brand executives, it is no wonder many are confused. Understanding and prioritizing new consumer tech tools is an essential skill that all customer-centric organizations must master. However, executives in critical customer-facing functions, such as e-commerce, marketing, sales, and customer service are often, so focused on programmatic ad technology that they fail to examine and implement new technologies that better serve the best interests of the brand and its customers. 

Against that background, here is a look at five key technologies, in order of priority, and how brands should approach each for maximum financial performance in 2022 and coming years … 

Social Selling Tech

The future of e-commerce is personal and human. Imagine the power of a global social selling community powering the brand’s e-commerce. Social selling is the act of empowering every sales associate with their own “storefront”, or microsite, with complete brand oversight. Associates can creatively curate and personalize for each customer. They can post favorite products, curate boards of products, list services, and integrate with social media such as TikTok and Instagram. Social selling tech integrates and optimizes emotionally intelligent digital retail with social media. LVMH and L’Oréal have invested heavily in social selling leader Replika Software and are rolling out to all brands globally. 

Whatever social selling tool brands select they need to make this the top priority in customer tech beyond a basic e-commerce website. Social selling can improve average cart size 50-200 percent and conversion rates by 8 to 10 times. Game-changing results are the main reason most premium and luxury leaders are putting social selling at the top of the list.

Short Video and Livestream Shopping 

The second most powerful consumer technology brands starting to execute right now is short video and livestream shopping technology. While operating on TikTok and Instagram is valuable, the most valuable video move is to focus resources on bringing static, brochureware sites to life with short videos for every product, and especially the best sellers. Carefully test livestreaming directly from stores and make sure production value and brand ambassadors live up to premium and luxury brand standards. Results from short video and livestream tech platform leader Firework show that average customer engagement increases from 8-seconds to 17-minutes while conversion can go from 0.2 percent to 34 percent. This second priority is a no-brainer. The synergies with social selling are fantastic.

Licensing Customer Behavioral Data

Digital cookie tracking is going to be eliminated due to consumer privacy protections becoming stricter and stricter. While this will certainly be a blow to unethical and illegal data brokers, it is a once-in-a-lifetime opportunity for ethical brands to engage with consumers directly for data access and optimize their marketing and selling communications and product innovations. The first generation of personal data exchanges is now in full bloom. Using a personal data exchange, any brand can design and deploy a fair-value rewards e-mail or text campaign to different segments of customers, to gain consent to license their digital platform data (Instagram, Google, etc.). 

The best personal data exchanges are fiduciaries and can guarantee end-to-end encryption, full anonymity, privacy, copyright, and licensing law protections to consumers. The brand doesn’t have to ever take possession of the data. Consumers receive fair value and privacy while brands generate rich, relevant, actionable, anonymous, segment level behavioral insights. Brands can utilize these behaviorally driven insights to develop new products, compelling content and offers. Brands can finally begin to design and deliver extraordinary customer experiences. This behavioral data can be supplemented with surveys and other tools to establish an on-going dialogue and achieve a 360-degree understanding of customer behaviors and their motivations. Licensing digital platform data direct from the customer is a major leap in the consumer insights journey to personalization. That leap makes it one of the top three priorities in consumer tech for premium and luxury brands.

Blockchain

Blockchain is simply a shared, immutable ledger that facilitates the process of recording transactions and tracking assets in a business network. An asset can be each raw material used to make a handbag, jewelry, a watch, and other premium and luxury goods. Probably the most powerful ways for the premium and luxury goods industry to use blockchain now is to provide provenance (authentic proof of origin) to prevent counterfeits or otherwise infringing products from landing in the hands of consumers seeking authentic goods. Blockchain can provide a secure and trusted tracking system from one end of the supply chain (the creation or mining of raw materials) to the end state, where a customer enjoys, and can even resell, the authentic product. 

When used to authenticate origin, Blockchain can be useful to help eliminate counterfeits by enabling brands and law enforcement to determine the authenticity of a product and help eliminate criminal activity and enhance the sales and profits of legitimate brands. The Aura Blockchain Consortium, led by respected industry veteran Daniela Ott, is bringing top-tier luxury brands together to establish product passports for luxury goods. Since it protects the original brand, and very importantly, the investment of the customer, using blockchain to trace products and fight counterfeiting is the fourth priority in customer tech.

Non-Fungible Tokens

Non-fungible tokens (“NFTs”) can represent any asset. In the premium and luxury industry, NFTs are mostly used to signify that, hidden in a digital artwork, there is unique, authenticating unit of data stored on a digital ledger (blockchain again) that establishes proof of ownership. That is powerful in the digital world. While NFTs will probably bring new forms of innovation to art, fashion, entertainment, beauty, retail, and many other categories, since they are tied to blockchain, which is the foundational force behind cryptocurrencies, they have had the same growing pains. Nonetheless, what premium and luxury brands need to do now is test and learn their way into the NFT marketplace, very meticulously. 

Brands that want to avoid being part of a scam, have their customers scammed, or be scammed themselves, can steer clear of toxic exchanges and dodgy NFT players. Technology from CryptID Technologies and other encryption innovators will soon be available to allow brands to authenticate proof of origin and ownership using tamper-proof, advanced verification techniques, called Zero Knowledge Proofs for any digital asset, at scale. Using cryptographic proofs that do not require blockchains to prove authenticity, and that massively scale transactions, where blockchain cannot, will liberate and empower NFT originators, buyers and sellers. 

Milton Pedraza is the CEO of the Luxury Institute, the Chairman of DataLucent, and is globally recognized as one of the world’s leading experts and private investors in personal data innovation.

At the beginning of the year, Moncler confirmed that it had suffered from a ransomware attack on its systems that led to a headline-making data breach. The leaked data exposed information about the Italian outerwear-maker’s employees and former employees, “some suppliers, consultants and business partners,” and customers. It followed from a data-centric attack on American fashion brand Guess, which was on the receiving end of a data breach in the summer of 2021. In that case, criminals were able to obtain social security numbers, ID numbers (driving licenses and passports), and financial account numbers. Around the same time, Chanel suffered a similar fate with its South Korean operation, which resulted in the leak of names, personal information, and shopping histories.

Instances of cyberattacks and hacking generally should not come as a surprise to brands. A recent Office for National Statistics report showed that while most forms of crimes in the United Kingdom are seeing a downtrend, crimes involving computers and hacking are experiencing a noticeable uptick. The same is true in the U.S., with ransomware attacks, alone, rising by almost 100 percent in 2021 according to SonicWall’s 2022 Cyber Threat Report.

When hacks occur, government agencies, such as the Information Commissioner’s Office (“ICO”) in the United Kingdom and the Federal Bureau of Investigation and relevant State authorities in the U.S., expect companies to deal with them proactively and ensure that any serious breach is resolved effectively. In light of increasing threats of cyberattacks and hacking, including for fashion brands, guidance on how companies – in fashion and beyond – should approach the issue are set out below … 

What do hackers want and how do they get it?

Fashion brands are a gold mine for data that can be exploited. Hackers target clients’ personal information, financial information, and operations and systems, which is all readily available, especially since most players in this space maintain e-commerce shops. Hackers can access such information by way of a data breach, namely, targeted attacks into secure log ins, where they obtain information; ransomware, where access to files or systems is blocked until a ransom fee is paid; and/or denial of service attacks, in which a system or server is flooded with targeted requests, preventing legitimate requests from being fulfilled.

What actions should you take if a breach occurs?

In the UK, the ICO expects a company to take action if it finds itself the victim of a cyberattack or breach, which it defines as “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to, personal data.” Primarily, companies are expected to carry out a data breach risk assessment, including by determining whether there a risk that data subjects will be seriously affected by the breach. They are also expected to inform individuals who have been affected by a high-risk data breach without delay, and inform the regulator as soon as practically possible, and in any event, within 72 hours.

In the U.S., the response will vary from state to state. Depending on the severity of the breach, the state attorney general, and eventually customers, may need to be notified with similar notification requirements as found under UK law.

When providing details to affected individuals, a brand needs to inform them, in clear language, of the nature of the breach and what personal data was affected. They should also be provided with details of the relevant contact point or the details of the brand’s data protection officer. It is recommended that individuals are provided with information on how the brand will assist them going forward and any actions they can take to protect themselves. Guidance from the ICO outlines that this may include forcing a password reset; advising individuals to use strong, unique passwords; and telling them to look out for phishing emails or fraudulent activity on their accounts.

If, after a risk assessment, the brand has decided that a notification to the ICO is not necessary, it is still highly advisable that the company records information about the breach and actions taken in response. If the ICO decides that an investigation is necessary, the company may be asked to justify the decisions it made.

Reporting the data breach

If a report to the ICO is necessary, then it is important that the following information is captured and shared with the ICO: the approximate number of affected individuals; how many personal data records were affected; the name of the data protection officer or contact point details; the effects of the breach, and actions taken in response.

Again, in the U.S., while this may vary from state to state, it is likely that the report will contain information that is similar to what is expected by the ICO.

Take home points

If a brand finds itself on the receiving end of a cyberattack or other data breach, it is important to be as prepared as possible. Planning in advance is ideal, and is likely to include contingency measures. However, as it may be difficult to plan for all eventualities, the following best practices will also limit what can be hacked: Do not store sensitive data in clear text – pseudonymize or encrypt, and so not hold onto incomplete or old data, whilst it may not be relevant to your business, it can expose the data subjects to malicious actions from hackers. Ensure access to data is handles on a strict basis, and ensure the company carries out appropriate security policy and regular cyber security training for staff. Finally, carry out regular information risk assessments, and maintain a response and recovery plan.

Vladimir Arutyunyan is an associate in Fox Williams’ commercial and techology team in London.

Louis Vuitton North America is coming under fire in a new lawsuit for allegedly collecting consumers’ biometric data in connection with a virtual try-on tool on its website. According the complaint that she filed on April 8, plaintiff Paula Theriot claims that Louis Vuitton North America (“Louis Vuitton”) “sells the image of luxury to consumers, inviting them to virtually ‘try on’ its designer eyewear through its website’s ‘Virtual Try-On’ feature.” The problem with that, Theriot argues, is that by enabling Louis Vuitton to access images of their faces, consumers are providing the brand with “detailed and sensitive biometric identifiers and information, including complete facial scans,” which the brand allegedly collects and stores “without first obtaining their consent, or informing them that this data is being collected.” 

In the newly-filed lawsuit, Theriot alleges that “unbeknownst to its website users, including [herself],” Louis Vuitton “collects detailed and sensitive biometric identifiers and information, including complete facial scans, of its users through the Virtual Try-On tool.” Despite consumer concerns regarding facial-scanning technology, Theriot asserts that Louis Vuitton “refused, and continues to refuse, to inform users that it is using technology on its website to collect their biometric facial scans, and neither informs users that their biometric identifiers are being collected, nor asks for their consent.” 

Beyond consumers’ privacy concerns and their expectations that “companies will follow the law in promoting and advertising [their] products,” Theriot claims that Louis Vuitton’s alleged practice of collecting consumers’ biometric data runs afoul of the “clear mandate” created by Illinois’s Biometric Information Privacy Act (“BIPA”). 

Enacted in 2008, BIPA requires “private entities, including companies like Louis Vuitton, that collect certain biometric identifiers or biometric information, or cause such information and identifiers to be collected, to take a number of specific steps to safeguard the biometric data they collect, store, or capture.” (Under BIPA, a “biometric identifier” generally means “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry” and “biometric information,” means “any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s biometric identifier used to identify an individual.”) Such steps include – but are not limited to – “obtain[ing] informed consent from consumers prior to collecting such data from them,” Theriot argues, and “publicly disclos[ing] to consumers their uses, retention of, and a schedule for destruction of the biometric information or identifiers that they do collect.” 

Theriot contends that she “never provided a written release to [Louis Vuitton] authorizing it to collect, store, or use her facial scans or facial geometry, and was never informed, in writing or otherwise, about the purpose for collecting her biometric data,” and upon reviewing the luxury brand’s website terms and conditions, “did not see any indication that her biometric information would be collected or distributed in those website terms and conditions.” 

Hardly a one-off occurrence, Theriot claims that Louis Vuitton has violated BIPA and continues to violate it “each and every time a website visitor based in Illinois uses the Virtual Try-On tool because Louis Vuitton continues to collect and store or facilitate the storage of biometric information or biometric identifiers without disclosure to or consent of any of the consumers who try on glasses on their website, necessarily using [its] Virtual Try-On tool to do so.” 

With the foregoing in mind, Theriot alleges that Louis Vuitton has violated BIPA, namely, by way of its “failure to inform in writing and obtain written release from users prior to capturing, collecting, or storing biometric identifiers” and “failure to develop and make publicly available a written policy for retention and destruction of biometric identifiers.” As such, she is seeking authorization from the court that the case can proceed as a class action, thereby, enabling other individuals to join, as well as monetary damages and injunctive relief to force Louis Vuitton to comply with BIPA. 

Louis Vuitton is not the only company on the receiving end of a currently-pending BIPA lawsuit. A similar case was filed against Target in a federal court in Illinois in October 2021, with the plaintiff accusing the retailer of making use of a virtual try-on feature that unlawfully captures, collects, and stores consumers’ biometric data through facial geometry scans. In that case, plaintiff Aimee Potter claims that Target enables consumers to test makeup products virtually by either scanning or uploading a photo of their face to its app, but fails to obtain consent to collect consumers’ biometric data and also fails to inform consumers that it is collecting such data.  

Both cases – and others like them – reflect the risks that are at play as “more and more retailers introduce virtual try-on tools that use biometric technology to recreate the fitting room experience for their online consumers,” particularly in the wake of the pandemic, per Steptoe & Johnson attorneys Stephanie Sheridan, Meegan Brooks and Surya Kundu. 

Against this background and given that virtual try-on features are serving as more than merely a pandemic trend and instead, a notable step in furtherance of a larger omnichannel retail revolution, and given the potential for more states to pass BIPA-like laws of their own, it is important for private entities that collect – or may start collecting – biometric data from consumers to consider best practices for maintaining public-facing privacy policies, appropriate consents and notices, and data security safeguards, among other things. 

A rep for Louis Vuitton was not immediately available for comment.

The case is Theriot v. Louis Vuitton North America, Inc., 1:22-cv-02944 (SDNY).