Privacy

Privacy

34 entries in Legal Intelligence Tracker

Top Legal Issues Facing Fashion & Retail in 2026

No single core event defines the headline; it summarizes ongoing legal pressures shaping fashion and retail operations in early 2026, mirroring 2025 trends and projecting persistence. Key developments include escalating tariffs and trade enforcement, AI/digital commerce risks, e-commerce scrutiny, sustainability mandates (e.g., PFAS restrictions, climate disclosures, extended producer responsibility), labor/immigration issues, Proposition 65 enforcement, financial distress with rising bankruptcies, and private equity shifts.[2][5][6][11] Specific cases testing boundaries involve IP disputes (e.g., Naghedi’s woven neoprene trademark push amid "dupes," Quince vs. Deckers on UGG trade dress as alleged monopolization), origin labeling scrutiny, and regulatory actions like Texas suing Shein over toxic chemicals and data risks under DTPA.[1][3][13]

The AI Knows Too Much: When Employees Feed Trade Secrets into Generative AI Tools

Employees feeding trade secrets into public generative AI tools like ChatGPT, Claude, or Google Gemini risk waiving legal protections, as these inputs may constitute voluntary disclosure to third parties without confidentiality guarantees.[1][2] The core event stems from a February 2026 U.S. District Court ruling in United States v. Heppner (Southern District of New York), where the court held that attorney-client privilege did not apply to documents prepared using Anthropic's Claude due to its privacy policy allowing data sharing with third parties, a logic now extending to trade secrets under the Defend Trade Secrets Act (DTSA).[1][2]

Tencent integrates WeChat with OpenClaw AI agent amid China tech battle - Reuters

Tencent launched ClawBot on March 22, 2026, integrating the open-source OpenClaw AI agent into WeChat as a chat contact, enabling over 1 billion users to automate tasks like file transfers and email sending directly in the app. [1][2][3][4] This embeds advanced, autonomous AI capabilities—beyond traditional chatbots—into WeChat's messaging, payments, and mini-program ecosystem, supporting multimodal interactions with text, images, videos, and files.[1][2][3]

10
Score

Lessons From CalPrivacy PlayOn Order

Core event: The California Privacy Protection Agency (CalPrivacy) settled with PlayOn Sports (operated by 2080 Media, Inc.), imposing a $1.1 million civil penalty for CCPA violations involving the sale and sharing of personal information via tracking technologies without adequate opt-out options, from January 1, 2023, to December 31, 2024.[1][3][4][5][11]

Algorithmic Pricing and AI-Powered Evidence Avoidance: Competition Law Risks and Compliance Strategies

Algorithmic pricing and AI tools face heightened U.S. regulatory scrutiny in 2026, driven by state laws, federal inquiries, and court cases addressing antitrust risks, collusion, and consumer fairness. No single core event dominates; instead, developments include new state legislation (e.g., Connecticut's HB8002 effective Jan. 1, 2026, prohibiting algorithmic pricing using nonpublic competitor data in rentals), California's AB 2564 proposal (Feb. 20, 2026) banning surveillance pricing, and over 40 bills in 24+ states targeting personalized pricing with data like location or demographics.[1][4][5][6] Key players: FTC (2024 6(b) study, 2025 findings on transparency risks); DOJ (2025 settlements with RealPage and Greystar requiring public data only); state AGs (e.g., California's inquiries to grocers/hotels); companies (RealPage challenging NY/Berkeley laws; hotels in Gibson v. Cendyn, 2025 Ninth Circuit win); states (NY, CA, CT laws; bills in PA, TX, NJ, etc.).[1][2][3][4][5]

Beyond the Server Location: Why the New Fight Over FISA 702 and the Cloud Act Matters to Corporate Privacy Strategy

Core event: The headline highlights an intensifying corporate debate over FISA Section 702 and the CLOUD Act, emphasizing that U.S. jurisdiction over cloud providers—based on corporate control rather than server location—exposes global data to compelled access and surveillance, clashing with EU GDPR rules like Article 48.[1][3][6]

The Backlash Against AI Devices That Are Always Watching

Core event: Mounting privacy backlash against always-on AI devices and data practices, highlighted by a U.S. class-action lawsuit against Meta over its AI smart glasses, where Kenya-based workers reviewed user footage including nudity, sex, and toilet use, contradicting privacy claims.[2] This joins EU probes into Meta's AI training data scraping from public Facebook/Instagram posts via short opt-out windows, covert Pixel tracking, and noyb cease-and-desist actions under GDPR.[1]

A Reminder About Florida’s Ban on Offshore Health Data Storage: What Providers and Vendors Should Know

Core event: In May 2023, Florida enacted Senate Bill 264, amending the Florida Electronic Health Records Exchange Act (codified at Fla. Stat. § 408.051(3)) to ban healthcare providers using certified electronic health record technology (CEHRT) from storing patient information outside the continental United States, its territories, or Canada—including in third-party cloud services or subcontracted facilities.[1][2][4][9]

The People Who Are Using AI at Home to Free Up Their Time

Core event/development: The news story highlights everyday people adopting AI agents and smart home systems in 2026 to automate routine tasks like comparing insurance plans, ordering groceries, managing energy use, and handling security, thereby freeing time for leisure activities such as biking or playing guitar. This reflects broader trends in predictive AI automation, where systems learn household routines to preheat ovens, track inventory for auto-reordering, optimize appliance schedules based on energy prices and solar output, and provide real-time alerts for maintenance or threats.[1][2][3]

White House Outlines AI Policy Agenda in New National Framework

On March 20, 2026, the White House under President Donald J. Trump released the National Policy Framework for Artificial Intelligence, a set of non-binding legislative recommendations to Congress for a unified federal AI approach emphasizing innovation, safety, and oversight.[1][3][4]

Grammarly’s AI tool mimicked experts without their consent. Now it’s being sued

Grammarly, owned by Superhuman Platform Inc., launched its "Expert Review" AI tool in August 2025, allowing users to pay $12/month for real-time writing feedback mimicking styles and advice from prominent figures like journalist Julia Angwin, author Stephen King, and Neil deGrasse Tyson—without obtaining their consent.[1][Input Summary] On March 12, 2026 (noted as "Wednesday" in reports), Angwin filed a class-action lawsuit in the U.S. Southern District of New York, alleging misappropriation of names and identities for commercial gain, violating New York and California privacy and publicity rights laws; the suit seeks class certification, an injunction, and damages for affected journalists, authors, editors, and others.[1][Input Summary]

Feedback on Frictionless Opt-Outs: CalPrivacy Seeks Comments from Businesses on Experience with Opt-Out Preference Signals

Core event: The California Privacy Protection Agency (CalPrivacy) issued an invitation for preliminary comments on March 6, 2026, seeking stakeholder input, especially from businesses, on experiences with Opt-Out Preference Signals (OOPS) and reducing friction in exercising privacy rights under the CCPA; comments are due by April 6, 2026.[1][3][7]

Goodwin Launches OC Office With 3 Ex-Jones Day Partners

Core event: Goodwin Procter LLP launched its first Orange County office in Newport Beach, California, on March 17, 2026, by recruiting three partners—Richard Grabowski, John Vogt, and Ryan Ball—from Jones Day to lead it.[1][3][7][9] These attorneys specialize in cybersecurity, privacy, technology litigation, trade secrets, and consumer financial services, bringing elite trial experience including top defense verdicts and summary judgments in high-stakes cases.[1][6][8][10]

Ex-Williams & Connolly Clerk Accused Of Posting Client Info

A former clerk of Williams & Connolly LLP has been posting confidential firm information, including client details and work email exchanges, on public platforms and threatening to continue leaking materials he described as "a fun read."[1][3][5] The firm filed a lawsuit against him in District of Columbia Superior Court on March 27, 2026, seeking to halt the disclosures.[1][5]

Four Standards Law Firms Should Use to Evaluate AI Marketing Tools

The article "Four Standards Law Firms Should Use to Evaluate AI Marketing Tools," published March 23, 2026, by Jamie Adams of Scorpion, outlines four key criteria for law firms to assess AI marketing solutions amid hype and rapid adoption. It argues that effective AI must deliver measurable business outcomes like increased signed cases, reduced costs per client, and revenue growth, rather than superficial metrics such as website traffic or form submissions[1].

CalPrivacy Issues $1.1 Million Fine for CCPA Violations Involving Student Privacy

The California Privacy Protection Agency (CalPrivacy) fined PlayOn Sports (2080 Media, Inc.) $1.1 million on March 3, 2026, for CCPA violations stemming from its GoFan digital ticketing platform used by about 1,400 California schools. PlayOn collected personal information via tracking technologies (e.g., cookies, Meta Pixel) for targeted advertising, constituting "sale" and "sharing" under CCPA, but failed to provide effective opt-out mechanisms.[1][4][5][9] Violations included phone/email opt-outs that didn't block website trackers, reliance on third-party tools like NAI/DAA, non-recognition of Global Privacy Control signals, coercive "agree-only" banners forcing consent (especially from students), misleading privacy notices claiming no sales, and inadequate disclosures.[1][3][5][7]

'Great crackdown': Russia tightens the screws on the internet - Reuters

Core Event

Russia has intensified internet controls through a sweeping "crackdown," mandating that all websites, apps, and online platforms register with the state-run registry under Roskomnadzor, the federal communications regulator. Non-compliant foreign services face immediate blocking, while domestic providers must integrate with the sovereign RuNet infrastructure for real-time content monitoring and data localization. Announced on March 19, 2026, the measures include AI-driven censorship tools to filter "extremist" or "fake" content, building on prior laws but with unprecedented enforcement speed—over 500 sites blocked in the first 48 hours.

Privacy Tip #483 – Whistleblower Alleges DOGE Employee Stole Social Security Data on a Thumb Drive

A whistleblower complaint alleges that a former Department of Government Efficiency (DOGE) software engineer stole two highly sensitive U.S. Social Security Administration (SSA) databases—"Numident" and the "Master Death File"—containing records on over 500 million living and dead Americans, including Social Security numbers, birth data, citizenship, race, ethnicity, and parents' names, by copying them onto a thumb drive.[1][2][3] The unnamed engineer reportedly boasted to colleagues at his new employer, a government contractor, about possessing the data and planning to use it there, prompting an investigation by SSA's independent inspector general.[1][2][3] SSA denies the theft, calling it "fake news" from The Washington Post aimed at scaring seniors, while all named parties—SSA, the ex-employee, and the contractor—refute the claims.[1][2]

[Podcast] Building Cyber Readiness for Government Contractors in 2026

The core event is a Wiley Rein LLP podcast episode released on March 25, 2026, discussing cyber readiness strategies for government contractors amid escalating 2026 cybersecurity mandates. Hosted by attorneys Scott Felder and Brian Walsh, it features Megan Brown and Erin Joe from Wiley’s Privacy, Cyber & Data Governance Practice, who share incident response lessons from ransomware, nation-state attacks, and data exfiltration. They outline governance improvements, third-party risk management, tabletop exercises, reporting navigation, and AI scrutiny preparation.[4]

AI Vendor Contracts: The Terms And Conditions Trap

Above the Law article warns in-house lawyers of hidden risks in standard AI vendor contracts, exemplified by one agreement granting vendors co-ownership of customer-generated content and perpetual licenses to data via broadly defined "Aggregated Statistics," with no anonymization standards or data recovery options.[1]

State Enforcers Step Up Scrutiny of Foreign Data Transfers: What Organizations Should Know

State enforcers, particularly in Florida and Texas, have intensified scrutiny of organizations' foreign data transfers involving sensitive personal information like precise location, biometrics, and genomic data, amid a broader shift to enforcement of existing U.S. state privacy laws in 2026.[6][7] The core development builds on a 2024 federal Department of Justice (DOJ) rule restricting outbound transfers of "covered data" to "countries of concern" (e.g., China, Iran, North Korea), now amplified by state-level actions targeting vendor relationships, data brokers, and service providers potentially exposing U.S. data to foreign adversaries.[6]

Utah SB 275’s “Digital Identity Bill of Rights”: What It Could Mean for Businesses

Utah SB 275 passed unanimously in both legislative chambers, establishing the nation's first "Digital Identity Bill of Rights" and a voluntary state-endorsed digital identity program. The bill creates rights for residents to control their digital identities, including selective disclosure of attributes (e.g., name without birthdate or address), protection from compelled digital ID use, and safeguards like explicit consent, purpose limitation, and a "duty of loyalty" prohibiting exploitation by providers.[1][2][3][5] It mandates standards for digital wallets, verifiers, and relying parties, with requirements for tamper-resistant tech, secure processing, and minimal data use; the program endorses specific attributes like name, birthdate, image, and Utah address.[1][4][5]

GSA Proposes New Contract Clause Focused on the Government Use of AI

Core event: On March 6, 2026, the U.S. General Services Administration (GSA) released a draft contract clause, GSAR 552.239-7001, "Basic Safeguarding of Artificial Intelligence Systems" (Feb 2026 GSAR Deviation), for inclusion in GSA solicitations and contracts involving AI capabilities, particularly ahead of the Multiple Award Schedule (MAS) refresh planned for late March or April 2026.[1][3][4][5]

Navigating Global Background Checks- Key Insights for Employers

No specific core event occurred; the headline summarizes ongoing 2026 regulatory updates and trends in global background checks for employers. These center on "Clean Slate" and "Fair Chance" reforms tightening criminal history access, alongside rising demands for compliant international screening amid global hiring.[1][5][6]

New Surveillance Tools in Retail Stores Pose Legal Risks

Retailers are increasingly deploying AI-powered surveillance tools like facial recognition, AI-enhanced cameras, and body cameras on security guards to combat rising theft rates, but these raise significant legal risks under privacy laws.[1][7] The core event is a March 19, 2026, legal alert highlighting how audio recording without consent violates the Federal Wiretap Act and state all-party consent laws (e.g., California, Illinois), while biometric data collection implicates state restrictions and FTC rules against unfair practices.[1][7]

mail

Get notified about new Privacy developments

Primary sources. No fluff. Straight to your inbox.