January 2026
Delaware court rejects overbroad noncompete in profits interest award
The Delaware Court of Chancery recently recommended dismissal of an employer’s motion to enforce a two-year noncompete contained in a profits interest award against a former employee. The restriction prohibited the employee, for two years post-employment, from competing with “all or any portion of any business conducted or planned to be conducted” by the Company or any of its affiliates “in any geographic area in which [the company] or any of its [a]ffiliates does business” or any entity that “directly or indirectly” provides Software as a Service products in the health and wellness industry.
With respect to the noncompete’s scope, the court held the two‑year duration, coupled with its global reach across at least sixty‑eight countries, was insufficiently tailored to the employee’s role managing only the small and medium business market sales within the organization. The court criticized the noncompete’s coverage of all “affiliates” without describing the nature or scope of those businesses. Balancing the equities, the court found that the noncompete elevated undefined affiliates’ interests above the employee’s right to engage in his trade. The court added that the employer failed to demonstrate the employee even knew the business operations of these affiliates, but that such knowledge would not save the overbroad noncompete.
Turning to consideration, the court evaluated the non-compete as arising in the employer-employee context, rather than in the sale-of-business context, which would require less scrutiny. It then held that the employee’s 150,000 unit profits interest award was insufficient to support the sweeping covenant. The court noted the vesting schedule: half the units would vest over five years of continued employment, while the remaining half would vest only upon a future sale meeting certain financial benchmarks. As a result, the employee “could neither expect nor guarantee” the interest would vest in whole or in part. And in fact, only 10% of the units had vested when the employee departed—an interest the employer repurchased for $52,241. Given the remote and contingent nature of the consideration, the court held it was not “substantial” enough to justify the noncompete. The court also declined to require the employee to return the $52,241 purchase price.
Finally, the Court declined to “blue pencil” the agreement to make it enforceable, reasoning that the unequal bargaining power inherent in the employment relationship did not weigh in favor of judicial modification. The court noted the record did not demonstrate any real negotiation, and that the company’s informing the employee he “had the opportunity to obtain counsel” was not enough to show a “bargained-for exchange among equally positioned parties.”
S&K Take: Delaware courts continue to scrutinize noncompetes in the employment setting. Although Delaware is generally known for its strong commitment to contract law—and contract law typically provides that courts will not inquire into the adequacy of consideration, such that even a “peppercorn” may suffice—this decision treats noncompetes as an exception by evaluating whether the consideration offered was sufficient to justify the restriction. While the court acknowledged that a profits‑interest award is not categorically inadequate, it emphasized that remote and contingent vesting undermines the value of such consideration when weighed against the significant trade restraints imposed on the employee by a noncompete.
Lawsuit alleges AI recruiter unlawfully collected applicant data without proper consent or ability to review or contest errors
Job applicants filed a proposed class action lawsuit in California state court alleging that an AI recruiting platform secretly collected and evaluated sensitive personal information to generate predictive “likelihood of success” scores that were provided to employers for use in hiring decisions. The plaintiffs allege the platform did so without obtaining proper consents, without providing required disclosures, and without providing applicants an opportunity to review or correct inaccurate information—actions they argue violate federal and state consumer‑protection statutes governing the collection and use of such “consumer reports” in employment.
According to the complaint, applicants have no way of knowing that the AI platform “lurks” behind employer application portals to scrape massive amounts of personal data from numerous sources. Those sources include information supplied directly by applicants and employers; public sources, such as LinkedIn and social media profiles; and extensive metadata, such as internet and device activity, browsing history, cookies, and other tracking data and third-party data. The platform then processes the data through its proprietary large language model to generate a score intended to predict a candidate’s “likelihood of success” based on factors such as projected career trajectory, inferred characteristics, and similarity to high‑performing employees. Applicants cannot review, access, or correct these AI‑generated reports before they are sent to employers, even though the resulting scores may influence crucial hiring decisions.
The plaintiffs argue that “there is no AI exemption” to the longstanding federal and state consumer protection laws governing the collection, use, and dispute rights for consumer reports, noting that “while the technology may be new,” it is precisely the type of computer‑driven decision-making concerns Congress sought to address when it enacted such laws almost 50 years ago.
S&K Take: We previously reported on an ACLU complaint alleging that the use of AI technology had a discriminatory disparate impact against qualified disabled or non-white applicants and employees. This new lawsuit reflects a different theory of liability: that the collection of personal data for use in AI‑generated hiring reports is itself unlawful.
Employers should ensure they fully understand how their recruiting technologies operate and avoid outsourcing employment decisions to AI systems that are not sufficiently clear or vetted. This includes assessing whether AI vendors’ practices could trigger federal or state consumer‑reporting obligations and confirming that required disclosures, authorizations, accuracy safeguards, and dispute‑resolution mechanisms are in place. As AI‑based hiring tools become more sophisticated and more widely adopted, their use will be scrutinized more closely.