Ethical Innovations: Embracing Ethics in Technology

Ethical Innovations: Embracing Ethics in Technology

Menu

Social Media on Trial: Did Platforms Poison Kids?

A landmark trial opened in Los Angeles County Superior Court examining claims that major social media companies deliberately designed features to addict children and harm young users’ mental health. The remaining defendants are Meta, owner of Instagram, and Google, owner of YouTube; TikTok and Snap settled with the plaintiff and are no longer defendants.

The plaintiff, identified in filings as K.G.M. or Kaley G.M. (a 19- or 20-year-old in different summaries), is a bellwether plaintiff whose case is intended to test legal arguments that may apply to hundreds or thousands of similar lawsuits. Plaintiffs’ lawyers, led by Mark Lanier, told jurors they will present internal company documents, studies and emails that they say show deliberate efforts to increase time spent on the platforms, to target young users, and to profit from that engagement. Evidence shown or described in court included a 2015 Meta internal email urging a 12% increase in time spent on the company’s platforms; an internal Meta study described as “Project Myst” that surveyed about 1,000 teens and parents and reportedly found that children experiencing trauma or stress were especially vulnerable to addictive use and that parental controls had limited effect; and internal messages and documents described by plaintiffs as comparing products to a casino or calling Instagram “like a drug.”

Plaintiffs contend those design choices — including social‑validation features such as like buttons, autoplay and infinite scroll — were engineered to maximize youth engagement and advertising revenue and that the resulting use contributed to depression, suicidal thoughts and other mental health harms among minors. Plaintiffs showed internal research and cited family members of children who died as part of the evidence they expect to present. The trial is expected to include testimony from experts, family members, former employees who have become whistleblowers, and company executives; depositions or live testimony from Mark Zuckerberg, Adam Mosseri and Neal Mohan were presented as possible.

Meta and Google deny the allegations. Defense attorneys argued the plaintiff’s mental health struggles stem from other factors, including family turmoil, alleged neglect and abuse, and long-term therapy beginning in early childhood. Meta’s counsel, identified in court as Paul Schmidt in one summary, told jurors that several of the plaintiff’s mental‑health providers did not diagnose social media addiction or treat it as the primary cause, and asked jurors to consider whether Instagram was a “substantial factor” in the plaintiff’s psychological distress. Company spokespeople said they disagreed with the claims and pointed to safety measures they have implemented for young users while asserting limits on liability for third‑party content.

Judge Carolyn B. Kuhl instructed jurors not to change how they use social media during the trial and to evaluate each defendant’s liability separately. The proceedings in Los Angeles are expected to run about six weeks to six‑to‑eight weeks in different accounts, and the outcome could set a benchmark for damages and influence numerous similar lawsuits nationwide. Related legal actions are proceeding in other jurisdictions, including a separate trial in New Mexico and a federal bellwether for school districts in Oakland, California, while more than 40 state attorneys general and multiple lawsuits in other countries pursue claims or regulatory changes addressing youth access to social media.

Separately, groups of state attorneys general have petitioned courts seeking far‑reaching changes to how Meta handles accounts and data for users under 13. Those petitions, described in varying detail, seek measures such as deletion of identified under‑13 accounts, deletion of data and algorithms built from under‑13 user information, time limits for young users during school and night hours, and removal or disabling of features alleged to promote overuse such as infinite scroll, autoplay and certain appearance‑altering filters. State attorneys general described recent Meta changes for teen accounts as minimal; Meta has introduced some teen‑focused features, including content filters for users under 16, per statements in court and filings.

About one hundred people attended the opening proceedings in Los Angeles, including parents who say their children died because of platform design choices. The trial is one of a broader wave of litigation and regulatory efforts seeking accountability and potential changes to platform design and youth protections.

Original Sources: 1, 2, 3, 4, 5, 6, 7, 8 (california) (meta) (youtube) (snap) (tiktok) (instagram) (snapchat) (plaintiff) (defendants) (emails) (children) (whistleblowers) (therapy) (neglect) (abuse) (addiction) (trial) (jury) (damages) (lawsuits) (settlements) (parents) (proceedings) (rage) (outrage) (scandal) (censorship) (manipulation) (crisis) (epidemic) (entitlement) (controversy) (boycott) (expose)

Real Value Analysis

Summary judgment: The article is primarily a news report of a trial alleging that major social media platforms engineered addictive features that harmed a teen plaintiff. It provides useful context about who’s involved, legal claims, and potential consequences, but it contains almost no practical, actionable guidance a typical reader can use immediately.

Actionable information: There are no clear steps, choices, tools, or instructions a reader can apply right away. The piece does not offer concrete advice for parents, teens, or others about how to change device settings, document harms, seek help, or pursue legal remedies. References to internal documents, company practices, and possible policy changes are descriptive, not procedural. If you hoped the article would tell you how to protect a child online, how to collect evidence for a complaint, or how to limit exposure to problematic features, it does not.

Educational depth: The article explains the dispute at a surface level: plaintiffs say platforms intentionally increased time-on-site and targeted young users for revenue, while defenses argue other factors caused the plaintiff’s distress. It reports specific items such as an internal 2015 request for a 12% rise in time spent, and lists prospective witnesses. But it does not analyze how platform features like infinite scroll, autoplay, or recommendation algorithms work, nor does it explain mechanisms by which those elements might affect mental health. It lacks deeper context about research linking social media use to adolescent well‑being, statistical methods, or why internal marketing metrics translate (or may not translate) into user harm. Numbers appear only as isolated facts and are not explained in terms of significance, methodology, or likelihood.

Personal relevance: The article is relevant if you are a parent of minors, an educator, a policymaker, a lawyer following similar litigation, or someone concerned about platform design affecting youth. For casual readers it is informational but not personally actionable. It does not tell parents what to do now to reduce risks, nor does it explain what legal outcomes might mean for everyday users. The relevance is moderate-to-high for stakeholders but low for people seeking immediate practical steps.

Public service function: The piece performs a public-interest function by reporting on a major trial with potential policy and regulatory implications. It informs readers that large companies face scrutiny and that plaintiffs aim to change platform rules. However, it fails to offer safety guidance, warnings, or emergency information. It recounts courtroom claims and expected testimony without giving readers tools to act on concerns about children’s safety online.

Practical advice: The article contains no usable tips. It mentions potential remedies sought by state attorneys general—time limits for young users, removal of infinite scroll and autoplay, deletion of under‑13 data—but does not explain how individuals could implement similar protections on devices or accounts today. Any reader looking for step‑by‑step instructions on limiting screen time, adjusting privacy settings, or documenting harms for a complaint will find nothing concrete.

Long-term impact: The trial could have long-term consequences for platform design, regulation, and litigation, which the article notes. But it does not help an individual plan ahead beyond reporting that change is possible. There is no guidance on how to prepare for potential regulatory outcomes, or how communities and families might adapt if platform features change.

Emotional and psychological impact: The article could provoke concern or alarm, especially by noting family tragedies and testimony from grieving parents, but it does not offer calming context, coping strategies, or support resources. It risks creating fear without giving readers ways to respond constructively.

Clickbait or sensationalism: The article is straightforward in tone and mainly reports facts of the trial. It references emotional testimony and high-profile executives as witnesses, which naturally draws attention, but it does not appear to rely on exaggerated claims beyond the legal allegations being covered. It does emphasize dramatic elements (deaths of children, whistleblower testimony) without providing follow-up resources.

Missed opportunities: The piece misses several chances to inform readers about concrete steps they could take now: how to change account age or privacy settings, how to set device-level time limits, how to identify addictive design patterns, how to seek counseling or document behavioral changes in a child, and how to follow or participate in policy processes. It also fails to connect courtroom claims to practical signs that a child’s social-media use is harmful, or to summarize independent research on the topic so readers can assess claims.

Practical, realistic guidance the article omitted

If you are worried about a child’s social-media use, start by observing specific changes: note sleep loss, academic decline, withdrawal from in-person activities, sudden mood shifts after using apps, or secretive behavior with devices. Keep a simple written log for a few weeks noting when the child uses social media, for how long, what times of day (especially near bedtime), and any immediate mood or behavior changes afterward. Use your phone’s built-in settings to enforce limits: enable screen‑time or digital‑wellbeing controls to set daily app timers and bed‑time locking. For younger children, create accounts under parental controls where possible, and remove saved payment methods to prevent easy re‑enabling of subscriptions or purchases.

Adjust the device and app environment to reduce addictive patterns: turn off autoplay and in‑app notifications, disable push notifications for nonessential apps, and limit feeds to known accounts rather than algorithmic recommendations when apps offer that choice. Move phones and tablets out of bedrooms at night or establish a household charging station to discourage late‑night use. Model good habits by limiting your own screen time and doing shared, device‑free activities.

If you suspect harm or developing mental-health issues, have a calm, nonjudgmental conversation with the child and seek professional help if you see sustained changes in mood, behavior, eating or sleep. Keep records of concerning incidents, communications, and any unusual online interactions in case you later need them for clinicians, school officials, or legal steps. If immediate danger exists (self‑harm, suicide risk), contact emergency services or a crisis line right away.

To evaluate platform safety claims and policy debates, compare multiple independent sources rather than relying on a single article. Look for peer‑reviewed research, official guidance from pediatric or mental‑health organizations, and statements from consumer‑protection agencies. When reading reports of internal company documents or lawsuits, distinguish between allegations, evidence presented in court, and legal rulings; a lawsuit’s claims do not equal proven facts until decided by a court.

If you want to take civic action, contact your state or local representatives to express concerns, participate in public comment periods for regulations when available, or support advocacy groups focused on children’s online safety. Documented, local efforts and voting on policy measures can influence how platforms are regulated.

These steps are practical, general, and do not depend on new facts from the trial. They help you assess and reduce immediate risks, support a child’s health, and participate constructively in broader debates about platform design and youth safety.

Bias analysis

"engineered addictive features that harmed young users’ mental health." This phrase makes a strong claim as fact. It helps the plaintiff by saying platforms caused harm. The words push a harmful motive without showing proof in the sentence. It frames the companies as deliberate wrongdoers.

"alleging that platform design intentionally addicted children and caused psychological harm." The word "intentionally" assigns purpose to the companies. It favors the plaintiff by turning design choices into deliberate acts. The sentence presents intent as part of the allegation, which can make readers accept intent as fact.

"internal documents and emails will show deliberate efforts to increase time spent on the platforms" The phrase "will show deliberate efforts" asserts certainty about what the documents prove. It helps the plaintiff’s case by presuming the evidence outcome. It turns a future courtroom showing into a present claim.

"in which Meta leadership urged a 12% increase in time spent on the company’s platforms." Saying leadership "urged a 12% increase" uses a precise number to sound factual and damning. It makes the motive look explicit and measurable. This selection of a striking figure can steer readers toward believing calculated intent.

"Defense attorneys for Meta and YouTube countered that the plaintiff’s mental health struggles stemmed from other factors in her life" This phrasing groups several personal issues as the defense's full explanation. It frames the defense as blaming the plaintiff’s family and past. The order shifts focus from platform actions to private life.

"including family turmoil, neglect, abuse, and long-term therapy beginning in early childhood." Listing severe personal problems in one phrase emphasizes the defense’s claim these explain harms. The choice of intense words like "neglect" and "abuse" can lead readers away from considering platform influence. It supports the defense by showing alternate causes.

"Meta’s lawyer asked jurors to consider whether Instagram was a substantial factor in K.G.M.’s psychological distress." The word "consider" and "whether" softens the claim and frames causation as uncertain. It presents a cautious legal posture that favors doubt about platforms’ role. This phrasing reduces the sense of direct blame.

"The trial is expected to run about six weeks and will include testimony from experts, family members of children who died, company executives including Mark Zuckerberg, Adam Mosseri and Neal Mohan, and former Meta employees who became whistleblowers." Listing "family members of children who died" is emotionally powerful and steers sympathy toward plaintiffs. Naming famous executives and "whistleblowers" highlights drama and credibility for the plaintiff side. The selection of testimony types emphasizes harms and internal problems.

"Jury consideration of internal documents and company practices may set a benchmark for damages and influence numerous similar lawsuits nationwide." The phrase "may set a benchmark" suggests large systemic consequences. It helps readers see the case as precedent-setting and impactful. This frames the trial as more significant beyond the individual plaintiff.

"Separate, related legal actions by state attorneys general seek far-reaching changes to Meta’s handling of accounts and data for users under 13" Calling the changes "far‑reaching" colors the state actions as broad and important. It frames regulators as taking strong measures and supports the sense of widespread concern. The wording favors the view that change is needed.

"request restrictions such as time limits for young users, removal of certain product features like infinite scroll and autoplay, and deletion of data and algorithms built from under‑13 user information." Using specific product features like "infinite scroll" and "autoplay" names targets and makes the requested restrictions tangible. It focuses blame on design elements and helps readers picture concrete fixes. The choice of features aligns with the plaintiff/regulator critique.

"Those state petitions called recent Meta changes for teen accounts minimal and insufficient." The words "minimal and insufficient" repeat the petitions' judgment without counterbalance. It sides with the state petitions’ criticism and shows their view as valid. No Meta response is given here to balance that strong wording.

"About one hundred people attended Monday’s proceedings in Los Angeles, including parents who say their children died because of platform design choices." Including "parents who say their children died because of platform design choices" uses the parents’ claims to add emotional weight. The phrase ties deaths directly to design in the parents’ view, which pushes a causal link. The text does not give any alternative explanation for those deaths.

"Snapchat-parent Snap and TikTok settled with K.G.M. and are no longer defendants in the case." The word "settled" can imply admission or avoidance, but the sentence does not explain terms. It may lead readers to infer guilt or liability without stating facts. The lack of detail hides why they left the case.

Emotion Resonance Analysis

The passage conveys several clear and layered emotions through its description of the trial, the parties’ statements, and the courtroom setting. One prominent emotion is anger or moral outrage, evident in phrases like “engineered addictive features,” “intentionally addicted children,” and parents who “say their children died because of platform design choices.” This anger is strong; it frames the companies’ conduct as deliberate and harmful, serving to condemn the platforms and push readers toward seeing the tech firms as culpable. Another strong emotion is grief and sorrow, especially where the text mentions “family members of children who died,” and parents attending the proceedings; this sorrow creates sympathy for plaintiffs and their families and highlights the human cost behind legal claims. Fear and concern appear in references to “psychological harm,” “mental health struggles,” and requests for limits such as time caps and removal of infinite scroll; these words produce worry about the safety of young users and imply urgency for protective change. A milder but significant emotion is skepticism or doubt, found in the defense’s counterclaims that the plaintiff’s issues stemmed from “other factors in her life,” including “family turmoil, neglect, abuse,” and long-term therapy; this injects uncertainty, prompting readers to question a simple cause-and-effect verdict. There is also a sense of determination or resolve on both sides: the plaintiff’s attorney promises “internal documents and emails” will show deliberate action, while defense lawyers ask jurors to weigh causation—both choices signal firmness and commitment to winning the court’s judgment. Finally, a feeling of anticipation or suspense is present in lines about the trial running “about six weeks,” likely testimony by major executives, and the possibility of setting a “benchmark for damages”; this forward-looking tone builds interest and signals potential broad consequences.

These emotions guide the reader’s reactions by shaping where sympathy, blame, and concern should fall. Anger and sorrow push readers to empathize with plaintiffs and view the platforms critically, steering moral judgment against the companies. Fear and concern over youth safety increase the perceived need for action and regulatory change, encouraging readers to support reforms like time limits or feature removals. Skepticism injected by the defense invites caution and balanced judgment, reminding readers that causation and context matter. Determination and anticipation frame the trial as consequential, making readers more likely to follow updates and regard the outcome as impactful beyond this single case.

The writer uses several persuasive emotional techniques. The text selects charged verbs and phrases—“engineered,” “intentionally addicted,” “caused psychological harm,” “parents who say their children died”—which sound more accusatory and urgent than neutral wording would. Personal detail, such as naming the plaintiff (K.G.M. or Kaley G.M.), naming attorneys and executives, and mentioning family members of deceased children, personalizes the story and increases emotional weight. Repetition of themes around addiction, harm to young users, and company responsibility reinforces the message and makes the claims feel more certain and pervasive. Contrasting language is used to heighten conflict: plaintiffs’ claims of deliberate design are set against defense assertions of preexisting family issues, producing a moral and factual clash that draws the reader into choosing sides. The inclusion of specific evidence references—an internal 2015 email urging a “12% increase in time spent”—adds a concrete detail that makes accusations feel credible and sharpens emotional response. Mentioning parallel state attorney general actions and potential nationwide influence magnifies the stakes, making individual harm seem system-wide. Together, these choices move readers’ attention toward sympathy for plaintiffs, concern about youth safety, and interest in regulatory outcomes, while also prompting a measure of skepticism by presenting the defense’s counterpoints.

Cookie settings
X
This site uses cookies to offer you a better browsing experience.
You can accept them all, or choose the kinds of cookies you are happy to allow.
Privacy settings
Choose which cookies you wish to allow while you browse this website. Please note that some cookies cannot be turned off, because without them the website would not function.
Essential
To prevent spam this site uses Google Recaptcha in its contact forms.

This site may also use cookies for ecommerce and payment systems which are essential for the website to function properly.
Google Services
This site uses cookies from Google to access data such as the pages you visit and your IP address. Google services on this website may include:

- Google Maps
Data Driven
This site may use cookies to record visitor behavior, monitor ad conversions, and create audiences, including from:

- Google Analytics
- Google Ads conversion tracking
- Facebook (Meta Pixel)