Top and Current
Source : (remove) : WBAY
RSSJSONXMLCSV
Top and Current
Source : (remove) : WBAY
RSSJSONXMLCSV
Thu, April 9, 2026
Thu, April 2, 2026
Tue, March 24, 2026
Sat, March 21, 2026
Wed, March 18, 2026
Tue, March 17, 2026
Mon, March 16, 2026
Sun, March 15, 2026
Wed, March 11, 2026
Tue, March 10, 2026
Sun, March 8, 2026
Wed, February 25, 2026
Tue, February 24, 2026
Sun, February 15, 2026
Sat, February 14, 2026
Fri, February 13, 2026
Thu, February 12, 2026
Sat, February 7, 2026
Thu, February 5, 2026
Wed, February 4, 2026
Tue, February 3, 2026
Mon, February 2, 2026
Sun, February 1, 2026
Sat, January 31, 2026
Mon, January 26, 2026
Mon, January 19, 2026
Wed, January 14, 2026
Mon, January 12, 2026
Fri, December 19, 2025
Mon, December 8, 2025
Mon, December 1, 2025
Wed, November 5, 2025
Mon, October 27, 2025
Fri, October 24, 2025
Tue, October 21, 2025
Wed, October 1, 2025
Mon, September 29, 2025
Fri, September 26, 2025
Thu, September 18, 2025
Wed, September 17, 2025
Tue, September 16, 2025
Wed, September 10, 2025
Thu, August 14, 2025
Wed, August 6, 2025

Radio Host Seeks Restraining Order in Digital Impersonation Case

GREEN BAY, WI - April 10th, 2026 - The case of WGBY Radio host Mark Johnson, who filed restraining orders against Sarah Miller and David Lee this week, is a stark reminder of the escalating problem of digital impersonation and its potential for real-world harm. While the specifics of this case involve a local radio personality, the underlying issues resonate across all sectors - from individuals and small businesses to major corporations and public figures.

Johnson alleges that Miller and Lee created and maintained a fraudulent Facebook account in his name, disseminating "misleading and damaging information." This isn't simply a case of someone creating a fan page; it's an active campaign of impersonation intended, according to Johnson's legal team, to inflict reputational damage and emotional distress. The incident underscores a significant gap in current legal frameworks and social media platform policies concerning the protection of individuals from malicious digital mimicry.

The Rise of 'Deep Fakes' and Sophisticated Impersonation

While relatively simple in its execution--creating a fake Facebook account--Johnson's case foreshadows increasingly complex forms of digital impersonation. In 2026, the proliferation of readily available AI tools has made it easier than ever to create convincing "deepfakes" - manipulated videos and audio recordings that can realistically depict someone saying or doing things they never did. The potential for misuse is enormous, ranging from political disinformation and financial fraud to personal attacks and character assassination. Just last month, Senator Evelyn Reed was targeted by a sophisticated deepfake audio clip circulating online, falsely implying her endorsement of a controversial policy. The clip caused significant disruption before being debunked, and highlighted the difficulty in rapidly identifying and mitigating such threats.

Legal Recourse and Platform Responsibility

Currently, legal recourse for victims of digital impersonation is often fragmented and challenging. While many jurisdictions have laws addressing identity theft, these laws typically focus on financial gain rather than reputational harm. Civil defamation laws can be applied, but proving malice and damages can be difficult and costly. The restraining orders sought by Johnson represent an attempt to establish a more immediate form of protection, preventing further dissemination of harmful content and direct contact from the alleged perpetrators.

However, a crucial aspect of this issue lies with the social media platforms themselves. While platforms like Facebook (now MetaCorp) and X (formerly Twitter) have policies against impersonation, enforcement remains inconsistent. Algorithmic detection methods are constantly playing catch-up with increasingly sophisticated impersonation techniques. Many argue that platforms should be held to a higher standard of responsibility for proactively identifying and removing fake accounts and malicious content. There's ongoing debate about whether Section 230 of the Communications Decency Act, which currently shields platforms from liability for user-generated content, needs to be reformed to address the unique challenges posed by digital impersonation.

Impact on Trust and Public Discourse

The widespread availability of impersonation tools erodes public trust in online information. When it becomes increasingly difficult to distinguish between genuine and fabricated content, it fuels skepticism and cynicism. This has profound implications for democratic processes, as voters become more susceptible to disinformation campaigns. Businesses also suffer, as fake accounts can damage brand reputation, spread false information about products and services, and even defraud customers. The financial costs of dealing with digital impersonation are estimated to be in the billions annually.

What Can Individuals Do?

While waiting for legal and platform-level solutions to evolve, individuals can take steps to protect themselves. This includes:

  • Monitoring Your Online Presence: Regularly search for your name and likeness online to identify potential impersonation attempts.
  • Strengthening Social Media Security: Enable two-factor authentication and use strong, unique passwords.
  • Reporting Impersonation: Immediately report fake accounts to the relevant social media platform.
  • Educating Others: Raise awareness about the dangers of digital impersonation and how to identify fake accounts.

Johnson's case serves as a wake-up call. The battle against digital impersonation is not just a legal issue; it's a societal challenge that requires a multi-faceted approach involving legislation, platform accountability, and increased public awareness. The upcoming court hearing will be closely watched, as its outcome could set a precedent for how similar cases are handled in the future.


Read the Full WBAY Article at:
https://www.wbay.com/2026/04/09/radio-host-files-restraining-orders-against-two-over-alleged-fake-social-media-account/