On January 13, 2026, a Lagos State High Court delivered a significant judgement in favour of human rights lawyer Femi Falana, SAN, against Meta Platforms Inc, the parent company of Facebook and Instagram, in a case centred on privacy invasion and misinformation.
The ruling, publicly shared by the plaintiff’s lawyer and privacy expert Olumide Babalola on LinkedIn, showed the judgement in favour of the user granting the plaintiff $25,000 as compensation.
However, some experts have raised concerns over the impact of this judgement and the precedence it sets.
The case in brief
In early 2025, a video circulated on Facebook that reportedly falsely portrayed Falana as suffering from a serious medical condition. Though the content’s originator remains unnamed in the ruling summary, the High Court held Meta responsible, dismissing the company’s claims that it was merely a neutral hosting service for third-party content.
According to the court, Meta owed a duty of care to users whose personal data and reputation were affected by content on its platforms, and as such, it breached Section 24 of the National Data Protection Act of 2023. Per Babalola, the court stated that Meta could not hide behind the intermediary and hosting defence once it monetises content and controls distribution systems.
As stated, false medical information about an individual, even a public figure, constitutes invasion of privacy under Nigerian law. And as a joint data controller, not just as a passive conduit of content, the platform determines the means and purposes of processing user content and employs algorithmic distribution for commercial benefit.
This legal framing, however, that a big tech company is not merely a platform but an active participant in content distribution is unprecedented in the Nigerian context and sets an important judicial precedent.
Expert concerns over this precedence
Following the court’s ruling, some experts have raised concerns about its effects on the average user and the tech companies.
“My criticism is not about the court’s authority or its concern for harm suffered by the claimant. It is about the legal architecture to reach the outcome, which, based on the judgement as reported, rests on doctrinal shortcuts that are neither coherent nor sustainable.” Gbenga Odugbemi, a legal and privacy expert, says.
He went on to explain that this case concerns false statements, reputational injury, and alleged damages arising from a third party publication on a social media platform and in legal taxonomy, that is traditionally and properly addressed through defamation or negligence, not privacy law.
“Privacy law is not a general-purpose substitute for tort law simply because it offers a more convenient pathway to liability. Judicial legitimacy arises from applying the correct legal framework to the facts before the court and applying it correctly.” Odugbemi says.
For Advocate Dirontsho Mohale, CEO & Founder, Baakedi Professional Practice, the concern lies in the inclusion of Meta as a joint data controller in this matter. Nigeria’s Data Protection Act, 37 of 2023, defines a data controller as an individual, private entity, public commission, agency, or any other body which alone or jointly with others determines the purpose and means of processing personal data.
“The view of the court is that Meta, along with the unnamed party that actually posted the video, decided to process this personal data by posting on social media platforms, and that is interpreting the definition incorrectly” Adv Mohale says.
What this means for Meta and the average user
Globally, Meta has faced scrutiny for privacy practices and data handling. In Europe, regulators have imposed record fines including a €1.2 billion penalty for unlawful data transfers and privacy infractions.
The tech company has also been entangled in ongoing disputes with data protection authorities, including challenges to privacy sanctions in Nigeria.
Historically, global tech companies like Meta have majorly relied on intermediary immunity, but this ruling challenges this defence in situations where platforms control content distribution and profit from it. The ruling mandates that Meta must deploy safeguards proportionate to the risk of its operations in Africa.
Platforms like Meta may now face increased expectations to develop better content review mechanisms, proactive misinformation safeguards, and faster takedown processes, not just reactive responses to complaints.
For Meta, this could translate into higher compliance costs, more sophisticated moderation systems, and potentially greater legal exposure in other jurisdictions following similar logic.
“If Meta does not appeal this case, tech firms will find themselves in court on a daily basis, worldwide. Remember that all legal systems encourage the development of local law by the consideration of foreign law and the application of international law. This case will be cited, as foreign law in e.g. South Africa, in a local lawsuit which will then creates local jurisprudence,” Adv Mohale says.
This ruling reinforces that constitutional protections such as Nigeria’s right to privacy and the Nigeria Data Protection Act, 2023 are substantial and actionable even when infringements occur through multinational platforms.
However, experts have raised concerns regarding this approach.
“On the surface, the decision may appear protective. However, the secondary effects are more troubling. If platforms face liability merely for hosting or algorithmically distributing third-party content, they will rationally respond by over-moderation. Lawful speech, such as criticism, satire, investigative reporting, and political commentary, will be removed pre-emptively to avoid litigation risk,” Odugbemi says.
According to these experts, the end result will not be greater protection for users, but less speech, reduced access to information, and an expansion of private censorship driven by litigation anxiety rather than legal principle.
“Users’ creativity will be limited, and user-experience may be negatively impacted by the additional controls that the platform owners have now been ordered to implement,” Adv Mohale says.
Following this premise, the idea would be that if a platform monetises a page or uses algorithms to distribute content, it is now considered a publisher or controller with a direct duty of care. This could lead to a flood of lawsuits against platforms for any user-generated content that causes foreseeable harm.
With the threat of vicarious liability for every post, platforms may become extremely aggressive in their content moderation to avoid multi-million dollar lawsuits. While Meta can afford a $25,000 fine or larger regulatory penalties, smaller African tech startups in the same space may not.
“If every social app or forum is classified as a joint data controller responsible for its users’ posts, the legal and compliance costs could bankrupt local startups,” a privacy expert says.
While the complainant’s lawyer, Olumide Babalola told Techpoint Africa that no record of the poster was found and that Meta did not provide the users’ information, experts have expressed unsure stances on that.
“Liability does not shift because a wrongdoer is hard to find. In law, difficulty of enforcement does not justify re-assigning liability to a more convenient defendant,” Odugbemi says.
According to him, “The court does not acquire a free hand to hold Meta substantively responsible simply because the author is unknown, or Meta controls identifying data. That would be a dangerous precedent. The law does not say, ‘If you cannot find the speaker, sue the microphone.’”
Remember that Meta had previously threatened to withdraw certain services like WhatsApp and Facebook from Nigeria if regulation becomes a problem. If the judicial environment is seen as hostile to the global platform’s business model, big tech companies may limit their investments or exit the market entirely, which could cause a massive disconnect in Meta’s case.
However, the big question is, can users now seek redress where false contents are involved using this precedence? According to privacy experts, this expands options beyond mere content removal, but is peculiar to certain cases.










