Nigeria’s human rights body steps into AI regulation

Nigeria’s human rights commission has noted the risks AI presents through algorithm bias, privacy invasion, and inequality to Nigerians.
Artificial Intelligence
Share this story
Subject(s):

In a move to safeguard citizens’ rights in the digital age, the National Human Rights Commission (NHRC) of Nigeria has announced plans to engage with technology companies to prevent potential harm from using artificial intelligence (AI). This initiative underscores the Commission’s commitment to ensuring that technological advancements do not infringe upon human rights.​

At a recent webinar hosted by the International Network for Corporate Social Responsibility (IN-CSR) in collaboration with the NHRC and the National Information Technology Development Agency (NITDA), NHRC’s Executive Secretary, Tony Ojukwu SAN, highlighted the dual nature of AI.

According to Nairametrics, Ojukwu cautioned against risks such as algorithmic bias and privacy invasion of existing inequalities. ​

“AI, if not governed with robust ethical frameworks and with human dignity at the core, can breed inequalities, result in algorithmic bias, invade privacy, and ultimately infringe on human rights,” Ojukwu remarked.

He emphasised that the NHRC views this juncture not as a threat but as an opportunity to expand its mandate into the digital realm, ensuring that technological progress aligns with principles of dignity, equality, and justice. ​

Strategic engagement with tech companies

Central to the NHRC’s strategy is proactive engagement with tech companies to ensure that AI algorithms are transparent and accountable.

The Commission plans to mandate human rights due diligence in digital innovation, which includes rigorous assessments to identify potential harm before technology is deployed. This approach aims to prevent discrimination and ensure that AI systems operate within ethical boundaries.

Ojukwu further highlighted the importance of human oversight in AI deployment, stating that despite AI’s increasing sophistication, the central role of human oversight remains vital.

He said that the NHRC is uniquely positioned to act as a bridge between government entities, private sector innovators, academic researchers, and civil society organisations in the nation’s quest for AI governance.

The NHRC’s initiative is part of a broader collaborative effort to establish ethical guidelines for AI governance in Nigeria. The Commission is working alongside organisations like IN-CSR and NITDA to develop comprehensive norms and guidelines that will govern the ethical use of AI.

These collaborations aim to align AI governance with international human rights standards, such as the Universal Declaration of Human Rights and Nigeria’s National Action Plan on Business and Human Rights. ​

Meanwhile, Bosun Tijani, Nigeria’s Minister of Communications, Innovation, and Digital Economy, enlisted 120 experts in April 2024 to develop a framework for AI adoption.

Similarly, Kashifu Inuwa Abdullahi, Director General of NITDA, emphasised the agency’s commitment to developing diverse, high-quality local datasets to train AI models. This initiative seeks to create inclusive and equitable AI systems free from biases inherited from foreign datasets. ​

The path forward

The NHRC’s proactive stance on AI regulation reflects a growing recognition of the need for ethical oversight in technological advancements.

By engaging with tech companies and collaborating with key stakeholders, the Commission wants to ensure that AI serves as a tool for inclusive growth, sustainable development, and the enhancement of human dignity.

This initiative sets a precedent for other nations grappling with the ethical implications of AI, highlighting the importance of aligning technological progress with fundamental human rights.​

As Nigeria navigates the complexities of AI integration and tries to improve AI adoption, the NHRC’s efforts underscore the critical role of regulatory bodies in shaping a future where technology and human rights coexist harmoniously.

Read next