Last month, precisely on November 19, 2025, Techpoint Africa published a story about Okechukwu Nwaozor, detailing his impressive feat of building OkeyMeta. The product was initially described as a Large Language Model (LLM) that was built from scratch.
In the previous article (now updated), Nwazor stated that he was on a mission to compete with ChatGPT, despite only raising a meagre ₦2.7m in funding. He made it clear he understood how unbelievable his ambitions sounded, but he wouldn’t let any form of doubt or mockery stop his lofty ambition.
The story went viral and raised a lot of concern among knowledgeable individuals in the ecosystem who pointed out that it’s impossible to build an LLM from scratch while operating on a shoestring budget.
Further consultations with experts revealed that what Nwaozor built is a hybrid AI system based on open-source models, such as Google’s Gemma-2B.
Like many AI projects, OkeyMeta is built on existing infrastructure. According to Babatunde Fatai, Emerging Technologies & Innovation Strategist, “there’s a big difference between training a foundational LLM from scratch and building a hybrid AI system using open-source models.
“What Nwaozor accomplished is genuinely difficult: fine-tuning models, creating custom reasoning layers, building APIs that thousands of developers use, all at 17 with limited resources. That’s impressive engineering. But it’s not the same as training an LLM from scratch, which would require millions of dollars in compute alone.”
Fatai still gave credence to what Nwaozor and his team achieved, saying that while it is important for readers to understand the technical accuracy of what the team has built, it does not diminish their achievement.
Building a functional AI system at 17, assembling a team, preparing datasets, experimenting with training, prompt engineering, hosting APIs, and making the product publicly accessible are still commendable achievements.
However, it is equally important to distinguish between training an LLM from scratch, which requires enormous compute budgets, specialised engineering teams, and trillions of tokens of data, and building an AI product powered by open-source LLMs, which is what OkeyMeta represents.
The earlier wording created a misleading impression about the scale and originality of OkeyMeta. This correction is crucial to ensure that readers, researchers, and the broader AI community have an accurate understanding of OkeyMeta’s technical foundation.
Nwaozor’s story represents Nigeria’s growing curiosity, ambition, and early participation in the global AI landscape.










