The Real Story Behind JAMB Failing Igbo Candidates - JAMB 2025 UTME Technical Review Report [VIDEO]


On Tuesday, the 14th of May 2025, a high-level technical review session was convened at the Joint Admissions and Matriculation Board (JAMB) headquarters in Abuja. The meeting, presided over by the esteemed Registrar, Professor Ishaq Olanrewaju Oloyede, was initiated in direct response to the mass outcry that followed the release of the 2025 Unified Tertiary Matriculation Examination (UTME) results the previous Friday. The objective of this gathering was to unravel the root causes behind the unexpectedly poor candidate performance and to establish clear mitigative measures to restore confidence in the integrity of the UTME assessment process.

The meeting began promptly at 10:00 a.m. and was attended by a distinguished panel of stakeholders. Present were heads of key directorates within JAMB, and Lead Systems Analysts. Also in attendance were delegates from the CBT Centre Regulatory Committee, representatives from the Educare Technical Team, and lead engineers from the consortium of software vendors responsible for the examination engine infrastructure.

Discussions commenced with a comprehensive analysis of the existing system architecture. The panel reviewed the software stack powering the CBT engine, paying particular attention to how examination content was delivered to candidates. An important distinction was made between server-streamed and locally cached delivery methods.

The discussion further investigated the existence and efficacy of randomization mechanisms for both questions and answer options. This scrutiny was aimed at determining whether shuffling protocols were uniformly enforced, how answer permutations were managed, and if consistency of correct answer mapping was maintained across candidates following randomization.

Attention then shifted to the scoring and marking logic. This line of inquiry included assessing the capability of the system to reconstruct question-and-answer mappings and evaluate response accuracy with complete transparency. It was critical to determine if each candidate’s raw responses were stored, retrievable, and auditable.

The team also rigorously examined JAMB’s quality assurance and testing frameworks. This involved reviewing the load testing procedures previously employed, verifying the consistency of deployment builds across centres, and probing whether any last-minute hotfixes or patches were applied during or after the examination period. System logs, particularly incident and error records, were meticulously analyzed to identify any anomalies that could suggest failures in content delivery, timing accuracy, or candidate response capture.

Another focal point of the review was the potential for human factor influence. The panel sought clarity on whether any manual post-processing occurred during score collation or validation. The conversation concluded with an assessment of JAMB’s response readiness to Freedom of Information requests and its willingness to publish anonymized candidate-level result data as a measure to enhance public trust.

One of the most critical discoveries made during this session revolved around three major systemic changes introduced in the 2025 UTME. The first was a shift from the traditional count-based analysis to a more robust source-based analysis of results. In previous years, JAMB evaluated the integrity of examination sessions primarily by counting the number of responses submitted per session. If the majority of candidates in a session of 250 submitted a near-complete set of answers, the session was deemed valid. Any significant deviation led to disqualification of that centre’s results. However, in 2025, a more advanced model was adopted—one that focused on the actual source and logic of the answers provided, rather than just their quantity.

The second change involved full-scale shuffling of both questions and answer options. This ensured that even two candidates sitting in the same session would not receive identical permutations, thereby enhancing test security. The third change was a series of systemic improvements aimed at optimizing performance and reducing lag during exam sessions. This was a major policy change that saw the best and highest Obtained UTME score in 15 years. And this would have amounted to a great achievement by JAMB!

While these improvements were technologically sound in theory, a major operational flaw was uncovered during the implementation phase. The system patch necessary to support both shuffling and source-based validation had been fully deployed on the server cluster supporting the KAD (Kaduna) zone, but it was not applied to the LAG (Lagos) cluster, which services centres in Lagos and the South-East. This omission persisted across all sessions until the 17th session, after which the error was discovered and corrected.

As a result, approximately 92 centres in the South-East and 65 centres in Lagos—totalling 157 centres—operated using outdated server logic that could not appropriately handle the new answer submission/marking structure. This affected an estimated 379,997 candidates, whose results were severely impacted due to system mismatches during answer validation.

To verify the scale and accuracy of this issue, JAMB collaborated with the Educare Technical Team, which had gathered response data directly from over 18,000 candidates. After deduplication and filtering, about 15,000 authentic records were analyzed. Of these, more than 14,000 originated from the regions serviced by the unpatched LAG servers, confirming the technical review's findings. Comparative analyses between JAMB’s internal audits and third-party system evaluations revealed significant overlap, reinforcing the conclusion that the affected centres were indeed operating under impaired conditions.

In response to these findings, Professor Oloyede convened a press briefing at 3:00 p.m. on the same day. He formally acknowledged the oversight and issued a sincere apology to candidates and their families. He announced that all affected candidates would be given the opportunity to retake the examination at no additional cost. Furthermore, recognizing the potential scheduling conflict with ongoing SSCE examinations, JAMB had reached an agreement with the West African Examinations Council (WAEC) to ensure seamless coordination of timetables.

Affected candidates were advised to reprint their examination slips by Friday, 17th May 2025, to confirm their revised test schedules. JAMB also released an official communique titled MAN PROPOSES, GOD DISPOSES, within which a Profound, emotional Section: “Appeal, Appreciation, and Apology,” which reiterated its commitment to fairness, transparency, and continuous improvement.

This review, conducted with thoroughness and transparency, signifies JAMB's resolve to uphold the sanctity of its examination processes. Going forward, stronger deployment validation protocols and real-time monitoring mechanisms will be implemented to prevent such oversights.

In summary, JAMB opened its systems to independent reviews to restore public confidence and ensure the reliability of the UTME for all stakeholders. And we hereby report, that this incident was neither a system failure nor administrative manipulation, but an outright human error.

~

Educare Tech Team,
As submitted by
Engr James Nnanyelugo

Post a Comment

0 Comments