In a move that signals a growing rift between India’s elite technical institutions and global ranking bodies, BITS Pilani has withdrawn from the Times Higher Education (THE) World University Rankings for 2026.
Speaking on the decision, Prof V Ramgopal Rao, group vice-chancellor of BITS Pilani, characterised the ranking methodology as a “black box” that had failed to align with the verifiable academic and research capacity of Indian institutions.
The decision marks a big milestone in a journey that began in 2020, when Prof Rao led a high-profile boycott of THE Rankings alongside several IITs. While those initial concerns centred on the opacity of reputation surveys, Prof Rao said that the problem had become structural, affecting the very core of how research excellence is measured.
Research metrics
At the heart of the withdrawal was THE’s ‘Research Quality’ pillar, which accounts for 30% of an institution’s overall score. This pillar, along with its sub-indicators – citation impact, research strength, excellence, and influence – is derived from citation behaviour.
“When such outcomes are difficult to reconcile with accepted academic realities, it raises legitimate questions about how research scores are being constructed,” Prof Rao said.
He pointed to a glaring anomaly: the Indian Institute of Science (IISc), universally regarded as India’s premier research institute, appears lower on global research quality indicators than much younger Indian institutions. This misalignment suggests that the metrics might reward technical optimisation of a rank rather than genuine innovation capacity.
Another grievance is self-citations. According to THE’s own documentation, self-citations are included in the data used for the World University Rankings.
While this doesn’t imply academic impropriety, Prof Rao said it makes citation-heavy composites overly sensitive to publication volume and citation networks, rather than actual research capacity like funding strength or translational impact.
Transparency vs perception
For BITS Pilani, the final factor in withdrawing was the realisation that participation had ceased to add ‘analytical value’. The institution found that THE outcomes diverged sharply from independently verifiable indicators used for internal governance.
In contrast, Prof Rao highlighted CSRankings as a model of transparency.
Unlike THE, CSRankings doesn’t use citation-derived indicators; instead, it ranks institutions based on publications in highly regarded, selective venues. “The data is transparent,” Prof Rao said, noting that BITS Pilani has “performed exceptionally well in this framework compared to established premier research institutions.”
The issue of ‘perception’ remains a point of contention. In a hyper-connected world, Prof Rao said that visibility can be manufactured quickly through branding and amplified networks without reflecting academic depth. When perception is weighted heavily, it allows reputation to outpace substance, rewarding institutions that prioritise immediate exposure over long-term outcomes.
National frameworks
While stepping away from THE, BITS Pilani isn’t abandoning benchmarking. Instead, it’s shifting its focus to more rigorous, transparent, and outcome-linked measures.
Prof Rao praised the National Institutional Ranking Framework (NIRF) as a credible alternative. “Institutions know exactly what is being measured and why,” he said, citing NIRF’s transparent methodology and public disclosure of data sources. He credited NIRF with strengthening data discipline and institutional introspection across the Indian higher education sector.
A potential domino effect?
As BITS Pilani joins the ranks of non-participating IITs, the question is: would more universities follow? Prof Rao believes that as the sector matures, more institutions would ask whether ranking outcomes truly align with how they manage academic performance. Where that alignment seems to be missing, universities might choose selective and principled engagement over automatic participation.
When asked what BITS Pilani lost by withdrawing, Prof Rao was blunt: “Primarily, visibility within that specific ranking system.” But he emphasised that academic standards, research capability, student quality, and employer confidence remain untouched.
“Institutions are ultimately judged by their academic culture and outcomes,” he said. “Not by inclusion in any single table.”
