Twenty Exhibits
In 2026, India has 54 institutions in the QS World University Rankings — a fivefold increase from 11 in 2015. Three now sit in the global top 200. The narrative of Indian higher education's global ascent has become axiomatic: cited in Parliament, celebrated in media, and used as the centrepiece of every institutional marketing campaign.
But what if the narrative is wrong?
Not entirely wrong — India's higher education system has genuinely improved. But what if the magnitude of improvement is overstated? What if up to a third of India's QS gains are methodological artifact rather than academic advancement? What if the two ranking systems that India obsesses over — NIRF and QS — are not just different lenses on the same reality, but windows into parallel universes?
This report is the first comprehensive cross-audit of QS and NIRF at the indicator level. Using 12 years of QS data (2015-2026), 10 years of NIRF data (2016-2025), and six years of auditable institutional data from NIRF's Data Capture System, we built 20 original analyses that no institution, government agency, or consulting firm has previously published.
What we found is uncomfortable.
"An institution can score 91 out of 100 on teaching quality in NIRF and 14 out of 100 on teaching quality in QS — simultaneously, for the same institution, measuring the same academic year."
— Finding from the TLR-FSR Cross-Audit, this reportPART I: THE PARADOX
Chapter 1: The 961-Place Gap That Explains Everything
Imagine being told you are simultaneously the 14th best institution in your country and the 975th best in the world. That's not a hypothetical — it's the lived reality of Manipal Academy of Higher Education in 2026.
This isn't a bug. It's a feature of two ranking systems that claim to measure "university quality" but are, in fact, measuring fundamentally different things. NIRF measures whether your institution teaches well, places graduates, and serves India. QS measures whether the global academic community has heard of you, whether your faculty publish in Scopus-indexed journals, and whether foreign students want to study there.
The result is a parallel universe of institutional identity — where India's best are invisible on the global stage, and global recognition has almost nothing to do with domestic teaching quality.
Exhibit 1 maps every Indian institution's NIRF rank against its QS rank. The connecting lines tell the story: the longer the line, the wider the identity crisis.
Key Insight
IIT Madras — India's undisputed #1 for five consecutive years — sits at QS #180. That's a 179-place identity gap. IIT Hyderabad, NIRF #12 and rising fast, is QS #685. The median gap for Indian institutions is 312 places. This isn't a rankings problem. It's a structural disconnect between what India values in higher education and what the world measures.
Chapter 2: How Much of India's QS Improvement Is Real?
Between 2015 and 2026, India went from 11 institutions in QS to 54. IIT Delhi climbed from #235 to #123. IIT Bombay surged from #222 to #129. The narrative of Indian higher education's global ascent has become a point of national pride.
But how much of that improvement is genuine academic advancement — and how much is methodological artifact?
We built a Gaming Index that scores each institution on five dimensions of potential strategic optimization: the divergence between NIRF research scores and QS citation metrics, unexplained spikes in Employer Reputation, sudden jumps in Sustainability scores, signs of survey mobilization, and anomalies in International Student Ratio reporting.
Exhibit 2 presents the verdict.
The Verdict
IIT Bombay scores 62/100 on the Gaming Index — the highest among India's elite. Its Sustainability score jumped 22.7 points in a single year (from 52.5 to 75.2), while its NIRF DCS green infrastructure CapEx showed no corresponding increase. Its QS Citations per Faculty rose even as NIRF's Research & Professional Practice score for the same institution declined. Our estimate: 30-40% of India's aggregate QS improvement is genuine academic advancement. 30% is methodological artifact from QS's changing weights. And 30% is strategic optimization.
PART II: THE CROSS-AUDIT
Chapter 3: The Perception Arbitrage — Where Domestic Brand Doesn't Travel
Every ranking system includes a reputation component. NIRF calls it "Perception." QS calls it "Academic Reputation." Both claim to measure brand strength. But one surveys Indian academics and employers; the other surveys 190,000 academics globally.
The gap between these two numbers is what we call the Perception Arbitrage — the difference between how India sees an institution and how the world sees it. This gap is not just an academic curiosity. It is the single largest consulting opportunity in Indian higher education.
Exhibit 3 plots every institution on this domestic-vs-global brand map. Institutions above the parity line have untapped global brand potential. Those below are globally overvalued relative to domestic reputation.
Key Insight
IIT Madras has a 58.5-point perception gap — scoring a perfect 100 on NIRF Perception but only 41.5 on QS Academic Reputation. JNU has a 44-point gap. BITS Pilani: 30 points. These are institutions with enormous domestic brand equity that simply hasn't crossed borders. Contrast with Delhi University, which sits almost exactly on the parity line (60.0 vs 58.9) — its colonial-era global connections still resonate. Every point on this chart is a potential ₹15-25 lakh consulting engagement.
Chapter 4: Where Should ₹5 Crore Be Invested?
This is the question every Vice-Chancellor asks. They have limited resources, multiple ranking parameters to improve, and no analytical framework to decide where the marginal rupee delivers the greatest rank improvement.
We answered it by running correlation analysis across 10 years of NIRF parameter changes against QS rank movements for the same institutions. The result is a Parameter Elasticity Study — a definitive guide to which NIRF parameter improvement actually moves QS ranks.
Exhibit 4 presents the correlations.
The ₹5 Crore Answer
RPC (Research & Professional Practice) has a 0.72 correlation with QS rank improvement — by far the strongest link. Perception follows at 0.58. TLR (Teaching, Learning & Resources) manages only 0.31. And OI (Outreach & Inclusivity) — the parameter NIRF weights at 10% — has a 0.08 correlation with QS improvement. Virtually zero.
The implication is blunt: ₹5 Crore invested in Scopus-indexed publications yields 3× more QS rank improvement than the same amount spent on teaching infrastructure. This isn't a normative judgment — it's a statistical fact that every institution chasing global rankings needs to confront.
Chapter 5: The Category Cannibalization That QS Cannot See
NIRF ranks institutions across 16 categories. An institution can be ranked in Engineering, Management, Universities, Research, Medical, Law, and more — simultaneously. QS collapses everything into a single number.
This creates a dangerous blind spot. When IIT Bombay's Engineering program holds steady at #3 while its Management program collapses from #5 to #14, QS sees one institution that "improved slightly." The internal hemorrhage is invisible.
Exhibit 5 plots each institution's Engineering rank change against its Management rank change between 2020 and 2025. Institutions in the bottom-right quadrant are cannibalizing Management for Engineering stability.
Key Insight
Anna University's Management program fell from #23 to #88 — a 65-place collapse — while its Engineering program barely moved (#9 to #11). BHU Management: 28→60. IIT Kanpur Management: 11→27. Seven of eight IIT Management programs have declined. Only IIT Delhi held steady (6→4). This is a sector-wide crisis in IIT management education that is completely invisible in QS data. Any institution using QS alone for benchmarking is flying blind.
PART III: THE STRUCTURAL CRISES
Chapter 6: The Faculty Crisis — 4,500 Professors Short
Of all QS indicators, Faculty-Student Ratio is the most honest. You can game surveys, inflate citations, and manufacture sustainability reports. You cannot manufacture professors.
Using NIRF DCS verified faculty and student counts, we calculated the actual faculty-student ratio for each institution and modelled how many additional professors each would need to hire to reach the global average QS FSR score of 45/100.
Exhibit 6 quantifies the deficit.
The Deficit
Delhi University needs 4,500 additional faculty members — 3.75× its current headcount — to reach the global average. IIT Bombay needs 620 more (an 86% increase). Even IISc, with its favourable small-campus ratio, needs 180 more. India's average FSR is approximately 12/100 against a global average of 45/100. Under the new ONOD mandate, faculty counts are verified through Provident Fund records. This metric is un-fakeable.
Chapter 7: The 9× Internationalization Crisis
India's International Student Ratio in QS averages 2.9/100. The global average is 26.5. That's a 9× gap — the widest structural deficit of any major higher education system in the world.
We modelled the cost of closing this gap using DCS enrollment data to calibrate how many additional international students each institution needs per QS ISR point gained.
Exhibit 7 shows the per-institution ROI calculation.
Key Insight
IISc needs only 135 international students per ISR point (small campus advantage). Delhi University needs 2,344 per point (dilution across 75,000 students). And here's the bombshell: Chandigarh University has 4,815 international students — 56× IIT Bombay's 86. CU has cracked the internationalization code through active recruitment. The IIT system, by contrast, has structurally abandoned it. To reach even ISR 5.0, IIT Bombay would need 1,258 MORE international students — a 15× increase from its current base.
Chapter 8: Can the 9× Gap Ever Close?
We modelled two scenarios: Business As Usual, where current trends continue; and an Aggressive Target scenario that aims to close the gap from 9× to 4× within five years (reaching ISR 6.6 by 2031).
Exhibit 8 plots both trajectories against the global average.
The Uncomfortable Answer
Under BAU, India reaches ISR 3.9 by 2031. Still 7× below the global average. The aggressive target of 6.6 requires approximately 50,000 additional international students across India's top 20 institutions — a 10× increase from the current base. This demands Study-in-India visa reform (currently 6-month processing time), a ₹5,000 Crore scholarship fund targeting African and ASEAN students, and English-medium satellite campuses. Without systemic policy intervention, this gap is permanent. It's structural, not institutional.
PART IV: THE METHODOLOGY MINEFIELD
Chapter 9: The Sustainability Score — QS's Most Gameable Indicator
In 2023, QS introduced Sustainability as a scored indicator. Institutions self-report their environmental and social initiatives. There is no third-party audit, no verification mechanism, and no penalty for overstatement.
We cross-referenced QS Sustainability scores with NIRF DCS CapEx data on green infrastructure to build an audit trail that QS itself does not maintain.
Exhibit 9 is the result.
Audit Flag
IIT Bombay's QS Sustainability score jumped 22.7 points in a single year — the largest year-over-year increase of any Indian institution. Its NIRF DCS green infrastructure CapEx was ₹45 Crore, below several institutions with lower Sustainability scores. Delhi University scored 42 on Sustainability with just ₹15 Crore in green CapEx. The correlation between what institutions claim and what they spend is weak. QS Sustainability is the new soft target for gaming — self-reported, unaudited, and increasingly influential in rank determination.
Chapter 10: The COVID Experiment — When Surveys Went Virtual
COVID-19 created a natural experiment in ranking methodology. NIRF, which relies primarily on institutional data, remained remarkably stable during 2020-2022. QS, which relies heavily on surveys, exhibited extreme volatility.
Exhibit 10 compares how the same institutions moved in QS versus NIRF during the pandemic years.
Key Insight
Delhi University gained 50 QS places during COVID while dropping 2 in NIRF. IIT KGP lost 33 QS places while holding steady in NIRF. IIT Guwahati gained 35 QS places with no NIRF movement. The pattern is clear: survey-based indicators are inherently volatile during disruptions. Institutions with stronger alumni networks and survey mobilization capabilities gained; those without lost ground — regardless of any change in actual academic quality. This isn't a one-time anomaly. It reveals a permanent fragility in survey-dependent ranking systems.
Chapter 11: What If QS Changes the Rules?
QS has changed its methodology three times in the last five years, most recently adding Sustainability, Employment Outcomes, and International Research Network. Each change reshuffles the leaderboard.
We stress-tested three plausible methodology scenarios against current Indian institution scores to identify who wins, who loses, and where India's structural vulnerability lies.
Exhibit 11 models the impact.
India's Achilles Heel
Scenario A (FSR weight doubled from 5% to 20%): Every IIT falls 40-60 places. IIT Delhi drops from #123 to #165. India's faculty deficit becomes catastrophic. Scenario B (ER weight increased, ISR doubled): Mixed results — strong employer brands gain, internationalization laggards lose. Scenario C (CPF weight increased to 30%, Sustainability removed): IISc jumps 39 places to #180; citation-heavy institutions win. The takeaway: any methodology shift toward teaching quality metrics devastates Indian rankings. India's global ranking story is built on research citation metrics and employer reputation — the two legs that happen to be strong. If QS ever rebalances toward teaching, the floor drops out.
PART V: THE INSTITUTIONAL STORIES
Chapter 12: The Private University Revolution — In NIRF, Not Yet in QS
India's private universities are the great untold story of the last decade. Chandigarh University went from unranked to NIRF #32 in four years. BITS Pilani climbed from #32 to #16. VIT, KIIT, LPU, Shoolini — all showing dramatic NIRF improvement.
But QS hasn't noticed yet.
Exhibit 12 overlays NIRF trajectories with QS band movements for India's most dynamic private institutions.
The 3-5 Year Lag
BITS Pilani is NIRF #16 and rising — but stuck in QS 601-650 band. CU is NIRF #32 — but QS #575. Shoolini is NIRF #52 — but QS #571. The pattern reveals a 3-5 year lag between NIRF improvement and QS recognition. QS reputation surveys are lagging indicators — they measure what academics thought 2-3 years ago, not what's happening now. Prediction: BITS Pilani will crack QS top 500 by 2028. CU will enter top 400 by 2029. The private university revolution is real — QS just hasn't caught up.
Chapter 13: When QS Subject and NIRF Category Disagree
QS ranks institutions by subject on a global scale. NIRF ranks by category on a domestic scale. When IIT Bombay is NIRF Engineering #3 and QS Engineering #52, is that a 49-place "gap" or simply a different measuring stick?
Exhibit 13 maps the cross-system subject agreement and divergence.
Key Insight
The divergence is asymmetric. In Engineering, where India genuinely competes globally, the gaps are moderate (IITB: NIRF #3 vs QS #52). In Business and Management, where India's global brand is weaker, the gaps are enormous (IITB: NIRF #14 vs QS #151 — a 137-place gap). This suggests that India's Engineering schools have partially broken through globally, but Management, Sciences, and Humanities remain domestically strong and globally invisible. The subject-level story is far more nuanced than the institutional aggregate.
Chapter 14: Your Real Competitors Will Surprise You
Institutions instinctively benchmark against rank-adjacent peers. IIT KGP compares itself to IIT Kanpur. BITS Pilani looks at IIT Guwahati. But rank adjacency in one system doesn't mean profile similarity across both systems.
We used cluster analysis on combined QS and NIRF indicators to group institutions by actual profile similarity — not rank proximity.
Exhibit 14 reveals the competitive clusters.
The Surprise
BITS Pilani (NIRF #16) doesn't belong in Cluster A with the Elite IITs — it belongs in Cluster C with VIT, CU, and KIIT. Why? Its QS score (15) is closer to VIT's (18) than to IIT Kanpur's (34). Delhi University is a QS overperformer (Cluster D) — its global brand outperforms its NIRF profile, likely due to colonial-era international connections. IISc sits in Cluster A with the top IITs but is the only research-only institution — a different species entirely. Institutions should benchmark against their cluster peers, not their rank neighbors.
PART VI: THE MONEY TRAIL
Chapter 15: The Cost of Climbing — ₹76 Crore vs ₹240 Crore Per QS Point
Not all QS rank improvements are created equal. Some institutions extract extraordinary value from modest investment. Others pour crores into the system and barely move.
We mapped five years of DCS financial data (CapEx + OpEx + Research spend) against QS score changes to calculate the cost per QS score point for each institution.
Exhibit 15 ranks institutions by efficiency.
Key Insight
IIT Hyderabad — the youngest IIT in our dataset — spends just ₹76 Crore per QS point. It has the steepest trajectory, the lowest cost base, and the freshest institutional culture. VIT, by contrast, spends ₹240 Crore per point — more than 3× as much. IISc's cost is ₹226 Crore per point, but for a different reason: at CPF 99.8, it's at the absolute frontier. Each additional point requires exponentially more effort. Private universities systematically pay more per QS point than public institutions, primarily because they lack the government research funding pipeline.
Chapter 16: The Publication Factory — Diminishing Returns at the Frontier
Publications are the currency of global rankings. More Scopus papers mean higher QS Citations per Faculty scores. But the conversion rate varies dramatically by institution — and understanding that rate is the difference between strategic investment and wasted resources.
Exhibit 16 calculates the institution-specific publication-to-CPF conversion rate.
Key Insight
Delhi University needs just 25 publications per CPF point — it's starting from a low base (CPF 27.0), so each additional paper has outsized impact. IISc needs 850 publications per point — at CPF 99.8, it's effectively at the ceiling with no room to improve. The critical insight: CPF is citations per faculty. IISc's 481 faculty inflates its per-capita metric massively. IIT KGP, with 700 faculty, is diluted. An institution's CPF strategy must account for its faculty denominator. Growing faculty to improve FSR actually hurts CPF unless the new faculty bring proportionally more citations.
PART VII: THE PREDICTIONS
Chapter 17: QS 2027 — Will India Finally Crack the Top 100?
We built a trajectory model using 12 years of QS rank data overlaid with NIRF momentum indicators to project QS 2027 ranks with confidence bands.
Exhibit 17 plots the trajectories and predictions.
The Prediction
IIT Delhi is projected to reach #108 (confidence band: 95-120) — which would make it India's first realistic top-100 contender. IIT Bombay projects to #115 (100-130). IIT Madras: #160 (145-175). The wildcard is IISc — its trajectory is the most volatile (155→225→165→219 over four years), making it genuinely unpredictable. If IIT Delhi reaches #95 and IIT Bombay reaches #100, it would mark the first time India has two institutions in the global top 100. We give that a 20% probability by 2027.
Chapter 18: NIRF 2026 — The Top 20 Forecast
Using 10-year parameter trends and score inflation calibration, we predicted the NIRF 2026 Overall top 20.
Exhibit 18 presents the forecast.
Key Insight
The top 6 are locked — IIT system inertia makes them virtually immovable. The biggest predicted mover is IIT Hyderabad (12→9), which has been consistently climbing. BITS Pilani is predicted to rise from #16 to #14. VIT from #21 to #18. Score inflation means hitting the same rank requires approximately 1.5% higher total score each year — the bar keeps rising. The most volatile zone is ranks 7-15, where 5 institutions compete within a 3-point band.
PART VIII: THE CONSULTING TOOLKIT
Chapter 19: The QS Improvement Roadmap — IIT Bombay Case Study
To demonstrate the practical application of this analysis, we built a full indicator-by-indicator QS improvement roadmap for IIT Bombay — from #129 to a top-100 target.
Exhibit 19 shows the McKinsey-style waterfall decomposition.
The Roadmap
Academic Reputation improvement (58.5→65.0) alone delivers 12 rank positions — the single highest-ROI intervention. Combined with CPF improvement (82.9→88.0, delivering 8 ranks) and incremental gains across IRN, SUS, and EO, the total potential improvement is 34 rank positions — enough to bring IITB from #129 to approximately #95. However, FSR (16.1) and ISR (1.6) are marked "VERY HARD" — structural constraints that require policy-level intervention. The ceiling for Indian institutions without FSR and ISR improvement is approximately #90-100. Breaking into the top 50 is impossible without solving the faculty and internationalization crises.
Chapter 20: The Institution Report Card — IIT Delhi
Every analysis in this report can be distilled into a single-page diagnostic for any institution in our dataset. We demonstrate with IIT Delhi — India's top QS-ranked institution.
Exhibit 20 is the complete report card.
The Three Interventions
For IIT Delhi: (1) AR Improvement (47.3→55): Mobilize 500+ alumni and academics for the QS survey. Impact: ~15 rank improvement. Cost: ₹2 Crore/year. (2) CPF Boost (56.4→65): Target 200 additional high-impact Scopus publications annually. Impact: ~8 ranks. Cost: ₹5 Crore/year. (3) ISR (1.6→5.0): Recruit 1,100+ international students via ICCR and ASEAN scholarships. Impact: ~3 ranks. Cost: ₹8 Crore/year. Total investment: ₹15 Crore/year for a projected 26-rank improvement. The FSR deficit (21.9, requiring 520 additional faculty) remains the structural ceiling.
Conclusion: Two Systems, One Truth
This report has presented 20 analyses, 12 years of QS data, 10 years of NIRF data, and over 7,200 institutional records. If there is one conclusion, it is this: NIRF and QS are not competing answers to the same question. They are answers to different questions.
NIRF asks: Does this institution serve India well? Does it teach effectively, place graduates into jobs, include disadvantaged students, and produce research relevant to national priorities?
QS asks: Is this institution globally visible? Do international academics know its name? Do its faculty produce Scopus-indexed citations? Do foreign students want to study there?
An institution can excel at one and fail at the other. IIT Madras is the perfect case: #1 in NIRF because it teaches superbly, places graduates at high salaries, and conducts nationally relevant research. #180 in QS because the global academic community doesn't survey-rank it highly, its faculty-student ratio is poor by global standards, and it has 69 international students.
The policy implication is that India must decide what it wants. If the goal is to serve Indian students and the Indian economy, NIRF matters and the system is working. If the goal is global prestige, QS matters — and the path requires solving the faculty crisis (4,500 professors short at DU alone), the internationalization crisis (9× gap), and the reputation gap (58-point perception arbitrage at IIT Madras).
Both goals are legitimate. But pretending that one ranking system can measure both is the great disconnect at the heart of Indian higher education policy.
The institutions that understand this distinction — and invest accordingly — will be the ones that rise in both systems. The ones that don't will continue celebrating domestic rankings while remaining invisible to the world.
Contact Aurobindo Saxena, Founder & CEO, RAYSolute Consultants
aurobindo@raysolute.com | www.raysolute.com | Forbes India Contributor
Download Full Report (PDF)