Skip to content

“DPD 2: The declining quality of USP students, staff and departmental assessments” (1995)


DPD 2:   The declining quality of USP students, staff and departmental assessments: 1995

When I was appointed DPD, there was an expectation that I would be focused on the planning  of USP’s financial resources.  I interpreted my job description more broadly, to include the quality of USP degrees, which also implied that I examine the quality of our students and staff.
There had also been a long simmering debate that indigenous Fijian students, who had extremely high failure rates at USP, were not getting a fair deal at USP, with the unspoken allegation being that Indo-Fijian academics were biased in some ways.
I decided to conduct a number of surveys of the quality of USP’s students coming from different countries, academic grade assessment by different departments, and the quality of staff that USP was hiring.  There were a number of major surprises, or should I say, shocks when the results came out.
After Professor Rajesh Chandra became Vice Chancellor in 2008, I strongly recommended that he repeat the exercise I had done in 1995. But he took no action, as I suspect that for a university management which was conducting a massive advertising campaign on the alleged “quality” of USP qualifications, the facts and the truth might prove extremely uncomfortable, and propaganda may not be helped by the facts at all.
I would strongly recommend that all employers of university graduates in the region, all USP Council Members, Permanent Secretaries of Education, Employment, Planning etc., all Members of Parliaments, and donors to the USP, read the articles I post here, as I suspect that the situation has worsened over the last twenty years.
But, as the events of the last ten  years has indicated, most could not care less, not even USP Council Members who find it far more important to attend every council meeting in order to enjoy the eating, drinking and shopping. I suspect that many of them do not even have the intellect to understand the findings of these papers.
But, Fiji’s Minister for Education (Dr Mahendra Reddy) has recently shown some courage in correcting the massive mistakes of previous Ministers of Education by bringing back national examinations and ending the crude unbalanced scaling of marks. I suggest that he demand that independent studies be done on USP (and Fiji National University and Fiji University) students, staff and programme assessments, of the kind I did in 1995.  He (and the Fiji public) may be in for another shock of the kind he received when he scrutinized the scaling of marks in Fiji exams.

Indigenous Fijian student performance

What did this paper find?  I quote from the paper itself.

This paper examines academic performance indicators of Fiji students in a small number of selected USP courses, as well as some secondary schools, in order to obtain a better understanding of the nature of the problems.
      The initial findings indicate that there are significant ethnic differences in academic grades, but also by gender. The evidence indicates that much of the ethnic differences are explained by differences in the quality of students (as indicated by secondary school examinations at the seventh form level).
      However, there is also a pattern of ethnic differences in performances, for given levels of seventh form achievement in secondary schools, indicating a degree of Fijian academic under-achievement at USP, even in courses where ethnic biases are extremely unlikely to be factors in assessment.
      What seems clear is that the ethnic (and gender) differences are to be found across a wide range of courses and subject areas, and unlikely to be the result of biases by lecturers.
     It must be noted that the differences in performance at USP in fact correspond to similar differences in academic performance at the secondary school levels (for the selection of some of the best Fijian schools in the country).
     The problem is therefore much deeper than the problem at USP.
Given that a number of key national objectives for indigenous Fijians (such as greater participation in the economy at management and professional levels) critically depend on an adequate output of qualified Fijians, it is vital that national Fijian education authorities, together with interested bodies such as teachers’ associations, openly and honestly come to terms with the nature of the problem, make hard decisions about their priorities for Fijian education and development, and articulate a coherent strategy for countering this deep seated malaise which is eating at the heart of Fijian education and economic development in the country.

Read on:

Fijian academic performance at USP. DPD (USP). 1995

National Differences in Academic Performance

What did I find?

The initial findings indicate that there are significant differences in academic performances by nationality. What seems clear is that the differences are to be found across a range of subject areas, and would not seem to indicate any national bias by lecturers.
     The evidence of the relative under-performance by countries is of concern to the University, and will no doubt also be to the countries themselves.
     More importantly, the data presented here indicates that some of the national differences are likely to be explained by differences in the quality of students (as indicated by performance in secondary school examinations at seventh form and/or Foundation level) coming into USP.
      In so far as the data indicates that member countries are not sending their best students to USP, this has to be an area of major concern for the University.
     The preliminary findings in this paper indicates that there is an urgent need for the University to be systematically and continuously monitoring academic performances of students at USP, as well as the quality of the intakes.
     This will help the University to identify academic difficulties faced by different national groups, provide valuable indicators if the quality of the intake is being serious impaired, and enable Regional Member Governments to become aware of the effects of their selection policies for study at USP and act accordingly.

I had recommended the following:


 The University establish a special unit (possible sites: CELT, PDO) appropriately staffed and resourced, which will

 (a) systematically monitor all relevant aspects of academic performance in all the credit courses of the University (campus, extension, summer schools, etc.)

 (b) focus especially on subject areas, nationality, gender, and other parameters of interest (marital status, residential status, work experience, etc.).

 (c) Publish the results of such monitoring as part of the normal reporting by the University,

 (d) Assist the University to identify problems areas which may be addressed by the relevant University sections.

 (e) continuously monitor the quality of students coming to USP and those studying in other non-regional institutions (and other related aspects).

 The University establish a project, appropriately funded, which will examine

 (a) the relevance and efficiency (in terms of being reasonably good predictors of success at USP) of current entry requirements based on national secondary school performance indicators.

 (b) whether there is broad evidence of significantly higher correlation between secondary school subject marks and USP performance in corresponding subject areas, and if there is,

 (c) whether a more flexible entry requirement (e.g. acceptable passes in two relevant or related areas) would be more useful in selecting the appropriate students.

 3.   The University take further steps to improve academic performance by students at USP. These could include:

 (a) Improving the quality of teaching staff through improvements in the terms and conditions of academic staff in order to attract better quality staff

 (b) Improving the quality of teaching and learning, through the greater support of CELT activities

 (c) Improving the quality of studying environments at USP (such as Library space)

 (d) Greater School support for mechanisms (such as Student Advisory Groups) to enable the early identification of, and remedial actions.

Read the detailed report here:

 National differences in academic performance at USP. DPD (USP). 1995

Those concerned might wish to examine which of these recommendations have not been implemented by USP management.

Departmental Differences in Academic Assessment

This study identified many reasons for large unacceptable difference in grades assessments, such as (quoting directly from the paper):

In an increasingly market-oriented world where tertiary educational institutions are required to be more self-financing, it is not unknown for some universities to deliberately and artificially maximise pass rates in order to maintain enrolments and revenues. Such institutions deservedly undermine their reputations internationally, as their compromises become known
However, it is not clear that there are adequate mechanisms at USP to ensure that University wide, there are standards of academic assessment which are consistently followed by lecturers, departments and Schools. For instance, in 1994, the University accepted results for a course in which 39% of the students were given at least an A grade, and another 29% with B or B+ grades.
     The results indicate considerable variability, across departments and over time, in assessment standards, as indicated by the above criteria. Given that these grades and grade distributions are the most important indicators of the quality of students’ academic performance, such significant variability raises serious questions about the consistency and quality of USP’s academic assessment across departments, and within departments, across courses.
      The number of A grades obtained by a graduate is usually the most important criterion for the awarding of academic prizes at USP.
B grades (and GPAs) are important in that the University uses that standard as a criterion of entry to post-graduate programmes.
    Lastly, the percentages passing in various courses are not only used as indicators of the “acceptable” output or product of Departments’ teaching,
    But also inevitably become an important factor in students’ selection of courses for study,
    And the most basic test of students’ performance overall, in the University.

The study provided detailed data to back up these statements.

I made a number of recommendations then:


1.Senate identify a Section (CELT or PDO) appropriately resourced, which will systematically monitor academic assessment in all the credit courses of the University. The results of such monitoring would be published as part of the normal reporting processes in the University. Such a unit could use the databases currently available through the University’s Banner system.

 2.Senate agree that for every course with enrollments above 30, Course Lecturers, Heads of Departments, and Heads of Schools ensure that the grade distribution should be such that the proportions of students receiving:

 A or A+ should be less than 10%

B or B+ should be less than 20%

Fail grades should be less than 40%

 Where the distributions fall outside the guidelines, written justification for the deviation to be provided to the Augmented Academic Committee, and be accepted by it, for the results to be formally ratified.

 Stakeholders in USP might like to inquire whether these recommendations were ever followed up, and what is the current situation.

Read here the detailed report:

Departmental differences in academic grades at USP. DPD (USP) 1995

Declining quality of USP academic staff

 My report concluded the following:

 5.1       For the University to achieve and maintain the desired quality of teaching (and research), it would be helpful if those who the University appoints as teaching staff, themselves have the highest of academic standards, and desirably, those attained by the better students amongst those they are teaching.

 5.2       The data presented here indicates that the University is not succeeding in this regard, as judged by the undergraduate academic profiles of the regional staff who comprised the sample being studied, compared to the better graduates being turned out by USP itself.  Given that large proportions of the best regional students choose study outside of the region, the numbers of good graduates coming out of USP is likely to be already biased downwards.

 5.3       The personal experience of the author over the last twenty years of teaching at USP (in the Mathematics and Economics Departments) is that USP has not been able to attract even its own best graduates into teaching.

 5.4       USP Graduates have chosen to go within the region into private or public sector jobs where either immediate rates of remuneration are higher, or the prospect of promotions better than at USP, with the promise therefore of higher rates of remuneration in the near future.

 5.5       Where USP has had some limited success in initially attracting staff, it has been unable to retain them.   As the only regional institution where the staff pay taxes on gross incomes, USP continuously loses its experienced staff to other regional organisations whose gross salaries are not only higher, but tax-free.

 5.6       If the best of the USP graduates are not being attracted into employment at USP, there is even less likelihood of the best regional graduates (with first degrees) from rim country institutions being attracted.

 5.7       Even if some are attracted (usually because of their interest in the academic life), there is little evidence that there is any significant improvement in retention rates, especially at the lecturer and senior lecturer levels.

 5.8       The University therefore needs to strongly consider ways in which to attract the best of the regional graduates coming out of either USP or other institutions, and to retain the quality staff it does succeed in hiring.

 5.9       A necessary condition must be a significant improvement in the overall remuneration package of the relatively junior academic staff.

Read the detailed Internal USP Memorandum here:

Staff Quality and some doubts about the Brash Report conclusions


Comments are closed.

%d bloggers like this: