22 October 2017 Register to receive our free newsletter by email each week
Advanced Search
View Printable Version
Comparability and policy relevance of R&D indicators

Science, technology and innovation indicators for measuring and comparing the knowledge systems of developing countries need to be specially redesigned to take into account the very different conditions and characteristics of these countries. The challenge is how to make statistics and indicators both cross-nationally comparable and able to adequately reflect a country's specific economic and societal features.

Simon Ellis, Ernesto Fernández Polcuch (pictured) and Rohan Pathirage describe the current status and context of measuring research and development (R&D) statistics in developing countries, in the Research Report of the UNESCO Forum on Higher Education, Research and Knowledge.

"Measuring R&D in Developing Countries: International comparability and policy relevance", presents particular characteristics of research and development practice in these countries and resulting consequences for R&D measurement, and discusses the need to develop new methodologies to complement those currently in use.

The two most widely used indicators for policy documents are 'research intensity', reflecting expenditure on R&D as a percentage of gross domestic product, and 'research density', showing the number of researchers in relation to a country's total population. To produce statistics for these 'R&D input indicators', the methodology given by the OECD's Frascati Manual is commonly used for both developed and developing countries.

But the characteristics of developing countries' research systems are quite unlike those that gave rise to the Frascati statistical standard. Its methodology, which gives priority to international comparability, does not produce results that are relevant for policies suiting the countries' particular characteristics.

So the science, technology and innovation (STI) statistics methodology should be adapted to produce policies that meet the needs of developing countries better, argue Ellis, Polcuch and Pathirage.

R&D statistics and indicators provide developed and developing countries with information vital for formulating policy and decision-making, assessing performance, monitoring and evaluating progress, predicting future trends and identifying priorities. They also help recognise strengths, weaknesses and potential opportunities for development, and make international comparisons.

For developing countries they are needed specifically to identify highly skilled personnel, the type of work they do, the institution and sector for which they work, and whether their research is in line with national policies and priorities.

The use of R&D statistics is still scarce in developing countries. To increase availability of information and improve its quality, the UNESCO Institute for Statistics, in cooperation with UNESCO offices and the organisation's Natural Sciences sector, has been organising training workshops.

Because of differences between developing and developed countries - and between individual developing countries - STI indicators should be adapted to particular policy requirements and answer specific policy questions, write Ellis, Poleuch and Pathirage.

To produce accurate and reliable R&D statistics for developing countries it is important to take into account the particular characteristics of R&D activities. R&D workers operate within a specific national, cultural, political, financial and economic system, often influenced by legacies of colonial, post-colonial and other kinds of governance.

In developing countries there is often more research than development, with a stronger government and higher education sector presence than private sector.

The way research and experimental development are carried out in some developing countries does not always fit methodologies available for measuring them. One common R&D characteristic is informality; and since informal R&D is difficult to capture it is usually considered beyond the scope of surveys.

Countries often estimate gross expenditure in R&D (GERD) based on national budget information, a practice leading to inaccurate and often incompatible data as the Frascati Manual bases all its recommendations on the use of R&D surveys, the authors point out.

Budget information is cheaper and quicker to obtain and in the past presented almost all R&D expenditure in some developing countries where research systems were overwhelmingly public sector and their research expenditure was accounted for in the budget.

But this is no longer the case. National research systems have limited absorption capacity to deal with budgetary increases; and some governments may not follow up their budgetary commitments, leaving funds in central accounts instead of transferring them to R&D institutions. Over or under-estimation of R&D budgets can result.

Some countries use a combination of budgetary records and annual reports from performing units, national budgets and national planning documents as a source for estimating GERD, which can lead to duplication when both sources are used.

Problems of consistency may arise when incomplete financial records are used to reconcile budget and expenditure data. Calculating annual figures from budgets and expenditure for longer-term projects or aggregating financial data from programmes involving many different institutions can prove difficult, Ellis, Polcuch and Pathirage report.

Mostly financed by governments in the past, R&D activities are undergoing significant changes in many developing countries, with new sources of funding emerging. Enterprises are slowly becoming involved, while foundations and NGOs - especially foreign organisations - already play a central role.

Many new funds go to support individuals and groups, not institutions, and therefore remain largely unaccounted for and seldom declared - including for statistical purposes.

In higher education the trend towards establishing private universities is not always matched by an increase in R&D expenditure. Business enterprise R&D is generally weak compared with OECD member states, and this needs to be allowed for in surveys, the authors argue.

The late 2000s saw professional crisis and changing nature of scientific work in many countries. Changes in the roles and activities of university professors have a strong impact on the production capacity of R&D data in higher education, which can be the most important sector in many developing countries. One example is the increasing number of 'taxi-professors', part-time professors with new kinds of contracts, who teach or conduct research in more than one university.

Meanwhile the international mobility of scientists and engineers is increasing, especially from developing to developed countries - the 'brain drain'. In Lebanon, the science and technology diaspora is estimated to be equal or larger than the home-based S&T workforce.

Some countries organise their scientific and engineering diaspora through 'remote mobilisation' to generate benefits for the country of origin; this presents new challenges for measuring R&D.

Globally, R&D is concentrated in the 'Triad countries'. In 2004 Europe, North America and Japan accounted for 79.5% of world scientific publications. In the developing world, R&D expenditure and output are concentrated in a relatively small group of countries in each region.

R&D activities and their institutional framework present distinctive characteristics in different countries which can be grouped according to three sets of parameters, write Ellis, Poleuch and Pathirage:

* Socio-economic development status.
* Capacities of R&D systems.
* Capacities of R&D statistic systems.

One small group of countries applies the current Frascati Manual with no major difficulties, though they need to learn more about constructing R&D indicators from more experienced countries. They include some OECD members, such as Mexico and soon Chile. Or, like Argentina, they might be providing data for OECD surveys.

A second group of nations might be capable of adopting most Frascati concepts but need to adapt their statistical processes to corresponding methodological proposals. These countries need guidance on establishing and consolidating sound R&D statistics systems.

R&D systems in remaining countries are confined to a few government and university institutions, with scarcely any business participation. These countries have very limited resources for science, technology and innovation policy and management, let alone STI statistics, write the authors.

Developing countries see internationalisation as a means of improving the quality of S&T activities, strengthening capacities and benefiting from technology transfers through alliances and networks.

International organisations may play a significant R&D role, involving local staff and addressing local issues, and also weigh greatly in total GERD. In some smaller countries figures relating to R&D personnel and expenditure of international organisations or foreign 'antennas' - foreign research centres, with foreign researchers and foreign funding - could substantially distort R&D indicators.

With the globalisation of higher education services, many universities from developed countries are opening branches in developing countries, presenting very particular situations not accounted for in Frascati, write Ellis, Polcuch and Pathirage.

Some foreign campuses adopt their home institutions' recruitment procedures for researchers and entrance qualifications for students, while others follow criteria closer to those of the host countries. Conditions for R&D activities and researcher status may therefore vary considerably.

Weaknesses in developing countries' statistical systems and practices present serious challenges to the quality of results from R&D surveys, and this must be taken into account when designing data collection procedures and analysing their results.

* Dr Simon Ellis is Head of Science Culture and Communications Statistics at the UNESCO Institute for Statistics (UIS) in Montreal, Canada. He has represented UNESCO on many international committees including the Millennium Project, the UN Expert Group on MDG Indicators, and the Partnership for Measuring ICTs for Development.

* Ernesto Fernández Polcuch is a Programme Specialist for Natural Sciences at the UNESCO Windhoek Cluster Office. As a leader of the Science and Technology Statistics team at the UNESCO Institute for Statistics (UIS) in Montreal from 2002 to 2008, he produced worldwide S&T statistics and developed methodologies for data collection in developing countries, among other things.

* Rohan Pathirage is an Assistant Programme Specialist in Science and Technology at the UNESCO Institute for Statistics (UIS) in Montreal, Canada. He is responsible for coordinating the Global S&T Survey at the UIS.

Click here to download the Unesco Forum Research Report.
Receive UWN's free weekly e-newsletters

Email address *
First name *
Last name *
Post code / Zip code *
Country *
Organisation / institution *
Job title *
Please send me UWN’s Global Edition      Africa Edition     Both
I receive my email on my mobile phone
I have read the Terms & Conditions *