13.05.2007
Monitoring Compliance with St. Petersburg Summit Commitments
No. 2 2007 April/June

There may be
three points of departure for reflection on the G8 commitments
compliance.

First, when the forum arose in the
mid-1970s to respond in a coordinated way to the problems and
challenges that the existing international institutions could not
cope with, its architects set a very high level of expectations on
the meetings’ outcome: they should treat crucial economic,
financial and political issues, and they should yield
results.

Second, St. Petersburg produced 14 summit
documents plus the Chair’s summary totaling 317 specific
commitments. Although it has confirmed the tendency for increasing
the number of commitments characteristic of the seventh series,
this is the highest number of any summit held since 1975. Of these,
216 commitments reflect decisions on the Presidency priority
issues: 52 relate to fight against infectious diseases; 114 to
global energy security; and 50 to education for innovative society
in the 21st century. However impressive this may seem, as Russian
Foreign Minister Sergei Lavrov said, “the viability of the
decisions hinges on the members’ commitment to their consistent
implementation within the systemic strategy of joint actions.
Serious and multifaceted work on the St. Petersburg commitments
implementation lies ahead, including the period of the German
presidency of the G8.” Thus, a weighted assessment of the summit
performance and the leaders’ commitment to the decisions made is
still to come, inter alia on the basis of compliance study
results.

Third, over 32 years of its history, the
G7/G8 has expanded both its agenda and institutional system, and is
now appreciated as an instrument of deliberation, direction-giving
and decision-making on global governance issues. It has also become
a subject for criticism and reform proposals. The reform proposals
are well known and range from expanding the institution to G10 and
G12, restructuring the G20 into L20, restructuring the G8 into G4,
abolishing the G8, etc. The critique mainly focuses on the forum’s
representativeness, legitimacy and effectiveness.

While it is
difficult to argue against proposals to expand the G8 to include
China and India, or the rationale for coexistence of the G8 and the
L20, it is worthwhile considering what data and instruments of
evaluation are available to support, inform or refute the
perception of the G8’s shortcomings. It is also useful to analyze
what these tools offer for monitoring, comparing and sharing, but,
moreover, for communicating the G8 performance results to the wider
public.

EVALUATING
G7/G8

Scholarly
analysis of summit results which has developed over the years
includes three different methods of evaluation.

Assessing summit
performance. Robert D. Putnam and Nicholas Bayne assess the
summits’ achievements on six criteria: leadership, effectiveness,
solidarity, durability, acceptability, and consistency.  The
assessment is done using a grading system from A to E. According to
Bayne, the first summit series (1975-1978) is considered to have
been the most productive so far. The first G8 sequence, which
coincides with the sixth series summits (1998-2001), has shown
consistent B and B+ performance. The seventh series (2002-) is very
diverse in achievements, ranging from C+ for Evian (2003) and Sea
Island (2004) to A- for Gleneagles.

Assessing behavior of the country holding the
G8 presidency
.  A ‘scorecard’ approach was
developed by the Foreign Policy Center in London (Hugh Barnes and
James Owen), which issued the first annual ‘scorecard’ on Russia in
2006. The system aims at monitoring the behavior of the country
holding the G8 Presidency on key features relevant for membership
in the G8. They include 12 indicators: openness and freedom of
speech; political governance; rule of law; civil society; economic
weight in the world; inflation; economic stability and solvency;
unemployment; volume of trade; protectionism; energy market
conditions; and stance on key international issues.

The measure of a
country’s compliance with G8 norms is assessed on a five-point
scale: (1) broad compliance; (2) moderate compliance; (3) sporadic
compliance; (4) lack of compliance; and (5) total failure to
comply.  The data for analysis is drawn from the IMF, the WB,
national official statistics, the WHO, various other international
organizations and think tanks.

Russia’s score
according to this first exercise has been far from impressive. On
open society the score is (5); on political governance, (4); on the
rule of law, (4); on civil society, (4); on economic growth and
stability, (3); on inflation, (3); on stable exchange rate and
market conditions, (3); on unemployment level, (4); on trade
volume, (3); on trade restrictions (protectionism, etc.), (4); on
energy market conditions and policies, (4); and on discernable
stance on key international issues, (4).

Assessing compliance with the summit
commitments
. However important to understanding of G8
effectiveness the summits’ performance evaluation or the member
states’ compliance to democracy and economic growth – the key
characteristics of monitoring – are, it would not be complete
without a consistent and quantifiable assessment of the G8 member
states’ compliance with the summits commitments.

This assessment
has been carried out by the G8 Research Group of the University of
Toronto under the leadership of Professor John Kirton and Doctor
Ella Kokotsis since the 1996 Lyon summit, and has continued on an
annual basis until now.

On February 20,
2007, the G8 St. Petersburg Interim Compliance Report was released
by the G8 Research Group of the University of Toronto and the State
University–Higher School of Economics G8 Research Group (HSE). The
findings for the St. Petersburg summit demonstrate a positive
average degree of G8 member states’ compliance performance (33%)
and, hence, testify their commitment to a wide range of decisions
made at the summit. These findings thereby confirm earlier
assessments of the G8 2006 meeting as a successful one.

However, before
highlighting results of a new cycle of the study launched this
autumn on the St. Petersburg summit commitments and its interim
results, it would be useful to remind of the most essential
dimensions of the study and some of the key methodology
approaches.

First, it should be noted that the main
objective of the study is not a cross-country comparison of the
member states’ performance on the summit commitments, even though
this is probably its most visible and striking result.

More importantly,
reflecting on the basis of empirical findings on the factors of
“high and low compliance”  the study aims to explore how
credible and effective the institution is, namely:

1. To what extent
and under what conditions does the G8 live up to the commitments
and decisions reached at the summit table?
2. How does the pattern of summit compliance vary by issue area and
over time?
3. What factors can enhance or diminish the commitments compliance
performance of the member states?

In the course of
the study, some of the factors enhancing compliance were
identified: the leaders’ personal involvement; the strength of
their domestic positions; the presence of domestic institutional
structures and an increased number of various-level working and
official bodies; the use of existing regimes (such as the IMF and
the World Bank) where the G8 are major shareholders and are able to
exert their political and financial influence, set the agendas, and
secure agreements on the implementation;”  the link of the
commitment made with the domestic priorities of the member states;
and the degree of consensus on the commitments and the mechanisms
of their implementation.
The methodology toolkit includes:

  • the definition
    of the concept of compliance;
  • the definition
    of the concept of compliance performance;
  • the methodology
    of selecting commitments for monitoring;
  • the methodology
    of assessing the degree of compliance with the
    commitments.

According to the
methodology, commitment is a “distinct, specific, collectively
agreed and publicly expressed statements of intent, promise or
undertaking by leaders that they will take future action to meet or
adjust to an identified target.”
In order to qualify, commitments must satisfy several
criteria:

  • Commitments must
    be distinct, meaning that each goal should represent a separate
    commitment;
  • Commitments must
    be specific, identifiable, measurable and contain clear
    parameters;
  • Commitments must
    be future-oriented rather than present endorsements of previous
    actions, that is, they need to represent a pattern for future
    action.

Verbal
instructions to international institutions, issued at the time of
the summit, are included as it is assumed that summit members will
take action to move toward attaining this result.
Compliance is a conscious new or altered effort by national
governments in the post-summit period aimed to implement the
provisions contained in summit communiqu?s. In the work of the G8
Research Group, compliance occurs when national governments change
their own behavior to fulfill a summit goal or commitment. Leaders
legitimize their commitments by either:

  • including them
    within their national agendas;
  • referring to
    them in public speeches or press releases;
  • assigning
    personnel to negotiate the mandates;
  • forming task
    forces or working groups;
  • launching new
    diplomatic initiatives; or
  • allocating
    budgetary resources toward the commitment’s
    fulfillment.

The measure of
compliance is assessed on a three level scale:
1. Full or nearly full compliance with a commitment is assigned a
score of +1.
2. Complete or nearly complete failure to implement a commitment is
indicated by a score of –1.
3.  An “inability to commit” or “work in progress” is given a
score of 0. An “inability to commit” refers to factors outside the
executive branch that impede implementation. “Work in progress”
refers to an initiative that has been launched by a government but
has not yet been completed by the time of the next summit, and
whose results therefore cannot be effectively judged.
As only a fraction (not more than 10 percent of commitments made)
is selected for monitoring compliance, criteria of selection are
relevant for validity of the study results.

Primary selection
criteria include:

  • Importance for
    the summit, the G8 and the world. It was agreed that at least two
    commitments of each of the priority themes for the summit should be
    included.
  • Comprehensiveness; the set needs to embrace the economic,
    global and political-security domains and incorporate at least one
    from each part of the traditional agenda, i.e., finance,
    macroeconomics, microeconomics, trade, development,
    environment/climate change, energy, crime and drugs, terrorism,
    arms control and proliferation, regional security, and
    international institutions reform.
  • Balance by
    document; geographic distribution affecting the G8 members, non-G8
    members and the world as a whole; contentiousness in the
    preparatory process; continuity from previous summits;
    proportionality among analysis dimensions that are most relevant
    for current scientific research, such as timetable, international
    organization, money mobilized, G8 bodies, target, remit mandates,
    propriety placement, specified agency, etc.

Secondary
selection criteria are of practical methodological character.
Selected commitments should allow individual and collective
compliance monitoring; be feasible to commit fully within the year
as the compliance framework is annual; allow monitoring on the
basis of sufficient and reliable information; and allow for easy
construction of interpretive guidelines.
Tertiary selection criteria include significance to the summit as
identified by experts in the host country.

COMPLIANCE
SCORECARD SO FAR

For most part, of
the 20 priority commitments selected for the G8 2006 compliance
monitoring and assessment, the Russian and Canadian research teams
got consistent results. However, for several commitments the teams
have not been able to find concerted scores. With regard to Russia,
inconsistencies relate to the final scores on three commitments:
Renewable Energy, Africa – Security, and Global Partnership –
Non-Proliferation. Nevertheless, the average compliance score for
Russia is 25 percent according to the officially released version
of the St. Petersburg Interim Compliance Report, which reports the
assessment drawn by the HSE Team.

 

 

 

 

 

 

 

 

 

 

For Germany, the
discrepancies between the two research teams persist on the final
scores for five commitments: Health – Polio Eradication, Education
– Qualification Systems and Gender Disparities, Africa – Debt
Relief, and Global Partnership – Non-Proliferation. Hence, the
average compliance score for Germany team is 45percent according to
the officially released version of the St. Petersburg Interim
Compliance Report, which reflects assessment of the University of
Toronto G8RG, whereas the HSE team score for Germany was 20 percent
(Table 1).
Discrepancies between the scores ascribed by the two G8 research
teams mostly occur due to:

  • Varying degree
    of comprehensiveness of the data used;
  • Differences in
    understanding of the commitment content and interpreting of the
    data collected;
  • Inconsistencies
    of interpretation of the commitments in cross-country
    comparisons.

Two examples will
give the readers a taste of the debate.

The St.
Petersburg Statement on Non-Proliferation reinforces the commitment
made in Kananaskis: “We remain committed to our pledges in
Kananaskis to raise up to $20 billion through 2012 for the Global
Partnership, initially in Russia, to support projects to address
priority areas identified in Kananaskis and to continue to turn
these pledges into concrete actions.”
The financial commitments of the G8 member states to the Global
Partnership (not including local or associated costs) are as
follows: 

Canada                    $743
million
France                     $909
million
Germany                 
$1.5 billion
Italy                         $1.21
billion
Japan                      
$200 million
Russia                     
$2 billion
United Kingdom       $750
million
United
States           
$10 billion
European Union        $1.21
billion
Non-G8 states         
$1.5 billion

Thus, of the
total 20 billion USD to be raised over the decade, Russia is to
allocate 2 billion.
Assuming the study formula of an equal distribution of funds over
the years, Russia is ahead of its obligations, having allocated
$1.3 billion out of committed $2 billion (Table 2).

Thus, this
country has registered a high level of compliance, meriting +1 in
the opinion of the HSE analysts. However, given the methodology
requirement that the monitoring relate to the period from one
summit to another and the fact that the data available includes the
period until June 2006 and there is no evidence that Russia has
contributed anything since the St. Petersburg summit, the G8RG of
the UoT analysts hold the view that Russia’s compliance score
should be 0 for the interim report.

Another example
of discrepancy stemming from differences in understanding the
content of the commitment and interpreting the data relates to the
assessment of Germany’s compliance performance on the G8 St.
Petersburg commitment on Education – Qualifications (a commitment
to “share information on qualification systems in our countries to
increase understanding of national academic practices and
traditions.”

The G8RG of the
UoT analysts registered full compliance by the German government
with this commitment, as indeed Germany has been involved in
numerous activities aiming at enhancing transparency and
compatibility of qualifications. However, a caveat is due that
these activities are part of a different agenda and long-term
obligations of Germany as member of the EU and Bologna process, and
namely, the European Commission recommendation for the
establishment of the European Qualifications Framework (EQF) for
lifelong learning, the SOCRATES and LEONARDO exchange programs, as
well as Bologna process seminars and research. Thus, the actions
represent the country’s compliance with the commitments made within
the EU; they have not been launched in response to the St.
Petersburg commitment and in fact cannot be considered as
compliance performance within the G8 setting. If they are accepted
as such, given that Italy, France and the UK are active proponents
of the same initiatives, their respective scores (0, –1, and 0)
question the consistency of assessment across countries (Table
3).

Table 3.
 2006 G8 Compliance scores for Russia and
Germany*

Another
contentious issue, which needs additional consideration, concerns
the case of monitoring activities implemented by the EU G8 member
states within the EU programs. If these are regarded as compliance
of the EU-25, how valid would be reference to the same actions of
each of the four EU G8 member states’ compliance? And a still more
difficult question is: How in this case can one differentiate and
evaluate individual contribution of these states toward compliance?
These questions show the degree of complexity and challenge faced
by the researchers undertaking the monitoring.

However, despite
all the above discrepancies, the monitoring of commitments
compliance performance remains to be a useful tool for assessing
and enhancing the effectiveness of the G8 as a global governance
institution. It is also extremely useful in evaluating commitment
of individual member states to dealing with diverse global issues
that demand collective management. Two factors are essential here:
ensuring validity, reliability and transparency of the monitoring
methodology, on the other hand, and preparedness of the member
states’ institutions to use the results of these findings in their
work.

To enhance the
reliability and validity of the monitoring, the G8 Research Group
of the University of Toronto and the State University–Higher School
of Economics G8 Research Group  adhere to a combination of
several principles.

First, to ensure consistency and integrity
of the data analysis, it is necessary to elaborate and agree upon
interpretation guidelines which would take account of the
commitments’ content.

Second, it is crucial to ensure
consistency of assessment across issues and across countries. This
can be achieved through interaction in collecting and assessing
data on the same commitment for different member states. This
procedure puts extra pressure on the team leaders, but seems to be
justified by the need for cross-country consistency. This problem
has proven to be a challenge so far.

Third, the quality of the expert
background materials on the content of the issues monitored is
essential to build common understanding of the individual
commitments’ specific nature among the analysts. This demand puts
extra pressure on the budget of the study.  However, again
this would be justified by the need for across issue data coherence
and interpretation consistency.

Forth, it is essential that the data
should be comprehensive and exhaustive, as these features have a
considerable influence on the analysis results.

Finally, to provide for utmost integration
of the various data on the commitments compliance it is vital to
get a full and timely feedback from the G8 member states (this is
far from a smooth and easy process, given the various pressures
experienced by the structures involved in the G8 process). The most
efficient way to ensure profound data consideration would be
through consultations with national expert structures.

The analytical
team of the HSE International Organizations Research Institute team
and the G8 Research Group of the University of Toronto will
continue close cooperation on the G8 compliance monitoring and
assessment and persevere in enhancing its validity. Given the early
date of the next summit,  the final report will be released by
the end of May 2007. (The Heiligendamm summit is scheduled for June
6-8, 2007, indicating as priority issues global economic
imbalances, energy and raw materials, world trade, poverty,
development assistance, Africa and the Middle East.) Hopefully, the
partnership between the two universities’ research groups will
contribute to the improving quality of the analysis and assessment,
and will help get trustworthy and reliable information on the G8
member states’ commitments to the St. Petersburg summit
decisions.