Peter J. Hammond

Abstracts of Articles and Research Papers

List of Topics

  1. Growth and Exhaustible Resources
  2. Rational Individual Choice and Consequentialism
  3. Game Theory and Consequentialism
  4. Consequentialist Social Choice and Utilitarian Ethical Theory
  5. Social Choice: General
  6. Social Choice with Interpersonal Comparisons
  7. Social Choice with Individual and Group Rights
  8. Distributional Objectives in Welfare Economics
  9. General Equilibrium Theory
  10. Gains from Trade and Migration
  11. Widespread Externalities and the f-Core
  12. Equilibrium in Incomplete Markets
  13. Private Information and Incentive Constraints
  14. Cost–Benefit Analysis, Policy Reform, and Welfare Measurement
  15. Welfare, Information and Uncertainty
  16. Miscellaneous Work in Welfare Economics and Ethics
  17. Continuum of Random Variables
  18. Empirics, Statistics, Experiments, and Other Topics


Growth and Exhaustible Resources

Agreeable Plans with Many Capital Goods Review of Economic Studies 42 (1975), 1–14.

Abstract:
In planning capital accumulation, only an interim plan, or overture, needs to be chosen initially. “Agreeable” plans exploit this fact; they lead to accumulation paths which are nearly optimal, provided that the time-horizon is long and is known far enough in advance. An earlier treatment of one-good models is extended to models with many capital goods. It is shown that, in many cases, an “insensitive path” — the limit of finite-horizon optimal paths — is agreeable. Also, with a convex technology and a strictly concave welfare function, if an agreeable plan exists, it must be insensitive and also unique.
JSTOR link for paper


Maximin Paths of Heterogeneous Capital Accumulation and the Instability of Paradoxical Steady States (with Edwin Burmeister) Econometrica 45 (1977), 853–870.

Abstract:
If there exist heterogeneous capital goods, a steady state may be “paradoxical” in the sense that increasing the rate of interest above the Golden Rule level may lead to an increase in consumption or utility, rather than to the decrease which always occurs in one-sector models. It is shown that, in many cases, a path of capital accumulation which maximizes the minimum consumption or utility level is unlikely to converge to a paradoxical steady state of this kind.
JSTOR link for paper


Uniformly Optimal Infinite Horizon Plans (with John F. Kennan) International Economic Review 20 (1979), 283–296.

Abstract:
A plan is defined to be uniformly optimal if, for all positive epsilon and all long enough time horizons, it is uniformly not epsilon-inferior to all alternative plans. It is shown that uniformly optimal plans are “agreeable” in a strong sense. In some cases, too, finite horizon optimal plans may converge to a uniformly optimal plan as the horizon tends to infinity. It is also shown that, in most cases where it is known that an optimal plan exists, any optimal plan is uniformly optimal.
JSTOR link for paper


On Hartwick’s Rule for Constant Utility and Regular Maximin Paths of Capital Accumulation and Resource Depletion (with Avinash K. Dixit and Michael Hoel) Review of Economic Studies 47 (1980), 551–6.

Abstract:
Hartwick’s rule of investing resource rents in an economy with producible capital and exhaustible resources becomes, in a general model of heterogeneous stocks, a rule whereby the total value of net investment (resource depletion counting negative) is equal to zero. It is shown that holding the discounted present value of net investment constant is necessary and sufficient for a competitive path to give constant utility. With free disposal, it is shown that, provided the Hartwick rule yields a path that does not exhibit “stock reversal,” it must be a maximin path.
JSTOR link for paper


On Endogenizing Long-Run Growth (with Andrés Rodríguez-Clare) Scandinavian Journal of Economics 95 (1993), 391–425; also in T.M. Andersen and K.O. Moene (eds.) Endogenous Growth (Oxford: Blackwell, 1993), pp. 1–35.

Abstract:
This assessment of recent theoretical work on endogenous growth identifies three different engines of long-run growth: (i) the asymptotic average product of capital is positive; (ii) labor productivity increases as an external effect of capital accumulation; (iii) there are feedback effects on the cost of accumulating knowledge or innovating. A general model encompassing all three is considered, and then used to review different proposed determinants of long-run growth rates. The contribution of endogenous growth theory has been to create a framework in which to explain why economic institutions and policies can have long-run effects on growth rates.
PDF version of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Rational Individual Choice and Consequentialism

Changing Tastes and Coherent Dynamic Choice Review of Economic Studies 43 (1976), 159–173; reprinted in K.J. Lancaster (ed.) Consumer Theory (Cheltenham, UK: Edward Elgar, 1999) ch. 23, pp. 367–381.

Abstract:
Changes of taste can be exogenous or endogenous. Both kinds can be handled by the general notion of a dynamic choice function defined on a decision tree. A general dynamic choice function is inconsistent in the sense that actual choices are likely to depart from planned choices. Then the agent may be naive or sophisticated. But then he is likely to be “incoherent” — in particular, his choices will not correspond to a preference relation. Under special assumptions, naive and sophisticated choice are coherent if and only if they coincide, which happens when tastes are “essentially consistent”; otherwise, both are incoherent.
JSTOR link for paper


Total Discounted Demands and Long-Period Preferences Economic Record 52 (1976), 26–35.

Abstract:
Suppose a consumer’s tastes are not changing secularly, but merely fluctuating over short periods. Suppose the consumer predicts his fluctuating tastes. Suppose prices remain constant. Then the consumer’s present discounted demand vector maximizes a fixed long period utility function. But when some taste fluctuations are unforeseen, then in general, a long period average demand vector maximizes a long period utility function only if the short run Engel curves are linear and have constant slopes that are independent of the taste fluctuations.


Endogenous Tastes and Stable Long-Run Choice Journal of Economic Theory 13 (1976), 329–340.

Abstract:
Suppose that short-run preferences depend upon consumption one period earlier. Then there is an acyclic long-run strict preference relation iff, for every finite set, every conservative choice sequence converges. If long-run preferences are acyclic, then a unique long-run choice from a compact set is globally stable. If the long-run choice set includes multiple choices, there is a weaker stability property. Under special assumptions there results are extended to cases when the short-run consumption set is endogenous, and when more previous periods affect the present.
ScienceDirect link for paper


Dynamic Restrictions on Metastatic Choice Economica 44 (1977), 337–350.

Abstract:
Economists customarily reduce dynamic decision problems to the choice of a single policy, which lasts for all time. This “metastatic” approach ignores the possibilities for changing the original choice. Can metastatic choices always be expanded into consistent dynamic choices when these possibilities are allowed for, and regardless of what the structure of the dynamic decision problem is? For an individual, such an expansion is possible if and only if his choice function corresponds to an ordering. For social choices depending on individuals’ choices, the expansion is always possible if and only if Arrow’s condition of independence of irrelevant alternatives is satisfied.
JSTOR link for paper


Consequentialist Foundations for Expected Utility Theory and Decision 25 (1988), 25–78.

Abstract:
Behaviour norms are considered for decision trees which allow both objective probabilities and uncertain states of the world with unknown probabilities. Terminal nodes have consequences in a given domain. Behaviour is required to be consistent in subtrees. Consequentialist behaviour, by definition, reveals a consequence choice function independent of the structure of the decision tree. It implies that behaviour reveals a revealed preference ordering satisfying both the independence axiom and a novel form of sure-thing principle. Continuous consequentialist behaviour must be expected utility maximizing. Other familiar assumptions then imply additive utilities, subjective probabilities, and Bayes’ rule.
PDF version of preprint


Consequentialism, Structural Rationality and Game Theory in K.J. Arrow, E. Colombatto, M. Perlman, and C. Schmidt (eds.) The Rational Foundations of Economic Behaviour (IEA Conference Volume No. 114) (London: Macmillan, 1996) ch. 2, pp. 25–42.

Abstract:
Previous work on consequentialism (especially in Theory and Decision 1988, pp. 25–78) has provided some justification for regarding an agent’s behaviour as “structurally rational” if and only if there are subjective probabilities, and expected utility is maximized. The key axiom is that rational behaviour should be explicable as the choice of good consequences. This and other axioms will be re-assessed critically, together with their logical implications. Their applicability to behaviour in n-person games will also be discussed. The paper concludes with some discussion of modelling bounded rationality.
PDF version of preprint


Subjectively Expected State-Independent Utility on State-Dependent Consequence Domains in M.J. Machina and B. Munier (eds.) Beliefs, Interactions, and Preferences in Decision Making (Dordrecht: Kluwer Academic, 1999), pp. 7–21.

Abstract:
The standard decision theories of Savage and of Anscombe and Aumann both postulate that the domain of consequences is state independent. But this hypothesis makes no sense when, for instance, there is a risk of death or serious injury. The paper considers one possible way of deriving subjective probabilities and utilities in this case also. Moreover, the utilities will be state independent in the sense of giving equal value to any consequence that happens to occur in more than one state dependent consequence domain. The key is to consider decision trees having “hypothetical” probabilities attached to states of nature, and even to allow hypothetical choices of these probabilities.
PDF file of preprint


Schumpeterian Innovation in Modelling Decisions, Games, and Economic Behaviour History of Economic Ideas XV (2007), 179–195.

Abstract:
Von Neumann's standard paradigm of a game in extensive form and Kolmogorov's standard model of a stochastic process both rely on constructing a fixed state space large enough to include all possible future eventualities. This allows a typical single-person decision problem to be represented as a decision tree. Yet not all eventualities can be foreseen. Also, any practical decision model must limit the state space rather severely. In this way the standard paradigm excludes not only Schumpeter's ideas regarding entrepreneurship, innovation and development, but also Shackle's “unexpected events”. This paper proposes an alternative approach using “decision jungles” with an evolving state space.
PDF file of reprint


Rationality and Dynamic Consistency under Risk and Uncertainty (with Horst Zank) in Mark J. Machina and W. Kip Viscusi (eds.) Handbook of the Economics of Risk and Uncertainty, Vol 1 (Oxford: North Holland, 2014) ch. 2, pp. 41–97.

Abstract:
For choice with deterministic consequences, the standard rationality hypothesis is ordinality — i.e., maximization of a weak preference ordering. For choice under risk (resp. uncertainty), preferences are assumed to be represented by the objectively (resp. subjectively) expected value of a von Neumann–Morgenstern utility function. For choice under risk, this implies a key independence axiom; under uncertainty, it implies some version of Savage's sure thing principle. This chapter investigates the extent to which ordinality, independence, and the sure thing principle can be derived from more fundamental axioms concerning behaviour in decision trees. Following Cubitt (1996), these principles include dynamic consistency, separability, and reduction of sequential choice, which can be derived in turn from one consequentialist hypothesis applied to continuation subtrees as well as entire decision trees. Examples of behavior violating these principles are also reviewed, as are possible explanations of why such violations are often observed in experiments.
PDF file of preprint



How Restrictive Are Information Partitions? (January 2005 revision).

Abstract:
Recently, several game theorists have questioned whether information partitions are appropriate. Bacharach (2005) has considered in particular more general information patterns which may not even correspond to a knowledge operator. Such patterns arise when agents lack perfect discrimination, as in Luce’s (1956) example of intransitive indifference. Yet after extending the state space to include what the agent knows, a modified information partition can still be constructed in a straightforward manner. The required modification introduces an extra set of impossible states into the partition. This allows a natural represention of the agent’s knowledge that some extended states are impossible.
PDF file of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Game Theory and Consequentialism

The Core and Equilibrium through the Looking-Glass Australian Economic Papers 16 (1977), 211–218; Voluntary Contracts and Jam in the Far Future Australian Economic Papers 17 (1978), 363–364.

Abstract:
In a game in extensive form, a natural subgame arises at each node of the game tree, with the players’ strategies restricted to ensure that node is attained. A solution to the game may not be consistent with the solutions to these subgames. The core of the simplest possible two-period exchange economy is not consistent with the core of the second-period economy. Considering just equilibria of the normal form of the game, as the core does, neglects this aspect of a dynamic game, which arises whenever players cannot (unlike the White Queen in Through the Looking Glass) “remember” the future as well as the past.


Aspects of Rationalizable Behavior in K. Binmore, A.P. Kirman, and P. Tani (eds.) Frontiers of Game Theory (Cambridge, Mass.: M.I.T Press, 1993), ch. 14, pp. 277–305.

Abstract:
Equilibria in games involve common “rational” expectations, which are supposed to be endogenous. Apart from being more plausible, and requiring less common knowledge, rationalizable strategies may be better able than equilibria to capture the essential intuition behind both correlated strategies and forward induction. A version of Pearce’s “cautious” rationalizability allowing correlation between other players’ strategies is, moreover, equivalent to an iterated procedure for removing all strictly, and some weakly dominated strategies. Finally, as the effect of forward induction in subgames helps to show, the usual description of a normal form game may be seriously inadequate, since other considerations may render implausible some otherwise rationalizable strategies.
PDF file of preprint


Elementary Non-Archimedean Representations of Probability for Decision Theory and Games in P. Humphreys (ed.) Patrick Suppes: Scientific Philosopher, Vol. I: Probability and Probabilistic Causality (Kluwer Academic Publishers, 1994), ch. 2, pp. 25–59.

Abstract:
A fundamental problem in extensive form game theory is that, in order to tell whether a player has a better strategy than in a presumed equilibrium, one must know the other players’ equilibrium reactions to a counterfactual deviation with prior probability zero. Past work by Selten and Myerson has considered “trembling-hand” strategies. Here one particular space of “extended” probabilities is proposed and characterized in the following four equivalent ways: (i) as complete conditional probability systems considered by Rényi, Myerson, and others; (ii) as lexicographic hierarchies of probabilities considered by Blume, Brandenburger and Dekel; (iii) as extended logarithmic likelihood ratios considered by McClennan; and (iv) as certain “canonical rational probability functions” which represent trembles directly. However, one wants to describe adequately the joint probability distributions determined by compound lotteries, and also to distinguish all pairs of probability distributions over the consequences of decisions that, according to the “consequentialist” axioms of decision theory, should not be indifferent. To achieve this, it is shown that an extension to general rational probability functions is needed.
PDF file of preprint


Consequentialism and Bayesian Rationality in Normal Form Games in W. Leinfellner and E. Köhler (eds.) Game Theory, Experience, Rationality. Foundations of Social Sciences, Economics and Ethics. In honor of John C. Harsanyi. (Vienna Circle Institute Yearbook 5) (Kluwer Academic Publishers, 1998), pp. 187–196.

Abstract:
The consequentialist hypothesis requires the set of possible consequences of behaviour in any single-person decision tree to depend only on the feasible set of consequences. This implies that behaviour reveals a consequence choice function. Previous work has applied this hypothesis to dynamically consistent behaviour in an (almost) unrestricted domain of finite decision trees. Provided that behaviour is continuous as objective probabilities vary, there is state independence, and also Anscombe and Aumann’s reversal of order axiom is satisfied, then behaviour must be “Bayesian rational” in the sense of maximizing subjective expected utility. Moreover, null events are excluded, so strictly positive subjective probabilities must be attached to all states of the world.

For agents playing a multi-person game, it is not immediately clear that single-person decision theory can be applied because, as Mariotti (1996) has pointed out, changing the decision problem faced by any one player typically changes the entire game and so changes other players’ likely choices in the game. Nevertheless, by applying the consequentialist hypothesis to particular variations of any given game, the force of these objections can be considerably blunted. For any one player i, the variations involve a positive probability that the game changes to another in which player i faces an arbitrary finite decision tree. However, only player i knows whether the game has changed. Also, if the game does change, then only player i has any choice to make, and only player i is affected by whatever decision is taken. In effect, player i is then betting on the other players’ strategy choices in the original game. Moreover, there is no reason for these choices to change because, outside the original game, the other players have no reason to care what happens.

In this way, Bayesian rational behaviour in normal form games can be given a consequentialist justification. There is a need, however, to attach strictly positive probabilities to all other players’ strategies which are not ruled out as completely impossible and so irrelevant to the game. This suggests that strictly positive probabilities should be attached to all other players’ rationalizable strategies, at least — i.e., to all those that are not removed by iterative deletion of strictly dominated strategies.
PDF file of preprint


Consequentialism, Non-Archimedean Probabilities, and Lexicographic Expected Utility in C. Bicchieri, R. Jeffrey and B. Skyrms (eds.) The Logic of Strategy (Oxford University Press, 1999), ch. 2, pp. 39–66.

Abstract:
Earlier work (Hammond, 1988a, b) on dynamically consistent “consequentialist” behaviour in decision trees was unable to treat zero probability events satisfactorily. Here the rational probability functions considered in Hammond (1994), as well as other non-Archimedean probabilities, are incorporated into decision trees. As before, the consequentialist axioms imply the existence of a preference ordering satisfying independence. In the case of rational probability functions, those axioms, together with continuity and a new refinement assumption, imply the maximization of a somewhat novel lexicographic expected utility preference relation. This is equivalent to maximization of expected utility in the ordering of the relevant non-Archimedean field.
PDF file of preprint


Non-Archimedean Subjective Probabilities in Decision Theory and Games Stanford University Department of Economics Working Paper No. 97-038; abbreviated version published in Mathematical Social Sciences 38 (1999), 139–156.

Abstract:
To allow conditioning on counterfactual events, zero probabilities can be replaced by infinitesimal probabilities that range over a non-Archimedean ordered field. This paper considers a suitable minimal field that is a complete metric space. Axioms similar to those in Anscombe and Aumann (1963) and in Blume, Brandenburger and Dekel (1991) are used to characterize preferences which: (i) reveal unique non-Archimedean subjective probabilities within the field; and (ii) can be represented by the non-Archimedean subjective expected value of any real-valued von Neumann–Morgenstern utility function in a unique cardinal equivalence class, using the natural ordering of the field.
PDF file of working paper


Expected Utility in Non-Cooperative Game Theory in S. Barberà, P.J. Hammond, and C. Seidl (eds.) Handbook of Utility Theory, Vol. 2: Extensions (Boston: Kluwer Academic Publishers, 2004) ch. 18, pp. 982–1063.

Abstract:
This sequel to previous chapters on objective and subjective expected utility reviews conditions for players in a non-cooperative game to be Bayesian rational — i.e., to choose a strategy maximizing the expectation of each von Neumann–Morgenstern utility function in a unique cardinal equivalence class. In classical Nash equilibrium theory, players’ mixed strategies involve objective probabilities. In the more recent rationalizability approach pioneered by Bernheim and Pearce, players’ possibly inconsistent beliefs about other players’ choices are described by unique subjective probabilities. So are their beliefs about other players’ beliefs, etc. Trembles, together with various notions of perfection and properness, are seen as motivated by the need to exclude zero probabilities from players’ decision trees. The work summarized here, however, leaves several foundational issues unsatisfactorily resolved.
PDF file of preprint


Utility Invariance in Non-Cooperative Games in U. Schmidt and S. Traub (eds.) Advances in Public Economics: Utility, Choice, and Welfare: A Festschrift for Christian Seidl (Springer Verlag, 2005), pp. 31–50.

Abstract:
Game theory traditionally specifies players’ numerical payoff functions. Following the concept of utility invariance in modern social choice theory, this paper explores what is implied by specifying equivalence classes of utility function profiles instead. Within a single game, utility transformations that depend on other players’ strategies preserve players’ preferences over their own strategies, and so most standard non-cooperative solution concepts. Quantal responses and evolutionary dynamics are also considered briefly. Classical concepts of ordinal and cardinal non-comparable utility emerge when the solution is required to be invariant for an entire class of “consequentialist game forms” simultaneously.
PDF file of preprint


Beyond Normal Form Invariance: First Mover Advantage in Two-Stage Games with or without Predictable Cheap Talk, in Prasanta Pattanaik, Koichi Tadenuma, Yongsheng Xu, and Naoki Yoshihara (eds.) Rational Choice and Social Welfare: Theory and Applications (Essays in Honor of Kotaro Suzumura) (Berlin: Springer, 2008), pp. 215–233.

Abstract:
Von Neumann (1928) not only introduced a fairly general version of the extensive form game concept. He also hypothesized that only the normal form was relevant to rational play. Yet even in Battle of the Sexes, this hypothesis seems contradicted by players' actual behaviour in experiments. Here a refined Nash equilibrium is proposed for games where one player moves first, and the only other player moves second without knowing the first move. The refinement relies on a tacit understanding of the only credible and straightforward perfect Bayesian equilibrium in a corresponding game allowing a predictable direct form of cheap talk.
PDF file of preprint


Isolation, Assurance and Rules: Can Rational Folly Supplant Foolish Rationality? in Kaushik Basu and Ravi Kanbur (eds.) Arguments for a Better World: Essays in Honor of Amartya Sen: Volume I, Ethics, Welfare, and Measurement (Oxford: Oxford University Press, 2009), ch. 28, pp. 523–534.

Abstract:
Consider an “isolation paradox” game with many identical players. By definition, conforming to a rule which maximizes average utility is individually a strictly dominated strategy. Suppose, however, that some players think “quasi-magically” in accordance with evidential (but not causal) decision theory. That is, they act as if others’ disposition to conform, or not, is affected by their own behavior, even though they do not actually believe there is a causal link. Standard game theory excludes this. Yet such “rational folly” can sustain “rule utilitarian” cooperative behavior. Comparisons are made with Newcomb’s problem, and with related attempts to resolve prisoner’s dilemma.
PDF file of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Consequentialist Social Choice and Utilitarian Ethical Theory

Consequentialist Demographic Norms and Parenting Rights Social Choice and Welfare 5 (1988), 127–145; also in W. Gaertner and P.K. Pattanaik (eds.) Distributive Justice and Inequality (Berlin: Springer-Verlag, 1988), pp. 39–57.

Abstract:
This paper extends the author’s recent work on dynamically consistent consequentialist social norms for an unrestricted domain of decision trees with risk to trees in which the population is a variable consequence — i.e., endogenous. Given a form of ethical liberalism and ethical irrelevance of distant ancestors, classical utilitarianism is implied (provided also that a weak continuity condition is met). The “repugnant conclusion” that having many poor people may be desirable can be avoided by denying that individuals’ interests extend to the circumstances of their birth. But it is better avoided by recognizing that potential parents have legitimate interests concerning the sizes of their families.


Consequentialist Decision Theory and Utilitarian Ethics in F. Farina, F. Hahn, and S. Vannucci (eds.) Ethics, Rationality, and Economic Behaviour (Oxford: Clarendon Press, 1996), pp. 92–118.

Abstract:
Suppose that a social behaviour norm specifies ethical decisions at all decision nodes of every finite decision tree whose terminal nodes have consequences in a given domain. Suppose too that behaviour is both consistent in subtrees and continuous as probabilities vary. Suppose that the social consequence domain consists of profiles of individual consequences defined broadly enough so that only individuals’ random consequences should matter, and not the structure of any decision tree. Finally, suppose that each individual has a “welfare behaviour norm” coinciding with the social norm for decision trees where only that individual’s random consequences are affected by any decision. Then, after suitable normalizations, the social norm must maximize the expected value of a sum of individual welfare functions over the feasible set of random consequences. Moreover, individuals who never exist can be accorded a zero welfare level provided that any decision is acceptable on their behalf. These arguments lead to a social objective whose structural form is that of classical utilitarianism, even though individual welfare should probably be interpreted very differently from classical utility.
PDF file of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Social Choice: General

Social Welfare Functionals on Restricted Domains and in Economic Environments (with Georges Bordes and Michel Le Breton) “Social Welfare Functionals on Restricted Domains and in Economic Environments,” Journal of Public Economic Theory 7 (2005), 1–25.

Abstract:
Arrow’s “impossibility” and similar classical theorems are usually proved for an unrestricted domain of preference profiles. Recent work extends Arrow’s theorem to various restricted but “saturating” domains of privately oriented, continuous, (strictly) convex, and (strictly) monotone “economic preferences” for private and/or public goods. For strongly saturating domains of more general utility profiles, this paper provides similar extensions of Wilson’s theorem and of the strong and weak “welfarism” results due to d’Aspremont and Gevers and to Roberts. Hence, for social welfare functionals with or without interpersonal comparisons of utility, most previous classification results in social choice theory apply equally to strongly saturating economic domains.
PDF file of preprint


Roberts’ Weak Welfarism Theorem: A Minor Correction Stanford University Department of Economics Working Paper No. 99-021.

Abstract:
Roberts’ “weak neutrality” or “weak welfarism” theorem concerns Sen social welfare functionals which are defined on an unrestricted domain of utility function profiles and satisfy independence of irrelevant alternatives, the Pareto condition, and a form of weak continuity. Roberts claimed that the induced welfare ordering on social states has a one-way representation by a continuous, monotonic real-valued function defined on the Euclidean space of interpersonal utility vectors. A counter-example shows that weak continuity is insufficient; a minor strengthening to pairwise continuity is proposed instead and its sufficiency demonstrated.
PDF file


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Social Choice with Interpersonal Comparisons

Equity, Arrow’s Conditions, and Rawls’ Difference Principle Econometrica 44 (1976), 793–804; reprinted in K.J. Arrow and G. Debreu (eds.), Landmark Papers in General Equilibrium Theory, Social Choice and Welfare (Edward Elgar, 2002), ch. 35, pp. 679–690; and (without appendix) in F.H. Hahn and M. Hollis (eds.), Philosophy and Economic Theory (Oxford University Press, 1979), ch. X, pp. 155–163.

Abstract:
An Arrow social welfare function was designed not to incorporate any interpersonal comparisons. But some notions of equity rest on interpersonal comparisons. It is shown that a generalized social welfare function, incorporating interpersonal comparisons, can satisfy modifications of the Arrow conditions, and also a strong version of an equity axiom due to Sen. One such generalized social welfare function is the lexicographic form of Rawls’ difference principle or maximin rule. This kind of generalized social welfare function is the only kind satisfying the modified Arrow conditions, the equity axiom, and a condition which underlies Suppes’ grading principle.
JSTOR link for paper


Why Ethical Measures of Inequality Need Interpersonal Comparisons Theory and Decision 7 (1976), 263–274.

Abstract:
An ethical measure of income inequality corresponds to a social ordering of income distributions. Without interpersonal comparisons, the only possible social orderings are dictatorial, so there can be no ethical inequality measure. Interpersonal comparisons allow a very much richer set of possible social orderings, and the construction of ethical measures of inequality.


Equity in Two-Person Situations: Some Consequences Econometrica 47 (1979), 1127–1135.

Abstract:
Suppose that social choice is based on interpersonal comparisons of welfare levels. Suppose too that, whenever all but two persons are indifferent between two options, a choice is made between these options which is equitable, in some sense. Then provided that individual welfare functions are unrestricted, and social choice is independent of irrelevant alternatives, it follows that social choice is always equitable, in the same sense. This applies when equity means satisfying Suppes’ indifference rule, or Suppes’ original justice criterion, or the lexicographic extension of Rawls’ difference principle.
JSTOR link for paper


Independence of Irrelevant Interpersonal Comparisons Social Choice and Welfare 8 (1991), 1–19.

Abstract:
Arrow’s independence of irrelevant alternatives (IIA) condition makes social choice depend only on personal rather than interpersonal comparisons of relevant social states, and so leads to dictatorship. Instead, a new “independence of irrelevant interpersonal comparisons” (IIIC) condition allows anonymous Paretian social welfare functionals such as maximin and Sen’s “leximin,” even with an unrestricted preference domain. But when probability mixtures of social states are considered, even IIIC may not allow escape from Arrow’s impossibility theorem for individuals’ (ex-ante) expected utilities. Modifying IIIC to permit dependence on interpersonal comparisons of relevant probability mixtures allows Vickrey–Harsanyi utilitarianism.
PDF file of preprint


Interpersonal Comparisons of Utility: Why and how they are and should be made in J. Elster and J.E. Roemer (eds.), Interpersonal Comparisons of Well-Being (Cambridge: Cambridge University Press, 1991), ch. 7, pp. 200–254; reprinted in A.P. Hamlin (ed.) Ethics and Economics, Vol. I (Edward Elgar, 1996), ch. 22, pp. 410–464.

Abstract:
A satisfactory complete normative criterion for individualistic ethical decision-making under uncertainty such as Harsanyi’s (Journal of Political Economy 1955) requires a single fundamental utility function for all individuals which is fully interpersonally comparable. The paper discusses reasons why interpersonal comparisons of utility (ICUs) have been eschewed in the past and argues that most existing approaches, both empirical and ethical, to ICUs are flawed. Either they confound facts with values, or they are based on unrealistic hypothetical decisions in an “original position”. Instead ICUs need to be recognized for what they really are — preferences for different kinds of people.
PDF file of preprint


Harsanyi’s Utilitarian Theorem: A Simpler Proof and Some Ethical Connotations in R. Selten (ed.) Rational Interaction: Essays in Honor of John Harsanyi (Berlin: Springer-Verlag, 1992), pp. 305–319.

Abstract:
Harsanyi’s utilitarian theorem states that the social welfare function is the weighted sum of individuals’ utility functions if: (i) society maximizes expected social welfare; (ii) individuals maximize expected utility; (iii) society is indifferent between two probability distributions over social states whenever all individuals are. After giving a simpler proof, an alternative axiomatic foundation for Vickrey–Harsanyi utilitarianism is provided. By making using an extended version of Harsanyi’s concept of a player’s “type” in the theory of games with incomplete information, the problem of forming social objectives when there is incomplete information can also be resolved, at least in principle.
PDF file of preprint


Interpersonally Comparable Utility (with Marc Fleurbaey) in S. Barberà, P.J. Hammond, and C. Seidl (eds.) Handbook of Utility Theory, Vol. 2: Extensions (Boston: Kluwer Academic Publishers, 2004) ch. 21, pp. 1181–1285.

Abstract:
This chapter supplements the earlier reviews in Hammond (1991a) and Suzumura (1996) by concentrating on four issues. The first is that in welfare economics interpersonal comparisons are only needed to go beyond Pareto efficiency or Pareto improvements. The second concerns the need for interpersonal comparisons in social choice theory, to escape Arrow’s impossibility theorem. The third issue is how to revise Arrow’s independence of irrelevant alternatives condition so that interpersonal comparisons can be accommodated. Finally, and most important, the chapter presents a form of utilitarianism in which interpersonal comparisons can be interpreted as ethical preferences for different personal characteristics.
PDF file of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Social Choice with Individual and Group Rights

Social Choice of Individual and Group Rights in W.A. Barnett, H. Moulin, M. Salles, and N. Schofield (eds.), Social Choice, Welfare, and Ethics (Cambridge: Cambridge University Press, 1995), ch. 3, pp. 55–77.

Abstract:
Individual rights can generally be respected if and, except in rare special cases, only if they apply to independent components of a Cartesian product space of social states, and also each individual is indifferent to how others exercise their rights. This is true whether or not the Pareto criterion is satisfied. Group rights can also be respected if they apply to the independent components for the different individual members of the group. This holds not only for social choice rules, but also for outcomes that arise when individuals and groups use equilibrium strategies in some game form. So only exceptionally is it possible to respect all rights. The paper concludes by considering different ways of including rights in the social states which are the object of individual preference and of social choice.
PDF file of preprint


Game Forms versus Social Choice Rules as Models of Rights in K.J. Arrow, A.K. Sen, and K. Suzumura (eds.) Social Choice Re-examined, Vol. II (IEA Conference Volume No. 117) (London: Macmillan, 1997) ch. 11, pp. 82–95.

Abstract:
The paper begins by defining multi-valued game forms which generalize social choice rules. It also defines the effectiveness relations that represent the rights induced by such game forms. When one-way instead of two-way rights are allowed, it also demonstrates yet further difficulties in finding a social choice rule to respect all rights. Finally, to deal with the problem that game forms typically contain arbitrary features of no consequence to a society or to its individual members, it is also suggested that one should define both social choice and individual values over sets consisting of “consequentially equivalent” classes of strategic game forms.
PDF file of preprint


Rights, Free Exchange, and Widespread Externalities in J.-F. Laslier, M. Fleurbaey, N. Gravel and A. Trannoy (eds.) Freedom in Economics: New Perspectives in Normative Analysis (London and New York: Routledge, 1998), ch. 11, pp. 139–157.

Abstract:
Sen’s libertarian paradox is ascribed to the inevitable conflict between the Pareto criterion and individuals’ rights to create negative externalities. Finite coalitions can effect exchanges of rights through Coaseian bargains in order to resolve inefficiencies due to local externalities. With a continuum of agents, however, finite coalitions are powerless to affect widespread externalities, except those that are regulated by policies such as inefficiently allocated quotas. Then finite coalitions may gain by exchanging such quotas, but Pareto improvements may require originally unused quotas to be confiscated. Thus, the voluntary exchange of rights may exacerbate widespread externalities.
PDF file of preprint


Equal Rights to Trade and Mediate Social Choice and Welfare (Special issue on “The axiomatic theory of resource allocation — In honour of Louis Gevers” edited by C. d’Aspremont and F. Maniquet) 21 (2003), 181–193.

Abstract:
For economies with a fixed finite set of traders, few results characterize Walrasian equilibria by their social choice properties. Pareto efficient allocations typically require lump-sum transfers. Other characterizations based on the core or strategyproofness apply only when, as in continuum economies, agents cannot influence prices strategically. Or the results concern social choice with a variable number of agents. This paper considers allocations granting agents equal rights to choose net trade vectors within a convex cone and, in order to exclude autarky, an additional right to mediate mutually beneficial transactions. Under standard assumptions, these properties characterize Walrasian equilibria without transfers.
PDF file of preprint ; Springer link


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Distributional Objectives in Welfare Economics

Dual Interpersonal Comparisons of Utility and the Welfare Economics of Income Distribution Journal of Public Economics 7 (1977), 51–71.

Abstract:
Interpersonal comparisons can be of utility levels and/or of utility differences. Comparisons of levels can be used to define equity in distributing income. Comparisons of differences can be used to construct an additive Bergson social welfare function over income distributions. When both utility levels and utility differences are compared, one can require the constructed additive Bergson social welfare function to indicate a preference for more equitable income distributions. This restricts the form of both the individual utility functions and the optimal distribution of income. The form of these restrictions depends on whether the levels and differences of the same utility functions are being compared.
ScienceDirect link.

... : A Corrigendum Journal of Public Economics 14 (1980), 105–6.
ScienceDirect link.


Progress in the Theory of Social Choice and Distributive Justice in S. Zandvakili (ed.) Research in Economic Inequality, Vol. 7: Inequality and Taxation pp. 87–106; revised version of English original whose Italian translation was published in L. Sacconi (ed.) La decisione: Razionalità collettiva e strategie nell’ amministrazione e nelle organizzazioni (Milano: Franco Angeli, 1986), ch. 3, pp. 89–106.

Abstract:
By definition, “consequentialist” behaviour in finite decision trees is explicable by its consequences. Both cost–benefit tests and “consequentialist” choices of economic policy necessarily require distributional judgements. These should emerge from a social welfare objective incorporating interpersonal comparisons. To accommodate them, Arrow’s IIA condition should be weakened to independence of ethically irrelevant alternatives. When consequences are risky, dynamically consistent consequentialist behaviour on an unrestricted domain of finite decision trees entails maximizing expected social welfare. Combined with an “ethical liberalism” condition, this leads to “fundamental” utilitarianism, which requires a further weakening of IIA to independence of ethically irrelevant mixed alternatives.
PDF file of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


General Equilibrium Theory

Irreducibility, Resource Relatedness, and Survival in Equilibrium with Individual Non-Convexities in R. Becker, M. Boldrin, R. Jones, and W. Thomson (eds.) General Equilibrium, Growth, and Trade II: The Legacy of Lionel W. McKenzie (San Diego: Academic Press, 1993), ch. 4, pp. 73–115.

Abstract:
Standard results in general equilibrium theory, such as existence and the second efficiency and core equivalence theorems, are most easily proved for compensated equilibria. A new condition establishes that, even with individual non-convexities, in compensated equilibrium any agent with a cheaper feasible net trade is also in uncompensated equilibrium. Some generalizations of McKenzie’s irreducibility assumption are then presented. They imply that (almost) no agent is at a cheapest point, so the easier and more general results for compensated equilibria become true for uncompensated equilibria. Survival of all consumers in uncompensated equilibrium also depends on satisfying an additional assumption that is similar to irreducibility.
PDF file of preprint


Walrasian Equilibrium without Survival: Equilibrium, Efficiency, and Remedial Policy (with Jeffrey L. Coles) in K. Basu, P.K. Pattanaik, and K. Suzumura (eds.) Choice, Welfare and Development: A Festschrift in Honour of Amartya K. Sen (Oxford: Oxford University Press, 1995), ch. 3, pp. 32–64.

Abstract:
Standard general equilibrium theory excludes starvation by assuming that everybody can survive without trade. Because trade cannot harm consumers, they can therefore also survive with trade. Here this assumption is abandoned, and equilibria in which not everybody survives are investigated. A simple example is discussed, along with possible policies which might reduce starvation. Thereafter, for economies with a continuum of agents, the usual results are established — existence of equilibrium, the two fundamental efficiency theorems of welfare economics, and core equivalence. Their validity depends on some special but not very stringent assumptions needed to deal with natural non-convexities in each consumer’s feasible set.
PDF file of preprint


Efficiency with Non-Convexities: Extending the ‘Scandinavian Consensus’ Approaches (with Antonio Villar) Scandinavian Journal of Economics 100 (1998), 11–32; also in T.M. Andersen and K.O. Moene (eds.) Public Policy and Economic Theory (Oxford: Blackwell, 1998), pp. 11–32.

Abstract:
There are two distinct “Scandinavian consensus” approaches to public good supply, both based on agents’ willingness to pay. A Wicksell–Foley public competitive equilibrium arises from a negative consensus in which no change of public environment, together with associated taxes and subsidies which finance it, will be unanimously approved. Alternatively, in a Lindahl or valuation equilibrium, charges for the public environment induce a positive consensus. To allow general non-convexities to be regarded as aspects of the public environment, we extend recent generalizations of these equilibrium notions and prove counterparts to both the usual fundamental efficiency theorems of welfare economics.
PDF file of preprint


Valuation Equilibrium Revisited (with Antonio Villar) in A. Alkan, C.D. Aliprantis, and N.C. Yannelis (eds.) Current Trends in Economics: Theory and Applications: Proceedings of the Third International Meeting of the Society for the Advancement of Economic Theory (Berlin: Springer, 1999), pp. 201–214.

Abstract:
This paper extends the notion of valuation equilibrium which applies to market economies involving the choice of a public environment. Unlike some other recent work, it is assumed here that consumers and firms evaluate alternative environments taking market prices as given (hence this notion is closer to that of competitive equilibria). It is shown that valuation equilibria with balanced tax schemes yield efficient allocations and that efficient allocations can be decentralized as valuation equilibria, with tax schemes that may be unbalanced.


Competitive Market Mechanisms as Social Choice Procedures in K.J. Arrow, A.K. Sen and K. Suzumura (eds.) Handbook of Social Choice and Welfare, Vol. II (Amsterdam: North-Holland, 2011), ch. 15, pp. 47–151.

Abstract:
A competitive market mechanism is a prominent example of a non-binary social choice rule, typically defined for a special class of economic environments in which each social state is an economic allocation of private goods, and individuals' preferences concern only their own personal consumption. This chapter begins by discussing which Pareto efficient allocations can be characterized as competitive equilibria with lump-sum transfers. It also discusses existence and characterization of such equilibria without lump-sum transfers. The second half of the chapter focuses on continuum economies, for which such characterization results are much more natural given that agents have negligible influence over equilibrium prices.
PDF file of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Gains from Trade and Migration

Limits to the Potential Gains from Economic Integration and Other Supply Side Policies (with Jaume Sempere) Economic Journal 105 (1995), 1180–1204.

Abstract:
Classical welfare economics demonstrates potential Pareto improvements from “supply side” policy changes that increase the efficiency of aggregate production. Special cases reviewed here concern market integration through customs unions and the gains from international trade. These classical results require incentive incompatible lump-sum transfers. Generally, other policies must compensate deserving losers. Following Dixit and Norman, we consider a freeze of consumer post-tax prices, wages and dividends, with tax rates and producer prices left to clear markets. Actual Pareto improvements are then generated by uniform poll subsidies. With appropriately distributed external tariff revenue, neither international transfers nor free disposal are required.
JSTOR link for paper


Gains from Trade versus Gains from Migration: What Makes Them So Different? (with Jaume Sempere) Journal of Public Economic Theory 8 (2006), 145–170.

Abstract:
Would unrestricted “economic” migration enhance the potential gains from free trade? With free migration, consumers’ feasible sets become non-convex. Under standard assumptions, however, Walrasian equilibrium exists for a continuum of individuals with dispersed ability to afford each of a finite set of possible migration plans. Then familiar conditions ensuring potential Pareto gains from trade also ensure that free migration generates similar supplementary gains, relative to an arbitrary status quo. As with the gains from customs unions, however, wealth may have to be redistributed across international borders.
PDF file of preprint


Migration with Local Public Goods and the Gains from Changing Places (with Jaume Sempere) Economic Theory 41 (2009), 359–377.

Abstract:
For an economy without public goods, in Hammond and Sempere (2006) we show that, under fairly standard assumptions, freeing migration would enhance the potential Pareto gains from free trade. This paper presents a generalization allowing local public goods subject to congestion. Our new result relies on policies which, unlike in the standard literature on fiscal externalities, fix both local public goods and congestion levels at their status quo values. Such policies allow constrained efficient and potentially Pareto improving population exchanges regulated only through appropriate residence charges, which can be regarded as Pigouvian congestion taxes.
PDF file of preprint ; Springer link


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Widespread Externalities and the f-Core

Continuum Economies with Finite Coalitions: Core, Equilibrium and Widespread Externalities (with Mamoru Kaneko and Myrna Holtz Wooders) Journal of Economic Theory 49 (1989), 113–134.

Abstract:
We develop a new model of a continuum economy with coalitions consisting of only finite numbers of agents. The core, called the f-core, is the set of allocations that are stable against improvement by finite coalitions and feasible by trade within finite coalitions. Even with widespread externalities — preferences depend on own consumptions and also on the entire allocation up to the null set — we obtain the result that the f-core coincides with the Walrasian allocations. Without widespread externalities, the f-core, the Aumann core, and the Walrasian allocations all coincide; however, with widespread externalities there is no obvious natural definition of the Aumann core.
ScienceDirect link


Four Characterizations of Constrained Pareto Efficiency in Continuum Economies with Widespread Externalities Japanese Economic Review 46 (1995), 103–124.

Abstract:
In continuum economies, widespread externalities are those over which each individual has negligible control. Nash–Walrasian equilibria with lump-sum transfers are defined, and their existence proved. They are then characterized by the property of “f-constrained Pareto efficiency” for finite coalitions. More general “private good” Nash–Walrasian equilibria are characterized as private good constrained Pareto efficient. Introducing complete Pigou taxes or subsidies leads to equilibria that are characterized by constrained efficiency and f-constrained efficiency for given levels of the widespread externalities. But full efficiency requires resolving the public good problem of determining those aggregate externalities or, equivalently, of setting appropriate Pigou prices.
PDF file of preprint


History as a Widespread Externality in Some Arrow–Debreu Market Games in G. Chichilnisky (ed.), Markets, Information and Uncertainty: Essays in Economic Theory in Honor of Kenneth J. Arrow (Cambridge: Cambridge University Press, 1999) ch. 16, pp. 328–361.

Abstract:
Two Arrow–Debreu market games are formulated whose straightforward Nash equilibria are Walrasian. Both have an auctioneer setting prices to maximize net sales value. In the second an additional redistributive agency maximizes welfare through optimal lump-sum transfers. In intertemporal economies, however, subgame imperfections can arise because agents understand how current decisions such as those determining investment influence either future prices (with finitely many agents), or future redistribution (even in continuum economies). The latter observation undermines the second efficiency theorem of welfare economics. Indeed, when the state of the economy affects future policy, it functions like a “widespread externality.”
PDF file of preprint


On f-Core Equivalence in a Continuum Economy with General Widespread Externalities Journal of Mathematical Economics 32 (1999), 177–184.

Abstract:
This paper partially extends the f-core equivalence theorem of Hammond, Kaneko and Wooders (1989) for continuum economies with widespread externalities — i.e., those over which each individual has negligible control. Externalities need not result directly from trading activities. Neither free disposal of divisible goods nor monotone preferences are assumed. Instead, a slightly strengthened form of local non-satiation suffices. However, in general it is proved only that any f-core allocation is a compensated Nash–Walrasian equilibrium. Finally, the proof uses an elementary argument which does not rely on Lyapunov’s theorem or convexity of the integral of a correspondence w.r.t. a non-atomic measure.
PDF file of preprint


History: Sunk Cost, or Widespread Externality? Rivista Internazionale di Scienze Sociali (2007), n. 2, 161–185.

Abstract:
In an intertemporal Arrow–Debreu economy with a continuum of agents, suppose that the auctioneer sets prices while the government institutes optimal lump-sum transfers period by period. An earlier paper showed how subgame imperfections arise because agents understand how their current decisions such as those determining investment will inuence future lump-sum transfers. This observation undermines the second effciency theorem of welfare economics and makes "history" a widespread externality. A two-period model is used to investigate the constrained efficiency properties of different kinds of equilibrium. Possibilities for remedial policy are also discussed.
PDF file of preprint


The Power of Small Coalitions in Large Economies Stanford University, Institute of Mathematical Studies in the Social Sciences, Economics Technical Report No. 501; in M.O.L. Bacharach, M.A.H. Dempster and J.L. Enos (eds.) Mathematical Models in Economics (1990), chapter 3; volume available online (2010) at this link .

Abstract:
As long as coalitions eventually become large, the Debreu–Scarf limit theorem for the core holds even if coalitions are restricted in size so that their proportion of agents shrinks to zero as the economy becomes infinitely large. Corresponding results hold for non-replica economies. In a limiting continuum economy, the core equivalence theorem holds even if there must be a “measure-consistent” partition of a coalition into self-sufficient subcoalitions each with a finite number of agents. These results help relate standard results to those presented in collaboration with Kaneko and Wooders concerning finite coalitions in continuum economies.
PDF file of preprint ; PDF file of online article



Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Equilibrium in Incomplete Markets

Overlapping Expectations and Hart’s Conditions for Equilibrium in a Securities Model Journal of Economic Theory 31 (1983), 170–175; reprinted in J.-M. Grandmont (ed.), Temporary Equilibrium: Selected Readings (New York: Academic Press, 1988), pp. 156–161.

Abstract:
Hart (J. Econ. Theory 9 (1974), 293–311) gave conditions for equilibrium to exist in a securities model where each agent undertakes asset transactions to maximize expected utility of wealth. These conditions rule out agents wanting to undertake unbounded balanced transactions to reach a Pareto superior allocation given their expectations. With mild extra assumptions to make agents unwilling to risk incurring unbounded losses on their portfolios, Hart’s conditions become equivalent to an assumption of “overlapping expectations,” which is comparable to a much weaker form of Green’s “common expectations” (Econometrica 41 (1973), 1103–1124).
ScienceDirect link


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Private Information and Incentive Constraints

The Implementation of Social Choice Rules: Some General Results on Incentive Compatibility (with Partha S. Dasgupta and Eric S. Maskin) Review of Economic Studies (Symposium on Incentive Compatibility), 46 (1979), 185–216.

Abstract:
A social choice rule f is a correspondence which associates with each possible configuration φ of individuals’ characteristics and each feasible set of alternatives A a choice set f(φ, A) ⊂ A, interpreted as the welfare optima of A. f is said to be implemented by a game form g if the equilibrium outcome set of g (with respect to the selected solution concept) is nonempty and contained in f(φ, A) ⊂ A for all φ and A. The general question of the implementability of social choice rules for four well-known solution concepts: dominant strategy equilibrium, Bayesian equilibrium, maximin equilibrium, and Nash equilibrium is studied in detail in this paper.
JSTOR link for paper


Straightforward Individual Incentive Compatibility in Large Economies Review of Economic Studies (Symposium on Incentive Compatibility), 46 (1979), 263–282.

Abstract:
It is intuitively obvious and generally known that the competitive resource allocation mechanism is, in a private economy with a nonatomic measure space of agents, individually incentive compatible. This paper characterizes the entire class of individually incentive compatible mechanisms in such economies as those which can be decentralized by allowing each agent to choose his own allocation from a constraint set that is independent of his own characteristics. Conditions are given for the competitive mechanism, without lump-sum transfers, to be the only Pareto satisfactory incentive compatible mechanism. A particular kind of “fair” Lindahl equilibrium has parallel properties in a corresponding economy with public goods.
JSTOR link for paper


Fully Progressive Taxation (with Partha S. Dasgupta) Journal of Public Economics 13 (1980), 141–154.

Abstract:
In the Mirrlees model of optimal income taxation, each worker’s true ability can be inferred from his income and the hours he works. Yet, a fully optimal ability tax is incentive incompatible. We show that, if both consumption and leisure are normal goods in each worker’s common utility function, the optimal incentive compatible allocation is the maximin optimum, with each worker enjoying the same utility level. This allocation can be implemented by a tax on ability as revealed by the most skilled kind of labor the worker offers, or by a tax on his average productivity.
ScienceDirect link


Markets as Constraints: Multilateral Incentive Compatibility in Continuum Economies Review of Economic Studies 54 (1987), 399–412; extended abstract in P. Kleinschmidt and F.J. Radermacher (eds.), Proceedings of the SOR Conference, Passau, 1987 pp. 57–8.

Abstract:
A symmetric allocation in a continuum is “multilaterally incentive compatible” if no finite coalition of privately informed agents can manipulate it by combining deception with hidden trades of exchangeable goods. Sufficient conditions for multilateral incentive compatibility are that all agents face the same linear prices for exchangeable goods, and that indistinguishable agents face identical budget sets. The same conditions are necessary under assumptions which extend those under which the second efficiency theorem of welfare economics holds in a continuum economy. Markets for exchangeable goods emerge as binding constraints on the set of Pareto efficient allocations with private information.
JSTOR link for paper


Incentives and Allocation Mechanisms in R. van der Ploeg (ed.), Advanced Lectures in Quantitative Economics (New York: Academic Press, 1990), ch. 6, pp. 213–248.

Abstract:
When individuals are privately informed of their own abilities and tastes, many first-best Pareto efficient allocations become infeasible. True feasibility requires an allocation to emerge from a game form with incomplete information. In English auctions when bidders know their willingness to pay, they have dominant strategies and the equilibrium outcome depends only on their willingness to pay. But in Dutch auctions bidders’ beliefs about each other are also important. A generalized version of the “revelation principle” is demonstrated. Random dominant strategy mechanisms for simple finite economies are characterized by means of simple linear inequalities.


On the Impossibility of Perfect Capital Markets in P. Dasgupta, D. Gale, O. Hart, and E. Maskin (eds.), Economic Analysis of Markets and Games: Essays in Honor of Frank Hahn (Cambridge, Mass.: M.I.T. Press, 1992), pp. 527–560.

Abstract:
Perfect capital markets require linear budget constraints, without credit rationing creating any tight borrowing constraints before the end of agents’ economic lifetimes. Yet lifetime linear budget constraints are totally unenforceable. This paper considers what allocations can be enforced through monitoring in a simple two period economy when agents have private information regarding their endowments. Then default may not become apparent soon enough for any economic penalty to be an effective deterrent. Instead, borrowing constraints must be imposed to control fraud (moral hazard). Adverse selection often implies that some borrowing constraints must bind, creating inevitable capital market imperfections.
PDF file of preprint


A Revelation Principle for (Boundedly) Bayesian Rationalizable Strategies in R.P. Gilles and P.H.M. Ruys (eds.), Imperfections and Behavior in Economic Organizations (Boston: Kluwer Academic Publishers, 1994) ch. 3, pp. 39–70.

Abstract:
The revelation principle is reconsidered in the light of recent work questioning its general applicability, as well as other work on the Bayesian foundations of game theory. Implementation in rationalizable strategies is considered. A generalized version of the revelation principle is proposed recognizing that, unless agents all have dominant strategies, the outcome of any allocation mechanism depends not only upon agents’ “intrinsic” types, but also upon their beliefs about other agents and their strategic behaviour. This generalization applies even if agents are “boundedly rational” in the sense of being Bayesian rational only with respect to bounded models of the game form.
PDF file of preprint


Asymptotically Walrasian Strategy-Proof Exchange (with José Córdoba) Mathematical Social Sciences 36 (1998), 185–212.

Abstract:
In smooth exchange economies with a continuum of agents, any Walrasian mechanism is Pareto efficient, individually rational, anonymous, and strategy-proof. Barberà and Jackson’s recent results imply that no such efficient mechanism is the limit of resource-balanced, individually rational, anonymous and non-bossy strategy-proof allocation mechanisms for an expanding sequence of finite economies. For a broad class of smooth random exchange economies, relaxing anonymity and non-bossiness admits mechanisms which, as the economy becomes infinitely large, are asymptotically Walrasian for all except one “balancing” agent, while being manipulable with generically vanishing probability. Also considered are some extensions to non-Walrasian mechanisms.
PDF file of preprint


Multilaterally Strategy-Proof Mechanisms in Random Aumann–Hildenbrand Macroeconomies in M. Wooders (ed.) Topics in Game Theory and Mathematical Economics: Essays in Honor of Robert J. Aumann (Providence, RI: American Mathematical Society), pp. 171–187.

Abstract:
By definition, multilaterally strategy-proof mechanisms are immune to manipulation not only by individuals misrepresenting their preferences, but also by finite coalitions exchanging tradeable goods on the side. Continuum economies are defined in which both agents’ identifiers and their privately known characteristics are joint i.i.d. random variables. For such economies, conditions are given for multilateral strategy-proofness to imply decentralization by a budget constraint with linear prices for tradeable goods and lump-sum transfers independent of individual characteristics. Also, adapting Aumann’s [1964a] key proof avoids using Lyapunov’s theorem or its corollary, Richter’s theorem on integrating a correspondence w.r.t. a non-atomic measure.
PDF file of preprint


Perfected Option Markets in Economies with Adverse Selection European University Institute, Working Paper 89/426; presented at the Econometric Society European Meeting, Munich, 1989.

Abstract:
In economies with adverse selection, Arrow–Debreu contingent commodity contracts must satisfy incentive constraints. Following Prescott and Townsend (in Econometrica 1984), an Arrow–Debreu economy is considered with a continuum of agents whose feasible sets are artificially restricted by imposing these incentive constraints. Equilibria in such an economy must be incentive-constrained Pareto efficient. It is shown that deterministic equilibria of this kind are achievable through “perfected” option markets with non-linear pricing in a way which makes the incentive constraints self-enforcing. Rothschild, Stiglitz and others have shown, however, that these equilibria must be vulnerable to free entry by profit seeking firms.
PDF file


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Cost–Benefit Analysis, Policy Reform, and Welfare Measurement

Approximate Measures of Social Welfare and the Size of Tax Reform in D. Bös, M. Rose, and C. Seidl (eds.), Beiträge zur neueren Steuertheorie (Berlin: Springer-Verlag, 1984), pp. 95–115.

Abstract:
This paper deals with second order approximations to changes of welfare measured by social welfare functions. In the framework of piecemeal policy the impacts of tax reforms to social welfare are considered. Three different kinds of social welfare functions are employed: an arbitrary Bergsonian, a social welfare function based on money metric utility for individuals, and a money metric of social welfare. Furthermore Pareto improving reforms are discussed. If possible, the optimal direction and the optimal size of a tax reform are determined.


Project Evaluation by Potential Tax Reform Journal of Public Economics 30 (1986), 1–36.

Abstract:
Shadow prices are derived for small open economies with several production sectors experiencing constant returns to scale. Small projects affect the balance of trade, domestic prices (of non-traded goods and factors), and sector scales. Only domestic prices affect welfare, and only if there is not “domestic price equalization”. Generally, a project’s net benefits depend upon the potential tax (and tariff) reform made possible (or necessary) through the balance of trade effect. Border prices are right for traded goods, but domestic good shadow pricing requires knowing the direction of at least one reversible available tax reform, and presuming optimality with respect to available reforms.
ScienceDirect link


Money Metric Measures of Individual and Social Welfare Allowing for Environmental Externalities in W. Eichhorn (ed.) Models and Measurement of Welfare and Inequality (Berlin: Springer-Verlag, 1994), pp. 694–724.

Abstract:
Even with environmental externalities, money metric measures of individual welfare can often be constructed by methods similar to those of Vartia (1983), provided that individual’s willingness to pay functions are known. Satisfactory money metric measures of social welfare are harder, however. Following Feldstein (1974) and Rosen (1976), a “uniform” money metric measure is proposed, based on the uniform poll subsidy (or tax) to all individuals which produces the same gain (or loss) in social welfare. Finally, problems with the definition of such measures when faced with “environmental catastrophe” are discussed.
PDF file of minor revision


Reassessing the Diamond–Mirrlees Efficiency Theorem in P.J. Hammond and G.D. Myles (eds.) Incentives, Organization, and Public Economics: Papers in Honour of Sir James Mirrlees (Oxford University Press, 2000), ch. 12, pp. 193–216.

Abstract:
Diamond and Mirrlees (1971) provide sufficient conditions for a second-best Pareto efficient allocation with linear commodity taxation to require efficient production when a finite set of consumers have continuous single-valued demand functions. This paper considers a continuum economy allowing indivisible goods, other individual non-convexities, and some forms of non-linear pricing for consumers. Provided consumers have appropriately monotone preferences and dispersed characteristics, robust sufficient conditions ensure that a strictly Pareto superior incentive compatible allocation with efficient production results when a suitable expansion of each consumer’s budget constraint accompanies any reform which enhances production efficiency. Appropriate cost–benefit tests can identify small efficiency enhancing projects.
PDF file of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Welfare, Information and Uncertainty

Ex-ante and Ex-post Welfare Optimality under Uncertainty Economica 48 (1981), 235–250.

Abstract:
Suppose that either (i) individuals’ subjective probabilities differ from society’s probability assessments, or (ii) social attitudes to individual risk differ from individuals’ own attitudes to risk. Then the social welfare function will not respect individuals’ own preferences for trade in contingent commodities, and Arrow–Debreu markets will not be able to bring about a full ex-post welfare optimum. This paper examines the extent to which markets can bring about such an ex-post welfare optimum. Contingent lump-sum transfers each period are required. Then it suffices to have spot markets each period, provided that consumers’ von Neumann–Morgenstern utility functions are “separable backwards” in time.
JSTOR link for paper


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Miscellaneous Work in Welfare Economics and Ethics

Theoretical Progress in Public Economics: A Provocative Assessment Oxford Economic Papers 42 (1990), 6–33; also in P.J.N. Sinclair and M.D.E. Slater (eds.) Taxation, Private Information and Capital (Oxford: Clarendon Press, 1991).

Abstract:
The following twelve issues are each briefly discussed: the blindness of the invisible hand to injustice; a misleading efficiency theorem; truthful revelation of feasibility constraints; delusions of first best; deadweight losses as sunk costs; markets as failures; the nth best as enemy of the good; intermonetary comparisons of gains and losses; few worthwhile changes are small; surplus economics; surplus econometrics; and unbalanced policies. Unbalanced policy changes should be evaluated by estimating the probabilities of different joint frequency distributions of welfare relevant attributes and welfare net gains for all individuals in the population, together with the budget deficits or surpluses.


The Moral Status of Profits and Other Rewards: A Perspective from Modern Welfare Economics in R. Cowan and M.J. Rizzo (eds.) Profits and Morality (Chicago: University of Chicago Press, 1995), ch. 4, pp. 88–123.

Abstract:
Standard neoclassical welfare economics justifies competitive profit maximization by firms. But when lump-sum transfers are used to achieve distributive justice, a firm’s owners and managers are entitled only to “normal” profits paid for services rendered. Yet with private information about effort or technology, not even efficient production is always desirable, let alone profit maximization. Furthermore, some profits should then be distributed specifically to the firm’s managers as incentive payments. In intertemporal economies these conclusions are reinforced, and profits harder even to define. Finally, it is argued that valuing freedom for its own sake may make profits more acceptable than otherwise.
PDF file of preprint


Spurious Deadweight Gains (with Giovanni Facchini and Hiroyuki Nakata) Economics Letters 72 (2001), 33–37.

Abstract:
Marshallian consumer surplus (MCS) is generally an inaccurate measure of welfare change because it neglects income effects. Suppose these effects overturn the usual demand response to a price change. Then, the deadweight loss from a distortionary tax or subsidy has the wrong sign, that is, there is a spurious deadweight gain.
PDF file of preprint


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Continuum of Random Variables

Monte Carlo Simulation of Macroeconomic Risk with a Continuum of Agents: The Symmetric Case (with Yeneng Sun) Economic Theory 21 (2003), 743–766; also in C.D. Aliprantis et al. (eds.) Assets, Beliefs, and Equilibria in Economic Dynamics: Essays in Honor of Mordecai Kurz (Berlin: Springer-Verlag, 2003), pp. 709–732.

Abstract:
Suppose a large economy with individual risk is modeled by a continuum of pairwise exchangeable random variables (i.i.d., in particular). Then the relevant stochastic process is jointly measurable only in degenerate cases. Yet in Monte Carlo simulation, the average of a large finite draw of the random variables converges almost surely. Several necessary and sufficient conditions for such “Monte Carlo convergence” are given. Also, conditioned on the associated Monte Carlo σ-algebra, which represents macroeconomic risk, individual agents’ random shocks are independent. Furthermore, a converse to one version of the classical law of large numbers is proved.
PDF file of preprint ; Springer link


Joint Measurability and the One-way Fubini Property for a Continuum of Independent Random Variables (with Yeneng Sun) Proceedings of the American Mathematical Society 134 (2006), 737–747.

Abstract:
As is well known, a continuous parameter process with mutually independent random variables is not jointly measurable in the usual sense. This paper proposes using a natural “one-way Fubini” property that guarantees a unique meaningful solution to this joint measurability problem when the random variables are independent even in a very weak sense. In particular, if F is the smallest extension of the usual product sigma-algebra such that the process is measurable, then there is a unique probability measure ν on F such that the integral of any ν-integrable function is equal to a double integral evaluated in one particular order. Moreover, in general this measure cannot be further extended to satisfy a two-way Fubini property. However, the extended framework with the one-way Fubini property not only shares many desirable features previously demonstrated under the stronger two-way Fubini property, but also leads to a new characterization of the most basic probabilistic concept — stochastic independence in terms of regular conditional distributions.
PDF file; AMS link


The Essential Equivalence of Pairwise and Mutual Conditional Independence (with Yeneng Sun) Probability Theory and Related Fields 135 (2006), 415–427.

Abstract:
For a large collection of random variables, pairwise conditional independence and mutual conditional independence are shown to be essentially equivalent. Unlike in the finite setting, a large collection of random variables remains essentially conditionally independent under further conditioning. The essential equivalence of pairwise and multiple versions of exchangeability also follows as a corollary. Our proof is based on an iterative extension of Bledsoe and Morse’s completion of a product measure on a pair of measure spaces.
PDF file


Monte Carlo Simulation of Macroeconomic Risk with a Continuum of Agents: The General Case (with Yeneng Sun) Economic Theory DOI 10.1007/s00199-007-0279-7 (published online, 7 September 2007).

Abstract:
In large random economies with heterogeneous agents, a standard stochastic framework presumes a random macro state, combined with idiosyncratic micro shocks. This can be formally represented by a random process consisting of a continuum of random variables that are conditionally independent given the macro state. However, this process satisfies a standard joint measurability condition only if there is essentially no idiosyncratic risk at all. Based on iteratively complete product measure spaces, we characterize the validity of the standard stochastic framework via Monte Carlo simulation as well as event-wise measurable conditional probabilities. These general characterizations also allow us to strengthen some earlier results related to exchangeability and independence.
PDF file of Warwick Economics Research Paper; Preprint version ; Springer link


Characterization of Risk: A Sharp Law of Large Numbers (with Yeneng Sun). Warwick Economic Research Paper, no. 806 (2007).

Abstract:
An extensive literature in economics uses a continuum of random variables to model individual random shocks imposed on a large population. Let H denote the Hilbert space of square-integrable random variables. A key concern is to characterize the family of all H-valued functions that satisfy the law of large numbers when a large sample of agents is drawn at random. We use the iterative extension of an infinite product measure introduced in Hammond and Sun (2006) to formulate a ÒsharpÓ law of large numbers. We prove that an H-valued function satisfies this law if and only if it is both Pettis-integrable and norm integrably bounded.
PDF file


Back to List of Topics. Go to List of Publications. Back to Peter Hammond home page.


Empirics, Statistics, Experiments, and Other Topics

Affine Models of the Joint Dynamics of Exchange Rates and Interest Rates (with Bing Anderson and Cyrus A. Ramezani) Journal of Financial and Quantitative Analysis 45 (2010), 1341–1365.

Abstract:
This paper extends the affine class of term structure models to describe the joint dynamics of exchange rates and interest rates. In particular, the issue of how to reconcile the low volatility of interest rates with the high volatility of exchange rates is addressed. The incomplete market approach of introducing exchange rate volatility that is orthogonal to both interest rates and the pricing kernels is shown to be infeasible in the affine setting. Models in which excess exchange rate volatility is orthogonal to interest rates but not orthogonal to the pricing kernels are proposed, and validated via Kalman filter estimation of maximal 5-factor models for 6 country pairs.
Cambridge Journals link


Individual Welfare and Subjective Well-Being: Comments on ‘Subjective Well-Being, Income, Economic Development and Growth’ by Daniel W. Sacks, Betsey Stevenson, and Justin Wolfers,” (with Federica Liberini and Eugenio Proto); in Claudia Sepúlveda, Ann Harrison, and Justin Yifu Lin (eds.) ABCDE 2011: Development Challenges in a Postcrisis World (Washington, DC: World Bank, 2013) pp. 339–353.

Abstract:
Sacks, Stevenson and Wolfers (2010) question earlier results like Easterlin's showing that long-run economic growth often fails to improve individuals’ average reports of their own subjective well-being (SWB). We use World Values Survey data to establish that the proportion of individuals reporting happiness level h, and whose income falls below any fixed threshold, always diminishes as h increases. The implied positive association between income and reported happiness suggests that it is possible in principle to construct multi-dimensional summary statistics based on reported SWB that could be used to evaluate economic policy.
PDF file of preprint


Do Happier Britons Have More Income? First-Order Stochastic Dominance Relations (with Federica Liberini and Eugenio Proto); CAGE Online Working Paper 165, Research Centre on Competitive Advantage in the Global Economy, University of Warwick, 2013.

Abstract:
For British Household Panel Survey data, given any self-reported life satisfaction level except the highest, the conditional income distribution first-order stochastically dominates the corresponding conditional distribution given any lower level. The conditional distribution for completely satisfied subjects, however, is stochastically dominated by several other conditional distributions. This “top anomaly” excludes any standard ordered discrete choice econometric model where satisfaction depends on income. The observed stochastic dominance relations, including the top anomaly, are consistent with an estimated multinomial logit model that includes background demographic and educational variables, along with the difference between individual log income and its conditional expectation.
PDF file


A Three-Stage Experimental Test of Revealed Preference (with Stefan Traub); CAGE Online Working Paper 71, Research Centre on Competitive Advantage in the Global Economy, University of Warwick, 2012.

Abstract:
A powerful test of Varian's (1982) generalised axiom of revealed preference (GARP) with two goods requires the consumer's budget line to pass through two demand vectors revealed as chosen given other budget sets. In an experiment using this idea, each of 41 student subjects faced a series of 16 successive grouped portfolio selection problems. Each group of selection problems had up to three stages, where later budget sets depended on that subject's choices at earlier stages in the same group. Only 49% of subjects' choices were observed to satisfy GARP exactly, even by our relatively generous nonparametric test.
PDF file of preprint


Laboratory Games and Quantum Behaviour: The Normal Form with a Separable State Space Warwick Economics Research Paper, no. 969 (2011); presented at Quantum Physics Meets TARK, University of Groningen.

Abstract: To describe quantum behaviour, Kolmogorov’s definition of probability is extended to accommodate subjective beliefs in a particular “laboratory game” that a Bayesian rational decision maker plays with Nature, Chance and an Experimenter. The Experimenter chooses an orthonormal subset of a complex Hilbert space of quantum states; Nature chooses a state in this set along with an observation in a measurable space of experimental outcomes that influences Chance’s choice of random consequence. Imposing quantum equivalence allows the trace of the product of a density and a likelihood operator to represent the usual Bayesian expectation of a likelihood function w.r.t. a subjective prior.
PDF file of preprint


Back to List of Topics

Publications

Back to Peter Hammond home page


Based on file translated from TEX by TTH, version 2.21. Latest revision 16 June 2014.