Skip header content and main navigation Binghamton University, State University of New York - Economics
 

Working papers, 2000

2001
Neha Khanna and Duane Chapman
Energy Efficiency and Petroleum Depletion in Climate Change Policy

2002
An Economic Analysis of Aspects of Petroleum and Military Security in the Persian Gulf

2003
Neha Khanna and Duane Chapman
Crying No Wolf: Why Economists Don't Worry about Climate Change, and Should

2004
Steve Scalet
What Prisoner's Dilemmas Are Good For

2005
T. Nicolaus Tideman and Florenz Plassmannv
Fair and Efficient Compensation for Taking Property under Uncertainty

2006
Neha Khanna and Martina Vidovic
Voluntary Pollution Prevention and the Role of Community Characteristics: An Evaluation of the EPA's 33/50 Program

2007
Steve Scalet
Explaining Positive Contributions in Public Goods Experiments

2008
Ronald Britto and Abraham Wender
Committee Decision-Making: The Optimal Number of Categories

2009
Charles W. Bischoff, Halefom Belay, and In-Bong Kang
Bayesian VAR Forecasts, Big Model/Judgmental Forecasts, and Combinations: 1981-1996

2010
In-Bong Kang, Steven C. Hine, and Charles W. Bischoff
Money Matters Some: New Evidence on the New Keynesian Model of the Business Cycle

2011
Charles W. Bischoff and Halefom Belay
The Problem of Identification of the Money Demand Function


Number: 2001

Authors: Neha Khanna and Duane Chapman

Title: Energy Efficiency and Petroleum Depletion in Climate Change Policy

Abstract:

This paper examines the validity of standard technology assumptions commonly used in climate economy models for the far future, and explores the consequences of changing them to reflect actual as opposed to postulated trends. This paper presents a model framework and results that combine resource depletion with optimal economic growth and climate change in a macro-geoeconomic model. The authors build upon the Nordhaus DICE model to include the demands for coal, oil, and natural gas which depend on own price, prices of substitute fuels, per capita income, and population. The resource depletion model captures the effect on oil depletion of shifting demand curves which respond to population and income growth. The authors also question the empirical validity of the rapidly declining energy and carbon intensity assumption employed by Nordhaus and others. Model results are significantly more pessimistic than those obtained in other work. The analysis of energy tax regimes yields unrealistically high tax rates necessary to bring the carbon emissions trajectory down to the optimal level. While the alternatives explored here might be interpreted as pessimistic, we consider them highly plausible. The significant policy conclusion is the need for an earlier and more aggressive implementation of climate policies than typically found in other work. Copies of the paper are available upon request.

File: Working Paper 2001


Number: 2002

Authors: Duane Chapman and Neha Khanna

Title: An Economic Analysis of Aspects of Petroleum and Military Security in the Persian Gulf

Abstract:

Geologic estimates of remaining global petroleum resources place about 50% in the Persian Gulf. Production costs are estimated at $5 per barrel there, and $15 per barrel in the North Sea and Alaska. Using mathematical methods derived from depletion theory, the present value of economic rent from oil is on the order of $20 trillion. Game theory is utilized to explain the $15-$20 per barrel price band that existed from 1986 to 1999. New economic forces have displaced this previously stable pattern; a new price range of $22 to $28 may be emerging. International trade in petroleum and conventional weapons are analyzed with econometric methods; the occurrence of nuclear weapons capability in the Persian Gulf region is explored.

File: Working paper 2002


Number: 2003

Authors: Duane Chapman and Neha Khanna

Title: Crying No Wolf: Why Economists Don't Worry about Climate Change, and Should

Abstract:

This paper identifies and critically examines a set of assumptions and characteristics that together define the standard for climate-economy models currently in use. These are a) low or negative carbon dioxide marginal abatement costs in developing countries, b) declining energy intensity and autonomous energy efficiency improvements, c) concave carbon dioxide emissions trajectory d) high pure rates of time preference. The general apathy toward controlling the growth of CO2 emissions, both at the global level and particularly in high income countries, is derived in part from conclusions based on these assumptions.

File: Working Paper 2003


Number: 2004

Authors: Steve Scalet

Title: What Prisoner's Dilemmas Are Good For

Abstract:

Traditionally, prisoner's dilemmas are represented as problems that require a solution. When individual rationality leads agents to interact in ways that create Pareto sub-optimal outcomes, the remedy is to create institutions that help align individual incentives with a Pareto efficient outcome. But we don't always want to eliminate these social dilemmas. Finding prisoner's dilemmas does not mean that we need to find a solution. The essence of the argument is this: the presence of prisoner's dilemmas are important for creating norms of cooperation in society, and these norms are necessary for maintaining efficient economies over the long run.

File: Not Available Online


Number: 2005

Authors: T. Nicolaus Tideman and Florenz Plassmann

Title: Fair and Efficient Compensation for Taking Property under Uncertainty

Abstract:

Various authors have shown that efficiency requires lump-sum compensation if there is uncertainty about whether the government will take property under eminent domain, and that compensation equal to the full value of property provides incentives for property owners to overinvest. However, the solutions offered by these authors do not consider the incentive that the government has to announce an inaccurately high probability with which it will take property and thereby avoid paying the full losses of owners. We show that the announcement of a possibility of a taking is itself a taking when this implies that further investments might not be compensated, and that it is both fair and efficient to require the government to compensate owners for the loss in value due to such an announcement. Unlike previous proposals, our mechanism provides incentives for efficient investment while paying neither more nor less than full compensation for expected losses under efficient behavior.

File: Working Paper 2005


Number: 2006

Authors: Neha Khanna and Martina Vidovic

Title: Voluntary Pollution Prevention and the Role of Community Characteristics: An Evaluation of the EPA's 33/50 Program

Abstract:

Abstract

File: Not Available Online


Number: 2007

Authors: Steve Scalet

Title: Explaining Positive Contributions in Public Goods Experiments

Abstract:

Casual observations of our economic life suggest that we sometimes contribute to public goods enterprises even when we recognize the pull to free ride on the contributions of others. Standard economic theory (specifically, Nash equilibrium theory) predicts that agents will not contribute to public goods projects because of this free riding problem. Experimentalists have actively pursued whether the anecdotal observations of economic life withstand carefully controlled replication in a laboratory setting. They do. These public goods experiments, based on a voluntary contributions mechanism (VCM), have consistently yielded positive contributions in the face of dominant strategy Nash equilibrium prediction of zero contributions. As a result experimentalists in public goods have focused their efforts on the question, "Why do agents contribute in the voluntary contribution mechanism?" This paper examines the experimentalist response to this question.

File: Working Paper 2007


Number: 2008

Authors: Ronald Britto and Abraham Wender

Title: Committee Decision-Making: The Optimal Number of Categories

Abstract:

When a committee is formed to help a decision-maker make a decision to accept or reject a proposal, or series of proposals, a common practice is to task the committee members to summarize their assessment by selecting one among several categories, like "highly recommend," "recommend," etc. We investigate the determination of the optimal number of categories placed before the committee members, deriving a condition under which an increase in the number increases the expected benefits of the decision. Loosely interpreted, this condition requires that there be an increase in the amount of information with the addition of a category. We also show how the condition can be illustrated with a simple diagram.

File: Not available online.


Number: 2009

Authors: Charles W. Bischoff, Halefom Belay, and In-Bong Kang

Title: Bayesian VAR Forecasts, Big Model/Judgmental Forecasts, and Combinations: 1981-1996

Abstract:

Forecasts from vector autoregression models, in which the data are combined with Bayesian prior distributions on the coefficients, gained great popularity in the 1980s. The prior distribution known as the "Minnesota Prior" was used to make forecasts starting in 1980. These forecasts seemed for a time to not only equal but even to surpass those of the consulting groups selling forecasts based on large judgmentally adjusted econometric models. Using actual forecasts made by the group then called DRI between 1981 and mid-1996, we find the forecasts based on the "Minnesota Prior" did not continue their early success, even when they are averaged with the DRI forecasts.

File: Not available online.


Number: 2010

Authors: In-Bong Kang, Steven C. Hine, and Charles W. Bischoff

Title: Money Matters Some: New Evidence on the New Keynesian Model of the Business Cycle

Abstract:

Economic events of the early 1970s led to a widespread re-examination of the Keynesian-neoclassical synthesis and, as a result, a proliferation of new macroeconomic theories. Among these was the New Classical view. One important response involved models with rational expectations and nominal rigidities. First, we test and do not reject an important New Keynesian model. We also investigate the New Keynesian model's ability to forecast out-of-sample movements of real economic activity in comparison with a simple autoregressive model. We also reexamine the recent claim by Thoma and Gray (Economic Inquiry 1998) that multi-variate structural models of the relationship between real economic activity and monetary/financial variables fail to win the "horse race" against a simple autoregressive model. We examine the relative information contents of forecasts from the New Keynesian model versus a 2nd order autoregressive model of the unemployment rate. We find that both the New Keynesian model's forecasts and an AR model's forecasts contain some useful information for forecasting one- to four-quarter-ahead movements of the unemployment rate, and we conclude that money matters but it does so only some. Thoma and Gray have gone further, to argue that money does not matter at all. All we argue is that unanticipated money, when measured properly, affects at least one real variable for some time in the short run.

File: Not available online.


Number: 2011

Authors: Charles W. Bischoff and Halefom Belay

Title: The Problem of Identification of the Money Demand Function

Abstract:

Cooley and LeRoy (1981) have argued that if random shifts in money demand are at least partially accommodated by the Federal Reserve, then there is no obvious way to identify the money demand function for the purposes of estimation. We argue that this is not correct, and that identification is in fact straightforward, provided that the monetary authority reacts to at least one variable not in the money demand function. If the monetary authority looks at "everything" in making its policy, this condition is likely to be fulfilled.

File: Not available online.


Connect with Binghamton:
Twitter icon links to Binghamton University's Twitter page YouTube icon links to Binghamton University's YouTube page Facebook icon links to Binghamton University's Facebook page Instagram

Last Updated: 6/16/09