Archive: Working Papers 198-525

You can search the BC Economics Working Papers by author, title, keyword, JEL category, and abstract contents viaǰ.


525
. Shakeeb Khan (University of Rochester) and Arthur Lewbel, "" (rev. 09/2006: 376 Kb, PDF)

Abstract: This paper provides a root-n consistent, asymptotically normal weighted least squares estimator of the coefficients in a truncated regression model. The distribution of the errors is unknown and permits general forms of unknown heteroskedasticity. Also provided is an instrumental variables based two stage least squares estimator for this model, which can be used when some regressors are endogenous, mismeasured, or otherwise correlated with the errors. A simulation study indicates the new estimators perform well in finite samples. Our limiting distribution theory includes a new asymptotic trimming result addressing the boundary bias in first stage density estimation without knowledge of the support boundary.

524. Donald Cox, "" (02/2002: 96 Kb, PDF)

Abstract: This chapter uses data from the 1992/93 and 1997/98 Vietnam Living Standards Surveys (VLSS) to describe patterns of money transfers between households. Rapid economic growth during the 1990's did little to diminish the importance of private transfers in Vietnam. Private transfers are large and widespread in both surveys,and they are much larger than public transfers are. Private transfers appear to function like means-tested public transfers, flowing from better off to worse off households and providing old-age support in retirement. Panel evidence suggests some hysteresis in private transfer patterns, but many households also changed from recipients to givers and vice versa between surveys. Changes in private transfers appear responsive to changes in household pre- transfer income, demographic changes and life-course events. Transfer inflows rise upon retirement and widowhood, for example, and are positively associated with increases in health expenditures. It also appears that private transfer inflows increased for households affected by Typhoon Linda, which devastated Vietnam's southernmost provinces in late 1997.

523. Arthur Lewbel, "" (rev. 06/2002: 171 Kb, PDF)

Abstract: Death statutes in the United States list elements of loss for which a defendant must make compensatory payment. The element that economists as expert witnesses are called upon to calculate is net income, roughly defined as the decedent's income minus personal expenses. The existence of joint or shared consumption goods complicates the definition and calculation of net income. Net income can be interpreted as the money required for survivors to attain the same standard of living as before. Equivalence scales traditionally used for this type of calculation are flawed. A new method for calculating net income is proposed, based on a collective household model.

522. Marzio Galeotti (University of Bergamo), Louis J. Maccini (Johns Hopkins University), and Fabio Schiantarelli, "" (01/2002)

Abstract: The purpose of this paper is to develop a model that integrates inventory and labor decisions. We extend a model of inventory behavior to include a detailed specification of the role of labor input in the production process and of the costs associated with it. In particular, we distinguish between employment, hours and effort per worker, and allow for adjustment costs associated with employment changes. We assume that the requirement function for effective hours has a general trans-logarithmic form, and derive an estimable system of Euler equations for inventories and employment with implied cross-equation restrictions. The econometric results shed light on several important topics, including the shape of the marginal cost of output and the role of labor hoarding as an explanation of procyclical productivity and the persistence of inventory stocks. Moreover, they raise questions about the adequacy of commonly used specifications such as Cobb-Douglas approximations to the production process and the definition of labor input as the product of employment and effective hours worked per worker.

521. Christopher F Baum, Mustafa Caglayan (University of Sheffield) and Neslihan Ozkan (University of Bristol), "" (rev. 08/2008: 244 Kb, PDF; published,Economics Letters, 2009, 102, 87-89)

Abstract: This paper investigates whether variations in macroeconomic uncertainty distort banks' allocation of loanable funds by affecting the predictability of banks' returns from lending. Low levels of macroeconomic uncertainty will allow bankers to base their lending decisions on more accurate evaluations of different lending opportunities, leading to a more unequal distribution of lending across banks. Contrarily, increased macroeconomic uncertainty will hinder bankers' ability to identify and channel funds towards the best opportunities, inducing more similar lending behavior across banks. Our empirical analysis provides support for the hypothesis that macroeconomic uncertainty adversely affects the efficient allocation of loanable funds.

520. Christopher F Baum, Mustafa Caglayan (University of Liverpool) and Neslihan Ozkan (University of Liverpool), "" (rev. 06/2003: 111 Kb, PDF; published,Economics Bulletin, 5:13, 2003)

Abstract: In this paper, employing VAR and factor analytic models with quarterly U.K. sectoral business investment data, we show that both common and sector-specific shocks play important roles in explaining business investment fluctuations.

519. Jushan Bai and Serena Ng (Johns Hopkins University), "" (12/2001: 340 Kb, PDF)

Abstract: This paper develops a new methodology that makes use of the factor structure of large dimensional panels to understand the nature of non-stationarity in the data. We refer to it as PANIC: a 'Panel Analysis of Non-stationarity in Idiosyncratic and Common components'. PANIC consists of univariate and panel tests with a number of novel features. It can detect whether the nonstationarity is pervasive, or variable-specific, or both. It tests the components of the data instead of the observed series. Inference is therefore more accurate when the components have different orders of integration. PANIC also permits the construction of valid panel tests even when cross-section correlation invalidates pooling of statistics constructed using the observed data. The key to PANIC is consistent estimation of the components even when the regressions are individually spurious. We provide a rigorous theory for estimation and inference. In Monte Carlo simulations, the tests have very good size and power. PANIC is applied to a panel of inflation series.

518. Jushan Bai and Serena Ng (Johns Hopkins University), "" (10/2001: 216 Kb, PDF)

Abstract: This paper uses a decomposition of the data into common and idiosyncratic components to develop procedures that test if these components satisfy the null hypothesis of stationarity. The decomposition also allows us to construct pooled tests that satisfy the cross-section independence assumption. In simulations, tests on the components separately generally have better properties than testing the observed series. However, the results are less than satisfactory, especially in comparison with similar procedures developed for unit root tests. The problem can be traced to the properties of the stationarity test, and is not due to the weakness of the common-idiosyncratic decomposition. We apply both panel stationarity and unit root tests to real exchange rates. We found evidence in support of a large stationary common factor. Rejections of PPP are likely due to non-stationarity of country-specific variations.

517. Tommaso Monacelli, "" (12/2001: 832 Kb, PDF)

Abstract: I show how to implement in a simple manner the comparison of alternative monetary policy rules in a two-country model of the new generation. These rules are: Full Price Stability, Taylor, Fixed and Managed Exchange Rates. I find, first, that the exchange rate dynamic is non-stationary unless some form of management is undertaken by the respective monetary authorities of the two countries. However, eliminating the excess volatility of the exchange rate does not significantly alter the overall macroeconomic volatility. Second, a floating exchange rate regime based on a Taylor-type rule seems to better approximate the full price stability benchmark, but at the cost of boosting interest rate volatility. In this respect limiting exchange rate flexibility is desirable. Finally, in all cases the model delivers positive cross-country correlation of interest rates but negative cross-country correlation of output.

516. Hideo Konishi and Michael Sandfort (U.S. Department of Justice), "" (rev. 11/2002: 228 Kb, PDF; published,Journal of Urban Economics, 53, 413-435)

Abstract: Planned shopping malls usually have one or more department stores (anchor stores) and multiple specialized retail stores in each commodity category. This paper presents a model of shopping malls in which these two types of stores sell noncomplementary commodities. If anchor stores sell standard (riskless yet low-value) commodities and retail stores sell specialized (high variance yet high expected value) commodities, then each type of store may bene t from collocating with the other, even though the stores sell substitutable products. The underlying intuition is that the presence of each type of retailer enhances consumer traffic at the shopping mall, which benefits the retailer or retailers of the other type. Under some parametric restrictions, the value of this increased traffic more than offsets the loss in markups due to competition from additional sellers at the mall. In this case, it is in a land developer's interest to rent retail space in the mall to both types of retailers. A Tiebout-like argument explains the striking similarity in the composition of stores in planned shopping malls.

515. Hideo Konishi and M. Utku Ünver (Koç University), "" (rev. 07/2002: 326 Kb, PDF; published,Social Choice and Welfare, 27, 3-24 (2006))

Abstract: In this paper, we analyze capacity manipulation games in hospital-intern markets inspired by the real-life entry-level labor markets for young physicians seeking residencies at hospitals. In these markets, where the matching is determined by a centralized clearinghouse called the National Residency Matching Program (NRMP) in the USA, hospitals usually report the number of vacant positions to the NRMP as well as their preferences. We consider a model where preferences of hospitals and interns are common knowledge, and hospitals play a game of reporting their capacities. We characterize the equilibria of the game-form for the two most widely used stable rules: hospital-optimal and intern-optimal stable rules. We show that (i) there may not be a pure strategy equilibrium in general; and (ii) when a pure strategy equilibrium exists other than true-capacities, truthful capacity revelation is weakly Pareto-dominated for hospitals. We also analyze other properties of the set of Nash equilibria. Finally, we present sufficient conditions on preferences to guarantee the existence of pure strategy equilibria.

514. Donald Cox, "" (10/2001: 190 Kb, PDF)

Abstract: Despite recent advances in data collection and the growing number of empirical studies that examine private intergenerational transfers, there still exist significant gaps in our knowledge. Who transfers what to whom, and why do they it? I argue that some of these gaps could be filled by departing from the standard parent-child framework and concentrating instead on fathers, mothers, sons and daughters in a way that accounts for fundamental - and sometimes obvious - male-female differences in concerns and objectives in family life. Elementary sex differences in reproductive biology constitute the basic building blocks of studies of family behavior in many disciplines, but despite recent progress they get far less attention than they deserve in economic studies of the family. I explore, separately, the implications of three basic biological facts for intergenerational transfer behavior. The first is paternity uncertainty: how does it affect the incentives of fathers, mothers and of various grandparents to invest in children? The second is differing reproductive prospects of sons versus daughters: when are sons a better investment than daughters and vice versa? The third is conflict: How much acrimony might we expect to occur in families, and why? In examining these issues I also explore household survey data from the United States. This preliminary evidence is consistent with non-biological as well as biological explanations of behavior. Nonetheless, the biological focus confers two advantages, by generating falsifiable predictions and by illuminating new avenues for empirical work. There is enormous potential for further micro-data-based empirical work in this area.

513. Kim C. Border (California Institute of Technology) and Uzi Segal, "" (10/2001: 210 Kb, PDF)

512. Uzi Segal and Joel Sobel (University of California, San Diego), "" (10/2001: 281 Kb, PDF)

Abstract: This paper provides characterization theorems for preferences. The main assumption is partial separability, where changing a common component of two vectors does not reverse strict preferences, but may turn strict preferences into indifference. We discuss applications of our results to social choice.

511. Zvi Safra (Tel Aviv University) and Uzi Segal, "" (10/2001: 208 Kb, PDF)

Abstract: This note shows that Machina's (1982) assumption that preferences over lotteries are smooth has some economic implications. We show that Fréchet differentiability implies that preferences represent second order risk aversion (as well as conditional second order risk aversion). This implies, among other things, that decision makers buy full insurance only at the absence of marginal loading. We also show that with constant absolute and relative risk aversion, expected value maximization, second order risk aversion, and Fréchet differentiability are equivalent.

510. Uriel Procaccia (Hebrew University School of Law) and Uzi Segal, "" (rev. 03/2002: 257 Kb, PDF

Abstract: The American and some other constitutions entrench property rights by requiring super majoritarian voting amending or revoking their own provisions. Following Buchanan and Tullock [5], this paper analyzes individuals' interests behind a veil of ignorance, and shows that under some standard assumptions, a (simple) majoritarian rule should be adopted. This result changes if one assumes that prefer- ences are consistent with the behavioral phenomenon known as the endowment effect." It then follows that (at least some) property rights are best defended by super majoritarian protection. The paper then shows that its theoretical results are consistent with a number of doctrines underlying American Constitutional Law.

509. G. C. Lim (University of Melbourne) and Paul D. McNelis, "" (10/2001: 180 Kb, PDF)

Abstract: This paper examines the role of interest rate policy in a small open economy subject to terms of trade shocks and time-varying currency risk responding to domestic exchange rate volatility. The private sector makes optimal decisions in an intertemporal non-linear setting with rational,forward-looking expectations. In contrast,the monetary authority practices least-squares learning about the evolution of inflation, output growth, and exchange rate depreciation in alternative policy scenarios. Interest rates are set by linear quadratic optimization, with the objectives for inflation, output growth, or depreciation depending on current conditions. The simulation results show that the preferred stance is one which targets inflation and growth, not inflation only nor inflation, growth and depreciation. Including exchange rate changes as targets significantly increases output variability, but marginally reduces inflation variability.

508. James E. Anderson and Eric van Wincoop (Federal Reserve Bank of New York), "" (09/2001: PDF; published inBrookings Trade Policy Forum 2001, Dani Rodrik and Susan Collins, eds., 2002)

507. István Kónya, "" (rev. 10/2001: 212 Kb, PDF)

Abstract: The paper develops a general equilibrium model of migration, assimilation and trade, using a random matching framework of culture and trade. The market equilibrium and the social plannerÕs solution are contrasted and policy implications are given. The model predicts that the presence of immigrants who do not assimilate into the mainstream culture is economically inefficient, but whether such migration occurs depends on the underlying parameters. Because of the endogeneity of the migration decision, care must be taken to select the optimal policy instruments. In particular, subsidizing assimilation or auctioning immigration permits do not achieve the first best. Instead, a mix of selective immigration, border control and aid to the source country can be used to promote efficiency.

506. Richard Arnott," (05/2001: 83 Kb, PDF)

505. Michele Cavallo (New York University) and Fabio Ghironi, "" (rev. 02/2002: 539 Kb, PDF; published,Journal of Monetary Economics, 49:5, 2002)

Abstract: We revisit Obstfeld and Rogoff's (1995) results on exchange rate dynamics in a two-country, monetary model with incomplete asset markets, stationary net foreign assets, and endogenous nominal interest rate setting a la Taylor (1993). Under flexible prices, the nominal exchange rate exhibits a unit root. However, today's exchange rate also depends on the stock of real net foreign assets accumulated in the previous period. The predictive power of net assets for the exchange rate is stronger the closer assets to non-stationary and the higher the degree of substitutability between domestic and foreign goods in consumption. When prices are sticky, the exchange rate still exhibits a unit root. The current level of the exchange rate depends on the past GDP differential, along with net foreign assets. Endogenous monetary policy and asset dynamics have consequences for exchange rate overshooting under both flexible and sticky prices.

504. Alberto Alesina (Harvard University), Silvia Ardagna (Wellesley College), Roberto Perotti (European University Institute) and Fabio Schiantarelli, "" (07/2000: 1945 Kb, PDF; published,American Economic Review, 2002, 92:571-589)

Abstract: This paper evaluates the effects of fiscal policy on investment using a panel of OECD countries. In particular, we investigate how different types of fiscal policy affect profits and , as a result, investment. We find a sizable negative effect of public spending -- and in particular of its public wage component -- on business investment. This result is consistent with models in which government employment creates wage pressure for the private sector. Various types of taxes also have negative effects on profits, but, interestingly, the effects of government spending on investment are larger than the effect of taxes. Our results have important implications for the so called 'Non-Keynesian' (i.e. expansionary) effects of fiscal adjustments.

503. Arturo Galindo (Inter-American Development Bank), Fabio Schiantarelli and Andrew Weiss (mdý University), "" (rev. 10/2003: 329 Kb, PDF)

Abstract: Has financial liberalization improved the efficiency with which investment funds are allocated to competing uses? In this paper, we address this question, using firm level panel data from twelve developing countries. The basic idea is to investigate whether financial liberalization has increased the share of investment going to firms with a higher marginal return to capital. To this end we develop a summary index of the efficiency of allocation of investment. We then examine the relationship between this index and various measures of financial liberalization. The results suggest that in the majority of cases financial reform has lead to an increase in the efficiency with which investment funds are allocated.

502. Richard Arnott, "" (07/2001: 134 Kb, PDF)

501. Jushan Bai and Serena Ng, "" (06/2001: 217 Kb, PDF)

Abstract: We present the sampling distributions for the coefficient of skewness, kurtosis, and a joint test of normality for time series observations. In contrast to independent and identically distributed data, the limiting distributions of the statistics are shown to depend on the long run rather than the short-run variance of relevant sample moments. Monte Carlo simulations show that the test statistics for symmetry and normality have good finite sample size and power. However, size distortions render testing for kurtosis almost meaningless except for distributions with thin tails such as the normal distribution. Nevertheless, this general weakness of testing for kurtosis is of little consequence for testing normality. Combining skewness and kurtosis as in Bera and Jarque (1981) is still a useful test of normality provided the limiting variance accounts for the serial correlation in the data.

Please note:To view Adobe Acrobat® (PDF) formatted files.


500
. Serena Ng and Pierre Perron (mdý University and CRDE, Université de Montréal), "" (06/2001: 239 Kb, Adobe Acrobat format)

Abstract: We consider issues related to the order of an autoregression selected using information criteria. We study the sensitivity of the estimated order to i) whether the effective number of observations is held fixed when estimating models of different order, ii) whether the estimate of the variance is adjusted for degrees of freedom, and iii) how the penalty for overfitting is defined in relation to the total sample size. Simulations show that the lag length selected by both the Akaike and the Schwarz information criteria are sensitive to these parameters in finite samples. The methods that give the most precise estimates are those that hold the effective sample size fixed across models to be compared. Theoretical considerations reveal that this is indeed necessary for valid model comparisons. Guides to robust model selection are provided.

499. Peter N. Ireland, "" (06/2001: 1812 Kb, Adobe Acrobat format)

Abstract: What explains the correlations between nominal and real variables in the postwar US data? Are these correlations indicative of significant nominal price rigidity? Or do they simply reflect the particular way that monetary policymakers react to developments in the real economy? To answer these questions, this paper uses maximum likelihood to estimate a model of endogenous money. This model allows, but does not require, nominal prices to be sticky. The results show that nominal price rigidity, over and above endogenous money, plays an important role in accounting for key features of the data.

498. Helen Connolly and Peter Gottschalk, "" (rev. 09/2004: 1.6 Mb, PDF; forthcoming,Canadian Journal of Economics)

Abstract: This paper asks whether wage subsidies encourages participants to move into jobs with greater wage growth. We provide an analytical framework that identifies the key causal links between earnings subsidies and both within-and between-job wage growth. This framework highlights the importance of the form of the subsidy on the decision about the type of job to accept. We find that the subsidy will lead participants to place a higher value on jobs with wage growth if the relationship between pre-and post-subsidy earnings is convex, but the subsidy is predicted to have no effect on within-job wage growth if the transformation is linear. The subsidy is also predicted to affect between-job wage growth by increasing on-the- job search and altering the reservation wage. We use this framework to analyze the effects of the Canadian Self-Sufficiency Project experiment. We find that this subsidy did not affect within-job wage growth but did increase wage gains between jobs.

497. Hideo Konishi, "" (03/2001: 114 Kb, Adobe Acrobat format; published in Economics Bulletin, 2001)

Abstract: This note shows that if commodities are tradable across jurisdictions, then it may be efficient to have heterogeneously sized jurisdictions, even if (i) consumers are identical, (ii) there is one private good and one public good, (iii) utility and production functions are not affected by population (within the relevant range of sizes of jurisdictions).

496. Ingela Alger and Ching-to Albert Ma (mdý University), "" (2/2001: 246 Kb, Adobe Acrobat format; forthcoming,Journal of Economic Behavior and Organization)

Abstract: A risk-averse consumer purchases an insurance policy; if she suffers a loss, she may receive services from a provider to recover some of the loss. Only the consumer and the provider know if the loss has actually occurred. The provider's behavior is uncertain. With some positive probability, the provider is honest, reporting the loss information truthfully to the insurer; with the complementary probability, the provider reports the information strategically, by writing a side-contract with the consumer to maximize the joint surplus of the provider-consumer coalition. We show that there is a loss of generality in considering only collusion-proof contracts, and characterize equilibria implemented by collusion-proof and noncollusion-proof contracts. When the probability of a provider acting collusively is small, the equilibrium contract is not collusion-proof but approximately first-best. When the probability of a provider acting collusively is large, the equilibrium contract is independent of this probability and identical to the equilibrium collusion-proof contract when the provider is collusive with probability 1.

495. Ingela Alger and François Salanié, "" (Rev. 11/2004: 196 Kb, Adobe Acrobat format)

Abstract: Consumers often have to rely on an expert's diagnosis to assess their needs. If the expert is also the seller of services, he may use his informational advantage to induce over-consumption. Empirical evidence suggests that over-consumption is a pervasive phenomenon in experts markets. We prove the existence of equilibrium over-consumption in an otherwise purely competitive model. This market failure results from the freedom of consumers to turn down an expert's recommendation: experts defraud consumers in order to keep them uninformed, as this deters them from seeking a better price elsewhere. Our model also yields predictions on the diagnosis price that are in line with stylized facts, and provides a theory for why risk-neutral consumers would demand extended warranties on durables.

494. Hideo Konishi, "" (rev. 11/2002: 492 Kb, Adobe Acrobat format, forthcoming,Transportation Science)

Abstract: This paper discusses uniqueness and efficiency of user equilibrium in transportation networks with heterogeneous commuters. Daganzo (1983, Transportation Science) proved the uniqueness of (stochastic) user equilibrium when commuters have heterogeneous tastes over possible paths but identical disutility functions from time costs. We first show, by example, that his result may not apply in general networks if disutility functions are allowed to differ. However, for ÒsimpleÓ transportation networks, we can show that user equilibrium is always unique and weakly Pareto efficient (cf. the Braess example) for a general class of utility functions. We investigate if this result applies to more general networks. We also show that user equilibrium is unique in a dynamic bottleneck model with a simple network. We discuss an interesting relationship between the following two problems: the existence of user equilibrium in a finite model and the uniqueness of user equilibrium in a continuum model. In the appendix, we also provide a proof of a slightly generalized version of DaganzoÕs theorem.

493. Maria Laura Parisi and Alessandro Sembenelli (Torino University), "" (2/2001: 243 Kb, Adobe Acrobat format)

Abstract: In this paper empirical evidence is presented on the elasticity of private R&D spending on its price. A censored panel-data regression model with random effects is applied to a balanced panel of 726 Italian firms over the 1992-97 period. Implied estimates point out that Italian firms' response to policy measures (including tax credits), aimed at reducing the user cost of R&D capital, is likely to be substantial (1.50-1.77). Furthermore, we also find that the elasticity of R&D spending is higher in recession (2.01) than in expansion (0.87).

492. Christopher F Baum and John Barkoulas (Georgia Southern University), "" (rev. 06/2004: 2.6 Mb, Adobe Acrobat format; forthcoming,Journal of Money, Credit and Banking)

Abstract: A number of previous studies have questioned the dominant role of Germany within the EMS. These conclusions are often based on empirical findings that interest rates of member countries of the EMS are not affected by German interest rates, even in the long run. In this study we establish evidence to the contrary by demonstrating that intra-EMS interest rate differentials (vis-a-vis Germany) exhibit mean-reverting behavior (a necessary but not sufficient condition for German dominance) characterized by long-memory dynamics. In a system incorporating six EMS countries and one non-EMS country (the U.S.), estimates from a fractional error correction model suggest the presence of short-run intra-EMS monetary-policy interdependencies, but they validate the German Dominance Hypothesis in the long run.

491. Peter N. Ireland, "" (2/2001: 271 Kb, Adobe Acrobat format)

Abstract: This paper extends a conventional cash-in-advance model to incorporate a real balance effect of the kind described by de Scitovszky, Haberler, Pigou, and Patinkin. When operative, this real balance effect eliminates the liquidity trap, allowing the central bank to control the price level even when the nominal interest rate hits its lower bound of zero. Curiously, the same mechanism that gives rise to the real balance effect also implies that monetary policies have distributional consequences that make some agents much worse off under a zero nominal interest rate than they are when the nominal interest rate is positive.

490. Donald Bruce (University of Tennessee), Douglas Holtz-Eakin (Syracuse University and NBER) and Joseph F. Quinn, "" (11/2000: 84 Kb, Adobe Acrobat format)

489. Ingela Alger and Régis Renault (GEMMA, Université de Caen), "" (rev. 11/2004: 191 Kb, Adobe Acrobat format)

Abstract: We explore the potential for discriminating between honest and dishonest agents, when a principal faces an agent with private information about the circumstances of the exchange (good or bad). When honest agents reveal circumstances truthfully independently of the contract offered, the principal leaves a rent only to dishonest agents (even if honest agents are willing to lie about their ethics); the principal is able to screen between good and bad circumstances. In contrast, if honest behavior is conditional on the contract being fair, the principal cannot screen along the ethics dimension. If the probability that the agent is dishonest is large, the optimal mechanism is as if the agent were dishonest with certainty (standard second best). Otherwise, it is as if the agent were honest with certainty (first best). In the latter case, the principal is unable to screen between circumstances if the agent is dishonest.

488. Christopher F Baum, Mustafa Caglayan (University of Liverpool) and Neslihan Ozkan (University of Liverpool), "" (rev. 07/2002: 1349 Kb, Adobe Acrobat format; published,Journal of Applied Econometrics, 2004, 19:1-23)

Abstract: In this paper, we empirically investigate the impact of exchange rate volatility on real international trade flows utilizing a 13-country dataset of monthly bilateral real exports for 1980--1998. We compute one-month-ahead exchange rate volatility from the intra--monthly variations in the exchange rate to better quantify this latent variable. We find that the effect of exchange rate volatility on trade flows is nonlinear, depending on its interaction with the importing country's volatility of economic activity, and that it varies considerably over the set of country pairs considered.

487. Maurizio Zanardi, "" (11/2000: 352 Kb, Adobe Acrobat format)

Abstract: Empirical evidence for the United States shows that many antidumping petitions are withdrawn before the International Trade Commission and the International Trade Administration complete their investigations. Prusa (1992) argues that petitions are used by domestic industries to threaten and induce foreign industries into a collusive agreement exonerated from antitrust concerns because of US trade laws. In his model, all antidumping petitions should be withdrawn, which is not the case. This paper provides a model in which only some petitions are withdrawn and Prusa's result is just a special case. The decision to withdraw a petition depends on two key parameters: the coordination cost and the bargaining power of domestic and foreign industries. A new dataset is constructed to test the theoretical model on the US experience for the period 1980-1992. The econometric analysis supports the theoretical conclusions of the model. This finding is consistent with the hypothesis that the antidumping law is used as a collusive device.

486. Peter Gottschalk, "" (revised 04/2001: 280 Kb, Adobe Acrobat format)

Abstract: This paper presents evidence on the extent of wage mobility both while working for the same firm and when moving to a new firm. We find that mean wage growth between jobs is large in comparison to wage growth while working for the same employer,especially for less educated workers who experience low mean wage growth between jobs but even lower wage growth while working for the same employer.There is, however, substantial heterogeneity in wage growth both within and between firms. We, therefore, focus on both the means of the wage change distributions and on the probability that a worker does not experience real wage growth either while working for the same employer or moving to a new employer. We find that while real wages do grow with experience on the average job, a substantial proportion of workers experience real declines in wages while working for the same employer or moving to a new employer.

485. James E. Anderson and Eric van Wincoop (Federal Reserve Bank of New York},"" (revised 12/2000: PDF; published,American Economic Review, 2003, 93, 170-92)

Abstract: The gravity model has been widely used to infer substantial trade flow effects of institutions such as customs unions and exchange rate mechanisms. McCallum [1995] found that the US-Canada border led to trade between provinces that was a factor 22 (2,200%) times trade between states and provinces, a spectacular puzzle in light of the low formal barriers on this border. We show that the gravity model usually estimated does not correspond to the theory behind it. We solve the "border puzzle" by applying the theory seriously. We find that national borders reduce trade between the US and Canada by about 40%, while reducing trade among other industrialized countries by about 30%. The spectacular McCallum headline number is the result of a combination of omitted variables bias and the small size of the Canadian economy.

484. Hideo Konishi, Thomas Quint (University of Nevada at Reno) and Jun Wako (Gakushuuin University), "" (07/2000: PDF; published,Journal of Mathematical Economics35, 1-15, (2001))

Abstract: We study a generalization of Shapley-Scarf's (1974) economy in which multiple types of indivisible goods are traded. We show that many of the distinctive results from the Shapley-Scarf economy do not carry over to this model, even if agents' preferences are strict and can be represented by additively separable utility functions. The core may be empty. The strict core, if nonempty, may be multi-valued, and might not coincide with the set of competitive allocations. Furthermore, there is no Pareto efficient, individually rational, and strategy-proof social choice rule. We also show that the core may be empty in the class of economies with a single type of indivisible good and agents consuming multiple units, even if no complementarity exists among the goods.

483. Tommaso Monacelli, "" (11/2000: 353 Kb, Adobe Acrobat format)

Abstract: I study the macroeconomic costs (both in terms of stabilization and welfare) of the relinquishment of monetary policy independence associated with the membership of a currency area. The analysis is framed within a general equilibrium model of the world economy, composed by a large closed Union and a small (either independent or integrated) open economy. In terms of business cycle stabilization, I find that an economy relinquishing its monetary independence may face a potential trade-off between higher instability in real activity and lower instability in inflation. The tightness of this trade-off is found to be inversely related to the degree of cross-country symmetry of the shocks. In terms of welfare, maintaining the monetary stabilization tool proves to be always welfare improving. Finally, a higher degree of openness does not necessarily make a country a better candidate for participating in a currency area.

482. Donald Cox, Serena Ng, and Andreas Waldkirch, "" (10/2000: 229 Kb, Adobe Acrobat format)

Abstract: Consumption is partly a social activity, yet most studies of consumer behavior treat households in isolation. We investigate familial relationships in consumption patterns using a sample of parents and their children from the Panel Study of Income Dynamics. We find a positive and statistically significant parent-specific effect on childrenÕs consumption even after controlling for the effect of parental income, and we find similar effects for sibling pairs. Child consumption responds negatively to large post-retirement shortfalls in consumption of the parents. This behavior holds up even after allowing for the possibility of smaller parent-to-child transfers made necessary by the parental consumption shortfalls. These results suggest that although income is an important source of the intergenerational correlation, parental choices and experiences also affect consumption behavior of the children.

481. Barry Eichengreen (Berkeley) and Fabio Ghironi, "" (Revised 05/2001: 103 Kb, Adobe Acrobat format; previously circulated as "The Future of EMU" and "EMU in 2010: Heterogeneity, Institutions, and Fiscal Policy")

Abstract: We speculate about how Europe's monetary union will evolve in the next five to ten years. We concentrate on what is likely to be the most important change in that period, namely, the increased number and heterogeneity of the participating states. New members will be sharply different from the incumbents in terms of their per capita incomes and economic structures. We concentrate on the implications of this development for the structure, organization and operation of the monetary union. We focus on the implications for the conduct of monetary policy of voting and representation rules on the ECB Board on the grounds that these will have to change with the accession of additional members. We focus on prudential supervision and lending in the last resort on the grounds that the inclusion of countries with recently-created and still-developing financial systems will be among the most prominent consequences of EMU enlargement. We focus on the coordination of fiscal policies on the grounds that the fiscal positions and problems of the accession economies will differ from those of the incumbents. And we focus on labor market flexibility on the grounds that labor-market effects will be among the leading consequences of the admission of new members.

480. James E. Anderson and Oriana Bandiera (London School of Economics), "" (10/2000: PDF; published as "Private Enforcement and Social Efficiency",Journal of Development Economics, 2005, 77, 341-66)

Abstract: Historical evidence suggests that Mafias originally formed to provide enforcement of legitimate property rights when state enforcement was weak. We provide a general equilibrium model of Mafias as enforcement coalitions which protect property from predators. Both the level of predation and the type of enforcement -- self-enforcement, specialized competitive enforcement and Mafia enforcement -- are endogenous. We identify the conditions under which a coalition emerges and persists and show that Mafias are most likely to be found at intermediate stages of economic development. We also show that Mafias might provide better enforcement to the rich than would a welfare-maximizing state, suggesting a difficulty in the emergence and persistence of state provision of enforcement.

479. Arthur Lewbel, "" (10/2000: 188 Kb, Adobe Acrobat format)

Abstract: This paper proposes a form of asymptotic trimming to obtain root n convergence of functions of kernel estimated objects. The trimming is designed to deal with the boundary effects that arise in applications where densities are bounded away from zero.

478. Hideo Konishi and Debraj Ray (New York University), ""( rev. 04/2002: 308 Kb, Adobe Acrobat format; published,Journal of Economic Theory, 110, 1-41)

Abstract: We study coalition formation as an ongoing, dynamic process,with payoffs generated as coalitions form, disintegrate, or regroup. A process of coalition formation (PCF) is an equilibrium if a coalitional move to some other state can be "justified" by the expectation of higher future value, compared to inaction. This future value,in turn,is endogenous: it depends on coalitional movements at each node. We study existence of equilibrium PCFs. We connect deterministic equilibrium PCFs with unique absorbing state to the core, and equilibrium PCFs with multiple absorbing states to the largest consistent set.In addition, we study cyclical as well as stochastic equilibrium PCFs.

477. Douglas Marcouiller, S.J., "" (08/2000: 116 Kb, Adobe Acrobat format)

Abstract: Insecurity impedes trade. Using a variant of the gravity model Ð the workhorse of empirical international economics Ð Anderson and Marcouiller (1999) showed that transparent government policies and enforceable commercial contracts significantly reduce trade costs and increase trade volume. This paper asks two further questions. Does insecurity impede some types of trade more than others? Do different dimensions of insecurity affect different types of trade differently?

476. Fabio Ghironi and Alessandro Rebucci (International Monetary Fund), "" (Revised 08/2001: 908 Kb, Adobe Acrobat format)

Abstract: We compare the performance of a currency board arrangement, inflation targeting, and dollarization in a small open, developing economy with liberalized capital account. We focus explicitly on the transmission of shocks to currency and country risk premia in international financial markets and on the role of fluctuations in premia in the propagation of other shocks. We calibrate our model on Argentina. The framework fits the data relatively well in that it matches the second moments of several key macro variables. Welfare analysis suggests that dollarization is preferable to the alternative regimes we consider because it removes the volatility that originates from the currency premium. However, a currency board can match dollarization if the central bank holds a sufficiently large stock of foreign reserves on average.

Please note:To view Adobe Acrobat® (PDF) formatted files.


475
. James E. Anderson and Leslie Young (Chinese University of Hong Kong), "" (03/2000: 80 Kb, Adobe Acrobat format)

Abstract: Without the rule of law, traders who incur trading costs can be held up by counter-parties who are stronger in anarchic bargaining. The favourable terms which the latter extract can overcrowd that side of the market, dissipating the benefits. We establish plausible necessary and sufficient conditions for a move from anarchy toward the rule of law to benefit all traders. The rule of law might be delayed, not only by the difficulties of setting up legal institutions, but by monopolistic traders that have meantime emerged to address the inefficiencies of anarchic trade. These monopolistic traders must also guarantee atomistic traders against holdup.

474. Richard Arnott and An Yan, "" (09/2000: 124 Kb, Adobe Acrobat format)

Abstract: Suppose that there are two congestible modes of travel from A to B - road and rail for concreteness - which are imperfect substitutes in demand. Road congestion from A to B is underpriced; this is an unalterable distortion. Compared to the first best, should the transportation planner choose a wider or narrower road, raise or lower the rail fare, and expand or contract rail capacity? This paper provides a synthetic review of the literature on the problem, presents some new results, and discusses directions for future research on this and related second-best problems.

473. Helen Connolly and Peter Gottschalk, "" (11/2000: 177 Kb, Adobe Acrobat format)

Abstract: This paper explores whether within job and between job wage growth is lower for less-educated workers.While a simple model of heterogeneous learning ability predicts that individuals with low learning ability will have flatter wage profiles,this prediction has been largely ignored in the recent welfare reform debates. The key econometric problem in estimating returns to tenure and experience is that wages depend on the unobservable job match component, which is endogenous. We depart from the standard method for dealing with this problem in one important way.We show that this alternative implies that wages grow with the number of previous successful job matches. In our empirical work we show that this source of between job wage growth is large. Furthermore, we show that this source of wage growth,as well as the standard returns to tenure and experience,are substantially smaller for the least educated.

472. Basma Bekdache (Wayne State University) and Christopher F Baum, "" (09/2000: 224 Kb, Adobe Acrobat format)

Abstract: This paper shows that the recent literature that tests for a long-run Fisher relationship using cointegration analysis is seriously flawed. Cointegration analysis assumes that the variables in question are I(1) or I(d) with the same d. Using monthly post-war U.S. data from 1959-1997, we show that this is not the case for nominal interest rates and inflation. While we cannot reject the hypothesis that nominal interest rates have a unit root, we find that inflation is a long-memory process. A direct test for the equality of the fractional differencing parameter for both series decisively rejects the hypothesis that the series share the same order of integration.

471. Fabio Ghironi, "" (rev. 02/2003: 580 Kb, Adobe Acrobat format)

Abstract: I develop a tractable, two-country, real model of macroeconomic interdependence with a role for net foreign asset dynamics. Absence of Ricardian equivalence in an overlapping generations structure ensures existence of a well-defined, endogenously determined, steady-state, international distribution of asset holdings, to which the world economy returns following temporary shocks. The model offers a plausible explanation for the failure of statistical tests to reject the hypothesis of a unit root in series of net foreign assets. Model dynamics after productivity shocks are significantly different from those of a setup in which net foreign assets do not move after shocks, such as Corsetti and Pesenti's (2001a) model. The difference relative to a complete markets economy in which net foreign asset movements play no role in shock transmission is smaller. It is amplified if the substitutability across goods rises, if shocks are permanent, and if steady-state net foreign assets are not zero.

470. Fabio Ghironi, "" (01/2000: 271 Kb, Adobe Acrobat format)

Abstract: This paper proposes a microfounded general equilibrium model of the U.S. and European economies suitable for analyzing the transmission of monetary and fiscal policy shocks between the U.S. and Europe. The focus is on understanding the determinants of transatlantic economic interdependence. A positive analysis of the consequences of policy changes in the U.S. and Europe is made and results about the transmission of such shocks are obtained. In the model, consumer preferences in the U.S. and Europe are biased in favor of goods produced in the continent where agents reside. Hence, PPP does not hold across the Atlantic, except in steady state. However, this is not sufficient to cause overshooting of the dollar exchange rate following policy shocks. U.S. current-account surplus can be achieved by means of a monetary expansion, a persistent increase in government spending, and/or higher long-run distortionary taxes relative to Europe.

469. Fabio Ghironi, "" (02/2000: 215 Kb, Adobe Acrobat format)

Abstract: I estimate the structural parameters of a small open economy model using data from Canada and the United States. The model improves upon the recent literature in open economy macroeconomics from an empirical perspective. I estimate parameters by using non-linear least squares at the single-equation level. Estimates of most parameters are characterized by small standard errors and are in line with the findings of other studies. I also develop a plausible way of constructing measures for non-observable variables. To verify if multiple-equation regressions yield significantly different estimates, I run full information maximum likelihood, system-wide regressions. The results of the two procedures are similar. Finally, I illustrate a practical application of the model, showing how a shock to the U.S. economy is transmitted to Canada under an inflation targeting monetary regime.

468. Gianluca Benigno (Bank of England), Pierpaolo Benigno (New York University) and Fabio Ghironi, "" (rev. 10/2003: 180 Kb, Adobe Acrobat format)

Abstract: This paper shows that properly designed interest rate rules can be consistent with maintaining exchange rate stability. It sheds light on the relation between interest rate rules, exchange-rate regimes, and determinacy of the rational expectations equilibrium in a modern macroeconomic framework.

467.Barry Eichengreen (Berkeley) and Fabio Ghironi, "" (11/1999: 267 Kb, Adobe Acrobat format)

Abstract: This paper studies the impact of changes in the extent to which fiscal policy is distortionary on the short-run macroeconomic tradeoffs facing fiscal policymakers in an era of budget equilibrium. It does so in an open economy framework, that we use to interpret U.S.-European policy interactions. Our analysis features both fiscal and monetary policy to study how changes in the extent to which fiscal policy is distortionary affect the interaction between central banks and fiscal authorities, both intra- and internationally. In addition, strategic interactions among policymakersÑand the tradeoffs they faceÑare affected by the exchange-rate regime. When government spending is funded through distortionary taxes aloneÑa scenario that we call anti-Keynesian, changing spending moves both inflation and employment in the desired direction following a worldwide supply shock. Smaller and more open economies face a more favorable tradeoff than large relatively closed ones. Under a managed exchange rate regime, European governments face a better tradeoff than under flexible rates, but the improvement is more significant for the country that controls the exchange rate. When both European countries in our model join in a monetary union, the country that had control of the exchange rate under the managed exchange rate regime faces a worse tradeoff, while the tradeoff improves for the country that controlled money supply. In the fully Keynesian case, in which taxes are non-distortionary, all countries face the same positively sloped tradeoff regardless of the exchange-rate regime. Increases in spending cause both output and inflation to rise. When fiscal policy is neither fully anti-Keynesian nor fully Keynesian, the governmentsÕ tradeoffs lie in between the extreme cases, and the exact position depends on the extent to which fiscal policy is Keynesian. Under all European exchange-rate regimes, small increases in the fraction of firms that are subject to distortionary taxation at home are beneficial when the equilibrium is characterized by unemployment, while a less Keynesian fiscal policy abroad is harmful. Governments in the U.S. and Europe will want the ECB and the Fed to coordinate their reactions to an unfavorable supply shock, while monetary policymakers will have little incentive to do so. Intra-European fiscal cooperation can be counterproductive, whereas cooperation between governments and central banks inside each continent can be beneficial. Our study suggests that, if governments are concerned mainly about the relation between fiscal policy and the business cycle, maintaining some fiscal distortions may be optimal.

466. Fabio Ghironi, "" (Revised 10/2000: 265 Kb, Adobe Acrobat format)

Abstract: I compare the performance of alternative monetary rules for Canada using an open economy model under incomplete markets. Different rules generate different paths for the markup and the terms of trade. A comparison of welfare levels suggests that flexible inflation targeting, the Bank of CanadaÕs current policy, dominates strict targeting rulesÑamong which a fixed exchange rate with the U.S.Ñand the Taylor rule. In contrast to other studies, strict targeting rules generate a more stable real economy by stabilizing markup dynamics. Flexible inflation targeting dominates because it yields a positive covariance between consumption and the labor effort, which provides agents with a source of risk diversification.

465. Fabio Ghironi, "" (rev. 08/2003: 563 Kb, Adobe Acrobat format)

Abstract: This paper develops a small open economy, sticky-price model with a role for current account dynamics in the transmission of shocks. I solve the stationarity problem of incomplete markets, open economy models by adopting an overlapping-generations structure. I model nominal rigidity by assuming that firms face costs of output price inflation volatility. Markup dynamics affect labor demand and investment decisions. To illustrate the functioning of the model, I identify the home economy with Canada and analyze how a recession in the U.S. is transmitted to Canada under alternative inflation targeting rules. Stabilizing inflation (in consumer or producer prices) at a steady-state target in all periods results in a milder, but more persistent recession than a rule under which the interest rate reacts to inflation in a Taylor fashion. Markup dynamics and changes in asset holdings are central to this result.

464. Natalya Delcoure (Louisiana Tech University), John Barkoulas (University of Tennessee), Christopher F. Baum and Atreya Chakraborty (Charles River Associates), "" (06/2000: 76 Kb, Adobe Acrobat format; published,Global Finance Journal, 2003, 14, 83-93.)

Abstract: Under conditions of risk neutrality and rational expectations in the foreign exchange market, there should be a one-to-one relationship between the forward rate and the corresponding future spot rate. However, cointegration-based tests of the unbiasedness hypothesis of the forward rate have produced mixed findings. In order to exploit significant cross-sectional dependencies, we test the unbiasedness hypothesis using a new multivariate (panel) unit-root test, the Johansen likelihood ratio (JLR) test, which offers important methodological advantages over alternative standard panel unit-root tests. When applied to a data set of eight major currencies in the post-Bretton Woods era, the JLR test provides strong and robust evidence in support of a unitary cointegrating vector between forward and corresponding future spot rates. However, the orthogonality condition is satisfied only for three major currencies.

463. Arthur Lewbel, "" (rev. 04/2003: 164 Kb, Adobe Acrobat format)

Abstract: Past parametric tests of demand system rank employed polynomial Engel curve systems. However, by Gorman's (1981) theorem, the maximum possible rank of a utility derived polynomial demand system is three. The present paper proposes a class of demand systems that are utility derived, are close to polynomial, and have rank four. These systems nest rational polynomial demands, and so can be used to test ranks up to four. These systems are suitable for applications where high rank is likely, such as demand systems involving a large number of goods. A test of rank using this new class of systems is applied to UK consumer demand data.

462. Arthur Lewbel, "" (rev. 10/2003: 447 Kb, Adobe Acrobat format; previously titled "Selection Model and Conditional Treatment Effects, Including Endogenous Regressors")

Abstract: In a sample selection or treatment effects model, common unobservables may affect both the outcome and the probability of selection in unknown ways. This paper shows that the distribution function of potential outcomes, conditional on covariates, can be identified given an observed instrument V that affects the treatment or selection probability in certain ways and is conditionally independent of either potential outcomes themselves or potential outcome equation error terms. Selection model estimators based on this identification are provided, which take the form of simple weighted averages, which take the form of either simple weighted averages or GMM or two stage least squares. These estimators permit endogenous and mismeasured regressors. Empirical applications are provided to estimation of a firm investment model and a returns to schooling wage model.

461. John Barkoulas (Louisiana Tech University), Christopher F. Baum and Atreya Chakraborty (Charles River Associates), "" (rev. 09/2001: 194 Kb, Adobe Acrobat format; published 2003,Journal of Macroeconomics, 25, 109-122)

Abstract: A plausible explanation for cointegration among spot currency rates determined in efficient markets is the existence of a stationary, time-varying currency risk premium. Such an interpretation is contingent upon stationarity of the forward premium. However, empirical evidence on the stochastic properties of the forward premium series has been inconclusive. We apply a panel unit-root testÐthe Johansen likelihood ratio (JLR) testÐto forward exchange premiums by utilizing cross-sectional information from their term structure. In contrast to earlier studies, the JLR test provides decisive and temporally stable evidence in support of stationary forward premiums, and therefore foreign exchange market efficiency, for six major currencies.

460. Peter N. Ireland, "" (06/2000: 243 Kb, Adobe Acrobat format)

Abstract: In cash-in-advance models, necessary and sufficient conditions for the existence of an equilibrium with zero nominal interest rates and Pareto optimal allocations place restrictions only on the very long-run, or asymptotic, behavior of the money supply. When these asymptotic conditions are satisfied, they leave the central bank with a great deal of flexibility to manage the money supply over any finite horizon. But what happens when these asymptotic conditions fail to hold? This paper shows that the central bank can still implement the Friedman rule if its actions are appropriately constrained in the short run.

459. Peter Gottschalk and Enrico Spolaore (Brown University), "" (rev. 03/2001: 275 Kb, Adobe Acrobat format; forthcoming,Review of Economic Studies)

Abstract: This paper presents a framework for the evaluation and measurement of reversal and origin independence as separate aspects of economic mobility. We show how that evaluation depends on aversion to multi-period inequality, aversion to inter-temporal fluctuations, and aversion to second-period risk. We construct extended Atkinson indices that allow us to quantify the relative impact of reversal and time independence. We apply our approach to the comparison of income mobility in Germany and in the United States. We find that the ranking of Germany and the US on the extent of reversal depends on the degree of aversion to inequality. Reversal has a higher impact in the US than in Germany for lower degrees of aversion to multi-period inequality, while reversal has higher impact in Germany for higher degrees of inequality aversion. By contrast, Americans gain more than Germans from origin independence for a large range of degrees of inequality aversion.

458. Peter N. Ireland, "" (04/2000: 313 Kb, Adobe Acrobat format)

Abstract: A small, structural model of the monetary business cycle implies that real money balances enter into a correctly-specified, forward-looking IS curve if and only if they enter into a correctly-specified, forward-looking Phillips curve. The model also implies that empirical measures of real balances must be adjusted for shifts in money demand to accurately isolate and quantify the dynamic effects of money on output and inflation. Maximum likelihood estimates of the model's parameters take both of these considerations into account, but still suggest that money plays a minimal role in the monetary business cycle.

457. Arthur Lewbel, "" (01/2000: 156 Kb, Adobe Acrobat format; published inEconometric Theory, 2000, 16, 603-609)

Abstract: Misclassification in binary choice (binomial response) models occurs when the dependent variable is measured with error, that is, when an actual "one" response is sometimes recorded as a zero, and vice versa. This paper shows that binary choice models with misclassification are semiparametrically identified, even when the probabilities of misclassification depend in unknown ways on model covariates, and the distribution of the errors is unknown.

456. Arthur Lewbel, "" (rev. 07/1999: 135 Kb, Adobe Acrobat format)

Abstract: A model is defined to be coherent if for each value of the regressors and errors there exists a unique corresponding value for the endogenous variables. This note provides necessary and sufficient conditions for the coherent specification of a simultaneous two equation system where one equation is a binary choice model, and the other is an arbitrary model (e.g. a nonlinear regression, a censored regression, or another binary choice). Some examples and extensions are provided, including a suggested functional form for simultaneous systems of binary choice equations. These models may be used to test if choices are substitutes or complements.

455. Bo E. Honoré (Princeton University) and Arthur Lewbel, "" (Rev. 09/2001: 232 Kb, Adobe Acrobat format)

Abstract: Previous estimators of binary choice panel data models with fixed effects require strong parametric error asumptions, strictly exogeneous regressors, or both. This is because nonlinearity of the model precludes the use of the "moment conditions on differences" based estimators that are generally employed for linear models without strictly exogeneous regressors. Based on the cross section binary choice estimator in Lewbel (2000a), we show how discrete choice panel data models with fixed effects can be estimated with only predetermined regressors. The estimator is semiparametric in that the error distribution is not specified, it allows for some general forms of heteroskedasticity, and converges at rate root n.

454. Arthur Lewbel, "" (08/1999: 348 Kb, Adobe Acrobat format;Journal of Econometrics, 97:1, 145-177, 2000)

Abstract: This paper provides estimators of discrete choice models, including binary, ordered, and multinomial response (choice) models. The estimators closely resemble ordinary and two stage least squares. The distribution of the model's latent variable error is unknown and may be related to the regressors, e.g., the model could have errors that are heteroscedastic or correlated with regressors. The estimator does not require numerical searches, even for multinomial choice. For ordered and binary choice models the estimator is root N consistent and asymptotically normal. A consistent estimator of the conditional error distribution is also provided.

453. Michael Sandfort (U.S. Department of Justice) and Hideo Konishi, "" (01/2000: 262 Kb, Adobe Acrobat format; formerly titled "Price Advertisement by Retail Stores: A Commitment Device"; published,International Journal of Industrial Organization, 20, 965-994, 2002)

Abstract: Price advertisement by retail stores is pervasive. If there exist non-negligible costs of consumer search, a retailer can increase the number of consumers visiting its location by advertising a low price, thus increasing consumers' expected utilities from search. If the increase in the number of consumers who visit the store is substantial, then the store's profit goes up even though low prices decrease profit margins. We show that this intuition extends to the case of a multi-product monopolist, who may choose to advertise very low prices for a limited number of items it carries, even when advertised and non-advertised commodities are substitutes. Finally, we analyze a retail duopoly in which both stores sell from the same location, showing that under some circumstances, there is an incentive for one of the retailers to free-ride on the other's advertisement.

452. John P. Conley (University of Illinois at Urbana Champaign) and Hideo Konishi, "y" (01/2000: 158 Kb, Adobe Acrobat format; published inJournal of Public Economics, 86, 241-260)

Abstract: Tiebout's basic claim was that when public goods are local there is an equilibrium and every equilibrium is efficient. The literature seems fall short of verifying this conjecture: If the notion of equilibrium is too weak then equilibrium is nonempty yet some equilibria could be inefficient. On the other hand, if the notion of equilibrium is too strong, then every equilibrium is efficient yet equilibrium may be empty. This paper introduces a new equilibrium notion, a \textit{migration-proof Tiebout equilibrium}, which is a jurisdiction structure such that (i) no consumer wants to migrate unilaterally across jurisdictions (free mobility of consumers), and (ii) no subgroup of consumers want to form a new jurisdiction that would not create instability in population distribution (free entry of migration-proof jurisdictions). We show that there is always a unique migration-proof equilibrium and is asymptotically efficient when consumers are homogeneous.

451. Marcus Berliant (Washington University) and Hideo Konishi, "" (01/2000: 314 Kb, Adobe Acrobat format; published inRegional Science and Urban Economics30, 289-324 (2000))

Abstract: Much of the literature on the endogenous generation of a city employs increasing returns to scale in order to obtain agglomeration. In contrast, the model considered here focuses on the role of marketplaces or trading centers in the agglomeration of population as cities. Gains to trade in combination with transportation and marketplace setup costs suffice to endogenously generate a city or cities with one or multiple marketplaces. It is assumed that consumers are fully mobile while production functions are location-specific. The exchange of commodities takes place in competitive markets at the marketplaces, while the number and locations of the marketplaces are determined endogenously using a core concept. Unlike the standard literature of urban economics, our model can deal with differences in geography by letting the setup costs of marketplaces and the transportation system depend on location. After showing that an equilibrium exists and that equilibrium allocations are the same as core allocations, we investigate the equilibrium number and locations of marketplaces, the population distribution, and land prices. In contrast with earlier literature, the results are general in the sense that specific functional forms are not needed to obtain existence of equilibrium, equilibria are first best, and equilibria are locally unique (in our examples).

Please note:To view Adobe Acrobat® (PDF) formatted files.


450
. Hideo Konishi and Michael Sandfort (Southern Methodist University), "" (01/2000: 245 Kb, Adobe Acrobat format; published inJournal of Economic Dynamics and Control, 26, 1029-1052)

Abstract: We prove the existence of stationary equilibrium in the primary and secondhand markets for an indivisible consumer durable in a general model with stochastic degradation and endogenous scrappage decisions. Unlike Rust (1985), we introduce transaction costs in the model as a motivation for consumer holdings of durables across multiple quality levels. In addition, we allow for multiple types of durables (e.g., Porsche and BMW). Since we use a fixed point theorem in making the existence argument, we do not need to invoke the single-crossing property on consumer tastes.

449. Suryapratim Banerjee (McGill University), Hideo Konishi and Tayfun Sonmez (Koc University and University of Michigan), "" (12/1999: 268 Kb, Adobe Acrobat format; published,Social Choice and Welfare18, 135-153, (2001))

Abstract: We analyze the core of a class of coalition formation game in which every player's payoff depends only on the members of her coalition. We first consider anonymous games and additively separable games. Neither of these strong properties guarantee the existence of a core allocation, even if additional strong properties are imposed. We then introduce two top-coalition properties each of which guarantee the existence. We show that these properties are independent of the Scarf-balancedness condition. Finally we give several economic applications.

448. Hideo Konishi, "" (12/1999: 244 Kb, Adobe Acrobat format; published inJournal of Urban Economics48, 1-28 (2000))

Abstract: Many cities are located on rivers or coasts. This paper argues that such cities developed as transportation hubs or markets for interregional trade, since these locations provide better access (lower marginal transportation costs) to other regions. Local products are collected at such hubs, and interregional trade then takes place among these transportation hubs. As the volume of trade between hubs increases, more workers are needed in order to meet labor demand for shipping and handling commodities, resulting in population agglomeration at such hubs. This paper constructs a simple three location-identical consumer model, in which transportation hub and population agglomeration emerge endogenously. In contrast with much of the literature on city formation, we introduce no economies of scale into the model. Markets are assumed to be perfectly competitive and complete. Since prices are determined in equilibrium, transportation costs and routes are simultaneously determined in the system. Population agglomeration occurs solely because of location-specific production technologies (which generates gains from trade) and the differences in transportation technologies among locations (which determines the transportation routes). It is shown that a hub city emerges when transportation technologies are heterogeneous enough.

447. Hideo Konishi, "" (12/1999: 256 Kb, Adobe Acrobat format)

Abstract: Geographical concentration of stores that sell similar commodities is pervasive. To analyze this phenomenon, this paper provides a simple two dimensional spatial competition model with consumer taste uncertainty. Given taste uncertainty, concentration of stores attracts more consumers since more variety means that a consumer has a higher chance of finding her favorite commodity (a market size effect). On the other hand, concentration of stores leads to fiercer price competition (a price cutting effect). The trade-off between these two effects is the focus of this paper. We provide a few sufficient conditions for the nonemptiness of equilibrium store location choices in pure strategies. We illustrate, by an example, that the market size effect is much stronger for small scale concentrations, but as the number of stores at the same location becomes larger, the price cutting effect eventually dominates. We also discuss consumers' incentives to visit a concentration of stores instead of using mail orders.

446. Guillermo Alger (Analysis Group Economics) and Ingela Alger, "" (12/1999: 463 Kb, Adobe Acrobat format)

Abstract: This paper summarizes theoretical findings on the determinants of liquid assets held by banks. The findings are summarized in a series of predictions, some of which are tested using a panel data set on Mexican banks. Surprisingly, we find that banks with relatively more demand deposits have relatively less liquid assets, in contrast with the theoretical prediction. We further exploit a period characterized by a prolonged aggregate liquidity shock on the Mexican banking system to shed light on the question: are there banks that rely more than others on liquid assets to meet their liquidity needs? We find that only small banks seem to rely on liquid assets to meet severe liquidity shocks.

445. Serena Ng and Timothy Vogelsang (Cornell University), "" (07/1999: 414 Kb, Adobe Acrobat format)

Abstract: This paper studies the error in forecasting a dynamic time series with a deterministic component. We show that when the data are strongly serially correlated, forecasts based on a model which detrends the data before estimating the dynamic parameters are much less precise than those based on an autoregression that includes the deterministic components. The local asymptotic distribution of the forecast errors under the two-step procedure exhibits bimodality, and the forecasts are conditionally median biased in a direction that depends on the order of the deterministic trend function. We explore the conditions under which feasible GLS detrending can lead to forecast error reduction. The finite sample properties of OLS and feasible GLS forecasts are compared with forecasts based on unit root pretesting. The procedures are applied to fifteen macroeconomic time series to obtain real time forecasts. Forecasts based on feasible GLS detrending tend to be more efficient than forecasts based on OLS detrending. Regardless of the detrending method, unit root pretests often improve forecasts.

444. Robert A. Moffitt (Johns Hopkins University) and Peter Gottschalk, "" (Rev. 11/98: 7.6 Mb, Adobe Acrobat format)

Abstract: We use the Michigan Panel Data Study on Income Dynamics to decompose the well-known rise in cross-sectional variance of individual male earnings in the U.S. into permanent and transitory components. We find that about half of the increase has arisen from an increase in the variance of permanent component of earnings and half from an increase in the variance of the transitory component over the period 1969-1991. In contrast to previous work by the authors, we employ a formal model for earnings dynamics. we also show that our results are robust to alternative models for earnings dynamics used in past studies in the literature. Finally, we draw out the implications of our analysis for the study of earnings mobility and show that the findings obtained in recent studies of trends in mobility are contained in our earnings dynamics model.

443. Richard Arnott, "" (12/2001: 261 Kb, Adobe Acrobat format)

Abstract: Consider a corridor which connects a continuum of residential locations to the CBD (central business district) and which is subject to flow congestion. All (identical) individuals travel along the corridor from home to work in the morning rush hour and have the same work start time. Each individual decides when to depart from home so as to minimize the sum of travel time costs, time early costs, and toll costs (when applicable). This paper investigates the pattern of trac flow over the morning rush hour and the social optimum, and considers the implications for land use and road cost-benefit analysis.

442. Arthur Lewbel, Daniel McFadden (University of California-Berkeley) and Oliver Linton (London School of Economics), "" (rev. 07/2010: PDF)

Abstract: We propose estimators of features of the distribution of an unobserved random variable W. What is observed is a sample of Y; V; X where a binary Y equals one when W exceeds a threshold V determined by experimental design, and X are covariates. Potential applications include bioassay and destructive duration analysis. Our empirical application is referendum contingent valuation in resource economics, where one is interested in features of the distribution of values W (willingness to pay) placed by consumers on a public good such as endangered species. Sample consumers with characteristics X are asked whether they favor (with Y = 1 if yes and zero otherwise) a referendum that would provide the good at a cost V specified by experimental design. This paper provides estimators for quantiles and conditional on X moments of W under both nonparametric and semiparametric specifications.

441. Arthur Lewbel and Serena Ng, "" (rev. 06/2002: 294 Kb, Adobe Acrobat format)

Abstract: Relative prices are nonstationary and standard root-T inference is invalid for demand systems. But demand systems are nonlinear functions of relative prices, and standard methods for dealing with nonstationarity in linear models cannot be used. Demand system residuals are also frequently found to be highly persistent, further complicating estimation and inference. We propose a variant of the Translog demand system, the NTLOG, and an associated estimator that can be applied in the presence of nonstationary prices with possibly nonstationary errors. The errors in the NTLOG can be interpreted as random utility parameters. The estimates have classical root-T limiting distributions. We also propose an explanation for the observed nonstationarity of aggregate demand errors, based on aggregation of consumers with heterogeneous preferences in a slowly changing population. Estimates using US data are provided.

440. Jushan Bai and Serena Ng, "" (03/2000: 261 Kb, Adobe Acrobat format)

Abstract: In this paper we develop some econometric theory for factor models of large dimensions. The focus is the determination of the number of factors, which is an unresolved issue in the rapidly growing literature on multifactor models. We propose some panel C(p) criteria and show that the number of factors can be consistently estimated using the criteria. The theory is developed under the framework of large cross-sections (N) and large time dimensions (T). No restriction is imposed on the relation between N and T. Simulations show that the proposed criteria yield almost precise estimates of the number of factors for configurations of the panel data encountered in practice.

439. Arthur Lewbel and OIiver Linton (London School of Economics and Yale University), "" (01/2000: 341 Kb, Adobe Acrobat format)

Abstract: The nonparametric censored regression model, with a fixed, known censoring point (normalized to zero), is y = max[0,m(x)+e], where both the regression function m(x) and the distribution of the error e are unknown. This paper provides consistent estimators of m(x) and its derivatives. The convergence rate is the same as for an uncensored nonparametric regression and its derivatives. We also provide root n estimates of weighted average derivatives of m(x), which equal the coefficients in linear or partly linear specifications for m(x). An extension permits estimation in the presence of a general form of heteroskedasticity. We also extend the estimator to the nonparametric truncated regression model, in which only uncensored data points are observed.

438. Jordi Gali (Universitat Pompeu Fabra and New York University) and Tommaso Monacelli, "" (11/1999: 575 Kb, Adobe Acrobat format)

Abstract: We lay out a tractable small open economy version of the canonical sticky price model, and use it as a framework to study the properties of three alternative monetary regimes: (a) optimal monetary policy, (b) a Taylor rule, and (c) an exchange rate peg. Several interesting results emerge from our analysis. First, the optimal policy is shown to entail a positive correlation between domestic and world interest rates. That doesn’t prevent sizable fluctuations of nominal and real exchange rates from occurring, though the implied volatility of those variables is much smaller than the empirical one. Second, a Taylor rule generally leads to excess volatility of nominal variables, and excess smoothness of real variables, relative to the optimal policy. Finally, we show that a pure exchange rate peg causes the equilibrium to be indeterminate and may thus be a source of macroeconomic instability.

437. Tommaso Monacelli, "" (01/1999: 523 Kb, Adobe Acrobat format)

Abstract: Industrial countries moving from fixed to floating exchange rate regimes experience dramatic rises in the variability of the real exchange rate. This evidence, forcefully documented by Mussa (1986), is a puzzle to the extent that it is hard to reconcile with the assumption of flexible prices. This paper shows that a model that combines nominal rigidities with a systematic behavior of monetary policy approximating a managed-fixed exchange rate regime is consistent with Mussa's findings: the real exchange rate is between three and six times more variable under floating than under fixed rates, and this holds independently of the underlying shocks. The impact of the change in regime on the volatility of other real macroeconomic variables, however, depends crucially on the specification of the monetary policy rule and on the source of fluctuations. The model takes also a theoretical stand on other issues raised in the empirical literature, like the so-called exchange rate anomaly, and the international monetary policy shock transmission sign.

436. Gary Burtless (Brookings Institution) and Joseph F. Quinn, "" (01/2000: 313 Kb, Adobe Acrobat format; published in Ensuring Health and Income Security for an Aging Workforce, Peter Budetti, Richard Burkhauser, Janice Gregory and Allan Hunt, eds. Kalamazoo: The W. E. Upjohn Institute for Employment Research, 2001, pp. 375-415)

Abstract: The trend toward earlier and earlier retirement was one of the most important labor market developments of the twentieth century. It was evident in all the major industrialized countries. In the United States, however, the trend toward earlier retirement came to at least a temporary halt in the mid-1980s. Male participation rates at older ages have stabilized or even increased slightly. Older women’s participation rates are clearly rising. This paper examines the environmental and policy changes contributing to the long-term decline in the U.S. retirement age as well as developments that contributed to the recent reversal. The dominant source of earlier retirement was the long-term increase in Americans’ wealth, which permitted workers to enjoy rising living standards even as they spent a growing percentage of their lives outside the paid work force. The expansion of Social Security pensions and of employer-sponsored pension plans and the introduction of mandatory retirement rules also encouraged earlier retirement over much of the last century. Many public policies and private institutions that encouraged early retirement have been modified in recent years. Mandatory retirement has been outlawed in most jobs. Social Security is no longer growing more generous, and worker coverage under company pension plans is no longer rising. Both Social Security and many private pensions have become more "age neutral" with respect to retirement. Public and private pension programs now provide weaker financial incentives for workers to retire at particular ages, such as age 62 or age 65, and offer stronger incentives for aging workers to remain in the labor force. The paper outlines additional policies that could encourage later retirement. An open question is whether such policies are needed. Rising labor productivity and increased work effort during the pre-retirement years mean that Americans can continue to enjoy higher living standards, even as improved longevity adds to the number of years that workers spend in retirement. If opinion polls are to be believed, most workers favor preserving the institutions that allow early retirement even if it means these institutions will require heavier contributions from active workers.

435. Robert G. Murphy, "" (11/1999: 157 Kb, Adobe Acrobat format; published inThe Economic Outlook for 2000, S. Hymans., ed., 2000, 135-176)

Abstract: This paper assesses the apparent decline during the 1990s in the unemployment rate associated with stable inflation--the so-called "NAIRU." The paper argues that supply shocks alone are not sufficient to account for this decline and that changes in labor markets are in part responsible. I consider several popular labor-market explanations for the decline. Although a demographic shift toward a more experienced workforce, a growing use of temporary employees, and a skyrocketing prison population probably have contributed to the decline in the NAIRU, they do not adequately explain the timing of an acceleration in that decline during the mid-1990s. I propose an alternative explanation based on evidence showing an increase during the 1990s in the synchronization of regional economic conditions. In particular, I suggest that greater uniformity in economic conditions across regions during the current business expansion has limited spillovers of wage and price pressures from one region of the country to another, thereby lowering the national NAIRU.

434. Atreya Chakraborty (Brandeis University), Mark Kazarosian and Emery Trahan (Northeastern University), "" (07/1999: 77 Kb, Adobe Acrobat format)

Abstract: We test whether uncertainty in the CEO's compensation influences the firm's investment decisions, using panel compensation data and cross-sectional invetsment data. Given the prospect of bearing extra risk, a rational agent reacts to minimize the impact of such risk. We provide evidence that CEOs with high earnings uncertainty invest less. As expected, the negative impact of permanent earnings uncertainty on firm investment is larger than that of transitory earnings uncertainty. The results are robust to several alternative specifications and lend support to Stulz' over-investment hypothesis. Knowing how investment is tied to the CEO's earnings uncertainty helps in building the correct compensation package.

433. Atreya Chakraborty (Brandeis University) and Mark Kazarosian, "" (11/1999: 62 Kb, Adobe Acrobat format)

Abstract: Since the mid-1980s many authors have investigated the influence of information technology (IT) on productivity. Until recently there has been no clear evidence that productivity increases as a result of IT spending. This productivity paradox is partly due to the difficulty in correctly identifying outputs, particularly in the service sector such as the trucking industry. Products are often differentiated by quality attributes of the service provided, rather than merely the physical content of the good delivered by motor carriers. A carrier's primary marketing objective, e.g. on-time-performance vs. lowest rate carrier, are precisely what differentiates a trucking firm's service. This paper uses cross-sectional data to show that the use of increasingly sophisticated IT by trucking firms varies depending upon marketing objectives. Our empirical results imply that in order to measure the impact of IT on productivity it is crucial to account for how the firm differentiates its product. We conclude that the productivity paradox can be alleviated if measures of output incorporate firms' marketing objectives.

432. Atreya Chakraborty (Brandeis University) and Mark Kazarosian, "" (11/1999: 100 Kb, Adobe Acrobat format)

Abstract: Economic theory predicts that earnings uncertainty increases precautionary saving and causes households to include relatively liquid assets in their portfolios. Risk avoidance and the demand for liquidity cause these portfolio choices. Studies investigating United States evidence of precautionary portfolio allocation are nonexistent. With panel data, our results confirm the precautionary motive, and indicate that the desire to moderate total exposure to risk (temperance) and the demand for liquidity each affect the household's portfolio. Both permanent and transitory earnings uncertainty boost total wealth, and this precautionary wealth tends to be invested in safe, liquid assets. These results are particularly pronounced for people facing borrowing constraints. Such behavior is consistent with consumer utility functions that exhibit decreasing absolute risk aversion and decreasing strength of the precautionary motive (prudence). Our findings are important because both unemployment compensation and income taxes provide insurance that reduce earnings uncertainty. As a result, precautionary saving is both curtailed and reallocated. These policies could have large effects on capital formation and interest rates, through changes in the composition of household asset demand.

431. Robert L. Clark (North Carolina State University) and Joseph F. Quinn, "" (09/1999: 259 Kb, Adobe Acrobat format)

430. Peter Gottschalk and Sheldon Danziger (U. of Michigan), "" (rev. 02/2001: 80 Kb, Adobe Acrobat format; forthcoming as Chapter 5 inThe Dynamics of Child Poverty in Industrialized Countries, Bradbury, Jenkins, and Michelwright, eds.,Cambridge University Press)

Abstract: This paper asks two questions about child poverty dynamics. The first is whether long-run transitions out of poverty have changed. The second is whether the events associated with exits from poverty have changed. We use the Panel Study of Income Dynamics to contrast the patterns of children 0 to 5 over the 1970's to patterns for similar children over the 1980's. We find that roughly half of the children who were in poor families at the start of each decade remained poor. For black children and children in female headed households, both the relative and absolute mobility are considerably lower. These mobility rates show no significant changes over time. Likewise, the events associated with exits out of poverty are remarkably stable.

429. Peter Gottschalk and Michael Hansen, "" (rev. 07/2001: 76 Kb, Adobe Acrobat format; forthcoming,Journal of Labor Economics)

Abstract: This paper explores the claim that college educated workers are increasingly likely to be in "non-college" occupations. We provide a conceptual framework which gives analytical content to the previously vague distinction between college and non-college jobs. This framework is used to show that skill bias technological change will to lead to a decline in the proportion of college workers in non-college jobs. This prediction is supported by the data.

428. James E. Anderson, "" (08/1999: 112 Kb, Adobe Acrobat format; published in Pacific Economic Review 5 (2000), 115-134)

Abstract: There isn't nearly as much trade as standard models suggest there should be. Formal trade barriers and transport costs are too low to account for the difference. The pattern of missing trade has interesting variation across country pairs. These clues suggest the need for theoretical and eventually structural empirical work on the missing transactions costs. This paper reviews recent empirical findings and some promising research directions in search, predation and contract theory.

427. Helen Connolly and Peter Gottschalk, "" (rev. 12/2001: 180 Kb, Adobe Acrobat format; previously titled "Dead-end and Stepping-stone Jobs: Theory and Evidence)

Abstract: This paper explores the wage and job dynamics of less-skilled workers by estimating a structural model in which agents choose among jobs that differ in initial wage and wage growth. The model also formalizes the intuitive notion that some of these jobs offer "stepping stones" to better jobs. The estimated model assumes that job offers consist of three attributes: an initial wage, an expected wage growth, and an indicator of the distribution from which future offers will come. We derive the conditions under which agents accept these offers and the effect of involuntary terminations on the acceptance decision. This model shows that the probability of leaving an employer depends both on the slope and intercept of the current and offered jobs and the probability of gaining access to the dominant wage offer distribution. We use the SIPP to estimate this model, which allows us to recover parameters of the wage offer distributions and the probability that a job is a stepping stone job. Our empirical work indicates that wage offer distributions vary systematically with the slope and intercept of wages in the current job and that there is a non-zero probability of being offered a stepping stone job.

426. Peter N. Ireland, "" (07/1999: 310 Kb, Adobe Acrobat format; published in Journal of Monetary Economics, 2001, 47:1, 3-18)

Abstract: This paper focuses on the specification and stability of a dynamic, stochastic, general equilibrium model of the business cycle with sticky prices. Maximum likelihood estimates reveal that the data prefer a version of the model in which adjustment costs apply to the price level but not to the inflation rate. Formal hypothesis tests provide evidence of instability in the estimated parameters, concentrated in the Euler equation linking consumption growth to the interest rate.

Please note:To view Adobe Acrobat® (PDF) formatted files.


425
. Peter N. Ireland, "" (07/1999: 245 Kb, PDF)

Abstract: This paper addresses the problem of multiple equilibria in a model of time-consistent monetary policy. It suggests that this problem originates in the assumption that agents have rational expectations and proposes several alternative restrictions on expectations that allow the monetary authority to build credibility for a disinflationary policy by demonstrating that it will stick to that policy even if it imposes short-run costs on the economy. Starting with these restrictions, the paper derives conditions that guarantee the uniqueness of the model's steady state; monetary policy in this unique steady state involves the constant deflation advocated by Milton Friedman.

424. Joseph F. Quinn, "" (05/1999: 206 Kb, PDF)

423. Robert L. Clark (North Carolina State University) and Joseph F. Quinn, "" (05/1999: 44 Kb, PDF)

422. Christopher F. Baum, Mustafa Caglayan (University of Durham) and John T. Barkoulas (Louisiana Tech University), "" (revised 06/2000: 120 Kb, PDF; published,Journal of Macroeconomics, 2001, 23, 565-576)

Abstract: This paper investigates the effects of permanent and transitory components of the exchange rate on firms' profitability under imperfect information. Utilizing a signal extraction framework, we show that the variances of these components of the exchange rate process will have indeterminate effects on the firm's growth rate of profits, but will have predictable effects on its volatility. An increase in the variance of the permanent (transitory) component in the exchange rate process leads to greater (lesser) variability in the growth rate of the firm's profits, thus establishing that the source of exchange rate volatility matters in analyzing its effects. Implications of our theoretical findings for the empirical modeling of the underlying relationships are discussed.

421. Peter N. Ireland, "" (04/1999: 205 Kb, PDF)

Abstract: This paper develops a method for combining the power of a dynamic, stochastic, general equilibrium model with the flexibility of a vector autoregressive time-series model to obtain a hybrid that can be taken directly to the data. It estimates this hybrid model via maximum likelihood and uses the results to address a number of issues concerning the ability of a prototypical real business cycle model to explain movements in aggregate output and employment in the postwar US economy, the stability of the real business cycle model's structural parameters, and the performance of the hybrid model's out-of-sample forecasts.

420. Joseph F. Quinn, "" (03/1999: 222 Kb, PDF)

419. Peter N. Ireland, "" (02/1999: 96 Kb, PDF)

Abstract: This paper characterizes Federal Reserve policy since 1980 as one that actively manages short-term nominal interest rates in order to control inflation and evaluates this policy using a dynamic, stochastic, sticky-price model of the United States economy. The results show that the Fed's policy insulates aggregate output from the effects of exogenous demand-side disturbances and, by calling for a modest but persistent reduction in short-term interest rates following a positive technology shock, helps the economy to respond to supply-side disturbances as it would in the absence of nominal rigidities.

418. James E. Anderson and Douglas Marcouiller, S.J., "" (Rev. 08/2000: PDF; published inReview of Economics and Statistics, 2002, 84, 342-352.)

Abstract: Corruption and imperfect contract enforcement dramatically reduce trade. This paper estimates the reduction, using a structural model of import demand in which transactions costs impose a price markup on traded goods. We find that inadequate institutions constrain trade far more than tariffs do. We also find that omitting indexes of institutional quality from the model leads to an underestimate of home bias. Using a broad sample of countries, we find that the traded goods expenditure share declines significantly as income per capita rises, other things equal. Cross-country variation in the effectiveness of institutions offers a simple explanation of the observed global pattern of trade, in which high-income, capital-abundant countries trade disproportionately with one another.

417. James E. Anderson, "" (rev. 05/2000: PDF; published,Review of International Economics, 2002, 10, 215-36)

Abstract: The desirability of trade reform paired with revenue neutral changes in other distortionary taxes is an empirical question. With a particular Computable General Equilibrium model of an economy, particular reforms can be evaluated, but the robustness of conclusions is suspect; they depend on a particular specification and parameterization of the model economy. This paper provides a diagnostic toolkit which permits sensitivity analysis across model specifications and parameterizations. Novel elements are an emphasis on the concept of compensated Marginal Cost of Funds (MCF), development of the MCF of quotas and analysis of the relationship between aggregate MCF and social welfare.

416. James E. Anderson and J. Peter Neary (University College Dublin), "" (11/1998: PDF; published,International Economic Review, 2003, 44, 627-49)

Abstract: International trade policies are often compared across countries and over time for a variety of purposes. Analysts use such measures as arithmetic or trade-weighted average tariffs, Non-Tariff Barrier (NTB) coverage ratios and measures of tariff dispersion. All such measures are without theoretical foundation. In this paper we develop and characterise a theoretically-based index number of trade policy which is appropriate to trade negotiations. We characterize an index of trade policy restrictiveness defined as the uniform tariff equivalent which maintains the same volume of trade as a given set of tariffs, quotas, and domestic taxes and subsidies. We relate this volume-equivalent index to the Trade Restrictiveness Index, a welfare-equivalent measure, and relate changes in both indexes to changes in the generalised mean and variance of the tariff schedule. Applications to international cross-section and time-series comparisons of trade policy show that the new index frequently gives a very different picture than do standard indexes.

415. Peter N. Ireland, "" (11/98: 128 Kb, PDF)

Abstract: This paper derives the restrictions imposed by Barro and Gordon's theory of time-consistent monetary policy on a bivariate time-series model for inflation and unemployment and tests those restrictions using quarterly US data from 1960 through 1997. The results show that the data are consistent with the theory's implications for the long-run behavior of the two variables, indicating that the theory can explain inflation's initial rise and subsequent fall over the past four decades. The results also suggest that the theory must be extended to account more fully for the short-run dynamics that appear in the data.

414. Robert G. Murphy, "" (11/98, PDF; published,Business Economics, 1999, 34, 33-38)

Abstract: This paper confirms that the unemployment rate associated with stable inflation, the so-called "NAIRU," probably has declined in recent years, after having risen sharply during the late 1970s and early 1980s. Although a demographic shift toward a less experienced workforce and an unexpected slowdown in trend productivity growth are able to explain the earlier rise in the NAIRU, a reversal of these effects does not adequately explain the timing of the apparent decline in the NAIRU during the 1990s. I propose that an additional element needs to be incorporated into the assessment. I argue that the degree of integration of regional labor markets across the United States has accelerated over the recent past, leading to a greater degree of synchronization in the pattern of regional labor market conditions and regional business-cycle conditions. I provide evidence of this greater synchronization, and suggest that it may have led to a drift downward in the NAIRU.

413. Oriana Bandiera, Gerard Caprio Jr. (World Bank), Patrick Honohan (World Bank and CEPR) and Fabio Schiantarelli, "" (10/98: 503 Kb, PDF)

Abstract: The effect of financial liberalization on private saving is theoretically ambiguous, not only because the link between interest rate levels and saving is itself ambiguous, but also because financial liberalization is a multi-dimensional and phased process, sometimes involving reversals. Some dimensions, such as increased household access to consumer credit or housing finance, might also work to reduce private savings rather than increasing them. Furthermore, the long-term effect of liberalization on savings may differ substantially from the impact effect. Using Principal Components, we construct a 25-year time series index of financial liberalization for each of eight developing countries: Chile, Ghana, Indonesia, Korea, Malaysia, Mexico, Turkey and Zimbabwe. This is employed in an econometric analysis of private saving in these countries. We find that the pattern of effects differs across countries. In summary, liberalization appears to have had a significant positive direct effect on saving in Ghana and Turkey, and a negative effect in Korea and Mexico. No clear effect is discernible in the other countries. There is no evidence of significant, positive and sizeable interest rate effects. For the present, our results must be taken as an indication that there is no firm evidence that financial liberalization will increase saving. Indeed, under some circumstances, liberalization has been associated with a fall in saving. All in all, it would be unwise to rely on an increase in private savings as the channel through which financial liberalization can be expected to increase growth.

412. Peter Gottschalk and Michael Hansen, "" (10/98: 182 Kb, PDF)

Abstract: This article examines whether recent college graduates have fared as well as their predecessors. We examine changes in both the wage and occupational distributions. Specifically, we explore the claim that college educated workers are increasingly likely to be in "non-college" occupations. The latter are defined using standard economic concepts rather than the subjective groupings of occupations used in previous studies. We show that changes in the wage distribution and changes in the proportion of college-educated workers in "non-college" jobs reflect continued improvements through the mid-1980s, but a deterioration in the late 1980s and early 1990s.

411. Richard Arnott, "" (10/98: 193 Kb, PDF)

Abstract: This paper investigates property tax systems (linear taxes on pre-development land value, post-development structure value, and post-development site value) from a partial equilibrium perspective. Particular attention is paid to characterizing property tax systems that are neutral with respect to the timing and density of development and to calculating the deadweight loss from non-neutral property tax systems.

410. Jushan Bai and Serena Ng, "" (8/98: 281 Kb, PDF)

Abstract: The assumption of conditional symmetry is often invoked to validate adaptive estimation and consistent estimation of ARCH/GARCH models by quasi maximum likelihood. Imposing conditional symmetry can increase the efficiency of bootstraps if the symmetry assumption is valid. This paper proposes a procedure for testing conditional symmetry. The proposed test does not require the data to be stationary or i.i.d., and the dimension of the conditional variables could be infinite. The size and power of the test are satisfactory even for small samples. In addition, the proposed test is shown to have non-trivial power against root-T local alternatives. Applying the test to various time series, we reject conditional symmetry in inflation, exchange rate and stock returns. These data have previously been tested and rejected for unconditional symmetry. Among the non-financial time series considered, we find that investment, the consumption of durables, and manufacturing employment also reject conditional symmetry. Interestingly, these are series whose dynamics are believed to be affected by fixed costs of adjustments.

409. Basma Bekdache (Wayne State University) and Christopher F. Baum, "" (Rev. 04/2000: 716 Kb, PDF)

Abstract: Excess returns earned in fixed-income markets have been modeled using the ARCH-M model of Engle et al. and its variants. We investigate whether the empirical evidence obtained from an ARCH-M type model is sensitive to the definition of the holding period (ranging from 5 days to 90 days) or to the choice of data used to compute excess returns (coupon or zero-coupon bonds). There is robust support for the inclusion of a term spread in a model of excess returns, while the significance of the in-mean term depends on characteristics of the underlying data.

Note: this paper was previously titled "Conditional heteroskedasticity models of excess returns: How robust are the results?"

408. Peter Gottschalk and Robert Moffitt (Johns Hopkins University), "" (Rev. 02/99: 230 Kb, PDF; previously circulated as "Changes in Job and Earnings Instability in the Panel Study of Income Dynamics and the Survey of Income and Program Participation")

Abstract: This paper has two objectives. The first is to provide evidence on changes in short term job turnover using a previously underutilized data source, the Survey of Income and Program Participation (SIPP). The results from the SIPP are contrasted with data from the Panel Study of Income Dynamics (PSID), a more widely used data set. The second objective of the paper is to describe the changes in the events accompanying job turnover. The implicit normative assumption behind much of the public discussion of job turnover is that turnover is undesirable because it is either "involuntary" or leads to worsened outcomes, such as an increase in the probability of unemployment a or decrease in wages. We, therefore, also examine several of these outcomes to see if the perception that conditions have worsened reflects changes in these events.

407. Peter Gottschalk and Enrico Spolaore, "" (Revised 11/98: 254 Kb, PDF)

Abstract: This paper provides an explicit welfare basis for evaluating economic mobility. Our social welfare function can be seen as a natural dynamic extension of the static social welfare function presented in Atkinson and Bourguignon (1982). Unlike Atkinson and Bourguignon, we use social preferences a la Kreps-Porteus, for which the timing of resolution of uncertainty may matter. Within this generalized framework, we show that welfare evaluation of mobility depends on the interplay between aversion to inequality, risk aversion, and aversion to intertemporal fluctuations. This framework allows us to provide a welfare analysis not only of "reversal" (which has been the focus of much of the literature) but also of "origin independence" (which has not received an explicit welfare foundation in the literature). We use our framework to develop welfare measures of mobility, and apply these measures to intergenerational mobility in the United States using PSID data. We show that the value of origin independence is quantitatively important. We also show that different subpopulations experience different mobility patterns: reversal is more important than origin independence for blacks but the opposite is true for non-blacks.

406. Joseph F. Quinn, "" (4/98: 87 Kb, PDF)

405. John T. Barkoulas (University of Tennessee), Christopher F. Baum and Mustafa Caglayan (University of Liverpool), "" (revised 09/2001: 132 Kb, PDF; publishedJournal of International Money and Finance, 21, 481-496, 2002)

Abstract: This paper investigates the effects of exchange rate uncertainty on the volume and variability of trade flows. Employing a signal extraction framework, we show that the direction and magnitude of importers' and exporters' optimal trading activities depend upon the source of the uncertainty (general economic shocks, fundamental factors driving the exchange rate process, or noise in the signal of policy innovations), providing a rationale for the contradictory empirical evidence in the literature. We also show that exchange rate uncertainty emanating from general economic shocks and the fundamental factors reduces the variability of trade flows, while that related to noise in the signal of policy innovations increases variability.

404. Christopher F. Baum, Mustafa Caglayan (Koc University) and John T. Barkoulas (Louisiana Tech University), "" (revised 11/1999: 531 Kb, PDF; published,Journal of International Money and Finance, 2001)

Abstract: This paper models the dynamics of adjustment to long-run purchasing power parity (PPP) over the post-Bretton Woods period in a nonlinear framework consistent with the presence of frictions in international trade. We estimate exponential smooth transition autoregressive (ESTAR) models of deviations from PPP using both CPI- and WPI-based measures for a broad set of U.S. trading partners. We find clear evidence of a mean-reverting dynamic process for sizable deviations from PPP, with an equilibrium tendency varying nonlinearly with the magnitude of disequilibrium.

403. Eric Ghysels (Pennsylvania State University) and Serena Ng, "" (3/98: 507 Kb, PDF; forthcoming,Review of Economics and Statistics)

Abstract: Many continuous time term structure of interest rate models assume a factor structure where the drift and volatility functions are affine functions of the state variable process. These models involve very specific parametric choices of factors and functional specifications of the drift and volatility. Moreover, under the affine term structure restrictions not all factors necessarily affect interest rates at all maturities simultaneously. This class of so called affine models covers a wide variety of existing empirical as well as theoretical models in the literature. In this paper we take a very agnostic approach to the specification of these diffusion functions and test implications of the affine term structure restrictions. We do not test a specific model among the class of affine models per se. Instead, the affine term structure restrictions we test are based on the derivatives of the responses of interest rates to the factors. We also test how many and which factors affect a particular rate. These tests are conducted within a framework which models interest rates as functions of "fundamental" factors, and the responses of interest rates to these factors are estimated with non-parametric methods. We consider two sets of factors, one based on key macroeconomic variables, and one based on interest rate spreads. In general, despite their common use we find that the empirical evidence does not support the restrictions imposed by affine models. Besides testing the affine structure restrictions we also uncover a set of fundamental factors which appear remarkably robust in explaining interest rate dynamics at the long and short maturities we consider.

402. Francis McLaughlin, "" (2/98: 28 Kb, PDF; published,Historical Journal of Massachusetts).

Abstract: In the winter of 1931, in the depth of the Great Depression, and under extremely adverse circumstances mdý longshoremen engaged in, and endured, a long and bitter strike because of a fundamental disagreement with their employers about what constituted a reasonable pace of work. This account of that strike illustrates the enduring need in a rational industrial relations system for institutional means for resolving fairly, unavoidable differences between employers and employees about the concrete meaning of distributive justice.

401. Francis McLaughlin, "" (2/98: 36 Kb, PDF; published,Historical Journal of Massachusetts).

Abstract: Despite the common opinion that the Knights of Labor had virtually disappeared by the end of the 1890s, it remained as the dominant labor organization among railroad freight handlers and longshoremen in mdý until just prior to the outbreak of World War I. The seemingly solid position of the Knights in the mdý transportation industry crumbled almost overnight in 1912, when a strike by mdý longshoremen was defeated resoundingly. The longshoremen's experience in this strike led them to abandon the Knights of Labor and move en masse to the International Longshoremen's Association by the beginning of 1913.

Please note:To view Adobe Acrobat® (PDF) formatted files.

400. Joseph Quinn, Kevin Cahill, Richard V. Burkhauser (Syracuse), Robert Weathers (Syracuse), "" (2/98: 176 Kb, Adobe Acrobat format)

399. Peter Gottschalk, John Fitzgerald (Bowdoin College), Robert Moffitt (Johns Hopkins University), "" (11/97: 132 Kb, Adobe Acrobat format; does not currently include all tables/figures)

398. Peter Gottschalk and Sheldon Danziger (University of Michigan), "" (12/97: 83 Kb, Adobe Acrobat format)

397. Peter Gottschalk, "" (7/97: 66 Kb, Adobe Acrobat format; does not currently include figures)

396. John T. Barkoulas (Louisiana Tech University), Christopher F. Baum and Atreya Chakraborty (Brandeis University), "" (rev. 12/1999: 413 Kb, Adobe Acrobat format; published,Economics Letters, 70, 237-243, 2001)

Abstract: Does merger and acquisition (M&A) activity occur in waves, that is, are there oscillations between low and high levels of M&A activity? The answer to this question is important in developing univariate as well as structural models of explaining and forecasting the stochastic behavior of M&A activity. There is evidence to suggest that aggregate U.S. time-series data on merger and acquisition (M&A) activity exhibit a "wave: behavior, which has been modeled by fitting either a two-state Markov switching-regime model or a sine-wave model to the data. This study provides an alternative characterization of the temporal patterns in M&A as a nonlinear process with strongly persistent or long-memory dynamics. The apparent level changes or partial cycles of differing magnitudes in aggregate M&A time series are consistent with an underlying data generating process exhibiting long memory. Time- and frequency-domain estimation methods are applied to a long M&A time series constructed by Town (1992), covering approximately a century of merger activity in the U.S. economy. We find significant evidence of long-term cyclical behavior, nonperiodic in nature, in the M&A time series, even after accounting for potential shifts in the mean level of the series. A shock to M&A activity exhibits significant persistence as it is damped at the very slow hyperbolic rate, but it eventually dissipates. We provide both theoretical and empirical rationales for the presence of fractional dynamics with long-memory features in M&A activity. Theoretically, long-term dependence may be due to persistent differences in firm valuation between stockholders and nonstockholders following an "economic disturbance," as suggested by Gort (1969). Empirically, long-memory dynamics in M&A activity may reflect the statistical properties of fundamental factors underlying its behavior, as several of the proposed determinants of M&A activity have been shown to exhibit strong persistence.

395. Emanuela Cardia (University of Montreal) and Serena Ng, "" (Rev. 07/98: 306 Kb, Adobe Acrobat format)

Abstract: This paper examines the implications of intergenerational transfers of time and money for labor supply and capital accumulation. Although intergenerational transfers of time in the form of grandparenting are as substantial as monetary transfers in the data, little is known about the role and importance of time transfers. In this paper, we calibrate an overlapping generations model extended to allow for both time and monetary transfers to the US economy. We use simulations to show that time transfers have important positive effects on capital accumulation and that these effects can be as significant as those of monetary transfers. However, while time transfers increase the labor supply of the young, monetary transfers produce an income effect that tends to decrease work effort. We also find that child care tax credits have little impact on parental time and money transfers, but that a universal child tax credit would increase the welfare of the rich while the poor would benefit from a means-tested program.

394. John Fitzgerald (Bowdoin), Peter Gottschalk and Robert Moffitt (Johns Hopkins), "" (11/97: 146 Kb, Adobe Acrobat format)

Abstract: By 1989 the Michigan Panel Study on Income Dynamics (PSID) had experienced approximately 50 percent sample loss from cumulative attrition from its initial 1968 membership. We study the effect of this attrition on the unconditional distributions of several socioeconomic variables and on the estimates of several sets of regression coefficients. We provide a statistical framework for conducting tests for attrition bias that draws a sharp distinction between selection on unobservables and on observables and that shows that weighted least squares can generate consistent parameter estimates when selection is based on observables, even when they are endogenous. Our empirical analysis shows that attrition is highly selective and is concentrated among lower socioeconomic status individuals. We also show that attrition is concentrated among those with more unstable earnings, marriage, and migration histories. Nevertheless, we find that these variables explain very little of the attrition in the sample, and that the selection that occurs is moderated by regression-to-the-mean effects from selection on transitory components that fade over time. Consequently, despite the large amount of attrition, we find no strong evidence that attrition has seriously distorted the representativeness of the PSID through 1989, and considerable evidence that its cross sectional representativeness has remained roughly intact.

393. Atreya Chakraborty (Brandeis University), Christopher F. Baum, "" (11/97: 51 Kb, Adobe Acrobat format; published,International Journal of Finance, 1998, 10:3, 1120-1138)

Abstract: The rationale for issuing poison pill securities remains unclear, despite the findings of a large body of prior research that these defenses adversely affect shareholder wealth. This paper investigates the hypothesis that the adoption of such defenses may reflect shareholdersÕ desire to contract efficiently with their managers in an environment characterized by hostile takeovers and uncertainty about the managersÕ true performance. Unlike previous research, we focus on financial characteristics of firms as they relate to the motives for adopting such defenses. Our empirical research does not support the optimal contacting hypothesis. We interpret our results as supportive of the managerial entrenchment hypothesis.

392. Kelly Chaston (Davidson College), Gregory Swinand, Frank Gollop, Richard Arnott, "" (11/97: 50 Kb, Adobe Acrobat format)

391. Richard Arnott, "" (11/97: 40 Kb, Adobe Acrobat format)

390. Richard Arnott, "" (11/97: 60 Kb, Adobe Acrobat format)

389. Richard Arnott, "" (11/97: 38 Kb, Adobe Acrobat format)

Abstract: According to the standard model of urban traffic congestion and urban spatial structure, congestion tolling results in a more concentrated city. In recent years, a new model of rush hour urban auto congestion has been developed which incorporates trip-timing decisions: the bottleneck model. In the simplest bottleneck model, optimal congestion tolling without toll revenue redistribution has no effect on trip price since the efficiency gains exactly equal the toll revenue collected. Optimal congestion tolling then has no effect on urban spatial structure. This paper formalizes this result and extends it somewhat.

388. Richard Arnott, Alex Anas (SUNY-Buffalo), Kenneth Small, "" (11/97: 17 Kb, Adobe Acrobat format)

387. Richard Arnott, "" (11/97: 63 Kb, Adobe Acrobat format)

386. Robert G. Murphy, "" (rev. 03/1999: 60 Kb, Adobe Acrobat format)

Abstract: This paper shows that the debt burden of households, as measured by the debt service to income ratio, is helpful in forecasting the future growth of consumer spending. Not only is the debt-service ratio a statistically significant predictor of future spending growth, it also explains about as much of the variation in spending growth as many other commonly used indicators. And when combined with other economic indicators, the debt-service ratio still provides incremental predictive power. The debt-service ratio predicts future spending growth in part because it helps predict future income growth for borrowing-constrained households, but also because it directly affects spending growth. I argue that this direct effect reflects a tightening of lending standards by financial institutions following a rise in the debt burden of households. This direct effect is important for spending on durable goods and services, but virtually nonexistent for spending on nondurable goods. Because almost 70 percent of spending on nondurable goods represents purchases of food and clothing (which are less discretionary than purchases of durables and services), I conclude that my results are consistent with the view that borrowing-constrained households will limit their discretionary purchases when faced with a tightening of credit.

385. Joseph F. Quinn, "" (10/97: 45 Kb, Adobe Acrobat format. Published inPublic Policy and Aging Report, Summer 1997, 10-14)

384. Christopher F. Baum and Clifford F. Thies (Shenandoah University), "" (9/97: 197 Kb, Adobe Acrobat format; published,Journal of Economics and Finance, 1998, 22:2-3, 5-12)

Abstract: This paper reexamines whether the term structure of interest rates, rather than merely a single interest rate, should be included in the demand for money of the interwar era. In contrast to earlier work, we use cointegration techniques to model the equilibrium/error correction process, and find that a sufficiently rich dynamic model using a single interest rate has considerable explanatory power. Nevertheless, we conclude that the inclusion of the term structure may help to explain the turbulent monetary dynamics of the Depression era.

383. James E. Anderson and Douglas Marcouiller, S.J., "" (revised 10/2001: 144 Kb, Adobe Acrobat format)

Abstract: Market exchange is subject to an endogenously determined level of predation which impedes specialization and gains from trade. We construct a model in which utility-maximizing agents opt between careers in production and careers in predation. Three types of equilibria may emerge: autarky (with no predation and no defense), insecure exchange equilibria (with predation and defense), and secure exchange equilibria (in which defense completely deters predation). Trading equilibria, two-thirds of them secure, are supported only in a narrow range of security parameter values. Since changes in the technologies of defense and predation have terms of trade effects, some producers may be hurt by enhanced security. We show cases of 'immiserizing security' in which producers in large poor countries are harmed by increased security.

382. Peter Gottschalk and Susan E. Mayer (Chicago), "" (9/97: 29 Kb, Adobe Acrobat format)

Abstract: Previous studies of trends in inequality have ignored changes in the distribution of home production. This paper asks whether including the value of home production affects the trend in inequality of families. During the 1980s household money income grew at a slow rate but inequality increased. At the same time home production declined somewhat overall, but it declined more for high income households than for low income household. Using Panel Study of Income Dynamics data we develop three methods to adjust household money income for the value of home production. We then compare trends in the level and distribution of these measures of adjusted income to the trends in the level and distribution of money income. Income adjusted for the value of home production is more equally distributed than unadjusted income, but inequality of adjusted income grew during the 1980s. These conclusions are the same regardless of the method for adjusting income.

381. Mehmet Caner (Koc University), Bruce Hansen, "" (8/97: 260 Kb, Adobe Acrobat format)

Abstract: This paper develops an asymptotic theory of inference for a two-regime threshold autoregressive (TAR) model with an autoregressive root which is local-to-unity. We find that the asymptotic null distribution of the Wald test for a threshold is non-standard and mildly dependent on the local-to-unity coefficient. We also study the asymptotic null distribution of the Wald test for an autoregressive unit root, and find that it is non-standard and dependent on the presence of a threshold effect. These tests and distribution theory allow for the joint consideration of non-linearity (thresholds) and non-stationarity (unit roots).

Our limit theory is based on a new set of tools which combines unit root asymptotics with empirical process methods. We work with a particular two-parameter empirical processes which converges weakly to a two-parameter Brownian motion. Our limit distributions involve stochastic integrals with respect to this two-parameter process. This theory is entirely new and may find applications in other contexts.

We illustrate the methods with an application to the U.S. monthly unemployment rate. We find strong evidence of a threshold effect. The point estimates suggest that in about 80% of the observations, the regression function is close to a driftless I(1) process, and in the other 20% of the observations, the regression function is mean-reverting with an unconditional mean of 5%. While the conventional ADF test for a unit root is quite insignificant, our TAR unit root test is arguably significant, with an asymptotic p-value of 3.5%, suggesting that the unemployment rate follows a stationary TAR process.

380. Christopher F. Baum, John T. Barkoulas, and Mustafa Caglayan, "" (rev. 01/99: 170 Kb, Adobe Acrobat format; published,Journal of International Financial Markets, Institutions, and Money, 9, 359-376)

Abstract: This paper considers two potential rationales for the apparent absence of mean reversion in real exchange rates in the post-Bretton Woods era. We allow for (i) fractional integration and (ii) a double mean shift in the real exchange rate process. These methods, applied to CPI-based rates for 17 countries and WPI-based rates for 12 countries, demonstrate that the unit-root hypothesis is robust against both fractional alternatives and structural breaks. This evidence suggests rejection of the doctrine of absolute long-run purchasing power parity during the post-Bretton Woods era.

379. Serena Ng and Timothy J. Vogelsang (Cornell), "" (7/97: 246 Kb, Adobe Acrobat format)

Abstract: This paper considers the implications of omitted mean shifts for estimation and inference in VARs. It is shown that the least squares estimates are inconsistent, and the F test for Granger causality diverges. While model selection rules have the tendency to incorrectly select a lag length that is too high, this over-parameterization can reduce size distortions in tests involving the inconsistent estimates. The practical issue of how to remove the breaks is shown to depend on whether the mean shifts are of the additive or innovational type in a multivariate setting. Under the additive outlier specification, the intercept in each equation of the VAR will be subject to multiple shifts when the break dates of the mean shifts to the univariate series do not coincide. Conversely, under the innovative outlier specification, the unconditional means of the univariate time series are subject to multiple shifts when mean shifts to the innovation processes occur at different dates: Techniques designed to detect multiple shifts are recommended when break dates do not coincide.

378. Richard V. Burkhauser (Syracuse) and Joseph F. Quinn, "" (7/97: 37 Kb, Adobe Acrobat format)

377. John T. Barkoulas (Georgia Southern University), Christopher F. Baum, Mustafa Caglayan (University of Leicester) and Atreya Chakraborty (The Brattle Group), "" (rev. 04/2000: 504 Kb, Adobe Acrobat format; published inGlobal Financial Markets: Issues and Strategies, D. Ghosh and M. Ariff, eds., Praeger Publishers, 2004)

Abstract: We test for stochastic long-memory behavior in the returns series of currency rates for eighteen industrial countries using a semiparametric fractional estimation method. A sensitivity analysis is also carried out to analyze the temporal stability of the long-memory parameter. Contrary to the findings of some previous studies alluding to the presence of long memory in major currency rates, our evidence provides wide support to the martingale model (and therefore for foreign exchange market efficiency) for our broader sample of foreign currency rates. Any inference of long-range dependence is fragile, especially for the major currency rates. However, long-memory dynamics are found in a small number of secondary (nonmajor) currency rates.

376. Angus Deaton (Princeton) and Serena Ng, "" (revised 12/97: 98 Kb, Adobe Acrobat format; forthcoming,Journal of the American Statistical Association)

Abstract: In many public policy problems, we need to estimate the way in which policy changes affect people's behavior. In the analysis of tax and subsidy reform, which is the topic of this paper, we need to know how tax-induced price changes affect the amounts that people buy of the taxed goods. We present various economic and statistical approaches to obtaining the estimates that are required. We consider the structural methods that are standard in economics, where the behavior and welfare of individual agents are simultaneously captured by the specification of utility functions whose parameters are to be estimated. We argue that these methods are less useful than alternatives that directly consider the derivatives of the regression function of average behavior. We consider both parametric and nonparametric estimators of these derivatives in the context of price reform for foods in Pakistan, focussing on the advantages and disadvantages of "average derivative estimation" (ADE) as proposed by Hardle and Stoker (1989) and Stoker (1991). Average derivative estimation is attractive in principle because it directly estimates the statistics that are required for policy analysis. In the practical case considered here, neither technique is a clear winner; each has strengths and weaknesses.

Please note:To view Adobe Acrobat® (PDF) formatted files.


375
. Christopher F. Baum and Meral Karasulu (Bogazici University), "" (6/97: 456 Kb, Adobe Acrobat format)

Abstract: This paper builds upon the analysis of Orphanides and Wilcox (1996) to evaluate optimal anti-inflation policy under a broader set of circumstances than considered in their work. We consider a monetary authority with two instruments--the funds rate and the discount rate--with the distinction that only movements of the latter are 'credible' alterations of the Fed's policy stance, reflecting reputational effects. The public forms expectations of inflation given realized inflation and the expected progress toward lower inflation, as evidenced by credible policy moves. Optimal policy is formulated in a stochastic, dynamic setting of the Tinbergen-Theil framework. The presence of a "cost-of-change" penalty on the sequence of discount rate adjustments generates expected trajectories for targets and policy instruments which differ considerably from those lacking such a penalty.

374. Serena Ng and Francisco Ruge-Murcia (Universite de Montreal), "" (revised 12/97: 198 Kb, Adobe Acrobat format)

Abstract: This paper extends the Competitive Storage Model by incorporating prominent features of the production process and financial markets. A major limitation of this basic model is that it cannot successfully explain the degree of serial correlation observed in actual data. The proposed extensions build on the observation that in order to generate a high degree of price persistence, a model must incorporate features such that agents are willing to hold stocks more often than predicted by the basic model. We therefore allow unique characteristics of the production and trading mechanisms to provide the required incentives. Specifically, the proposed models introduce (i) gestation lags in production with heteroskedastic supply shocks, (ii) multiperiod forward contracts, and (iii) a convenience return to inventory holding. The rational expectations solutions for twelve commodities are numerically solved. Simulations are then employed to assess the effects of the above extensions on the time series properties of commodity prices. Results indicate that each of the features above partially account for the persistence and occasional spikes observed in actual data. Evidence is presented that the precautionary demand for stocks might play a substantial role in the dynamics of commodity prices.

373. Alexander Michaelides (Princeton) and Serena Ng, "" (rev. 04/1999: 433 Kb, Adobe Acrobat format)

Abstract: The non-negativity constraint on inventories imposed on the rational expectations theory of speculative storage implies that the conditional mean and variance of commodity prices are nonlinear in lagged prices and have a kink at a threshold point. In this paper, the structural parameters of this model are estimated using three simulation based estimators. The finite sample properties of the Simulated Methods of Moments estimator of Duffie and Singleton (1993), the Indirect Inference estimator of Gourieroux, Monfort and Renault (1993), and the matching score estimator of Gallant and Tauchen (1996) are assessed. Exploiting the invariant distribution implied by the theory allows us to assess the error induced by simulations. Our results show that while all three estimators produce reasonably good estimates with properties that stack up well with those of the PMLE, there are tradeoffs among the three estimators in terms of bias, efficiency, and computation demands. Some estimators are more sensitive to the sample size and the number of simulations than others. A careful choice of the moments/auxiliary models can lead to a substantial reduction in bias and an improvement in efficiency. Increasing the number of simulated data points can sometimes reduce the bias and improve the efficiency of the estimates when the sample size is small.

372. Basma Bekdache (Wayne State University) and Christopher F. Baum, "" (rev. 8/97: 570 Kb, Adobe Acrobat format)

Abstract: This paper compares six term structure estimation methods in terms of actual ex ante price and yield prediction accuracy. Specifically, we examine the models' ability to price Treasuries for one to five trading days ahead. The models' performance differs markedly between in- and out-of-sample predictions. Their relative success also depends on time, the forecast horizon, and whether price or yield errors are compared. We examine the degree of loss in accuracy the modeler incurs by not using the best method: in particular, we compare the more complex splining methods and the parsimonious Nelson-Siegel model.

371. Joseph F. Quinn and Timothy Smeeding (Syracuse), "" (6/97: 115 Kb, Adobe Acrobat format)

Abstract: In this paper, we compare the patterns of labor force withdrawal of older workers in seven OECD countries. We find wide variation in retirement patterns and recent trends. In Sweden and the United States, for example, it is relatively common for workers to combine retirement benefits and earnings (i.e., to continue working while "retired"), while in Australia, the Netherlands, the United Kingdom and (West) Germany, this is much less common. These countries have also had very different experiences in arresting the postwar trend toward earlier retirement. After significant declines prior to 1985, the labor force participation rates of older men in Australia, West Germany and the United States have been almost unchanged, while the pre-1985 trends have continued unabated in Sweden and Canada. The challenge to researchers is to correlate the diverse public policies in these countries with the very different labor market experiences observed.

370. Robin L. Lumsdaine (Princeton) and Serena Ng, "" (revised 8/98: 263 Kb, Adobe Acrobat format).

Abstract: Ever since the development of the Autoregressive Conditional Heteroskedasticity (ARCH) model (Engle [1982]), testing for the presence of ARCH has become a routine diagnostic. One popular method of testing for ARCH is T times the R^2 from a regression of squared residuals on p of its lags. This test has been shown to have a Lagrange multiplier interpretation and is asymptotically distributed as a Chi^2(p) random variable. Underlying this test is the assumption of a correctly specified conditional mean. In this paper, we consider the properties of the ARCH test when there is a possibly misspecified conditional mean. Examples of misspecification include omitted variables, structural change, and parameter instability. We show that misspecification will lead to overrejection of the null of conditional homoskedasticity. We demonstrate the use of recursive residuals to improve the fit of a first stage conditional mean regression. We illustrate these results via Monte Carlo simulation and consider two empirical examples.

369. Serena Ng and Pierre Perron (mdý University), "" (rev. 09/2000: 370 Kb, Adobe Acrobat format).

Abstract: It is widely known that when there are negative moving average errors, a high order augmented autoregression is necessary for unit root tests to have good size, but that information criteria such as the AIC and BIC tend to select a truncation lag that is very small. Furthermore, size distortions increase with the number of deterministic terms in the regression. We trace these problems to the fact that information criteria omit important biases induced by a low order augmented autoregression. We consider a class of Modified Information Criteria (MIC) which account for the fact that the bias in the sum of the autoregressive coefficients is highly dependent on the lag order k. Using a local asymptotic framework in which the root of an MA(1) process is local to -1, we show that the MIC allows for added dependence between k and the number of deterministic terms in the regression. Most importantly, the k selected by the recommended MAIC is such that both its level and rate of increase with the sample size are desirable for unit root tests in the local asymptotic framework, whereas the AIC, MBIC and especially the BIC are less attractive in at least one dimension. In monte-carlo experiments, the MAIC is found to yield huge size improvements to the DF(GLS) and the feasible point optimal P(t) test developed in Elliot, Rothenberg and Stock (1996). We also extend the M tests developed in Perron and Ng (1996) to allow for GLS detrending of the data. The M(GLS) tests are shown to have power functions that lie very close to the power envelope. In addition, we recommend using GLS detrended data to estimate the required autoregressive spectral density at frequency zero. This provides more efficient estimates on the one hand, and ensures that the estimate of the spectral density is invariant to the parameters of the deterministic trend function, a property not respected by the estimation procedure currently employed by several studies. The MAIC along with GLS detrended data yield a set of Mbar(GLS) tests with desirable size and power properties.

368. Serena Ng, "" (6/97: 160 Kb, Adobe Acrobat format)

Abstract: Flexible functional forms of indirect utility and expenditure functions are frequently used in approximating the behavior of utility maximizing consumers to arrive at demand systems that can be easily estimated. A common finding in time series estimations of the Almost Ideal Demand System is strong persistence in the estimated residuals. This paper suggests two explanations for this result. First, the functions used to approximate total expenditure does not allow for the possibility of economic growth. Hence when the data on expenditure have trends, the inadequacy of the approximation results in residuals that are serially correlated. Second, when the economy grows and/or prices trend at different rates, Stone's price index provides a poor approximation to the theoretically appropriate price variable. The consequence is also reflected in the error term. Simulations are used to illustrate these arguments and cointegration is proposed as a guide to model specification.

367. Joseph F. Quinn, "" (6/97: 88 Kb, Adobe Acrobat format; published inProspects for Social Security Reform, O. Mitchell et al., eds., University of Pennsylvania Press)

Abstract: Because currently anticipated Social Security revenues are inadequate to pay for promised benefits, reform of the Social Security program is likely. Several different plans to restore the program to actuarial balance have emerged, and more will arise as the debate continues. Comparison and evaluation of these alternatives require a framework for analysis and criteria along which to judge their strengths and weaknesses. After discussing the goals of the Social Security system, we identify three main criteria (income adequacy, individual equity and economic growth), and propose several other issues for consideration.

366. Peter Gottschalk and Mary Joyce (U.S. Bureau of Labor Statistics), "" (5/97: 495 Kb, Adobe Acrobat format)

Abstract: This paper uses data from the Luxembourg Income Study to explore the role of differences in supply shifts in explaining cross-national differences in the rise in earnings inequality. Changes in returns to age and education are estimated for eight countries using a common specification of earnings functions across years and countries. We find that the small overall increase in earnings inequality in many countries reflects large but offsetting changes in returns to skill and changes in inequality within age education cells. Furthermore, these differences in returns to skill can largely be explained by differences in supply shifts.

365. Bruce E. Hansen, "" (revised 2/98: 273 Kb, Adobe Acrobat format.)

Abstract: Threshold regression methods are developed for non-dynamic panels with individual-specific fixed effects. Least squares estimation of the threshold and regression slopes is proposed using fixed-effects transformations. A non-standard asymptotic theory of inference is developed which allows construction of confidence intervals and testing of hypotheses. The methods are applied to a 15-year sample of 565 U.S. firms to test whether financial constraints affect investment decisions.

364. David A. Belsley, "" (4/97: 155 Kb, Adobe Acrobat format)

Abstract: A case is made that today's graduate students in economics must master early on a computational environment suitable for their research needs. The virtues ofMathematicafor this purpose are highlighted. Descriptions of its elements that are particularly suited for this need are given along with examples of the kind of economic research successfully accomplished in the environment.

363. Christopher F. Baum and Meral Karasulu (Bogazici University), "" (4/97: 105 Kb, Adobe Acrobat format; forthcoming inFocus on Fiscal Policy, Nova Science Publishers)

Abstract: In the United States, eradication of persistent federal deficits has won broad bipartisan support. At the same time, political pressures have mounted to strengthen the Federal Reserve's explicit concern with price stability. Proposals under consideration would require a much narrower focus on the part of Fed policymakers, and could be interpreted as targeting the price level rather than a negligible rate of inflation. The deficit-reduction and price-stability policies should be analysed in combination, as reductions in the real interest rate triggered by lower deficits will have an impact on optimal monetary policy with anti-inflation and stabilization objectives. This paper builds upon the analysis of Orphanides and Wilcox (1996) to evaluate optimal anti-inflation policy under a broader set of circumstances than considered in their work. We consider a monetary authority with two instruments--the funds rate (or rate of base money growth) and the discount rate--with the distinction that only movements of the latter are 'credible' alterations of the Fed's policy stance, reflecting reputational effects. The public forms expectations of inflation given realized inflation and the expected progress toward lower inflation, as evidenced by credible policy moves and the gradual eradication of the fiscal deficit. The interaction between deficit reduction policy and the optimal monetary trajectory is analysed, and the implications for the coordination of these strategies considered via stochastic simulations of the model. The impacts of a price level stabilization target on the Fed and a balanced-budget rule on the fiscal authorities are contrasted with their more flexible counterparts: an inflation target and restriction on deficit spending. Our results indicate that these more stringent political constraints on economic policy could have severe consequences on the ability of the monetary and fiscal authorities to mitigate adverse economic shocks.

362. Joseph F. Quinn, "" (3/97: 50 Kb, Adobe Acrobat format; published inJournal of the American Society of CLU & ChFC, 1997, 44-53)

Abstract: Although officially off the table during the recent Presidential campaign, Social Security finances are very much in the news these days. Any discussion of Federal budget balance raises the topic, because Social Security expenditures are the largest item in the Federal budget. This paper discusses the size and timing of the Social Security funding problem, and asks whether there is a Social Security crisis and whether Social Security can be rescued without radical reform. The options proposed by the President's Social Security Advisory Board are then reviewed, followed by a discussion of some pros and cons of partial privatization, the most controversial component of the reform plans.

361. John Barkoulas and Christopher F. Baum, "" (2/97: 199 Kb, Adobe Acrobat format; published inFinancial Engineering and the Japanese Markets, 1997, 4:189-201)

Abstract: We test for long memory in 3- and 6-month daily returns series on Eurocurrency deposits denominated in Japanese yen (Euroyen). The fractional differencing parameter is estimated using the spectral regression method. The conflicting evidence obtained from the application of tests against a unit root as well as tests against stationarity provides the motivation for testing for fractional roots. Significant evidence of positive long-range dependence is found in the Euroyen returns series. The estimated fractional models result in dramatic out-of-sample forecasting improvements over longer horizons compared to benchmark linear models, thus providing strong evidence against the martingale model.

360. James E. Anderson, "" (1/97: 220 Kb, Adobe Acrobat format; published,Journal of Development Economics, 1998, 56:393-410)

Abstract: There is widespread concern about the effect of the Uruguay Round policy changes on world agricultural prices and consequently upon the welfare of developing countries. Assessing welfare changes with the standard terms of trade effect calculation can be misleading for distorted economies, since the distortion effect operates in addition or in opposition to the terms of trade effect. This study reveals distortion effects which are many times larger than terms of trade effects in a study of the Uruguay Round's impact on 9 agricultural economies. In 3 of 9 cases, the distortion effect reverses the impact of the terms of trade effect. In 2 other cases the distortion effect raises a trivial terms of trade effect up to around 1% of national income.

359. Joe Peek and Eric S. Rosengren (Federal Reserve Bank of mdý), "" (12/96: 66 Kb, Adobe Acrobat format)

Abstract: A key provision of the Federal Deposit Insurance Corporation Improvement Act of 1991 was prompt corrective action (PCA). PCA emphasized early intervention by bank supervisors and was intended to limit forbearance by making supervisory intervention more timely and less discretionary. However, PCA, as implemented, appears to have been oversold. Had PCA been in place during the recent banking crisis in New England, it would have had little, if any, effect. Since it imposes an essentially nonbinding constraint on bank supervisors, PCA is not likely to play a major role in preventing the next banking crisis.

358. Joe Peek and Eric S. Rosengren (Federal Reserve Bank of mdý), "" (11/96: 102 Kb, Adobe Acrobat format)

Abstract: Derivatives have become an essential instrument for hedging risks, yet moral hazard can lead to their misuse by problem banks. Given that the absence of comprehensive data on bank derivatives activities prevents an accurate assessment of bank risk-taking, banks have an opportunity to take unmonitored second bets. Thus, troubled banks have the motive to increase risk, and derivatives provide the means to do so. The role of bank supervisors should be to limit the opportunity through more comprehensive data reporting requirements and closer supervisory scrutiny of derivatives activity at problem banks. Because a relatively large number of banks active in the derivatives market have low capital ratios and are considered institutions with a significant risk of failure by bank supervisors, the possible misuse of derivatives by troubled banks should be of concern to regulators. However, we find no evidence that the volume of derivatives activity at troubled banks affects the probability of formal regulatory intervention or even a downgrade in supervisory rating.

357. Joe Peek and Eric S. Rosengren (Federal Reserve Bank of mdý), "" (9/96: 91 Kb, Adobe Acrobat format; published inAmerican Economic Review, 87:4, 495-505).

Abstract: One of the more dramatic financial events of the late 1980s and early 1990s was the surge in Japanese stock prices that was immediately followed by a very sharp decline of more than 50 percent. While the unprecedented fluctuations in Japanese stock prices were domestic financial shocks, the unique institutional characteristics of the Japanese economy produce a framework that is particularly suited to the transmission of such shocks to other countries through the behavior of the Japanese banking system. The large size of Japanese bank lending operations in the United States enables us to use U.S. banking data to investigate the extent to which this domestic Japanese financial shock was transmitted to the United States, as well as to identify a supply shock to U.S. bank lending that is independent of U.S. loan demand. We find that binding risk-based capital requirements associated with the decline in the Japanese stock market resulted in a decline in commercial lending by Japanese banks in the United States that was both economically and statistically significant. This finding has added importance given the severe real estate loan problems currently faced by Japanese banks. How Japanese bank regulators decide to resolve these problems will have significant implications for credit availability in the United States as well as in other countries with a significant Japanese bank presence.

356. John T. Barkoulas, Christopher F. Baum and Nickolaos Travlos, "" (12/96: 161 Kb, Adobe Acrobat format; published inApplied Financial Economics, 2000, 10:2, 177-184)

Abstract: We test for stochastic long memory in the Greek stock market, an emerging capital market. The fractional differencing parameter is estimated using the spectral regression method. Contrary to findings for major capital markets, significant and robust evidence of positive long-term persistence is found in the Greek stock market. As compared to benchmark linear models, the estimated fractional models provide improved out-of-sample forecasting accuracy for the Greek stock returns series over longer forecasting horizons.

355. Richard Arnott, André de Palma (Université de Cergy-Pontoise), Robin Lindsey (University of Alberta), "Information and Time-of-Usage Decisions in the Bottleneck Model with Stochastic Capacity and Demand" (10/96)

354. Richard Arnott, Marvin Kraus, "When are Anonymous Congestion Charges Consistent with Marginal Cost Pricing?" (Rev. 3/97; published 1998,Journal of Public Economics)

Abstract: There are constraints on pricing congestible facilities. First, if heterogeneous users are observationally indistinguishable, then congestion charges must be anonymous. Second, the time variation of congestion charges may be constrained. Do these constraints undermine the feasibility of marginal cost pricing, and hence the applicability of the first-best theory of congestible facilities? We show that if heterogeneous users behave identically when using the congestible facility and if the time variation of congestion charges is unconstrained, then marginal cost pricing is feasible with anonymous congestion charges. If, however, the time variation of congestion charges is constrained, optimal pricing with anonymous congestion charges entails Ramsey pricing.

353. Richard Arnott, Chong-en Bai, Brian Sack (Massachusetts Institute of Technology), "" (5/96: Adobe Acrobat format, 96 Kb)

Abstract: Arnott and Stiglitz (1993) have argued that, in competitive insurance markets with moral hazard, equilibrium may entail firms offering latent policies--policies that are not bought in equilibrium but are kept in place to deter entry. This paper provides an extended example of such an equilibrium, which not only proces that latent policies can be present in equilibrium but also elucidates the mechanism which makes them potentially effective in deterring entry.

351. Atreya Chakraborty (Charles River Associates), Richard Arnott, "" (Rev. 10/2000: 1669 Kb, Adobe Acrobat format)

Abstract: This paper highlights the role of takeover defenses in the acquisition process. If managerial defensive effort is fixed, the unregulated level of takeover activity is lower than socially desirable since shareholders regard the financial incentives given to raiders to stimulate takeover activity as a cost, while society views them as a transfer. We show that this result no longer holds if defensive effort is variable -- the unregulated market for corporate control will generate excessive takeovers. One implication of our analysis is that in the presence of substantial anti-takeover related expenditures the gains from takeover will be overestimated. These gains include the benefits from dismantling defenses which were installed because of the takeover threat.

Please note:To view Adobe Acrobat® (PDF) formatted files.


350
. Richard Arnott, John Rowse (University of Calgary), "" (rev.12/95: Adobe Acrobat format, 215 Kb)

349. John T. Barkoulas, Christopher F. Baum and Gurkan S. Oguz (Tufts), "" (10/96: 173 Kb, Adobe Acrobat format; published inApplied Economics Letters, 1998, 5:135-138)

Abstract: Using spectral regression and exact maximum likelihood methods, we test for long memory dynamics in the traded goods prices for the G7 countries, as measured in their import and export price indices. Significant and robust evidence of fractional dynamics with long memory features is found in both import and export price inflation rates.

348. James E. Anderson, "," (10/96: 176 Kb, Adobe Acrobat format; published inTrade Policy and the Pacific Rim, J. Piggott and A. Woodland, eds., Macmillan/International Economic Association, 1999)

Abstract: The theory of trade reform typically is based on a passive government budget constraint, in which changes in tariff revenue are costlessly offset by lump sum transfers. This paper offers a general framework for trade reform when the government budget constraint is active, such that tariff revenue cuts must be offset by public good decreases or other tax increases. The trade reform and public finance literatures are integrated to develop some useful and simple new expressions characterizing welfare improving trade reform. The expressions are operational with Computable General Equilibrium models. The theoretical analysis and an application to Korean data in 1963 cast doubt of the desirability of tariff cuts in convex competitive economies with active government budget constraints.

347. Chong-en Bai and Yijang Wang (Minnesota), "" (9/96: 140 Kb, Adobe Acrobat format)

Abstract: We construct an agency model in which the planner (agent) makes project starting and termination decisions on behalf of the state (principal) to reflect the practice of socialist economies. The model shows that asymmetric information between the state and the planner regarding the quality of projects started leads to the persistence of unprofitable projects in most cases. Since in the model it is assumed that the state's objective is to maximize economic profit and the state has full power to dictate and enforce the optimal contract, the finding of the model has the implication that hardening budget constraints in socialist economies is difficult even under an "ideal" setting when these economies are free of social considerations and political frictions.

346. Chris Canavan and Mariano Tommasi (UCLA and Universidad de San Andres), "" (6/96: 224 Kb, Adobe Acrobat format)

Abstract: We investigate the interplay between government credibility and the visibility of policy-making, using the choice of a nominal anchor as an important example of how governments control visibility. We show that visibility has an important influence on how governments acquire credibility, and for this reason is a variable that governments use strategically. Policy-makers with stronger commitment to reform opt for more visible policies (e.g., an exchange-rate anchor) whereas policy-makers who cannot carry through with serious reform opt for noisier signals (e.g., a money anchor). Our logic is that greater visibility makes it easier for the public to learn the government's preferences, and only policy-makers committed to reform want this to happen. Among other things, our analysis provides a rationale for the prevalence of temporary exchange-rate targets in inflation-stabilization programs.

345. Chong-en Bai and Zhigang Tao (Hong Kong University of Science and Technology), "" (8/96: 734 Kb, Adobe Acrobat format)

Abstract: This paper analyzes a model with many homogeneous agents, whose effort can be allocated to two tasks. One task produces a public good that is an important input for the production of the final output. The other task only affects the agent's own output. We show that, when the public input and the private input are complementary, the principal should offer a fixed-wage contract to some agents and a revenue-sharing contract to the remaining agents. Furthermore, we show that, when the ex ante contracts are subject to ex post renegotiation, agents with the fixed-wage contract should not own any asset, whereas agents with the revenue-sharing contract should own the physical asset in which the private input is embedded. Meanwhile, the principal should retain residual rights of control over the public good. This paper offers an explanation of the co-existence of company-owned units and franchised units in a franchise company. It adopts and extends important features from both the multi-task theory of the firm and the incomplete-contract theory of the firm.

344. Chongen Bai, David D. Li (University of Michigan) and Yijiang Wang (University of Minnesota), "" (4/96: 492 Kb, Adobe Acrobat format)

Abstract: A large literature has documented impressive productivity growth in China's state enterprises during the reform. The evidence has been used to support the view that China's enterprise reform has been successful. We cast doubt on this view by arguing that productivity is not a reliable measure of state enterprise performance. A model is used to show that when firms are not profit maximizers, higher productivity may actually lead to greater allocative distortion, lower profits and lower economic efficiency. There is evidence this may be the case for many Chinese state enterprises during the reform.

343. H. Lorne Carmichael (Queen's University) and W. Bentley MacLeod, "" (8/96: 363 Kb, Adobe Acrobat format)

Abstract: We examine an evolutionary model of preferences in a society where resources are finite. Agents who develop better strategies for bargaining and trading will grow to dominate the population. We show that successful agents will have preferences that exhibit the "endowment effect". The social institution of private property emerges spontaneously. Agents decisions will be subject to "framing" effect, and we are able to make some predictions as to the frames that will be salient in given situations. The model makes a clear distinction between individual welfare and revealed preferences. Nonetheless, it may still be possible to recover information about individual welfare from behavioral data.

342. W. Bentley MacLeod, "" (6/96: 396 Kb, Adobe Acrobat format)

Abstract: This paper introduces a model of contract incompleteness and bounded rationality based on the multi-tasking model of Holmstrom and Milgrom (1991). It is shown that the trade-off between the use of an employment relationship versus and explicit state contingent contract depends on number of tasks or complexity of the services provided by the individual.

341. Donald Cox, Bruce E. Hansen, and Emmanuel Jimenez (World Bank), "" (rev. 12/1999: 204 Kb, Adobe Acrobat format. ).

Abstract: In recent years there has been rapidly growing interest in the implications of altruistic preferences for economic behavior. Undoubtedly most of this interest is fueled by altruism's often pivotal role in economic models and policy issues. Yet there is also an emerging consensus that empirical evidence for altruistic preferences--as specified in the seminal models of Becker and Barro--is lacking, at least for the United States. The failure to find strong evidence for altruism flies in the face of what seems to be an eminently commonsensical proposition about behavior.

A possible reason for the lack of evidence for altruism in a developed country like the United States is that its substantial public transfers may have already crowded out private ones to a large extent, rendering the remaining small samples uninformative about altruism. In this paper we focus on a country with extremely limited public income redistribution, the Philippines. We examine a model that nests the Becker-Barro model of altruism and predicts that the relationship between private transfers and pre-private-transfer income will be non-linear, taking the form of a spline. We estimate this model by non-linear least squares, treating the threshold (knot point) as an unknown parameter, using recently developed econometric techniques. This allows a rigorous econometric test of the altruism hypothesis. We find that private transfers are widespread, highly responsive to household economic status and conform to patterns implied by altruistic utility interdependence. In particular, among the poorest households, we estimate that decreases in pre-private-transfer income would prompt large increases in private transfers.

Our findings have significant policy implications, because they imply that attempts to improve the status of the poor could be thwarted by private responses. Some of the gains from public transfers would be shared with richer households whose burden of support for their less fortunate kin is eased. So the problems that altruistic preferences create for public income redistribution, first pointed out by Becker and Barro over 20 years ago, do indeed matter empirically.

340. Thomas Lemieux (Université de Montréal) and W. Bentley MacLeod, "" (3/96: 265 Kb, Adobe Acrobat format).

Abstract: This paper presents results from a 1971 natural experiment carried out by the Canadian government on the unemployment insurance system. At that time they dramatically increased the generosity of the system. We find that the propensity to collect UI increases with a first time exposure to the system. Hence as more individuals experience unemployment, their lifetime use of the system increases. This supply side hysteresis effect may explain why unemployment has steadily increased over the 1972-1992 period, even though the generosity of unemployment insurance did not.

339. W. Bentley MacLeod and James Malcomson (University of Southampton), "" (3/96: 831 Kb, Adobe Acrobat format)

Abstract: In standard shirking models of efficiency wages, workers are motivated only by high wages. Yet 23% of young US workers report receiving some form of performance pay. This paper extends the efficiency wage framework using the theory of self-enforcing agreements to allow for performance pay in the form of bonuses. The result is a simple model of wage formation that helps explain a number of apparently unrelated phenomena in labor markets. First, in efficient markets performance pay is preferred to an efficiency wage when the cost of having a job vacant is low and qualified workers are in short supply. Second, more capital intensive industries offer higher pay than less capital intensive industries, as observed in studies of inter-industry wages differentials. Third, sustaining an efficient outcome requires a social convention similar to the notion of a fair wage, although the outcome itself is determined by fundamentals and not by exogenously imposed notions of what is fair. Finally, a two-sector version of the model makes some predictions about the relationships between turnover and wages and between wages, growth and unemployment.

338. H. Lorne Carmichael (Queen's University) and W. Bentley MacLeod, "" (1/96: 317 Kb, Adobe Acrobat format)

Abstract: Gift giving is a practice common to many societies. In an evolutionary model the social custom of giving gifts at the beginning of a relationship can lead to trust and cooperation. The evolutionary approach makes predictions about the character of the goods that can be used as gifts. For example, gift goods may have little use value even at low levels of consumption. Although the gifts themselves are useless, the institution is not.

337. Oivind Anti Nilsen (Norwegian Research Centre in Organization and Management) and Fabio Schiantarelli, "" (rev.11/2000: 276 Kb, Adobe Acrobat format)

Abstract: The objective of this paper is to identify and discuss the main stylized facts about the type and degree of non-smoothness of capital adjustment. Using Norwegian micro data, we investigate the frequency of periods of zero investment as well as the lumpiness of investment both at the plant and firm level, and at different level of aggregation across capital goods. We also discuss how the importance of zero investment episodes and lumpiness varies between small and large plants or firms. Finally we estimate a discrete hazard model, controlling for unobserved heterogeneity, to determine the probability of having an episode of high investment, conditional on the length of the interval from the last high investment episode, and discuss the implications of the empirical evidence for the shape of the adjustment cost function.

336. W. Bentley MacLeod, "" (6/96: 373 Kb, Adobe Acrobat format)

Abstract: This essay illustrates that if Savage's small world assumption is relaxed, one can construct a theory of bounded rationality that incorporates some of the insights from recent work in cognitive psychology. The theory can be used to explain why contracts are incomplete and the existence of endowment effects in exchange.

335. Christopher F. Baum and Meral Karasulu (Bogazici University), "Modelling Federal Reserve Discount Policy" (rev. 10/96: 93 Kb, Adobe Acrobat format; published,Computational Economics, 11:53-70, 1998)

Abstract: We employ threshold cointegration methodology to model the policy problem solved by the Federal Reserve System in their manipulation of the discount rate under a reserves target operating procedure utilized since October 1979. The infrequent and discrete adjustments that characterize movements in the discount rate instrument vis-a-vis the Federal Funds rate do not lend themselves to a linear cointegration framework. The inherently nonlinear relationship arising from the Fed's self-imposed constraints on discontinuously changing the discount rate is satisfactorily modelled as an instance of threshold cointegration between the discount rate and the Federal Funds rate.

334. John Barkoulas and Christopher F. Baum, "" (rev. 7/97: 89 Kb, Adobe Acrobat format; published inPacific-Basin Finance Journal, 6:1-2, 115-124)

Abstract: Using the spectral regression and Gaussian semiparametric methods of estimating the long-memory parameter, we test for fractional dynamic behavior in a number of important Japanese financial time series: spot exchange rates, forward exchange rates, stock prices, currency forward premia, Euroyen deposit rates, and the Euroyen term premium. Stochastic long memory is established as a feature of the currency forward premia, Euroyen deposit rates, and Euroyen term premium series. The martingale model cannot be rejected for the spot, forward, and stock price series.

333. Christopher F. Baum, John Barkoulas and Mustafa Caglayan, "" (rev. 04/98: 149 Kb, Adobe Acrobat format; published inSouthern Economic Journal, 65:4 (1999), 900-913)

Abstract: We test for fractional dynamics in CPI-based inflation rates for twenty-seven countries and WPI-based inflation rates for twenty-two countries. The fractional differencing parameter is estimated using semiparametric and approximate maximum likelihood methods. Significant evidence of fractional dynamics with long-memory features is found in both CPI- and WPI-based inflation rates for industrial as well as developing countries. Implications of the findings are considered and sources of long memory are hypothesized.

332. Christopher F. Baum and Clifford F. Thies (Shenandoah University), "" (revised 8/97: 182 Kb, Adobe Acrobat format; published,Review of Quantitative Finance and Accounting)

Abstract: The effects of measurement and specification error on estimates of the Q and cash flow model of investment are investigated. Two sources of error are considered: expensing of R&D expenditures and failing to identify that component of cash flow which relaxes financing constraints. We apply random-effects and instrumental variables estimators to a model that addresses these sources of error. We find that: (1) the capitalization of R&D strengthens the explanatory power of the model; (2) expected and unexpected components of cash flow have different effects; and (3) the effects of Q are much more evident in firms facing low costs of external finance.

331. David A. Belsley, "" (5/96: 256 Kb, Adobe Acrobat format; published inComputational Economics, 10, 197-229)

Abstract: Monte Carlo experiments establish that the usual "t-statistic" used for testing for first-order serial correlation with artificial regressions is far from being distributed as a Student's t in small samples. Rather, it is badly biased in both mean and variance and results in grossly misleading tests of hypotheses when treated as a Student's t. Simply computed corrections for the mean and variance are derived, however, which are shown to lead to a transformed statistic producing acceptable tests. The test procedure is detailed and exemplar code provided.

330. Donald Cox, James Fetzer and Emmanuel Jiminez, "" (5/96: 101 Kb, Adobe Acrobat format)

Abstract: This paper uses the Viet Nam Living Standards Survey (VNLSS) to provide a snapshot of private transfer activity. We investigate private transfer patterns along a variety of dimensions, such as age, household resources, demographic make-up of the household and characteristics of the region of residence. We find that private transfers are substantial and widespread in Viet Nam, and their patterns suggest that they sometimes function like means-tested public transfers. They are targeted to vulnerable groups such as low-income households or those stricken with illness, for example. But they are also disproportionately given to the well-educated. A substantial fraction of elderly households receive private transfers, suggesting that they function in part as old-age support.

329. Donald Cox and Oded Stark (Harvard University), "" (6/94: 139 Kb, Adobe Acrobat format)

Abstract: How can parents secure old-age support in the form of care, attention or financial transfers from their children? We explore the enforcement of implicit intergenerational agreements from a fresh angle by studying the possibility that the child's conduct is conditioned by the parents' example. Parents can take advantage of this learning potential by making transfers to their own parents when children are present to observe such transfers. Parents who desire old-age support have an incentive to behave appropriately. The idea that the parents' behavior is aimed at inculcating desirable behavior in their children generates testable hypotheses about transfers that we investigate using household survey microdata. The demonstration-effect approach also has implications for such diverse phenomena as population aging and the labor market participation of women.

328. Donald Cox, Emmanuel Jimenez, Wlodek Okrasa, "" (5/96: 117 Kb, Adobe Acrobat format)

Abstract: Can Eastern European families most severely impoverished during the transition to capitalism rely on private family safety nets? This question is likely critical for the transition's success, but little is known about family networks in Eastern Europe. We analyze newly available Polish household surveys, conducted both before and after Poland's economic transition, which measure private inter-household transfers. Such transfers are large and widespread in Poland, and in many ways they appear to function like means-tested public transfers. They flow from high to low-income households and are targeted to young couples, large families and those experiencing illness. Private transfer patterns also suggest that they are responsive to liquidity constraints. Our results from 1987 data indicate that private transfers could fill a non-trivial portion of the income gap left by unemployment. But we also find evidence from 1992 data that family networks weakened somewhat after the transition.

327. Donald Cox, Zekeriya Eser, Emmanuel Jimenez, "" (4/96: 97 Kb, Adobe Acrobat format)

Abstract: This paper tests for the motives for private income transfers. We consider two motives: altruism and exchange. The question of private-transfer motives is important because such motivation can influence the effects of public income transfers on the distribution of income. Using a household survey for Peru, we find that transfer amounts received increase with recipient pre-transfer income, which contradicts a key prediction of the strong form of the altruism hypothesis but is consistent with exchange. We also find that capital market imperfections are likely to be an important cause of private transfers, and that social security benefits "crowd out" the incidence of private transfers.

Please note:To view Adobe Acrobat® (PDF) formatted files.


325
. Bruce E. Hansen, "" (revised 9/97: 314 Kb, Adobe Acrobat format.

Abstract: A distribution theory is developed for least squares estimates of the threshold in threshold autoregressive (TAR) models. We find that if we let the threshold effect (the difference in slopes between the two regimes) get small as the sample size increases, then the asymptotic distribution of the threshold estimator is free of nuisance parameters (up to scale). Similarly, the likelihood ratio statistic for testing hypotheses concerning the unknown threshold is asymptotically free of nuisance parameters. These asymptotic distributions are non-standard, but are available in closed form so critical values are readily available. To illustrate this theory, we report an application to the U.S. unemployment rate. We find statistically significant threshold effects.

324. Joseph Quinn, "" (revised 7/96: 96 Kb, Adobe Acrobat format. InRetirement Prospects in a Defined Contribution World, D. Salisbury, ed., Employee Benefit Research Institute, 1997, 25-39.)

323. Joseph Quinn, "" (2/96: 83 Kb, Adobe Acrobat format. Figures available from theauthor. Published by National Institute on Aging, Washington, 1996).

322. Jill Quadagno (Florida State University) and Joseph Quinn, "" (6/95: 49 Kb, Adobe Acrobat format. Figures available from theauthor. Published inSocial Security in the 21st Century, E. Kingson and J. Schulz, eds., 1997, Oxford University Press, 127-146).

321. John Barkoulas, Christopher F. Baum, and Mustafa Caglayan, "" (revised 01/98: 182 Kb, Adobe Acrobat format. Published inApplied Economics, 1999, 31, 1393-1400.)

Abstract: We test for fractional dynamics in U.S. monetary series, their various formulations and components, and velocity series. Using the spectral regression method, we find evidence of a fractional exponent in the differencing process of the monetary series (both simple-sum and Divisia indices), in their components (with the exception of demand deposits, savings deposits, overnight repurchase agreements, and term repurchase agreements), and the monetary base and money multipliers. No evidence of fractional behavior is found in the velocity series. Granger's (1980) aggregation hypothesis is evaluated and implications of the presence of fractional monetary dynamics are drawn.

320. John Barkoulas, Christopher F. Baum, Joseph Onochie (Baruch College), "" (rev. 5/97: 90 Kb, Adobe Acrobat format). Published inReview of Financial Economics, 6:2, 187-198.

Abstract: We employ a nonlinear, nonparametric method to model the stochastic behavior of changes in the 90-day U.S. T-bill rate. The estimation technique is locally weighted regression (LWR), a nearest-neighbor method, and the forecasting criteria are the root mean square error (RMSE) and mean absolute deviation (MAD) measures. We compare the forecasting performance of the nonparametric fit to the performance of two benchmark linear models: an autoregressive model and a random-walk-with-drift model. The nonparametric fit results in significant improvements in forecasting accuracy as compared to benchmark linear models both in-sample and out-of-sample, thus establishing the presence of substantial nonlinear mean predictability of changes in the 90-day T-bill rate.

319. Bruce E. Hansen, "" (revised 4/98: 360 Kb, Adobe Acrobat format.

Abstract: Threshold models have a wide variety of applications in economics. Direct applications include models of separating and multiple equilibria. Other applications include empirical sample splitting when the sample split is based on a continuously-distributed variable such as firm size. In addition, threshold models may be used as a parsimonious strategy for non-parametric function estimation. For example, the threshold autoregressive model (TAR) is popular in the non-linear time series literature.

Threshold models also emerge as special cases of more complex statistical frameworks, such as mixture models, switching models, Markov switching models, and smooth transition threshold models. It may be important to understand the statistical properties of threshold models as a preliminary step in the development of statistical tools to handle these more complicated structures.

Despite the large number of potential applications, the statistical theory of threshold estimation is undeveloped. The previous literature has demonstrated that threshold estimates are super-consistent, but a distribution theory useful for testing and inference has yet to be provided.

This paper develops a statistical theory for threshold estimation in the regression context. We allow for either cross-section or time series observations. Least squares estimation of the regression parameters is considered. An asymptotic distribution theory for the regression estimates (the threshold and the regression slopes) is developed. It is found that the distribution of the threshold estimate is non-standard. A method to construct asymptotic confidence intervals is developed by inverting the likelihood ratio statistic. It is shown that this yields asymptotically conservative confidence regions. Monte Carlo simulations are presented to assess the accuracy of the asymptotic approximations. The empirical relevance of the theory is illustrated through an application to the multiple equilibria growth model of Durlauf and Johnson (1995).

318. Salih Gurcan Gülen, "" (5/96: 215 Kb, Adobe Acrobat format) An abridged version of this paper has been published inThe Energy Journal, Vol. 17, 2:43-57.

Abstract: The energy shocks of the 1970's had significant effects on the global economy. Were they engineered by an effective cartel of OPEC members acting to share the market by controlling output and influencing market prices? If OPEC was an effective cartel sharing the market among its members, there would be a long-run relationship between each member's individual production and total OPEC output. One would also expect OPEC's production to significantly affect the market price of oil as the organization is often accused of curbing production in order to raise prices. These implications of cartel behavior are tested via cointegration and causality tests. The likely effects of regime changes are dealt with using techniques developed by Perron (1989). There is evidence of output coordination among some members of the organization, especially in the output rationing era (1982-93). This is also the only period in which the causality from OPEC production to the price of oil is statistically significant. Overall, the evidence suggests that OPEC did act as a cartel in the 1980's in order to maintain prices, while it simply took advantage of market conditions in the 1970's and did not have to restrain output.

317. John Barkoulas, Christopher F. Baum, "" (rev. 10/96: 75 Kb, Adobe Acrobat format. Published inJournal of Financial Research, Fall 1997, 20:3, 355-372).

Abstract: We investigate the low frequency properties of three- and six-month rates for Eurocurrency deposits denominated in eight major currencies with specific emphasis on fractional dynamics. Using the fractional integration testing procedure suggested by Geweke and Porter-Hudak (1983), we find that several of the Eurocurrency deposit rates are fractionally integrated processes with long memory. These findings have important implications for econometric modeling, forecasting, and cointegration testing of Eurocurrency rates.

316. James E. Anderson, Will Martin (World Bank), "" (4/96: 99 Kb, Adobe Acrobat format)

Abstract: A simple general equilibrium model of an economy with distortionary taxes and public goods is used to extend, unify and clean up the welfare analysis of changes in taxation, redistribution and the provision of public goods. We clarify the distinction between compensation and money metric measures of the welfare impact of fiscal changes and show that the equivalent variation measure dominates other measures. We provide an integrated approach to marginal tax and public good changes when public goods have real resource costs and must be financed by distortionary taxation using the concepts of the marginal cost of funds, the fiscal price of public goods and the virtual price of public goods. Here too, the compensation version of these concepts dominates the money metric version.

315. John Barkoulas, Christopher F. Baum and Gurkan S. Oguz, "" (rev. 10/96: 281 Kb, Adobe Acrobat format; published inInternational Journal of Finance, 1997, 9:2, 586-606).

Abstract: DeGennaro, Kunkel, and Lee (1994) studied the long run dynamics of a system of long term interest rates of five industrialized countries by means of sophisticated cointegration methods. They found little evidence in support of the cointegration hypothesis, thus concluding that a separate set of fundamentals drives the dynamics of each of the individual long term interest rate series. In this study, we extend their analysis by exploring the possibility of very slow mean reverting dynamics (fractional cointegration) in the system of the five long term interest rates. We use the GPH test as our testing methodology for fractional integration and cointegration. Through rigorous investigation of the full system of the five long term interest rate series and its various subsystems, we provide evidence that the error correction term follows a fractionally integrated process with long memory, that is, it is mean reverting, though not covariance stationary. Despite significant persistence in the short run, a shock to the system of long term interest rates eventually dissipates so that an equilibrium relationship prevails in the long run.

314. John Barkoulas and Christopher F. Baum, "" (4/96: 178 Kb, Adobe Acrobat format; published inEconomics Letters, 53:3, 253-259. (Previously titled "Testing for Fractal Structure in Stock Returns")

Abstract: This paper investigates the presence of fractal dynamics in stock returns. We improve upon existing literature in two ways: i) instead of rescaled-range analysis, we use the more efficient semi-nonparametric procedure suggested by Geweke and Porter-Hudak (GPH, 1983), and ii) to ensure robustness, we apply the GPH test to a variety of aggregate and sectoral stock indices and individual companies' stock returns series at both daily and monthly frequencies. Our results indicate that fractal structure is not exhibited by stock indices, but it may characterize the behavior of some individual stock returns series.

313. John Barkoulas, Christopher F. Baum, and Atreya Chakraborty, "" (2/96: 231 Kb, Adobe Acrobat format; published inInternational Journal of Banking and Finance, 1:1, 119-135, 2003)

Abstract: We employ a nonlinear, nonparametric method to model the stochastic behavior of changes in several short and long term U.S. interest rates. We apply a nonlinear autoregression to the series using the locally weighted regression (LWR) estimation method, a nearest-neighbor method, and evaluate the forecasting performance with a measure of root mean square error. We compare the forecasting performance of the nonparametric fit to the performance of two benchmark linear models: an autoregressive model and a random-walk-with-drift model. The nonparametric model exhibits greater out-of-sample forecast accuracy than that of the linear predictors for most U.S. interest rate series. The improvements in forecasting accuracy are statistically significant and robust. This evidence establishes the presence of significant nonlinear mean predictability in U.S. interest rates, as well as the usefulness of the LWR method as a modeling strategy for these benchmark series.

312. Paul Beaudry (University of British Columbia), Mustafa Caglayan and Fabio Schiantarelli, "" (2/96: 74 Kb, Adobe Acrobat format; later version published,American Economic Review, 91, 648-662, 2001)

Abstract: It is often argued that monetary instability reduces the informational content of market signals and thereby hinders the efficient allocation of investment. In this paper we use a signal extraction framework to give empirical content to this idea. In particular, we show why this framework predicts that, as monetary uncertainty decreases, the cross-sectional distribution of investment widens. We then explore this hypothesis using panel data information for UK companies over twenty years. Our data generally support the view that monetary instability may affect investment allocation through its effect on the predictability of prices.

311. Christopher F. Baum and John T. Barkoulas, "" (2/96: 352Kb, Adobe Acrobat format; published inApplied Financial Economics, 1997, 7:635-643)

Abstract: We re-examine Sephton and Larsen's (1991) conclusion that cointegration-based tests for market efficiency suffer from temporal instability. We improve upon their research by i) including a drift term in the vector error correction model (VECM) in the Johansen procedure, ii) correcting the likelihood ratio test statistic for finite-sample bias, and iii) fitting the model over longer data sets. We show that instability of the Johansen cointegration tests mostly disappears after accounting for these two factors. The evidence is even more stable in favor of no cointegration when we apply our analysis to longer data sets.

310. Bruce E. Hansen, "" (revised 4/98: 363 Kb, Adobe Acrobat format)

Abstract: In the past decade, we have seen the development of a new set of tests for structural change of unknown timing in regression models, most notably the SupF statistic of Andrews (1993), the ExpF and AveF statistics of Andrews-Ploberger (1994), and the L statistic of Nyblom (1989). The distribution theory used for these tests is primarily asymptotic, and has been derived under the maintained assumption that the regressors are stationary. This excludes structural change in the marginal distribution of the regressors. As a result, these tests technically cannot discriminate between structural change in the conditional and marginal distributions. This paper attempts to remedy this deficiency by deriving the large sample distributions of the test statistics allowing for structural change in the marginal distribution of the regressors. We find that the asymptotic distributions of the SupF, ExpF, AveF and L statistics are not invariant to structural change in the regressors. To solve the size problem, we introduce a 'fixed regressor bootstrap' which achieves the first-order asymptotic distribution, and appears to possess reasonable size properties in small samples. Our bootstrap theory allows for arbitrary structural change in the regressors, including structural shifts, polynomial trends, and exogenous stochastic trends. It allows for lagged dependent variables and heteroskedastic error processes.

309. Richard Arnott, Alex Anas (SUNY Buffalo), "Taxes and Allowances in a Dynamic Equilibrium Model of Urban Housing with a Size-Quality Hierarchy." (12/95)

308. James Anderson, "" (9/95: 83Kb, Adobe Acrobat format; published,Journal of International Economics, 1998, 44:21-44)

Abstract: This paper rehabilitates effective protection. In general equilibrium, the usual definition (the percentage change in value added per unit induced by the tariff structure) corresponds to no economically interesting magnitude. The effective rate of protection for sector j is defined here as the uniform tariff which is equivalent to the actual differentiated tariff structure in its effect on the rents to residual claimants in sector j. This definition applies to general as well as partial equilibrium economic structures, has obvious relevance for political economy models and seems to correspond to the motivation for the early effective protection literature. Like the earlier effective rate formula, the concept is operational using the widely available set of Computable General Equilibrium (CGE) models. An example is provided for the US economy. The numerical results for the old and new concepts are not significantly correlated.

307. Richard Arnott, Ralph Braid (Wayne State University), Russell Davidson (GREQE, Queen's University), David Pines (Tel-Aviv University), "" (8/95: 264Kb, Adobe Acrobat format)

Abstract: This paper examines the properties of stationary-state general equilibrium in a monocentric city with durable housing. On the demand side, identical households choose location, housing quality and quantity (floor area), and other goods. On the supply side, developers choose the structural density and time path of quality (which depends on construction quality and maintenance) of buildings. Under a certain set of assumptions, existence and uniqueness of equilibrium are proved, and its comparative static/dynamic properties are determined.

306. Peter Gottschalk, Mary Joyce (Bureau of Labor Statistics), "" (8/95: 66Kb, Adobe Acrobat format)

305. Richard Arnott, André de Palma (University of Geneva), Robin Lindsey (University of Alberta), "Recent Developments in the Bottleneck Model" (8/95)

304. Richard Arnott, Marvin Kraus, "" (6/95: 99Kb, Adobe Acrobat format)

303. Chong-en Bai, Chenggang Xu (London School of Economics), "" (8/95: 462Kb, Adobe Acrobat format)

Abstract: This paper provides a theoretical framework to analyze workers' incentives under different ownership. It shows that the workers' effort and expected income are higher and the monitoring intensity is lower in the employee-owned firm than in the capitalist firm. Unlike in previous models, the advantage of employee ownership here does not depend on the size of the firm. It also shows that the advantage of employee ownership increases as workers' reservation wage decreases, the monitoring cost and productivity uncertainty increases. Finally, it discusses the relevance of the theory to employee stock-ownership program (ESOP) and profit sharing.

302. Richard Arnott, Paul Anglin (University of Windsor), "" (4/95: 248Kb, Adobe Acrobat format)

Abstract: Many people believe that prevailing commission rates for residential real estate brokers are "too high" but do not offer a formal model. This paper presents a general equilibrium model of the housing market in which real estate brokers serve as matching intermediaries. We use this model to construct an illustrative example which is "calibrated" using data representative of a typical housing market.

301. Richard Arnott, Ralph Braid (Wayne State University), "" (2/95: 231Kb, Adobe Acrobat format)

Abstract: This paper presents a filtering model of the housing market which is similar to Sweeney's (1974b), except that the maintenance technology is such that housing can be maintained at a constant quality level as well as downgraded, and population at each income level grows continuously over time. In equilibrium, at each moment of time, some housing is allowed to deteriorate in quality, and other housing is maintained in a steady-state interval of qualities.

300. Bruce E. Hansen, "Rethinking the Univariate Approach to Unit Root Testing: Using Covariates to Increase Power" (5/95: now published inEconometric Theory, 1995, 11:1148-1172)

Abstract: In the context of testing for a unit root in a univariate time series, the convention is to ignore information in related time series. This paper shows that this convention is quite costly, as large power gains can be achieved by including correlated stationary covariates in the regression equation. The paper derives the asymptotic distribution of ordinary least squares (OLS) estimates of the largest autoregressive root and its t statistic. The asymptotic distribution is not the conventional ''Dickey-Fuller'' distribution, but a convex combination of the Dickey-Fuller distribution and the standard normal, the mixture depending on the correlation between the equation error and the regression covariates. The local asymptotic power functions associated with these test statistics suggest enormous gains over the conventional unit root tests. A simulation study and empirical application illustrate the potential of the new approach.

Please note:To view Adobe Acrobat® (PDF) formatted files.


299
. Bruce E. Hansen, "" (5/95: 413Kb, Adobe Acrobat format; published inEconomic Journal, 1996, 106:1398-1413).

Abstract: This article reviews David Hendry'sEconometrics: Alchemy or Science?

298. Chong-en Bai, Yijiang Wang (University of Minnesota), "" (8/95: 116Kb, Adobe Acrobat format)

Abstract: This paper studies the soft budget constraint problem in a principal-agent model. The agent screens projects of and makes initial investment in the projects that have passed the screening. He then finds the types of the funded projects and decides to close some of the ex post inefficient ones among them. Closing projects sends an unfavorable signal about the agent's screening effort. Under the ex ante efficient contract, the agent has incentive to refinance some of the ex post inefficient projects.

297. Bruce E. Hansen, "" (4/95: 281Kb, Adobe Acrobat format; published inJournal of Business and Economic Statistics, 1997, 15:60-67).

Abstract: Numerical approximations to the asymptotic distributions of recently proposed tests for structural change are presented. This enables easy yet accurate calculation of asymptotic p-values.

296. Bruce E. Hansen, "" (3/95: 116Kb, Adobe Acrobat format) Note: This erratum corrects an error in Hansen,Journal of Applied Econometrics(1992), and was published inJournal of Applied Econometrics, 1996, 11:195-198).

295. Bruce E. Hansen, "" (11/94: 660Kb, Adobe Acrobat format; published inEconometric Theory, 1996, 12:347-359).

Abstract: This paper establishes stochastic equicontinuity for classes of mixingales. Attention is restricted to Lipschitz-continuous parametric functions. Unlike some other empirical process theory for dependent data, our results do not require bounded functions, stationary processes, or restrictive dependence conditions. Applications are given to martingale difference arrays, strong mixing arrays, and near epoch dependent arrays.

294. Douglas Marcouiller, S.J., Veronica Ruiz de Castilla (University of Texas, Austin), Christopher Woodruff (University of California, San Diego), "" (3/95: 231Kb, Adobe Acrobat format; published inEconomic Development and Cultural Change, 1997, 45, 367-392.)

Abstract: Using comparable micro-level data from three countries, we ask what type of person works in the informal sector and whether informal workers earn lower wages than observationally equivalent workers in the formal sector. The characteristics of informal workers are similar across countries. Surprisingly, when we control for these personal characteristics, we find a significant wage premium associated with formal employment in El Salvador and Peru but a premium associated with work in the informal sector in Mexico. A model of endogenous selection offers little help in explaining the differences in wage patterns. The research casts doubt on the received wisdom that the informal sector, always and everywhere, is a poorly-paid but easily-entered refuge for those who have no other employment opportunities.

293. Fabio Schiantarelli, "" (8/95: 1.2 Mb, Adobe Acrobat format)

Revised version (with discussion by Steven Fazzari and Donald Hester) published inIs Bank Lending Important for the Transmission of Monetary Policy?, J. Peek and E. Rosengren, eds., Federal Reserve Bank of mdý Conference Series No. 39, June 1995.

292. Joseph F. Quinn, Michael Kozy, "" (5/95: 66Kb, Adobe Acrobat format)

291. T. Christopher Canavan, "" (7/95: 215Kb, Adobe Acrobat format)

Abstract: This paper presents a model in which "instrument uncertainty"‹that is, an uncertain mapping from monetary policy to macroeconomic outcomes‹may mitigate the inflationary bias problem that arises when efficient monetary policy rules are time- inconsistent. If the relation between monetary policy and macroeconomic outcomes is uncertain, the private sector has an incentive to scrutinize the past for clues about this relationship. This learning creates a link between past government behavior and present inflation expectations that the government can exploit to enhance its credibility. The model implies that the two conventional arguments for simple rules in monetary policy‹one stressing the central bank's poor forecasting abilities and the other stressing the perils of discretion‹may work at cross-purposes. Moreover, it provides an explanation of the cyclical behavior of inflation due to political cycles and of the correlation between the level and variance of inflation.

290. James Anderson, "" (6/95: 99Kb, Adobe Acrobat format; published,Economic Journal, 1998, 108:1111-1125)

Abstract: This paper provides benchmarks of trade restrictiveness and year-on-year changes in trade restrictiveness using the Trade Restrictiveness Index. These benchmark measures stand in sharp contrast to standard measures. For a 28 country sample the TRI is used to compare trade policy in a recent year with free trade. Trade weighted average tariffs substantially underestimate restrictiveness measured by the 'uniform tariff equivalent' (the inverse of the TRI minus one), with the degree of underestimate positively correlated with the dispersion of the tariff structure. The rank correlation of the 'uniform tariff equivalent' and the average tariff in the sample is high, but the error implied by using the average tariff instead of the uniform tariff equivalent is substantial and variable. For a 7 case sample, year-on-year recent changes in trade policy are evaluated with the TRI and with standard measures. Here, the correlation of the TRI and changes in the standard measures is close to zero, essentially because tariff means and variances often do not move together. These conclusions appear to be robust with respect to missing data problems. The magnitude of the TRI is not very sensitive to elasticity of substitution variation, but is sensitive to the assumptions used to treat NTBs.

289. Peter Gottschalk, Kathleen M. Lang, "" (6/95: 52Kb, Adobe Acrobat format)

288. Christopher F. Baum, Basma Bekdache (Wayne State University), "" (revised 6/96: 215 Kb, Adobe Acrobat format): published in H. Amman et al.,eds.,Computational Approaches to Economic Problems, Kluwer Academic Publishers, 1997.

Abstract: In this paper, we test the multivariate model of securities' excess returns formulated by Engle et al. (1990) on an expanded set of maturities. By applying their methodology to the entire Treasury term structure, we consider the applicability of a parsimonious common factor approach to the dynamics of short-, medium-, and long-term interest rates. We extend their methodology to incorporate asymmetric GARCH representations, in which the slope of the yield curve (and its sign) affects the evolution of the conditional variance of excess returns in fixed-income and equity markets. We find this approach quite successful in explaining the comovements of excess returns on the spectrum of Treasury issues for the 1962-1992 period.

287. Frank M. Gollop, Kelly A. Chaston, Kathleen M. Lang, "The Battle Against Major Air Pollutants: Some Wartime Statistics" (4/95)

286. Fabio Schiantarelli, Alessandro Sembenelli (CERIS-CNR), "Form of Ownership and Financial Constraints: Panel Data Evidence from Leverage and Investment Equations" (4/95)

Abstract: This paper analyzes the effects of the form of ownership on the substitutability between internal and external sources of finance. In particular, we test whether financial constraints are more severe for independent firms and whether there are differences between members of large national business groups and subsidiaries of foreign multinational corporations. The results obtained from leverage and investment equations estimated for a panel of Italian companies imply that independent firms face more severe financial constraints. Moreover, such constraints are greater when cash flow decreases. Members of national groups and subsidiaries of multinational corporations are less sensitive to cash flow in their investment decisions. Leverage equations suggest, however, that there are interesting differences between the two latter categories of firms. In particular, agency costs arising from the conflict between managers and shareholders are more important for subsidiaries of multinational corporations.

285. Frank M. Gollop, "The Pin Factory Revisited: Diversification and Productivity Growth" (11/94)

284. Fabio Schiantarelli, Xiaoqiang Hu (Claremont McKenna College), "Investment and Financing Constraints: A Switching Regression Approach Using U.S. Firm Panel Data" (4/95)

Abstract: In this paper we develop a switching regression model of investment, in which the probability of a firm being financially constrained is endogenously determined. This approach allows one to address the potential problem of static and dynamic misclassification encountered where firms are sorted using a criteria chosen a priori.
The empirical results obtained for US panel data suggest that the probability of being constrained depends upon variables that capture each firm's credit worthiness, and it is also related to general macroeconomic conditions and to the tightness of monetary policy.

283. Robert G. Murphy, "Macroeconomic Policy Implications of Oil in Colombia" (11/1994)

Abstract: This paper develops and applies a small econometric model to illustrate the likely short-term macroeconomic effects on the economy of Colombia resulting from the recent discovery and planned development of new oil resources. In performing the analysis, the paper considers various assumptions concerning government fiscal, monetary, and exchange-rate policies so as to assess the ability of policy to influence the effects that the oil discovery will have on the economy. Comparisons of these alternative policy simulations strongly suggest that appropriate macroeconomic policies can reduce significantly the negative consequences of the "Dutch disease," the symptoms of which are reflected in an over-valued exchange rate and declining non-oil export and import-competing sectors. The simulations also demonstrate, however, that some degree of relative price adjustment will be needed for the Colombian economy. In particular, attempts to limit relative price adjustment through the real exchange rate simply force the requisite relative price adjustment to occur through domestic price inflation in reducing the real purchasing power of the peso. Managing this tradeoff between domestic price inflation and real appreciation of the exchange rate should be the overriding concern of macroeconomic policy.

282. Richard Arnott, "" (09/1994: 1.1 Mb, Adobe Acrobat format)

Abstract: Economists' favorite remedy for traffic congestion is road pricing. Not only is road pricing based on sound economic principles, but also given current technology it could be implemented at reasonable cost and in a flexible and sophisticated manner. But there are serious obstacles to the widespread adoption of road pricing. There are problems of phase-in: the fixed costs of introducing any system of road pricing, as well as the problems of coordinating road pricing across jurisdictions, including standardization and the treatment of out-of-towners. Political acceptability is an even more serious obstacle. How can congestion pricing be "sold" to economically unsophisticated voters who are justifiably suspicious of any new government taxes and charges? This paper will not argue against road pricing, though it will point out some of the difficulties associated with the policy that economists have tended to ignore or to gloss over. Rather, it will examine some of the alternatives to road pricing. More specifically, it will focus on two related questions, one positive, one normative, on the assumption that congestion pricing is not introduced, at least on city streets. The positive question: What are the likely effects of policies other than road pricing on alleviating road congestion? The normative question: What mix of policies (road pricing excluded) would be most effective in alleviating traffic congestion? Throughout the focus will be on urban traffic congestion. Alternatives to road pricing can be grouped into five categories: 1. Expansion and upgrading of existing road capacity; 2. Expansion and upgrading of mass transit; 3. Regulation; 4. Information; 5. Non-road transport pricing. While the emphasis of the paper will be on qualitative analysis, there will be some attempts at quantification via back-of-the-envelope calculations.

281. Christopher F. Baum and John Barkoulas, "" (10/1994)

A revised version (January 1996) of this paper, now entitled "Time-Varying Risk Premia in the Foreign Currency Futures Basis," may bein Adobe Acrobat format (192 K): published inJournal of Futures Markets16:7, 735-755.

Abstract (revised January 1996): Significant time-varying risk premia exist in the foreign currency futures basis, and these risk premia are meaningfully correlated with common macroeconomic risk factors from equity and bond markets. The stock index dividend yield and the bond default and term spreads in the U.S. markets help forecast the risk premium component of the foreign currency futures basis. The specific source of risk matters, but the relationships are robust across currencies. The currency futures basis is positively associated with the dividend yield and negatively associated with the spread variables. These correlations cannot be attributed to the expected spot price change component of the currency futures basis, thus establishing the presence of a time-varying risk premium component in the currency futures basis.

280. Richard Arnott, "The Economics of Residential Real Estate Brokerage" (06/1994)

279. Chongen Bai, Shan Li, "Capital Structure and Product Market Strategy" (07/1994)

Abstract: This paper develops a general framework to analyze the relationship between a firm's capital structure and its product market strategy and presents a taxonomy of whether debt makes a firm tough or soft in product market competition and how strategic considerations affect the leverage of a firm based on the nature of the firm's agency problem and the characteristics of the product market. We then review the related literature and point out unexplored linkages between capital structure and product market strategies. Finally, we discuss the empirical implications of our theoretical results.

278. Martin Fleming, John Jordan, and Kathleen Lang, "Macroeconomic Policy and Methodological Misdirection in the National Income and Product Accounts" (09/1994)

277. Chongen Bai and Yijiang Wang, "Specific Human Capital Investment and Turnover Under Uncertainty" (07/1994)

Abstract: An equilibrium model of labor contracts under asymmetric information is developed. A profit-maximizing firm offers a wage but retains the right to lay off the worker based on its private observation of the worker's productivity ex post. The worker invests in specific human capital, unobservable to the firm, to improve the retention probability. It is shown that, under not very restrictive conditions, productivity uncertainty has adverse effects on the firm's wage offer to the worker, the worker's investment in firm-specific human capital, employment stability, and average productivity. A comparison between American and Japanese firms is made to explore the implication of the finding.

276. Chongen Bai, "Specific Human Capital Investment and Wage Profiles" (11/1992)

275. Olin Liu (International Monetary Fund) and Christopher F. Baum, "" (10/1994: 1.8 Mb, Adobe Acrobat format)

Abstract: This paper develops and tests a nonlinear general equilibrium model of the term structure of interest rates based on the framework of Cox, Ingersoll and Ross (CIR, 1985). The contributions of this paper to the literature are both theoretical and empirical. The theoretical advantages of the general equilibrium model developed in this paper over the CIR model are (a) the risk premium is endogenously derived as a nonlinear function of the instantaneous interest rate. (b) The nonlinear model shows that the term premium need not be strictly increasing in maturity as in CIR's model; it can be either increasing or humped, a result that is consistent with recent findings by Fama (1984) and McCulloch (1987). (c) Yields of different maturities are not perfectly correlated, but exhibit positive correlations. A partial differential equation for valuing the discount bond price is presented, and a closed-form expression is derived. The term structure of interest rates derived from this nonlinear model may be increasing, decreasing, humped or inverted, depending on parameter values.
In an empirical application of the model, we develop a strategy for estimation which permits analysis of the model's temporal stability. Our model, like that of CIR, expresses the underlying stochastic process as a highly nonlinear function of two fundamental, time-invariant parameters. Many researchers have found that general equilibrium models such as CIR's provide quite poor explanations of the evolution of the term structure of interest rates. As an alternative strategy to that of fitting the fundamental parameters, we employ nonlinear system estimation of the unrestricted reduced-form parameters with a moving-window strategy in order to capture the term structure volatility caused by factors other than the instantaneous interest rate. We purposefully do not impose any law of motion on the estimated volatilities. This methodology is shown to have strong predictive power for the observed term structure of interest rates, both in-sample and out-of-sample.

274. T. Christopher Canavan, "" (07/1994, PDF)

Abstract: Standard models of hyperinflation use a money demand function based on asset- market considerations: households adjust their real balances according to expected inflation, which is the negative of the real rate of return to money. But these models yield inaccurate and sometimes counterintuitive predictions. One is that if a hyperinflation is a price-level bubble, then hyperinflation is possible at any rate of money growth. Another is that, for some equilibria, an increase in a government's reliance on seignorage reduces rather than raises the steady-state inflation rate. This paper proposes an alternative way to look at hyperinflation based on a careful description of the microeconomics of monetary exchange. Money is primarily an institution required to finance consumption and only incidentally a financial asset. The decision to accept money is a decision to engage in monetary exchange, and a hyperinflation occurs when most households choose to abstain from monetary exchange. The macroeconomic implications of this model are more appealing than those of the traditional models. First, while a hyperinflation in my model may have the same properties as a price-level bubble, this "bubble" is very unlikely when money growth is low and inevitable when money grows too quickly. Second, a greater reliance on seignorage increases the rate of inflation, and can ultimately cause a hyperinflation. The model also mimics the non-monotonic path of real balances as inflation accelerates. Finally, the model suggests that it may be very difficult to restore a currency's place in exchange after a hyperinflation.

273. James E. Anderson, "Effective Protection Redux" (07/1994)

Abstract: This paper rehabilitates effective protection. The usual definition of the effective rate of protection is the percentage change in value added per unit induced by the tariff structure. The problem is that in general equilibrium this measure corresponds to no economically interesting magnitude. The effective rate of protection for sector j is defined here as the uniform tariff which is equivalent to the actual differentiated tariff structure in its effects on the rents to residual claimants in sector j. This definition applies to general as well as partial equilibrium economic structures, has obvious relevance for political economy models and seems to correspond to the motivation for the early effective protection literature. Like the earlier effective rate formula, the concept is operational using the widely available set of Computable General Equilibrium (CGE) models.

272. James E. Anderson, "Measuring the Welfare Impact of Fiscal Policy" (08/1994)

Abstract: This paper is a guide to welfare cost measurement with all the basic elements of fiscal policy active in a representative consumer economy. By developing a simple dual framework which nevertheless is more general than the special cases usually employed, we are able to clear up several confusions in the literature on the welfare cost of tax changes. We argue for the natural dominance of a single measure of welfare cost. We do a similar analysis of the parallel literature on government project evaluation, and are able to clear up the confusing concept of the marginal social cost of funds.

271. Christopher F. Baum and Basma Bekdache, "Comparing Alternative Models of the Term Structure of Interest Rates" (06/1994) This paper has been revised and reissued as.

270. Richard Arnott, Marvin Kraus, "When are Anonymous Congestion Charges Consistent with Marginal Cost Pricing?"
This paper has been revised and reissued as Working Paper #354.

269. James E. Anderson, "Strategic Lobbying and Antidumping" (10/1993)

Abstract: Anti-dumping is often defended as a pressure valve which reduces more illiberal forms of protectionist pressure. In the domino dumping model of Anderson (1992, 1993) this need not be true as exporters dump to obtain market access in the event of a VER. The contribution of this paper is to show that anti-dumping opens a channel for strategic lobbying through which lobbying commitments can have favorable effects on the decisions of exporting firms, and through which antidumping enforcement can encourage lobbying. Thus a "de-politicizing" institution can perversely be responsible for politicizing trade policy all the more.

268. Fabio Schiantarelli, Izak Atiyas, Gerard Caprio, Jr., John Harris, and Andrew Weiss, "Credit Where Credit is Due? How Much, For Whom, and What Difference Does it Make? A Review of the Macro and Micro Evidence on the Real Effects of Financial Reform" (12/1993)

267. Fabio Schiantarelli, Mustafa Caglayan, Paul Beaudry and Michael Devereaux, "Trends and Cycle Variations in the Cross-Sectional Distribution of Debt for U.K. Companies: Some Stylized Facts" (03/1994)

266. Fabio Schiantarelli, Andrew Weiss, Fidel Jaramillo and Miranda Siregar, "Financial Liberalization and the Efficiency of Investment Allocation" (02/1994)

Abstract: In this paper we investigate whether financial liberalization improves the efficiency with which investment funds are allocated. We develop a simple measure of efficiency and apply it to firm level panel data for Indonesia and Ecuador. We find evidence that financial liberalization has indeed improved the allocation of investment, particularly in Indonesia.

265. Harold Petersen and James A. Repetti (BC Law School), "The Relationship Of Taxation To Corporate Dividends" (12/1993)

264. Joseph F. Quinn, Scott A. Bass (Syracuse) and Richard V. Burkhauser, "Towards Pro-Work Policies And Programs For Older Americans" (12/1993)

263. John Jordan, Kathleen Lang and Martin Fleming (Cahners Publishing), "Measurement Error in U.S. National Income and Product Accounts: Its Nature and Impact on Forecasts" (12/1993)

262. John Barkoulas, Alpay Filiztekin and Robert G. Murphy, "The Long-Run Relationship Between Saving And Investment Stylized Fact Or Fiction?" (12/1993; published as "Time Series Evidence on the Saving-Investment Relationship,"Applied Economics Letters, 1996, 3, 77-80)

Abstract: The high correlation between domestic saving and investment rates in cross country regressions has been interpreted by some authors as evidence that world capital markets are not integrated. Our paper reexamines the long-run saving investment relationship across OECD countries using co integration methods. This approach enables us to provide evidence regarding this relationship at a desegregated level, that is, for each country separately. It also accounts for the non-stationary of the underlying tome series. In order to estimate long-run saving-investment correlation as well as to correct for simultaneous equation bias, for a non-linear single-equation error correlation model is used. The results qualify the conclusions of the previous studies by suggesting that saving and investment rates are not highly correlated in the long run for most OECD countries.

261. James E. Anderson, "Strategic Lobbying" (12/1993)

260. Joseph F. Quinn, "Is Early Retirement an Economic Threat?" (12/1993)

259. Jane Marrinan and Eric Van Wincoop (mdý University and IGIER), "Public And Private Savings And Investments" (12/1993)

Abstract: Several have shown that models with perfect international capital mobility can generate high correlations between aggregate savings and investment, as observed in the data. In this paper we decompose aggregate saving and investment into their two component parts, private and public. This leads to some striking observations. In almost all of the OECD countries we investigate during the 1975-1989 period, the private sector saving investment gap closely mirrors the government sector saving investment gap. Moreover, unlike the large aggregate saving investment correlations, private sectors saving investment correlations are on average close to zero. The paper investigates these and other moments associated with the public and private saving and investment in the context of models with perfect capital mobility. The paper devotes significant attention to modeling the government sector. Rules for taxation, government consumption and investment are specified, estimated, and fed into the model simulations. We find that while models are fiscal, technology and interest rate shocks are able to generate negative correlations between the public and private sector saving investment gaps, these correlations still fall significantly short of the very negative correlations observed in the data. Moreover, The models are not able to generate correlations between private saving and investment that are much lower than those between total saving and investment.

258. James E. Anderson, "Measuring Trade Restrictiveness In A Simple CGE Model" (12/1993)

257. Joseph F. Quinn, "Retirement And The Labor Force Behavior Of The Elderly" (12/1993)

256. Richard Arnott and Kenneth Small (University of California, Irvine), "The Economics Of Traffic Congestion" (12/1993)

255. Joseph F. Quinn and Richard V. Burkhauser (Syracuse), "Changing Policy Signals: How To Increase Work Effort At Older Age" (12/1993)

254. Richard Arnott and Joseph Stiglitz (Stanford), "Price Equilibrium, Efficiency, And Decentralizability In Insurance Markets With Moral Hazard" (12/1993)

Abstract: In this paper we investigate the descriptive and normative properties of competitive equilibrium with moral hazard when forms offer "price contracts" which allow clients to purchase as much insurance as they wish to at the quoted price. We show that a price equilibrium always exists and is one of three types i.) Zero-profit price equilibrium- zero profit, zero effort, and full insurance ii.) Positive- profit price equilibrium- positive profit, positive effort, partial insurance iii) zero- insurance price equilibrium- zero insurance, zero profit, positive effort. Suppose a client purchases an additional unit of insurance from an insurer and consequently reduces accident- avoidance effort. This will lower the profitability of insurance the client has obtained form other insurers. In setting price, the insurer neglects this effect, however. Thus, price insurance entails an externality. We show under what circumstances this externality can be fully internalized by a linear tax on insurance sales. Actual insurance contracts lie in the middle ground between exclusive quantity contracts (where an individual is effectively constrained to purchase all his insurance from one firm) and price contracts. We argue that our analysis of price contracts sheds light on the welfare properties of actual insurance contracts. Notably, since the externality we identify will still be operative, the taxation of insurance sales is typically desirable.

253. Joseph F. Quinn, "Poverty And Income Security Among Older Persons" (12/1993)

252. James E. Anderson and J. Peter Neary (University College Dublin), "The Trade Restrictiveness Of The Multi-Fibre Arrangement" (12/1993)

251. Joseph F. Quinn, "Capitalism and Wealth Creation" (12/1993)

221. Fabio Schiantarelli, Andrew Weiss (mdý University) and Fidel Jaramillo (Facultuad Latinoamericana de Ciencias Sociales), "" (12/1993: 626 Kb, Adobe Acrobat format)

Abstract: Using a large panel of Ecuadorian firms, this paper analyzes the role of capital market imperfections for investment decisions, and investigates whether the financial reforms introduced in the 80’s have succeeded in relaxing financial constraints. The model allows both for an increase cost of borrowing, as the degree of leverage increases, and for a ceiling on a latter. The econometric results suggest both types of capital market imperfections are important for small and young firms, but not for large ones. Moreover, the estimated equations do not provide evidence that financial reform in Ecuador has helped to relax these financial restraints.

198. Donald K. Richter, John Griffin and Richard Arnott, "Computation of Dynamic User Equilibria in a Model of Peak Period Traffic Congestion with Heterogenous Commuters" (04/1990: 632 Kb, Adobe Acrobat format)

Please note:To view Adobe Acrobat® (PDF) formatted files.