Archive: Working Papers 526-800
You can search the BC Economics Working Papers by author, title, keyword, JEL category, and abstract contents viaǰ.
800. David Dillenberger and, "" (05/2012: PDF)
Abstract: Machina (2009, 2012) lists a number of situations where standard models of ambiguity aversion are unable to capture plausible features of ambiguity attitudes. Most of these problems arise in choice over prospects involving three or more outcomes. We show that the recursive non-expected utility model of Segal (1987) is rich enough to accommodate all these situations.
799. Christopher Conlon (Columbia) and, "" (rev. 08/2012: PDF; published inAmerican Economic Journal: Microeconomics, 5:4, 1-3)
Abstract: Incomplete product availability is an important feature of many markets, and ignoring changes in availability may bias demand estimates. We study a new dataset from a wireless inventory system on vending machines to track product availability every four hours. The data allow us to account for product availability when estimating demand, and provide valuable variation for identifying substitution patterns when products stock out. We develop a procedure that allows for changes in product availability when availability is only observed periodically. We find significant differences in demand estimates: the corrected model predicts significantly larger impacts of stock-outs on profitability.
798. Christopher Conlon (Columbia) and, ""
Abstract: Product availability impacts many industries such as transportation, events, and retail, yet little empirical evidence documents the importance of stocking decisions for firm profits, vertical relationships, or consumers. We conduct several experiments, exogenously removing top-selling products from a set of vending machines and analyzing substitution patterns and profit impacts of the changed product availability using non- parametric analyses and structural demand estimation. We find substantial switching to alternate products, and evidence of misaligned incentives between upstream and downstream firms in the choice of which products to carry. We discuss the trade-offs of both empirical approaches for analyzing product availability effects generally.
797. Alon Harel (Hebrew University) and, "" (02/2012, PDF)
Abstract: Since Becker (1971), a common argument against asymmetric norms that promote minority rights over those of the majority is that such policies reduce total welfare. While this may be the case, we show that there are simple environments where aggregate sum of individual utilities is actually maximized under asymmetric norms that favor minorities. We thus maintain that without information regarding individual utilities one cannot reject or promote segregation-related policies based on utilitarian arguments.
796., "" (04/2012, PDF)
Abstract: Preferences may arise from regret, i.e., from comparisons with alternatives forgone by the decision maker. We show that when the choice set consists of pairwise statistically independent lotteries, transitive regret-based behavior is consistent with betweenness preferences and with a family of preferences that is characterized by a consistency property. Examples of consistent preferences include CARA, CRRA, and anticipated utility.
795.and Yoto V. Yotov (Drexel University), "" (01/2012: PDF)
Abstract: This paper provides striking confirmation of the restrictions of the structural gravity model of trade. Structural forces predicted by theory explain 95% of the variation of the fixed effects used to control for them in the recent gravity literature, fixed effects that in principle could reflect other forces. This validation opens avenues to inferring unobserved sectoral activity and multilateral resistance variables by equating fixed effects with structural gravity counterparts. Our findings also provide important validation of a host of general equilibrium comparative static exercises based on the structural gravity model.
794., Mustafa Caglayan (Heriot-Watt University) and Oleksandr Talavera (University of Sheffield), "" (rev. 10/2013: PDF; forthcoming,Manchester School)
Abstract: This paper empirically examines the role of diversification in export markets on firm-level R&D activities. We show that geographical sales diversification across different regions of the world induces UK firms to increase their R&D expenditures, as firms must innovate and develop new products to maintain a competitive edge over their rivals. This finding is robust to a battery of sensitivity checks. Furthermore, we find no evidence of reverse causality between R&D and sales diversification.
793.,,and Luis Serven (World Bank), "" (rev. 02/2016: PDF)
Abstract: We show how to relate the welfare of a country's infinitely-lived representative consumer to observable aggregate data. To a first order, welfare is summarized by total factor productivity and by the capital stock per capita. These variables suffice to calculate welfare changes within a country, as well as welfare differences across countries. The result holds regardless of the type of production technology and the degree of market competition. It applies to open economies as well, if total factor productivity is constructed using domestic absorption, instead of gross domestic product, as the measure of output. It also requires that total factor productivity be constructed with prices and quantities as perceived by consumers, not firms. Thus, factor shares need to be calculated using after-tax wages and rental rates and they will typically sum to less than one. These results are used to calculate welfare gaps and growth rates in a sample of developed countries with high-quality total factor productivity and capital data. Under realistic scenarios, the U.K. and Spain had the highest growth rates of welfare during the sample period 1985-2005, but the U.S. had the highest level of welfare.
792. Meghan Skira, "" (rev. 08/2013: PDF)
Abstract: This paper formulates and estimates a dynamic discrete choice model of elder parent care and work to analyze how caregiving affects a woman’s current and future labor force participation and wages. Intertemporal tradeoffs, such as decreased future earning capacity due to a current reduction in labor market work, are central to the decision to provide care. The existing literature, however, overlooks such long-term considerations. I depart from the previous literature by modeling caregiving and work decisions in an explicitly intertemporal framework. The model incorporates dynamic elements such as the health of the elderly parent, human capital accumulation and job offer availability. I estimate the model on a sample of women from the Health and Retirement Study by efficient method of moments. The estimates indicate that intertemporal tradeoffs matter considerably. In particular, women face low probabilities of returning to work or increasing work hours after a caregiving spell. Using the estimates, I simulate several government sponsored elder care policy experiments: a longer unpaid leave than currently available under the Family and Medical Leave Act of 1993; a paid work leave; and a caregiver allowance. The leaves encourage more work among intensive care providers since they guarantee a woman can return to her job, while the caregiver allowance discourages work. A comparison of the welfare gains generated by the policies shows that half the value of the paid leave can be achieved with the unpaid leave, and the caregiver allowance generates gains comparable to the unpaid leave.
791.and Krishna Pendakur (Simon Fraser University), "" (rev. 07/2013: PDF; previously circulated as "Generalized Random Coefficients With Equivalence Scale Applications")
Abstract: We model unobserved preference heterogeneity in demand systems via random Barten scales in utility functions. These Barten scales appear as random coefficients multiplying prices in demand functions. Consumer demands are nonlinear in prices and may have unknown functional structure. We therefore prove identification of Generalized Random Coefficients models, defined as nonparametric regressions where each regressor is multiplied by an unobserved random coefficient having an unknown distribution. Using Canadian data, we estimate energy demand functions with and without random coefficient Barten scales. We find that not accounting for this unobserved preference heterogeneity substantially biases estimated consumer-surplus costs of an energy tax.
790. Michele De Nadai (University of Padova) and, "" (rev. 07/2013: PDF)
Abstract: Measurement errors are often correlated, as in surveys where respondents' biases or tendencies to err affect multiple reported variables. We extend Schennach (2007) to identify moments of the conditional distribution of a true Y given a true X when both are measured with error, the measurement errors in Y and X are correlated, and the true unknown model of Y given X has nonseparable model errors. We also provide a nonparametric sieve estimator of the model, and apply it to nonparametric Engel curve estimation. In our application measurement errors on the expenditures of a good Y are by construction correlated with measurement errors in total expenditures X. This feature of most consumption data sets has been ignored in almost all previous demand applications. We find accounting for this feature casts doubt on Hildenbrand's (1994) "increasing dispersion" assumption.
789. Yingying Dong (California State University-Irvine),and Thomas Tao Yang, "" (rev. 05/2012: PDF; forthcoming,Canadian Journal of Economics)
Abstract: We discuss the relative advantages and disadvantages of four types of convenient estimators of binary choice models when regressors may be endogenous or mismeasured, or when errors are likely to be heteroskedastic. For example, such models arise when treatment is not randomly assigned and outcomes are binary. The estimators we compare are the two stage least squares linear probability model, maximum likelihood estimation, control function estimators, and special regressor methods. We specifically focus on models and associated estimators that are easy to implement. Also, for calculating choice probabilities and regressor marginal effects, we propose the average index function (AIF), which, unlike the average structural function (ASF), is always easy to estimate.
788., "" (rev. 02/2012: PDF; published as "Exchange Rate Pass-Through and Credit Constraints,"Journal of Monetary Economics, 2013, 60:1, 25-38)
Abstract: The macroeconomic evidence on the short-term impact of exchange rates on exports and prices is notoriously weak. In this paper I examine the micro-foundations of this disconnect by looking at firms' export and price setting decisions in response to fluctuations in exchange rates and credit conditions using German firm survey data. Firm- level data on pricing and export expectations enables me to measure the instantaneous response of each firm to changing financial constraints and the EUR/USD exchange rate, which avoids endogeneity issues. I find that primarily large firms cause the exchange rate "puzzles" in aggregate data. The exchange rate disconnect disappears for financially constrained firms. For these firms, the pass-through rate of exchange rate changes to prices is more than twice the rate of unconstrained firms. Similarly, their export volumes are about twice as sensitive to exchange rate fluctuations. Credit therefore affects not only exports via trade finance, but also international relative prices by constraining the scope of feasible pricing policies. The effect of borrowing constraints is particularly strong during the recent financial crisis.
787. Chiu Yu Ko, "" (10/2011: PDF)
Abstract: This paper extends Bernheim and Whinston's (1986) menu auction model under transferable utilities to a framework with non-transferable utilities and budget constraints. Under appropriate definitions of equilibria, it is shown that every truthful Nash equilibrium (TNE) is a coalition-proof Nash equilibrium (CPNE) and that the set of TNE payoffs and the set of CPNE payoffs are equivalent, as in a transferable utility framework. The existence of a CPNE is assured in contrast with the possible non-existence of Nash equilibrium under the definition by Dixit, Grossman, and Helpman (1997). Moreover, the set of CPNE payoffs is equivalent to the bidder-optimal weak core.
786. Daniel Kim (RAND Corporation),, Michael Ganz (Abt Bio-Pharma Solutions, Inc.), S.V. Subramanian (Harvard School of Public Health), and Ichiro Kawachi (Harvard School of Public Health), "" (10/2011: PDF; published,Social Science & Medicine, 73:12, 1689-1697, 2011)
Abstract: Past observational studies of the associations of area-level/contextual social capital with health have revealed conflicting findings. However, interpreting this rapidly growing literature is difficult because estimates using conventional regression are prone to major sources of bias including residual confounding and reverse causation. Instrumental variable (IV) analysis can reduce such bias. Using data on up to 167,344 adults in 64 nations in the European and World Values Surveys and applying IV and ordinary least squares (OLS) regression, we estimated the contextual effects of country-level social trust on individual self-rated health. We further explored whether these associations varied by gender and individual levels of trust. Using OLS regression, we found higher average country-level trust to be associated with better self-rated health in both women (beta=0.051, 95% confidence interval 0.011 to 0.091, P=0.01) and men (beta=0.038, 0.0002 to 0.077, P=0.049). IV analysis yielded qualitatively similar results, although the estimates were more than double in size (in women, using country population density and corruption as instruments: beta=0.119, 0.028 to 0.209, P=0.005; in men: beta=0.115, 0.025 to 0.204, P=0.01). The estimated health effects of raising the percentage of a country's population that trusts others by 10 percentage points were at least as large as the estimated health effects of an individual developing trust in others. These findings were robust to alternative model specifications and instruments. Conventional regression and to a lesser extent IV analysis suggested that these associations are more salient in women and in women reporting social trust. In a large cross-national study, our findings, including those using instrumental variables, support the presence of beneficial effects of higher country-level trust on self-rated health. Past findings for contextual social capital using traditional regression may have underestimated the true associations. Given the close linkages between self-rated health and all-cause mortality, the public health gains from raising social capital within countries may be large.
785. Murat Kurt (University of Pittsburgh), Mark S. Roberts (University of Pittsburgh), Andrew J. Schaefer (University of Pittsburgh), and, "Valuing Prearranged Paired Kidney Exchanges: A Stochastic Game Approach" (10/2011)
Abstract: End-stage renal disease (ESRD) is the ninth-leading cause of death in the U.S. Transplantation is the most viable renal replacement therapy for ESRD patients, but there is a severe disparity between the demand for kidneys for transplantation and the supply. This shortage is further complicated by incompatibilities in blood-type and antigen matching between patient-donor pairs. Paired kidney exchange (PKE), a cross-exchange of kidneys among incompatible patient-donor pairs, overcomes many difficulties in matching patients with incompatible donors. In a typical PKE, transplantation surgeries take place simultaneously so that no donor may renege after her intended recipient receives the kidney. Therefore, in a PKE, the occurrence of a transplantation requires compatibility among the pairs' willingnesses to exchange. We consider an arbitrary number of autonomous patients with probabilistically evolving health statuses in a prearranged PKE, and model their transplant timing decisions as a discrete-time non-zero-sum noncooperative stochastic game. We explore necessary and sufficient conditions for patients' decisions to be a stationary-perfect equilibrium, and formulate a mixed-integer linear programming representation of equilibrium constraints, which provides a characterization of the socially optimal stationary-perfect equilibria. We carefully calibrate our model using a large scale nationally representative clinical data, and empirically confirm that randomized strategies, which are less consistent with clinical practice and rationality of the patients, do not yield a significant social welfare gain over pure strategies. We also quantify the social welfare loss due to patient autonomy and demonstrate that maximizing the number of transplants may be undesirable. Our results highlight the importance of the timing of an exchange and the disease severity on matching patient-donor pairs.
784.and Parag A. Pathak (MIT), "" (01/2011; PDF)
Abstract: In Fall 2009, officials from Chicago Public Schools changed their assignment mechanism for coveted spots at selective college preparatory high schools midstream. After asking about 14,000 applicants to submit their preferences for schools under one mechanism, the district asked them to re-submit their preferences under a new mechanism. Officials were concerned that "high-scoring kids were being rejected simply because of the order in which they listed their college prep preferences" under the abandoned mechanism. What is somewhat puzzling is that the new mechanism is also manipulable. This paper introduces a method to compare mechanisms based on their vulnerability to manipulation. Under our notion, the old mechanism is more manipulable than the new Chicago mechanism. Indeed, the old Chicago mechanism is at least as manipulable as any other plausible mechanism. A number of similar transitions between mechanisms took place in England after the widely popular mdý mechanism was ruled illegal in 2007. Our approach provides support for these and other recent policy changes involving matching mechanisms.
783., "" (09/2011; PDF)
Abstract: Motivated by historically low retention rates of graduates at USMA and ROTC, the Army recently introduced branch-for-service incentives programs where cadets could bid an additional three years of active duty service obligation to obtain higher priority for their desired career specialties. The full potential of this highly innovative program is not utilized, due to the ROTC's choice of a poorly behaved cadet-branch matching mechanism. Not only does the ROTC mechanism effectively block the access of a large fraction of moderately high-skilled cadets to key career branches, but it is also highly vulnerable to preference manipulation and encourages effort reduction, potentially compromising human capital accumulation of the Army. Building on recent advances in matching markets, we propose a design that eliminates each of these deficiencies and also benefits the Army by mitigating several policy problems that the Army has identified. In contrast to the ROTC mechanism, our design utilizes market principles more elaborately, and it can be interpreted as a hybrid between a market mechanism and a priority-based allocation mechanism.
782.and Tobias B. Switzer (USAF), "" (05/2011; PDF)
Abstract: Branch selection is a key decision in a cadet's military career. Cadets at USMA can increase their branch priorities at a fraction of slots by extending their service agreement. This real-life matching problem fills an important gap in market design literature. Although priorities fail a key substitutes condition, the agent-optimal stable mechanism is well-defined, and in contrast to the current USMA mechanism it is fair, stable, and strategy-proof. Adoption of this mechanism benefits cadets and the Army. This new application shows that matching with contracts model is practically relevant beyond traditional domains that satisfy the substitutes condition.
781.and, "" (rev. 06/2013: PDF)
Abstract: Although a pilot national live-donor kidney exchange program was recently launched in the US, the kidney shortage is increasing faster than ever. A new solution paradigm is able to incorporate compatible pairs in exchange. In this paper, we consider an exchange framework that has both compatible and in- compatible pairs, and patients are indifferent over compatible pairs. Only two-way exchanges are permitted due to institutional constraints. We explore the structure of Pareto-efficient matchings in this framework. The mathematical structure of this model turns out to be quite novel. We show that under Pareto-efficient matchings, the same number of patients receive transplants, and it is possible to construct Pareto-efficient matchings that match the same incompatible pairs while matching the least number of compatible pairs. We extend the celebrated Gallai-Edmonds Decomposition in the combinatorial optimization literature to our new framework. We also conduct comparative static exercises on how this decomposition changes as new compatible pairs join the pool.
780.and Yoto V. Yotov (Drexel University), "" (09/2011; PDF)
Abstract: This paper infers the terms of trade effects of Free Trade Agreements (FTAs) with the structural gravity model. Using panel data methods to resolve two way causality between trade and FTAs, we estimate direct FTA effects on bilateral trade volume in 2 digit manufacturing goods from 1990-2002. We deduce the terms of trade changes implied by these volume effects for 40 countries plus a rest-of-the-world aggregate. Some gain over 10%, some lose less than 0.2%. Overall, using a novel measure of the change in iceberg melting, global efficiency rises 0.62%.
779. Anthony Creane (Michigan State University), Chiu Yu Ko (mdý College) and,""(rev. 04/2013, PDF; published,Games and Economic Behavior, 82, 254-268, 2013)
Abstract: We examine a firm that can license its production technology to a rival when firms are heterogeneous in production costs. We show that a complete technology transfer from one firm to another always increases joint profit under weakly concave demand when at least three firms remain in the industry. A jointly profitable transfer may reduce social welfare, although a jointly profitable transfer from the most efficient firm always increases welfare. We also consider two auction games under complete information: a standard first-price auction and a menu auction by Bernheim and Whinston (1986). With natural refinement of equilibria, we show that the resulting licensees are ordered by degree of efficiency: menu auction, simple auction, and joint-profit maximizing licensees, in (weakly) descending order.
778. Anthony Hannagan (mdý College) and,""(08/2011, PDF)
Abstract: Diamond and Mirrlees (1971) and Dasgupta and Stiglitz (1972) show that production efficiency is achieved under the optimal commodity tax when profit income is zero. Here, we consider the simplest possible model to analyze production efficiency in the presence of profit income: a tax reform problem in an economy with a representative consumer, two goods, and two firms with decreasing returns to scale technologies. We show that differentiating a uniform producer tax according to the inverse elasticity rule, while keeping government revenue constant, reduces additional distortions caused by the presence of profit income and improves social welfare.
777.,""(rev. 04/2013, PDF; published,Journal of Urban Economics, 75, 68-79, 2013)
Abstract: Housing developments (condos and suburban developments) are not necessarily homogeneous. Developers provide different types of units with various sizes and other characteristics catering to different types of customers. In this paper, we allow local consumption externalities within each development: some consumers would be happy to pay high prices for the most prestigious units in the development, while some are happy to have modest units as long as the prices are low. We consider land developers who seek the optimal mix of units in developments to maximize their profits. We show that there exists an equilibrium, and that every equilibrium is Pareto efficient as long as consumers and developers are optimistic in a certain sense. We provide examples in which mixed developments are more profitable to the developers. We relate this work to widely used hedonic pricing model by Rosen (1974), and to an equilibrium concept under asymmetric information by Rothschild and Stiglitz (1976).
776.and Ryusuke Shinohara (Shinshu University),""(08/2011, PDF; published,Journal of Public Economic Theory16, 173-195, 2014)
Abstract: We consider a public good provision game with voluntary participation. Agents participating in the game provide a public good and pay the fees according to a mechanism (allocation rule), while nonparticipants can free-ride on the participants. We examine how the equilibrium public good provision level is affected by enlarging the population of an economy. We introduce a condition for an allocation rule, the asymptotic uniform continuity in replication (AUCR), which requires that small changes in the population must yield only small changes in the public good provision and which is satisfied by many mechanisms. We show that under AUCR, the equilibrium level of the public good converges to zero as the economy is replicated in the sense of Milleron (1972).
Please note:To view Adobe Acrobat® (PDF) formatted files.
775.and Sanjay K. Chugh (University of Maryland), "" (08/2011, PDF)
Abstract: We study Ramsey-optimal fiscal policy in an economy in which product varieties are the result of forward-looking investment decisions by firms. There are two main results. First, depending on the particular form of variety aggregation in preferences, firms' dividend payments may be either subsidized or taxed in the long run. This policy balances monopoly incentives for product creation with consumers' welfare benefit of product variety. In the most empirically relevant form of variety aggregation, socially efficient outcomes entail a substantial tax on dividend income, removing the incentive for over-accumulation of capital, which takes the form of variety. Second, optimal policy induces dramatically smaller, but efficient, fluctuations of both capital and labor markets than in a calibrated exogenous policy. Decentralization requires zero intertemporal distortions and constant static distortions over the cycle. The results relate to Ramsey theory, which we show by developing welfare-relevant concepts of efficiency that take into account product creation.
774.and Brent Bundick (FRB, Kansas City), "" (rev. 11/2015, PDF)
Abstract:Abstract: Can increased uncertainty about the future cause a contraction in output and its components? An identified uncertainty shock in the data causes significant declines in output, consumption, investment, and hours worked. Standard general-equilibrium models with flexible prices cannot reproduce this comovement. However, uncertainty shocks can easily generate comovement with countercyclical markups through sticky prices. Monetary policy plays a key role in offsetting the negative impact of uncertainty shocks during normal times. Higher uncertainty has even more negative effects if monetary policy can no longer perform its usual stabilizing function because of the zero lower bound. We calibrate our uncertainty shock process using fluctuations in implied stock market volatility and show that the model with nominal price rigidity is consistent with empirical evidence from a structural vector autoregression. We argue that increased uncertainty about the future likely played a role in worsening the Great Recession.
773. Ariel Rubinstein (University of Tel Aviv) and, "" (rev. 02/2012, PDF)
Abstract: We investigate the procedure of "random sampling" where the alternatives are random variables. When comparing any two alternatives, the decision maker samples each of the alternatives once and ranks them according to the comparison between the two realizations. Our main result is that when applied to three alternatives, the procedure yields a cycle with a probability bounded above by 8/27. Bounds are also obtained for other related procedures.
772., "" (02/2011, PDF)
Abstract: This paper uses a New Keynesian model with banks and deposits, calibrated to match the US economy, to study the macroeconomic effects of policies that pay interest on reserves. While their effects on output and inflation are small, these policies require important adjustments in the way that the monetary authority manages the supply of reserves, as liquidity effects vanish and households' portfolio shifts increase banks' demand for reserves when short-term interest rates rise. Money and monetary policy remain linked in the long run, however, since policy actions that change the price level must change the supply of reserves proportionately.
771., "" (rev. 04/2012, PDF)
Abstract: This paper asks the question, "How should the level of mass transit service be adjusted when road pricing is introduced for a substitute auto mode?" The reference point for the introduction of road pricing is second-best optimization in transit. Because this involves below- marginal-cost pricing in transit, it is efficient for road pricing to be accompanied by an increase in the transit fare and a reduction in service. This runs counter to the usual view of using the toll revenue generated by road pricing to provide a higher level of transit service.
770. Barak Medina (Hebrew University), Shlomo Naeh (Hebrew University) and, "" (rev. 02/2012, PDF)
Abstract: Transitivity is a fundamental requirement for consistency. Legal systems, especially when composed over time and by different agencies, may encounter non-transitive cycles. This paper discusses a new solution to such cycles, namely setting the hierarchy of the relevant rules or preferences. The hierarchy determines the sequence of applying the rules or preferences, and thus enables avoiding non-transitive cycles. The paper provides a formal generalization of this solution, and demonstrates its possible implementation to anti-discrimination laws. It is also shown that this solution can be traced to the Rabbinic literature, starting with the Mishnah and the Talmud (1st–5th c CE).
769. Joseph M. Ostroy (UCLA) and, "" (09/2010, PDF)
Abstract: We show that efficient anonymous incentive compatible (dominant strategy) mechanisms for public goods eliminate externalities, i.e., each individual is unable to change the welfare of anyone else. The characterization is used to derive existence and non-existence results for models with a finite number of individuals and to explain existence results in the continuum. A similar characterization and conclusions are demonstrated for private goods in [7]. However, unlike private goods, elimination of externalities with public goods implies that individuals cannot change the outcome. Hence, such mechanisms provide only weak incentives for truth-telling.
768. Alon Harel (Hebrew University) and, "" (01/2011, PDF)
Abstract: This paper defends the use of asymmetric norms which grant gre- ater privileges to minorities than to majorities. The norms we discuss include norms facilitating the establishment or prohibition of minority- only or majority-only institutions, neighborhoods, or associations. While traditionally the primary arguments favoring minorities’ privileges have been based on considerations of fairness or justice, we show that there are simple environments where asymmetric norms would maximize aggregate sum of individual utilities. A utilitarian may therefore support the establishment of black colleges or Hassidic only neighborhoods while at the same time oppose exclusion of blacks or Jews from white or Christian neighborhoods.
767. Aline Bütikofer (University of Bern),and, "" (rev. 01/2011, PDF)
Abstract: Using data on elderly individuals and couples, we estimate a collective model of household consumption of a variety of goods, showing how resources are shared between husband and wife, and how this allocation is affected by retirement and health status. We identify the extent to which shared consumption of some goods by elderly married couples reduces their costs of living relative to living alone. We also identify the fraction of household resources consumed by wives versus husbands, taking this jointness of some consumption into account. The results are relevant for household bargaining models and for a variety of welfare calculations.
766., "" (rev. 01/2012, PDF)
Abstract: Slow mean reversion of real exchange rates is commonly considered a result of border frictions that remain despite integration of financial and goods markets. This paper shows that even if border frictions decline, a contemporaneous decline in output shock variance can in fact slow down mean reversion. It proposes a new method of estimating border cost from time-series data only, without relying on within-country variation. Applying this method to the real exchange rate of final goods and a novel measure of the real exchange rate for productive capabilities, such as technology and know-how, gives very differential border cost estimates. During the years 1974–2008, a relocation reduces productive capability by 22% for the average country pair, whereas final goods by only 15%. The real exchange rate for final goods takes more than two years to revert to purchasing power parity, more than twice as long as productive capabilities.
765.and Viktors Stebunovs (Board of Governors of the Federal Reserve System), "" (12/2010, PDF)
Abstract: This paper studies the domestic and international effects of the transition to an interstate banking system implemented by the U.S. since the late 1970s in a dynamic, stochastic, general equilibrium model with endogenous producer entry. Interstate banking reduces the degree of local monopoly power of financial intermediaries. We show that the an economy that implements this form of deregulation experiences increased producer entry, real exchange rate appreciation, and a current account deficit. The rest of the world experiences a long-run increase in GDP and consumption. Less monopoly power in financial intermediation results in less volatile business creation, reduced markup countercyclicality, and weaker substitution effects in labor supply in response to productivity shocks. Bank market integration thus contributes to a moderation of firm-level and aggregate output volatility. In turn, trade and financial ties between the two countries in our model allow also the foreign economy to enjoy lower GDP volatility in most scenarios we consider. The results of the model are consistent with features of the U.S. and international business cycle after the U.S. began its transition to interstate banking.
764., Mark E Schaffer (Heriot-Watt University) and Steven Stillman (University of Waikato), "" (12/2010, PDF; published,Journal of Economic Surveys, 2011, 25:2, 380-394)
Abstract: We review the Stata statistical package and evaluate its suitability for applied research.
763., "" (11/2010: PDF)
Abstract: Although theologians and economists communicate their ideas to different professional audiences in different ways, we agree on many basic points. We agree, for instance, that all too often, a large gap appears between "what is" and "what should be." We agree, more specifically, that unregulated markets lead to undesirable and perhaps even disastrous environmental degradation. And we view with great suspicion government policies that redistribute wealth perversely, away from the needy and towards the affluent. But while economists share theologians' concerns about the problems that economic development brings, we can also point to benefits that come with rising income levels.
762. Alev Atak (Queen Mary), Oliver Linton (London School of Economics) and, "" (09/2010: PDF)
Abstract: This paper is concerned with developing a semiparametric panel model to explain the trend in UK temperatures and other weather outcomes over the last century. We work with the monthly averaged maximum and minimum temperatures observed at the twenty six Meteorological Office stations. The data is an unbalanced panel. We allow the trend to evolve in a nonparametric way so that we obtain a fuller picture of the evolution of common temperature in the medium timescale. Profile likelihood estimators (PLE) are proposed and their statistical properties are studied. The proposed PLE has improved asymptotic property comparing the the sequential two-step estimators. Finally, forecasting based on the proposed model is studied.
761. Zongwu Cai (University of North Carolina at Charlotte) and, "" (11/2010: PDF)
Abstract: We study quantile regression estimation for dynamic models with partially varying coefficients so that the values of some coefficients may be functions of informative covariates. Estimation of both parametric and nonparametric functional coefficients are proposed. In particular, we propose a three stage semiparametric procedure. Both consistency and asymptotic normality of the proposed estimators are derived. We demonstrate that the parametric estimators are root-n consistent and the estimation of the functional coefficients is oracle. In addition, efficiency of parameter estimation is discussed and a simple efficient estimator is proposed. A simple and easily implemented test for the hypothesis of varying-coefficient is proposed. A Monte Carlo experiment is conducted to evaluate the performance of the proposed estimators.
760., "" (09/2010: PDF)
Abstract: The paper is based on a talk given in April 2010. It traces the roots of the recent financial crisis from development of the limited liability corporation, to separation of ownership from control, tax incentives for debt financing, investment bankers moving to the corporate form, government backstopping the lenders, and the failure of the quants. Finally, it considers the Minsky hypothesis, under which financial markets are inherently unstable.
759. Yingying Dong (California State University, Irvine) and, "" (rev. 12/2012; PDF; previously circulated as "Regression Discontinuity Marginal Threshold Treatment Effects")
Abstract: Regression discontinuity models, where the probability of treatment jumps discretely when a running variable crosses a threshold, are commonly used to nonparametrically identify and estimate a local average treatment effect. We show that the derivative of this treatment effect with respect to the running variable is nonparametrically identified and easily estimated. Then, given a local policy invariance assumption, we show that this derivative equals the change in the treatment effect that would result from a marginal change in the threshold, which we call the marginal threshold treatment effect (MTTE). We apply this result to Manacorda (2012), who estimates a treatment effect of grade retention on school outcomes. Our MTTE identifies how this treatment effect would change if the threshold for retention was raised or lowered, even though no such change in threshold is actually observed.
758. Geoffrey Dunbar (Simon Fraser University),and Krishna Pendakur (Simon Fraser University), "" (rev. 01/2012, PDF; published,American Economic Review, 2013, 103, 438-471)
Abstract: The share of household resources devoted to children is hard to identify, because consumption is measured at the household level, and goods can be shared. Using semiparametric restrictions on individual preferences within a collective model, we identify how total household resources are divided up among household members, by observing how each family member’s expenditures on a single private good like clothing varies with income and family size. Using data from Malawi we show how resources devoted to wives and children vary by family size and structure, and we find that standard poverty indices understate the incidence of child poverty.
757.Juan Carlos Escanciano (Universidad Carlos III de Madrid),Stefan Hoderlein (mdý College), (mdý College), Oliver Linton (London School of Economics), and Sorawoot Srisuma (University of Surrey), "Nonparametric Euler Equation Identification and Estimation" (rev. 03/2020; PDF)
Abstract: We consider nonparametricidentification and estimation of pricing kernels, or equivalently of marginal utility functions up to scale, in consumption based asset pricing Euler equations. Ours is the first paper to prove nonparametric identification of Euler equations under low level conditions (without imposing functional restrictions or just assuming completeness). We also propose a novel nonparametric estimator based on ouridentification analysis, which combines standard kernel estimation with the computation of a matrix eigenvector problem. Our estimator avoids the ill-posed inverse issues associated with nonparametric instrumental variables estimators. We derive limiting distributions for our estimator and for relevant associated functionals. A Monte Carlo shows a satisfactory finite sample performance for our estimators.
756. Juan Carlos Escanciano (Indiana University), David Jacho-Chávez (Indiana University) and, "" (rev. 01/2012; PDF; previously circulated as "Uniform Convergence for Semiparametric Two Step Estimators and Tests")
Abstract: A new uniform expansion is introduced for sums of weighted kernel-based regression residuals from nonparametric or semiparametric models. This result is useful for deriving asymptotic properties of semiparametric estimators and test statistics with data-dependent bandwidth, random trimming, and estimated weights. An extension allows for generated regressors, without requiring the calculation of functional derivatives. Example applications are provided for a binary choice model with selection, including a new semiparametric maximum likelihood estimator, and a new directional test for correct specification of the average structural function. An extended Appendix contains general results on uniform rates for kernel estimators, additional applications, and primitive sufficient conditions for high level assumptions.
755. Sokbae Lee (University College London) and, "" (rev. 06/2011; PDF)
Abstract: We provide new conditions for identification of accelerated failure time competing risks models. These include Roy models and some auction models. In our set up, unknown regression functions and the joint survivor function of latent disturbance terms are all nonparametric. We show that this model is identified given covariates that are independent of latent errors, provided that a certain rank condition is satisfied. We present a simple example in which our rank condition for identification is verified. Our identification strategy does not depend on identification at infinity or near zero, and it does not require exclusion assumptions. Given our identification, we show estimation can be accomplished using sieves.
754., "" (rev. 09/2012; PDF)
Abstract: Credit limit variability is a crucial aspect of the consumption, savings, and debt decisions of households in the United States. Using a large panel this paper first demonstrates that individuals gain and lose access to credit frequently and often have their credit limits reduced unexpectedly. Credit limit volatility is larger than most estimates of income volatility and varies over the business cycle. While typical models of intertemporal consumption fix the credit limit, I introduce a model with variable credit limits. Variable credit limits create a reason for households to hold both high interest debts and low interest savings at the same time since the savings act as insurance. Simulating the model using the estimates of credit limit volatility, I show that it explains all of the credit card puzzle: why around a third of households in the United States hold both debt and liquid savings at the same time. The approach also offers an important new channel through which financial system uncertainty can affect household decisions.
753., "" (rev. 12/2014; PDF)
Abstract: How does banking affect development, and does banking affect all sectors equally, or change the structure of the economy? Since banking tends to grow with the rest of the economy, these questions are difficult to answer. This paper examines the growth of the national banking system from 1870-1900 during an important period in the financial and economic development of the United States. I create a new data set on individual banks and place them geographically. Minimum capital requirements lim- ited the expansion of banks, and I use these requirements to identify the effects of additional banking. Banking was very important: the opening of a bank with the minimum capital increased total production by 12% for counties close to the dividing line between getting a bank and not. Both manufacturing and farming benefited, sug- gesting that the commercial, as opposed to investment, activities of banks were very important. Banks increased the inequality in farm size after a decade, largely through the expansion of larger farms, but had no affect on yields. Although the literature on banking often focuses on investment, commercial banking, either through direct currency creation or bills of exchange to facilitate the movement of goods, appears to be an important part of banking activities during development.
752.and Hajo Holzmann (University of Karlsruhe), "" (08/2008; PDF)
Abstract: In this paper we are concerned with analyzing the behavior of a semiparametric estimator which corrects for endogeneity in a nonparametric regression by assuming mean independence of residuals from instruments only. Because it is common in many applications, we focus on the case where endogenous regressors and additional instruments are jointly normal, conditional on exogenous regressors. This leads to a severely ill-posed inverse problem. In this setup, we show first how to test for conditional normality. More importantly, we then establish how to exploit this knowledge when constructing an estimator, and we derive results characterizing the large sample behavior of such an estimator. In addition, in a Monte Carlo experiment we analyze the finite sample behavior of the proposed estimator. Our application comes from consumer demand. We obtain new and interesting findings that highlight both the advantages, and the difficulties of an approach which leads to ill-posed inverse problems. Finally, we discuss the somewhat problematic relationship between nonparametric instrumental variable models, and the recently emphasized issue of unobserved heterogeneity in structural models.
751., "" (12/2008; PDF)
Abstract: This paper discusses new development in nonparametric econometric approaches related to empirical modeling of demand decisons. It shows how diverse recent approaches are, and what new modeling options arise in practice. We review work on nonparametric identification using nonseparable functions, semi-and nonparametric estimation approaches involving inverse problems, and nonparametric testing approaches. We focus on classical consumer demand systems with continuous quantities, and do not consider approaches that involve discrete consumption decisions as are common in empirical industrial organization. Our intention is to give a subjective account on the usefulness of these various methods for applications in the field.
Please note:To view Adobe Acrobat® (PDF) formatted files.
750.and Joachim Winter (University of Munich), "" (06/2009; PDF)
Abstract: This paper considers measurement error from a new perspective. In surveys, response errors are often caused by the fact that respondents recall past events and quantities imperfectly. We explore the consequences of recall errors for such key econometric issues as the identification of marginal effects or economic restrictions in structural models. Our identification approach is entirely nonparametric, using Matzkin-type nonseparable models that nest a large class of potential structural models. We establish that measurement errors due to poor recall are generally likely to exhibit nonstandard behavior, in particular be nonclassical and differential, and we provide means to deal with this situation. Moreover, our findings suggest that conventional wisdom about measurement errors may be misleading in many economic applications. For instance, under certain conditions left-hand side recall errors will be problematic even in the linear model, and quantiles will be less robust than means. Finally, we apply the main concepts put forward in this paper to real world data, and find evidence that underscores the importance of focusing on individual response behavior.
749.Berthold R. Haag (HypoVereinsbank),and Sonya Mihaleva (Brown University), "" (09/2009; PDF)
Abstract: Homogeneity of degree zero has often been rejected in empirical studies that employ parametric models. This paper proposes a test for homogeneity that does not depend on the correct specification of the functional form of the empirical model. The test statistic we propose is based on kernel regression and extends nonparametric specification tests to systems of equations with weakly dependent data. We discuss a number of practically important issues and further extensions. In particular, we focus on a novel bootstrap version of the test statistic. Moreover, we show that the same test also allows to assess the validity of functional form assumptions. When we apply the test to British household data, we find homogeneity generally well accepted. In contrast, we reject homogeneity with a standard almost ideal parametric demand system. Using our test for functional form we obtain however that it it precisely this functional form assumption which is rejected. Our findings indicate that the rejections of homogeneity obtained thus far are due to misspecification of the functional form and not due to incorrectness of the homogeneity assumption.
748., "" (06/2009; PDF)
Abstract: Rationality places strong restrictions on individual consumer behavior. This paper is concerned with assessing the validity of the integrability constraints imposed by standard utility maximization, arising in classical consumer demand analysis. More specifically, we characterize the testable implications of negative semidefiniteness and symmetry of the Slutsky matrix across a heterogeneous population without assuming anything on the functional form of individual preferences. In the same spirit, homogeneity of degree zero is being considered. Our approach employs nonseparable models and is centered around a conditional independence assumption, which is sufficiently general to allow for endogenous regressors. It is the only substantial assumption a researcher has to specify in this model, and has to be evaluated with particular care. Finally, we apply all concepts to British household data: We show that rationality is an acceptable description for large parts of the population, regardless of whether we test on single or composite households.
747., "" (rev. 09/2014; PDF)
Abstract: In this paper we consider endogenous regressors in the binary choice model under a weak median exclusion restriction, but without further specification of the distribution of the unobserved random components. Our reduced form specification with heteroscedastic residuals covers various heterogeneous structural binary choice models. As a particularly relevant example of a structural model where no semiparametric estimator has of yet been analyzed, we consider the binary random utility model with endogenous regressors and heterogeneous parameters. We employ a control function IV assumption to establish identification of a slope parameter beta by the mean ratio of derivatives of two functions of the instruments. We propose an estimator based on direct sample counterparts, and discuss the large sample behavior of this estimator. In particular, we show root-n consistency and derive the asymptotic distribution. In the same framework, we propose tests for heteroscedasticity, overidentification and endogeneity. We analyze the small sample performance through a simulation study. An application of the model to discrete choice demand data concludes this paper.
746.and Halbert White (University of California-San Diego)," (07/2009; PDF)
Abstract: This paper is concerned with extending the familiar notion of fixed effects to nonlinear setups with infinite dimensional unobservables like preferences. The main result is that a generalized version of differencing identifies local average structural derivatives (LASDs) in very general nonseparable models, while allowing for arbitrary dependence between the persistent unobservables and the regressors of interest even if there are only two time periods. These quantities specialize to well known objects like the slope coefficient in the semiparametric panel data binary choice model with fixed effects. We extend the basic framework to include dynamics in the regressors and time trends, and show how distributional effects as well as average effects are identified. In addition, we show how to handle endogeneity in the transitory component. Finally, we adapt our results to the semiparametric binary choice model with correlated coefficients, and establish that average structural marginal probabilities are identified. We conclude this paper by applying the last result to a real world data example. Using the PSID, we analyze the way in which the lending restrictions for mortgages eased between 2000 and 2004.
745.and Jörg Stoye (New York University), "" (08/2009; PDF)
Abstract: This paper explores the empirical content of the weak axiom of revealed preference (WARP) for repeated cross-sectional data or for panel data where individuals experience preference shocks. Specifically, in a heterogeneous population, think of the fraction of consumers violating WARP as the parameter of interest. This parameter depends on the joint distribution of choices over different budget sets. Repeated cross-sections do not reveal this distribution but only its marginals. Thus, the parameter is not point identified but can be bounded. We frame this as a copula problem and use copula techniques to analyze it. The bounds, as well as some nonparametric refinements of them, correspond to intuitive behavioral assumptions in the two goods case. With three or more goods, these intuitions break down, and plausible assumptions can have counterintuitive implications. Inference on the bounds is an application of partial identification through moment inequalities. We implement our analysis with the British Family Expenditure Survey (FES) data. Upper bounds are fre- quently positive but lower bounds not significantly so, hence FES data are consistent with WARP in a heterogeneous population.
744.Halbert White (University of California-San Diego),and Xun Lu (Hong Kong University of Science and Technology), "" (08/2010; PDF)
Abstract: The causal notions embodied in the concept of Granger causality have been argued to belong to a different category than those of Judea Pearl's Causal Model, and so far their relation has remained obscure. Here, we demonstrate that these concepts are in fact closely linked by showing how each relates to straightforward notions of direct causality embodied in settable systems, an extension and refinement of the Pearl Causal Model designed to accommodate optimization, equilibrium, and learning. We then provide straightforward practical methods to test for direct causality using tests for Granger causality.
743., "" (06/2010: PDF)
742.Onur Kesten (Carnegie Mellon University), Morimitsu Kurino (Maastricht University) and, "" (rev. 05/2011: PDF)
Abstract: This paper studies the problem of assigning a set of indivisible objects to a set of agents when monetary transfers are not allowed. We offer two characterizations of the prominent lottery assignment mechanism called the probabilistic serial. We show that it is the only mechanism satisfying non-wastefulness and ordinal fairness. Our second result shows that a direct ordinal mechanism satisfies ordinal efficiency, envy-freeness, and upper invariance if and only if it is the probabilistic serial.
741., "" (rev. 05/2011; PDF; published as "The effects of financial development in the short and long run: Theory and evidence from India", 2013.Journal of Development Economics104, 56-72)
Abstract: Although many think of extending financial access as a means of reducing poverty, empirical studies have produced contradictory results. One problem that many studies share is that they cover only a short time frame and do not examine dynamic effects. I show that allowing effects to differ over time is crucial to understanding changes in access to credit when precautionary motives are important. While the introduction of credit initially creates a boom in consumption and reduces poverty, in the long term credit reduces mean consumption because access to credit reduces the steady state stock of wealth. Using consistent consumption data that cover a much longer time period than most studies, my empirical findings show that increased access to bank branches in rural India increased consumption initially, but consumption later fell, although the long term effect was still slightly positive.
740.Alessandro Barattieri,and, "" (06/2010; PDF; published,American Economic Journal: Macroeconomics, 6, 70-101)
Abstract: Nominal wage stickiness is an important component of recent medium-scale structural macroeconomic models, but to date there has been little microeconomic evidence supporting the as- sumption of sluggish nominal wage adjustment. We present evidence on the frequency of nominal wage adjustment using data from the Survey of Income and Program Participation (SIPP) for the period 1996-1999. The SIPP provides high-frequency information on wages, employment and demographic characteristics for a large and representative sample of the US population. The main results of the analysis are as follows. 1) After correcting for measurement error, wages appear to be very sticky. In the average quarter, the probability that an individual will experience a nominal wage change is between 5 and 18 percent, depending on the samples and assumptions used. 2) The frequency of wage adjustment does not display significant seasonal patterns. 3) There is little heterogeneity in the frequency of wage adjustment across industries and occupations 4) The hazard of a nominal wage change first increases and then decreases, with a peak at 12 months. 5) The probability of a wage change is positively correlated with the unemployment rate and with the consumer price inflation rate.
739., "" (06/2010; PDF)
Abstract: John Commons' influence in American labor economics was eclipsed after World War II by a resurgent neoclassical labor economics that gradually relegated Commons' institutional orientation to the periphery of economic discourse. A common opinion is that the work of institutional economists in the Commons tradition was largely descriptive and lacked theoretical content. Commons, however, regarded his Institutional Economics as a work of economic theory. This paper contains a description of the theoretical core of Institutional Economics and an evaluation of it from the perspective of its potential usefulness in the teaching of modern labor economics. Part I describes the theoretical perspective of neoclassical economic theory in order to clarify the institutional perspective by contrast. Part II describes Commons’ alternative perspective. Part III presents the conclusions derived from this comparison of the two alternative perspectives.
738., "" (05/2010; PDF)
Abstract: The method of indirect least squares (ILS) using a proxy for a discrete instrument is shown to identify a weighted average of local treatment effects. The weights are nonnegative if and only if the proxy is intensity preserving for the instrument. A similar result holds for instrumental variables (IV) methods such as two stage least squares. Thus, one should carefully interpret estimates for causal effects obtained via ILS or IV using an error-laden proxy of an instrument, a proxy for an instrument with missing or imputed observations, or a binary proxy for a multivalued instrument. Favorably, the proxy need not satisfy all the assumptions required for the instrument. Specifically, an individual's proxy can depend on others' instrument and the proxy need not affect the treatment nor be exogenous. In special cases such as with binary instrument, ILS using any suitable proxy for an instrument identifies local average treatment effects.
737.Onur Kesten (Carnegie Mellon University) and, "" (rev. 06/2012: PDF)
Abstract: A new centralized mechanism was introduced in New York City and mdý to assign students to public schools in district school-choice programs. This mechanism was advocated for its superior fairness property, besides others, over the mechanisms it replaced. In this paper, we introduce a new framework for investigating school-choice matching problems and two ex-ante notions of fairness in lottery design, strong ex-ante stability and ex-ante stability. This frame- work generalizes known one-to-many two-sided and one-sided matching models. We first show that the new NYC/mdý mechanism fails to satisfy these fairness properties. We then propose two new mechanisms, the fractional deferred-acceptance mechanism, which is ordinally Pareto dominant within the class of strongly ex-ante stable mechanisms, and the fractional deferred- acceptance and trading mechanism, which satisfies equal treatment of equals and constrained ordinal Pareto efficiency within the class of ex-ante stable mechanisms.
736.Michael Belongia (University of Mississippi) and, "" (05/2010; PDF)
Abstract: This paper extends a New Keynesian model to include roles for currency and deposits as competing sources of liquidity services demanded by households. It shows that, both qualitatively and quantitatively, the Barnett critique applies: While a Divisia aggregate of monetary services tracks the true monetary aggregate almost perfectly, a simple-sum measure often behaves quite differently. The model also shows that movements in both quantity and price indices for monetary services correlate strongly with movements in output following a variety of real and nominal shocks. Finally, the analysis characterizes the optimal monetary policy response to shocks that originate in an explicitly-modeled financial sector.
735., "" (04/2010; PDF)
Abstract: With an estimated New Keynesian model, this paper compares the "great recession" of 2007-09 to its two immediate predecessors in 1990-91 and 2001. The model attributes all three downturns to a similar mix of aggregate demand and supply disturbances. The most recent series of adverse shocks lasted longer and became more severe, however, prolonging and deepening the great recession. In addition, the zero lower bound on the nominal interest rate prevented monetary policy from stabilizing the US economy as it had previously; counterfactual simulations suggest that without this constraint, output would have recovered sooner and more quickly in 2009.
734.Halbert White (University of California-San Diego) and, "" (08/2008; PDF)
Abstract: This paper demonstrates the extensive scope of an alternative to standard instrumental variables methods, namely covariate-based methods, for identifying and estimating effects of interest in general structural systems. As we show, commonly used econometric methods, specifically parametric, semi-parametric, and nonparametric extremum or moment-based methods, can all exploit covariates to estimate well-identifïed structural effects. The systems we consider are general, in that they need not impose linearity, separability, or monotonicity restrictions on the structural relations. We consider effects of multiple causes; these may be binary, categorical, or continuous. For continuous causes, we examine both marginal and non-marginal effects. We analyze effects on aspects of the response distribution generally, deÖned by explicit or implicit moments or as optimizers (e.g., quantiles). Key for identification is a specific conditional exogeneity relation. We examine what happens in its absence and find that identification generally fails. Nevertheless, local and near identification results hold in its absence, as we show.
733.Halbert White (University of California-San Diego) and, "" (03/2010; PDF)
Abstract: We give two new approaches to testing a conditional form of exogeneity. This condition ensures unconfoundedness and identification of effects of interest in structural systems. As these approaches do not rely on the absence of causal effects of treatment under the null, they complement earlier methods of Rosenbaum (1987) and Heckman and Hotz (1989).
732., "" (01/2010; forthcoming inThe Gravity Model in International Trade: Advances and Applications, Steven Brakman and Peter Bergeijk, eds.)
Abstract: The high trade costs inferred from gravity are rarely used in the wide class of trade models. Two related problems explain this omission of a key explanatory variable. First, national seller and buyer responses to trade costs depend on their incidence rather than on the full cost. Second, the high dimensionality of bilateral trade costs requires aggregation for most practical uses in interpretation or standard trade modeling. This paper provides an intuitive description of a resolution to the aggregation and incidence problems. For each product, it is as if each province or country sells to a world market containing all buyers and buys from from that market containing all sellers, the incidence of aggregated bilateral trade costs being divided between sellers and buyers according to their location. Measures of incidence described here give intuitive insight into the consequences of geography, illustrated with results from Anderson and Yotov (2008). The integration of the incidence measures with standard general equilibrium structure opens the way to richer applied general equilibrium models and better empirical work on the origins of comparative advantage.
731.and Will Martin (World Bank), "" (01/2010; forthcoming,Journal of Public Economic Theory)
Abstract: The fact that raising taxes can increase taxed labor supply through income effects is frequently used to justify greater public good provision than indicated by traditional, compensated analyses. We develop a model including multiple public goods and taxes and derive consistent measures of the marginal benefit of public goods and their marginal social cost inclusive of tax distortions using both compensated and uncompensated measures of the Marginal Cost of Funds (MCF). Our analysis confirms that the desirability of tax financed public projects is independent of whether compensated or uncompensated methods are used. The main innovation shows that the costs or benefits of providing particular public goods should be adjusted by a simple, benefit multiplier not previously seen in the literature if an uncompensated MCF is used.
730.Oleg Badunenko (DIW Berlin),and Dorothea Schäfer (DIW Berlin), "" (03/2010, PDF)
Abstract: The paper investigates whether the presence and tenure of Private Equity (PE) investment in European companies improves their performance. Previous studies documented the unambiguous merit of a buyout during the 1980s and 1990s for listed firms in the US and UK markets. This study analyzes such influences in both listed and unlisted European firms during 2002-2007. Our analysis suggests that short-term PE investments have, on average, a detrimental effect on firm performance. The performance of a firm that has PE backing is lower than that of a firm without PE backing in the first year of PE investment. Such an effect disappears if PE investments remain in the firm for an uninterrupted six-year term.
729.Fuhito Kojima (Stanford University) and, "" (rev. 10/2011: PDF)
Abstract: The mdý mechanism is a popular student-placement mechanism in school-choice programs around the world. We provide two characterizations of the mdý mechanism. We introduce a new axiom, respect of preference rankings. A mechanism is the mdý mechanism for some priority if and only if it respects preference rankings and satisfies consistency, resource monotonicity, and an auxiliary invariance property. In environments where each type of object has exactly one unit, as in house allocation, a characterization is given by respect of preference rankings, individual rationality, population monotonicity, and the auxiliary invariance property.
728.,,and Luis Serven (World Bank), "" (rev. 06/2010: PDF)
Abstract: We prove that the change in welfare of a representative consumer is summarized by the current and expected future values of the standard Solow productivity residual. The equivalence holds if the representative household maximizes utility while taking prices parametrically. This result justifies TFP as the right summary measure of welfare (even in situations where it does not properly measure technology) and makes it possible to calculate the contributions of disaggregated units (industries or firms) to aggregate welfare using readily available TFP data. Based on this finding, we compute firm and industry contributions to welfare for a set of European OECD countries (Belgium, France, Great Britain, Italy, Spain), using industry-level (EU-KLEMS) and firm-level (Amadeus) data. After adding further assumptions about technology and market structure (firms minimize costs and face common factor prices), we show that welfare change can be decomposed into three components that reject respectively technical change, aggregate distortions and allocative efficiency. Then, using theoretically appropriate firm-level data, we assess the importance of each of these components as sources of welfare improvement in the same set of European countries.
727., "" (11/2009: PDF)
Abstract: This paper studies the effects of international openness and contracting institutions on vertical integration. It first derives a number of predictions regarding the interactions between trade barriers, contracting costs, technology intensity, and the extent of vertical integration from a simple model with incomplete contracts. Then it investigates these predictions using a new dataset of over 14000 firms from 45 developing countries. Consistent with theory, the effect of technology intensity of domestic producers on their likelihood to vertically integrate is decreasing in the quality of domestic contracting institutions and in international openness. Contract enforcing costs are particularly high in developing countries and their effects on the vertical structure of technological intensive firms may have significant welfare costs. If improving domestic contracting institutions is not feasible an equivalent solution is to increase openness to international trade. This would discipline domestic suppliers reducing the need for vertical integration.
726., Atreya Chakraborty (University of Massachusetts-mdý), Liyan Han (Beihang University) and Boyan Liu (Beihang University), "" (11/2009: PDF; published,Applied Economics, 2012, 44:4, 515-525)
Abstract: We find that U.S. corporations' demand for liquidity is sensitive to two important factors: uncertainty facing the firm and the quality of corporate governance. Following prior research, we find that both factors have important influences on firms' cash holdings. Our results also indicate that the interactions between uncertainty and governance measures are significant. From a policy perspective, these new findings indicate both governance and the nature of uncertainty may play an important role in managing liquidity risks. Policy recommendations may not only be limited to changes in financial policy but may also include changes in corporate governance.
Please note:To view Adobe Acrobat® (PDF) formatted files.
725.and Roger Koenker (University of Illinois Urbana-Champaign), "" (03/2009: PDF)
Abstract: Conditional quantile estimation is an essential ingredient in modern risk management. Although GARCH processes have proven highly successful in modeling financial data it is generally recognized that it would be useful to consider a broader class of processes capable of representing more flexibly both asymmetry and tail behavior of conditional returns distributions. In this paper, we study estimation of conditional quantiles for GARCH models using quantile regression. Quantile regression estimation of GARCH models is highly nonlinear; we propose a simple and effective two-step approach of quantile regression estimation for linear GARCH time series. In the first step, we employ a quan- tile autoregression sieve approximation for the GARCH model by combining information over different quantiles; second stage estimation for the GARCH model is then carried out based on the first stage minimum distance estimation of the scale process of the time series. Asymptotic properties of the sieve approximation, the minimum distance estimators, and the final quantile regression estimators employing generated regressors are studied. These results are of independent interest and have applications in other quantile regression settings. Monte Carlo and empirical application results indicate that the proposed estimation methods outperform some existing conditional quantile estimation methods.
724.and Chi Wan (Carleton University) "" (rev. 03/2010: PDF; published,Applied Financial Economics, 20, 1163-1171, 2010)
Abstract: This paper empirically investigates the impact of macroeconomic uncertainty on the spreads of individual firms' credit default swaps (CDS). While existing literature acknowledges the importance of the levels of macroeconomic factors in determining CDS spreads, we find that the second moments of these factors--macroeconomic uncertainty--have significant explanatory power over and above that of traditional macroeconomic factors such as the risk-free rate and the Treasury term spread.
723.Matteo Iacoviello (Federal Reserve Board of Governors) and Marina Pavan (Universitat Jaume I), "" (rev. 09/2011: PDF)
Abstract: We study housing and debt in a quantitative general equilibrium model. In the cross-section, the model matches the wealth distribution, the age profiles of homeownership and mortgage debt, and the frequency of housing adjustment. In the time-series, the model matches the procyclicality and volatility of housing investment, and the procyclicality of mortgage debt. We use the model to conduct two experiments. First, we investigate the consequences of higher individual income risk and lower downpayments, and find that these two changes can explain, in the model and in the data, the reduced volatility of housing investment, the reduced procyclicality of mortgage debt, and a small fraction of the reduced volatility of GDP. Second, we use the model to look at the behavior of housing investment and mortgage debt in an experiment that mimics the Great Recession: we find that countercyclical Önancial conditions can account for large drops in housing activity and mortgage debt when the economy is hit by large negative shocks.
722., Jaewoo Lee (IMF) and Alessandro Rebucci (Inter-American Development Bank), "" (10/2009, PDF)
Abstract: International financial integration has greatly increased the scope for changes in a country's net foreign asset position through the "valuation channel" of external adjustment, namely capital gains and losses on the country's external assets and liabilities. We examine this valuation channel theoretically in a dynamic equilibrium portfolio model with international trade in equity that encompasses complete and incomplete asset market scenarios. By separating asset prices and quantities in the definition of net foreign assets, we can characterize the first-order dynamics of both valuation effects and net foreign equity holdings. First-order excess returns are unanticipated and i.i.d. in our model, but capital gains and losses on equity positions feature persistent, anticipated dynamics in response to productivity shocks. The separation of prices and quantities in net foreign assets also enables us to characterize fully the role of capital gains and losses versus the current account in the dynamics of macroeconomic aggregates. Specifically, we disentangle the roles of excess returns, capital gains, and portfolio adjustment for consumption risk sharing when financial markets are incomplete, showing how these different channels contribute to dampening (or amplifying) the impact response of the cross-country consumption differential to shocks and to keeping it constant in subsequent periods.
721.and Chiu Yu Ko, "" (rev. 04/2012, PDF; published,Games and Economic Behavior)
Abstract: This paper considers a resource allocation mechanism that utilizes a profit-maximizing auctioneer/matchmaker in the Kelso-Crawford (1982) (many-to-one) assignment problem. We consider general and simple (individualized price) message spaces for firms' reports following Milgrom (2010). We show that in the simple message space, (i) the matchmaker's profit is always zero and an acceptable assignment is achieved in every Nash equilibrium, and (ii) the sets of stable assignments and strong Nash equilibria are equivalent. By contrast, in the general message space, the matchmaker may make a positive profit even in a strong Nash equilibrium. This shows that restricting message space not only reduces the information requirement but also improves resource allocation.
720.Anthony Creane (Michigan State University) and, "" (11/2009, PDF)
Abstract: Markets are often characterized with firms of differing capabilities with more efficient firms licensing their technology to lesser firms. We examine the effects that the amount of the technology transferred, and the characteristics of the partner have on this licensing. We find that a partial technology transfer can be the joint-profit minimizing transfer; no such transfer then is superior. However, under weakly concave demand, a complete transfer always increases joint profits so long as there are at least three firms in the industry. We also establish a "Goldilocks" condition in partner selection: it is neither too efficient nor too inefficient. Unfortunately, profitable transfers between sufficiently inefficient firms reduce welfare, while transfers from relatively efficient firms increase welfare. However, an efficient firm might not select the least efficient partner, though it is the social-welfare-maximizing partner.
719.and Se-il Mun (Kyoto University), "" (11/2009, PDF; published,Regional Science and Urban Economics, 40, 173-186, 2010)
Abstract: It is often argued in the US that HOV (high occupancy vehicle) lanes are wasteful and should be converted to HOT (high occupancy vehicles and toll lanes). In this paper, we construct a simple model of commuters using a highway with multiple lanes, in which commuters are heterogeneous in their carpool organization costs. We first look at the HOV lanes and investigate under what conditions introducing HOV lanes is socially beneficial. Then we examine whether converting HOV lanes to HOT lanes improves the efficiency of road use. It is shown that the result depends on functional form and parameter values. We also discuss the effect of alternative policies: simple congestion pricing without lane division; and congestion pricing with HOV lanes. The analysis using specific functional form is presented to explicitly obtain the conditions determining the rankings of HOV, HOT, and other policies based on aggregate social cost.
718.Muriel Niederle (Stanford University), Alvin E. Roth (Harvard University) and, "" (09/2008: PDF)
Abstract: Markets sometimes unravel, with offers becoming inefficiently early. Often this is attributed to competition arising from an imbalance of demand and supply, typically excess demand for workers. However this presents a puzzle, since unraveling can only occur when firms are willing to make early offers and workers are willing to accept them. We present a model and experiment in which workers’ quality becomes known only in the late part of the market. However, in equilibrium, matching can occur (inefficiently) early only when there is comparable demand and supply: a surplus of applicants, but a shortage of high quality applicants.
717.and, "" (08/2008: PDF)
716.Hervé Crès (Institut d’Études Politiques de Paris) and, "" (09/2008: PDF)
Abstract: When aggregating individual preferences through the majority rule in an n-dimensional spatial voting model, the 'worst-case' scenario is a social choice configuration where no political equilibrium exists unless a super majority rate as high as 1 − 1/n is adopted. In this paper we assume that a lower d-dimensional (d < n) linear map spans the possible candidates’ platforms. These d 'ideological' dimensions imply some linkages between the n political issues. We randomize over these linkages and show that there almost surely exists a 50%-majority equilibria in the above worst-case scenario, when n grows to infinity. Moreover the equilibrium is the mean voter. The speed of convergence (toward 50%) of the super majority rate guaranteeing existence of equilibrium is computed for d = 1 and 2.
715.Marek Pycia (UCLA) and, "" (rev. 03/2014: PDF; previously circulated as "A Theory of House Allocation and Exchange Mechanisms")
Abstract: Allocation and exchange of discrete resources such as kidneys, school seats, and many other resources for which agents have single-unit demand is conducted via direct mechanisms without monetary transfers. Incentive compatibility and efficiency are primary concerns in designing such mechanisms. We show that a mechanism is indi- vidually strategy-proof and always selects the efficient outcome with respect to some Arrovian social welfare function if and only if the mechanism is group strategy-proof and Pareto efficient. We construct the full class of these mechanisms and show that each of them can be implemented by endowing agents with control rights over resources. This new class, which we call trading cycles, contains new mechanisms as well as known mechanisms such as top trading cycles, serial dictatorships, and hierarchical exchange. We illustrate how one can use our construction to show what can and what cannot be achieved in a variety of allocation and exchange problems, and we provide an example in which the new trading-cycles mechanisms strictly Lorenz dominate all previously known mechanisms.
714.Francesco Giavazzi (Bocconi University),and Michel Serafinelli (University of California, Berkeley), "" (rev. 02/2012: PDF; previously circulated as "Culture, Policies and Labor Market Outcomes"; forthcoming,Journal of the European Economic Association)
Abstract: We study whether cultural attitudes towards gender, the young, and leisure are significant determinants of the employment rates of women and of the young, and of hours worked. We do this controlling for policies, institutions and other structural characteristics of the economy which may influence labor market outcomes. We identify a country-specific effect of culture exploiting the evolution over time within country as well as across countries, of cultural attitudes. We also address the endogeneity of attitudes, policies and institutions, and allow for the persistent nature of labor market outcomes. We find that culture matters for women employment rates and for hours worked. However, policies, in particular employment protection legislation and taxes, are also important and their quantitative impact substantial.
713., "" (rev. 08/2010: PDF)
Abstract: This paper constructs a two-country stochastic growth model in which neutral and investment-specific technology shocks are nonstationary but cointegrated across economies. It uses this model to interpret data showing that while real investment has grown faster than real consumption in the United States since 1970, the opposite has been true in the Euro Area. The model, when estimated with these data, reveals that the EA missed out on the rapid investment-specific technological change enjoyed in the US during the 1990s; the EA, however, experienced more rapid neutral technological progress while the US economy stagnated during the 1970s.
712., Mustafa Caglayan (Heriot-Watt University) and Oleksandr Talavera (University of Sheffield), "" (rev. 07/2012: PDF; published,Review of International Economics, 21:3, 459-474, 2013)
Abstract: The paper explores factors that lead to accumulation or decumulation of firms' cash reserves. In particular, we empirically examine whether additional future fixed capital and R&D investment expenditures induce firms to change their liquidity ratio while considering the role of market imperfections. Implementing a dynamic framework on a panel of US, UK and German firms, we find that firms in all three countries make larger adjustments to cash holdings when they plan additional future R&D rather than fixed capital investment expenditures. This behavior is particularly prevalent among financially constrained firms.
711.Sushil Bikhchandani (UCLA) and, "" (rev. 10/2009: PDF)
Abstract: Preferences may arise from regret, i.e., from comparisons with alternatives forgone by the decision maker. We ask whether regret-based behavior is consistent with non-expected utility theories of transitive choice. We show that the answer is no. If choices are governed by ex ante regret and elation then non-expected utility preferences must be intransitive.
710., "" (rev. 9/2009: PDF; published,Japanese Economic Review, 61, 35-63, 2010)
Abstract: Scotchmer and Wooders (1987) show that efficient clubs are homogeneous when consumers are divisible in Berglas's (1976) anonymous crowding model. However, if consumers are not divisible or if clubs have multiple facilities with economies of scope, mixed clubs are efficient. In such a model, we consider clubs with multiple membership policies for different types of consumers, and show the existence and efficiency of equilibrium with nonlinear policies. We employ entrepreneurial equilibrium, an equilibrium concept with profit-seeking entrepreneurs. In our model, club managers and members of clubs care only about the members' actions, not their types. The equilibrium is efficient in our adverse selection model due to this "anonymity" of crowding effects. Our theorem can be regarded as showing the existence of a core allocation that satisfies envy-free property in the absence of nonanonymous crowding effects.
709.Ted Juhl (University of Kansas) and, "" (06/2009: PDF)
Abstract: Several widely used tests for a changing mean exhibit nonmonotonic power in finite samples due to "incorrect" estimation of nuisance parameters under the alternative. In this paper, we study the issue of nonmonotonic power in testing for changing mean. We investigate the asymptotic power properties of the tests using a new framework where alternatives are characterized as having "large" changes. The asymptotic analysis provides a theoretical explanation to the power problem. Modified tests that have monotonic power against a wide range of alternatives of structural change are proposed. Instead of estimating the nuisance parameters based on ordinary least squares residuals, the proposed tests use modified estimators based on nonparametric regression residuals. It is shown that tests based on the modified long-run variance estimator provide an improved rate of divergence of the tests under the alternative of a change in mean. Tests for structural breaks based on such an estimator are able to remain consistent while still retaining the same asymptotic distribution under the null hypothesis of constant mean.
708., "" (01/2009: PDF)
Abstract: Quantile regression has important applications in risk management, portfolio optimization, and asset pricing. The current paper studies estimation, inference and financial applications of quantile regression with cointegrated time series. In addition, a new cointegration model with varying coefficients is proposed. In the proposed model, the value of cointegrating coefficients may be affected by the shocks and thus may vary over the innovation quantile. The proposed model may be viewed as a stochastic cointegration model which includes the conventional cointegration model as a special case. It also provides a useful complement to cointegration models with (G)ARCH effects. Asymptotic properties of the proposed model and limiting distribution of the cointegrating regression quantiles are derived. In the presence of endogenous regressors, fully-modified quantile regression estimators and augmented quantile cointegrating regression are proposed to remove the second order bias and nuisance parameters. Regression Wald test are constructed based on the fully modified quantile regression estimators. An empirical application to stock index data highlights the potential of the proposed method.
707.Yingying Dong (California State University, Fullerton) and, "" (rev. 07/2010: PDF)
Abstract: Suppose V and U are two independent mean zero random variables, where V has an asymmetric distribution with two mass points and U has a symmetric distribution. We show that the distributions of V and U are nonparametrically identified just from observing the sum V+U, and provide a rate root n estimator. We apply these results to the world income distribution to measure the extent of convergence over time, where the values V can take on correspond to country types, i.e., wealthy versus poor countries. We also extend our results to include covariates X, showing that we can nonparametrically identify and estimate cross section regression models of the form Y=g(X,D*)+U, where D* is an unobserved binary regressor.
706.and Ken Rogoff (Harvard University), "" (04/2009: PDF)
Abstract: We test for changes in price behavior in the longest crude oil price series available (1861-2008). We find strong evidence for changes in persistence and in volatility of price across three well defined periods. We argue that historically, the real price of oil has tended to be highly persistent and volatile whenever rapid industrialization in a major world economy coincided with uncertainty regarding access to supply. We present a modified commodity storage model that fully incorporates demand, and further can accommodate both transitory and permanent shocks. We show that the role of storage when demand is subject to persistent growth shocks is speculative, instead of its classic mitigating role. This result helps to account for the increased volatility of oil price we observe in these periods.
705., Mustafa Caglayan (University of Sheffield) and Oleksandr Talavera (University of East Anglia), "" ((rev. 02/2010: PDF; published,Journal of Banking and Finance, 34, 2709-2719, 2010)
Abstract: This paper analyzes the effects of parliamentary election cycles on the Turkish banking system. Using annual bank-level data representing all banks in Turkey during 1963-2007, we find that there are meaningful differences in the structure of assets, liabilities and financial performance across different stages of the parliamentary election cycle. However, we find that government-owned banks operate similarly to both domestic and foreign-owned private sector banks before, during and after elections. Our estimates also show that government-owned banks underperform their domestic and foreign-owned private sector counterparts.
704.Eugene Choo (University of Calgary),and Aloysius Siow (University of Toronto), "" (08/2008)
Abstract: We develop and estimate an empirical collective model with endogenous marriage formation, participation, and family labor supply. Intra-household transfers arise endogenously as the transfers that clear the marriage market. The intra-household allocation can be recovered from observations on marriage decisions. Introducing the marriage market in the collective model allows us to independently estimate transfers from labor supplies and from marriage decisions. We estimate a semi-parametric version of our model using 2000 US Census data. Estimates of the model using marriage data are much more consistent with the theoretical predictions than estimates derived from labor supply.
703., "" (05/2008)
Abstract: Predation---extortion or theft---imposes significant endogenous costs on trade, with rich implications for trade policy. The model of this paper shows that the response of trade to liberalization depends on the strength of enforcement against predators. Efficient commercial policy may either tax or subsidize trade. The Mercantilist predilection for trade monopoly and for subsidy has a rationale. Insecurity induces an international externality alternative that of the standard terms of trade effect. Tolerance or intolerance of smuggling can be rational depending on the weakness or strength of enforcement, illustrated by the switch from the former to the latter by Britain in regard to its North American colonies.
702., "" (02/2008; published,International Economic Review, 50, 903-927, 2009)
Abstract: Much empirical work requires the aggregation of policies. This paper provides methods of policy aggregation that are consistent with two common objectives of empirical work. One is to preserve real income. The other is to preserve the real volume of activity in one or more parts of the economy. Trade policy aggregation is an acute example of the aggregation problem with thousands of highly dispersed trade barriers to be aggregated. An application to India shows that the standard atheoretic method of aggregation is seriously misleading compared to the consistent method.
701., "" (12/2008)
Abstract: Are bigger markets safer? How should government policy respond to terrorist threats? Trade draws potential terrorists and economic predators into productive activity, but trade also draws terrorist attacks. Larger trade reduces the risk of terrorist attack when the wage elasticity is high, associated with low ratios of predators to prey and high wages; but it may increase the risk of terrorist attack when the wage elasticity is low, associated with high ratios of predators to prey. Anti-terrorist trade policy should always promote trade in simultaneous play. Government first mover advantage and inelastic wage may imply trade restriction. Tolerance of smuggling may improve security. Better enforcement should ordinarily be provided for bigger, inherently safer and higher wage markets.
Please note:To view Adobe Acrobat® (PDF) formatted files.
700., "" (12/2008)
Abstract: The aggregated incidence of bilateral trade costs is derived from the gravity model. Incidence is equivalent to a TFP penalty. Sectoral and national differences in TFP have sharp implications for the equilibrium pattern of production and trade in a specific factors model of production. Unskilled labor is intersectorally mobile. Skilled labor acquires sector specific skills. Productivity shocks cause incidence shock that induce ex post inefficient allocation of skilled labor. Below (above) average TFP sectors produce less and have below (above) average skill premia. Ex ante efficient allocation is lower in sectors with riskier TFP incidence, despite risk neutrality.
699., "" (12/2008)
Abstract: Does globalization widen inequality or increase income risk? Globalization amplifies the effect of idiosyncratic relative productivity shocks. But wider markets reduce the effect of economy-wide supply shocks on world prices. Both forces are at work in the specific factors continuum model of this paper. Ex post equilibrium exhibits positive (negative) premia for export (import-competing) sector specific factors. Globalization widens inequality in North and South. Globalization increases personal income risk from idiosyncratic productivity shocks, but reduces aggregate shock risk acting on the factoral terms of trade. Both forces have their greatest impact on the poorest and least impact for the richest trading sectors, while the distribution in nontraded sectors is unaffected.
698.and Yoto Yotov (Drexel University), "" (12/2008; forthcoming,American Economic Review)
Abstract: Neglected properties of the structural gravity model offer a theoretically consistent method to calculate the incidence of estimated trade costs, disaggregated by commodity and region, and re-aggregated into forms useful for economic geography. For Canada's provinces, 1992-2003, incidence is on average some five times higher for sellers than for buyers. Sellers' incidence falls over time due to specialization, despite constant gravity coefficients. This previously unrecognized globalizing force drives big reductions in 'constructed home bias', the disproportionate share of local trade; and large but varying gains in real GDP. Aggregation biases gravity coefficients downward.
697.Robert Moffitt (Johns Hopkins University) and, "" (12/2008)
Abstract: We estimate the trend in the transitory variance of male earnings in the U.S. using the Michigan Panel Study of Income Dynamics from 1970 to 2004. Using both an error components model as well as simpler but more approximate methods, we find that the transitory variance increased substantially in the 1980’s and then remained at this new higher level through 2004 We also find a strong cyclical component to the transitory variance. Its increase accounts for between 30 and 65 percent of the total rise in cross-sectional variance, depending on the time period. The cross-sectional variance has recently increased but this reflects a rise in the variance of the permanent component, not the transitory component. Increases in transitory variance occurred for less educated in the early 1980s and for more educated workers in the later 1980s and early 1990s.
696., Erika McEntarfer (U.S. Treasury) and Robert Moffitt (Johns Hopkins University), "" (12/2008)
Abstract: We estimate the trend in the transitory variance of male earnings in the U.S. from 1991 to 2005 using an administrative data set of Unemployment Insurance wage reports, the Longitudinal Employer-Employer Dynamics data set (LEHD), and compare the findings to those of Moffitt and Gottschalk (2008) obtained from the Michigan Panel Study of Income Dynamics (PSID). Despite substantial differences between the LEHD and the PSID in the levels of cross- sectional variances of male earnings, the changes over time in transitory variances obtained from estimating two of the models in Moffitt and Gottschalk are quite similar in the two data sets. Specifically, over the 1991-2003 period, transitory variances fell slightly, and then rose slightly, returning in 2003 to the same approximate level they had obtained in 1991. Overall, the analysis of the LEHD data confirms the findings based on the PSID that the transitory variance did not show a trend net of cycle over this period.
695.and Mustafa Caglayan (University of Sheffield), "" (11/2008: 248 Kb, PDF)
Abstract: Empirical evidence obtained from data covering Eurozone countries, other industrialized countries, and newly industrialized countries (NICs) over 1980–2006 shows that exchange rate uncertainty has a consistent positive and significant effect on the volatility of bilateral trade flows. A one standard deviation increase in exchange rate uncertainty leads to an eight per cent increase in trade volatility. These effects differ markedly for trade flows between industrialized countries and NICs, and are not mitigated by the presence of the Eurozone. Contrary to earlier findings, our results also suggest that exchange rate uncertainty does not affect the volume of trade flows of either industrialized countries or NICs.
694.and Krishna Pendakur (Simon Fraser University), "" (05/2008: 158 Kb, PDF)
Abstract: The structural consumer demand methods used to estimate the parameters of collective household models are typically either very restrictive and easy to implement or very general and difficult to estimate. In this paper, we provide a middle ground. We adapt the very general framework of Browning, Chiappori and Lewbel (2007) by adding a simple restriction that recasts the empirical model from a highly nonlinear demand system with price variation to a slightly nonlinear Engel curve system. Our restriction has an interpretation in terms of the behaviour of household scale economies and is testable. Our method identifies the levels of (not just changes in) household resource shares, and a variant of equivalence scales called indifference scales. We apply our methodology to Canadian expenditure data.
693. Francis X. Diebold (Pennsylvania) and, "" (rev. 04/2012: PDF; published,Review of Economic Studies, 80:4, 1304-1337)
Abstract: We introduce the financial economics of market microstructure into the financial econometrics of asset return volatility estimation. In particular, we use market microstructure theory to derive the cross-correlation function between latent returns and market microstructure noise, which feature prominently in the recent volatility literature. The cross-correlation at zero displacement is typically negative, and cross- correlations at nonzero displacements are positive and decay geometrically. If market makers are sufficiently risk averse, however, the cross-correlation pattern is inverted. We derive model-based volatility estimators, which we apply to stock and oil prices. Our results are useful for assessing the validity of the frequently-assumed independence of latent price and microstructure noise, for explaining observed cross-correlation patterns, for predicting as-yet undiscovered patterns, and for microstructure-based volatil- ity estimation.
692.and Halbert White (University of California-San Diego), "" (rev. 11/2009: PDF)
Abstract: This paper examines the ways in which structural systems can yield observed variables, other than the cause or treatment of interest, that can play an instrumental role in identifying and estimating causal effects. We focus speciÖcally on the ways in which structures determine exclusion restrictions and conditional exogeneity relations that act to ensure identification. We show that by carefully specifying the structural equations and by extending the standard notion of instrumental variables, one can identify and estimate causal effects in the endogenous regressor case for a broad range of economically relevant structures. Some of these have not previously been recognized. Our results there create new opportunities for identifying and estimating causal effects in non-experimental situations. Our results for more familiar structures provide new insights. For example, we extend results of Angrist, Imbens, and Rubin (1996) by taking into account an important distinction between cases where Z is an observed exogenous instrument and those where it is a proxy for an unobserved exogenous instrument. A main message emerging from our analysis is the central importance of sufficiently specifying the causal relations governing the unobservables, as these play a crucial role in creating obstacles or opportunities for identification. Because our results exhaust the possibilities for identification, we ensure that there are no other opportunities for identification based on exclusion restrictions and conditional independence relations still to be discovered. To accomplish this characterization, we introduce notions of conditioning and conditional extended instrumental variables (EIVs). These are not proper instruments, as they are endogenous. They nevertheless permit identification and estimation of causal effects. We analyze methods using these EIVs either singly or jointly.
691. Xiaohong Chen (Yale University), Roger Koenker (University of Illinois at Urbana-Champaign) and, "" (10/2008, PDF)
Abstract: Parametric copulas are shown to be attractive devices for specifying quantile autoregressive models for nonlinear time-series. Estimation of local, quantile-specific copula-based time series models offers some salient advantages over classical global parametric approaches. Consistency and asymptotic normality of the proposed quantile estimators are established under mild conditions, allowing for global misspecification of parametric copulas and marginals, and without assuming any mixing rate condition. These results lead to a general framework for inference and model specification testing of extreme conditional value-at-risk for financial time series data.
690., Dorothea Schäfer (DIW Berlin) and Oleksandr Talavera (Aberdeen Business School), "" (rev. 09/2010, PDF; published,Journal of International Money and Finance, 2011, 30, 678-691)
Abstract: We estimate firms' cash flow sensitivity of cash to empirically test how the financial system's structure and activity level influence their financial constraints. For this purpose we merge Almeida et al. (2004), a path-breaking new design for evaluating a firm's financial constraints, with Levine (2002), who paved the way for comparative analysis of financial systems around the world. We conjecture that a country's financial system, both in terms of its structure and its level of development, influences the cash flow sensitivity of cash of constrained firms but leaves unconstrained firms unaffected. We test our hypothesis with a large international sample of 80,000 firm-years from 1989 to 2006. Our findings reveal that both the structure of the financial system and its level of development matter. Bank-based financial systems provide the constrained firms with easier access to external financing.
689.and Halbert White (University of California-San Diego), "" (rev. 07/2010: PDF; previously circulated as "Independence and Conditional Independence in Causal Systems")
Abstract: We study the interrelations between (conditional) independence and causal relations in settable systems. We provide definitions in terms of functional dependence for direct, indirect, and total causality as well as for (indirect) causality via and exclusive of a set of variables. We then provide necessary and su¢ cient causal and stochastic conditions for (conditional) dependence among random vectors of interest in settable systems. Immediate corollaries ensure the validity of Reichenbach's principle of common cause and its informative extension, the conditional Reichenbach principle of common cause. We relate our results to notions of d-separation and D-separation in the artificial intelligence literature.
688., Atreya Chakraborty (University of Massachusetts-mdý) and Boyan Liu (Beihang University), "" (08/2008: PDF; published,International Journal of Finance & Economics, 15:1, 22-30, 2010)
Abstract: We investigate the relationship between a firm’s measures of corporate governance, macroeconomic uncertainty and changes in leverage. Recent research highlights the role of governance in financing decisions. Previous research also indicates that macroeconomic uncertainty affects a firm’s ability to borrow. In this paper we investigate how both these channels of influence affects firms' financing decisions. Our findings show that macroeconomic uncertainty has an important role to play, both by itself and in interaction with a measure of corporate governance.
687.Shlomo Nael (Hebrew University) and, "(05/2008: 156 Kb, PDF)
Abstract: Transitivity is a fundamental axiom in Economics that appears in consumer theory, decision under uncertainty, and social choice theory. While the appeal of transitivity is obvious, observed choices sometimes contradict it. This paper shows that treatments of violations of transitivity al- ready appear in the rabbinic literature, starting with the Mishnah and the Talmud (1st–5th c CE). This literature offers several solutions that are similar to those used in the modern economic literature, as well as some other solutions that may be adopted in modern situations. We analyze several examples. One where nontransitive relations are acceptable; one where a violation of transitivity leads to problems with extended choice functions; and a third where a nontransitive cycle is deliberately created (to enhance justice).
686., Mustafa Caglayan (University of Sheffield) and Oleksandr Talavera (Robert Gordon University), "" (06/2008: PDF; published,Economics Letters, 2010, 106:25-27)
Abstract: We investigate the impact of debt on U.S. manufacturing firms' capital investment behavior as the underlying firm-specific and market-level uncertainty changes. The influence of leverage on capital investment may be stimulating or mitigating depending on the effects of uncertainty.
685., Carsten Kowalczyk (Tufts University) and Tomas Sjöström (Rutgers University), "" (06/2008, PDF; published,Review of International Economics, 17, 304-309, 2009)
Abstract: This paper shows nonemptiness of the core of a customs union game with a status quo equilibrium with tariffs by employing an appropriate notion of the core as in Kowalczyk and Sjöström (1994, Economica). Specifically, we find that if customs unions may have no effects on non-member countries as in Ohyama (1972, Keio Economic Studies) and Kemp and Wan (1976, Journal of International Economics) then a subset of countries forming such a customs union does not block global free trade when accompanied by so-called Grinols transfers (Grinols, 1981, Journal of International Economics).
684. Michael Giandrea (U.S. Bureau of Labor Statistics), Kevin Cahill (The Analysis Group) and, "" (04/2008: 121 Kb, PDF)
Abstract: What role does self-employment play in the retirement process? Older Americans are staying in the labor force longer than prior trends would have predicted and many change jobs later in life. These job transitions are often within the same occupation or across occupations within wage- and-salary employment. The transition can also be out of wage-and-salary work and into self employment. Indeed, national statistics show that self employment becomes more prevalent with age, partly because self employment provides older workers with opportunities not found in traditional wage-and-salary jobs, such as flexibility in hours worked and independence. This paper analyzes transitions into and out of self employment among older workers who have had career jobs. We utilize the Health and Retirement Study, a nationally-representative dataset of older Americans, to investigate the prevalence of self employment among older workers who made a job transition later in life and to explore the factors that determine the choice of wage- and-salary employment or self employment. We find that post-career transitions into and out of self employment are common and that health status, career occupation, and financial variables are important determinants of these transitions. As older Americans and the country as a whole face financial strains in retirement income in the years ahead, self employment may be a vital part of the pro-work solution.
683.Zvi Safra (Tel Aviv University) and, ""(5/2008: PDF)
Abstract: A reasonable level of risk aversion with respect to small gambles leads to a high, and absurd, level of risk aversion with respect to large gambles. This was demonstrated by Rabin for expected utility theory. Later, Safra and Segal extended this result by showing that similar arguments apply to almost all non-expected utility theories, provided they are Gateaux differentiable. In this paper we drop the differentiability assumption and by restricting attention to betweenness theories we show that much weaker conditions are sufficient for the derivation of similar calibration results.
682. Zvi Safra (Tel Aviv University) and, "(5/2008: PDF; published,Econometrica, 76:5, 1143-1166, 2008)
Abstract: Rabin proved that a low level of risk aversion with respect to small gambles leads to a high, and absurd, level of risk aversion with respect to large gambles. Rabin’s arguments strongly depend on expected utility theory, but we show that similar arguments apply to general non-expected utility theories.
681.Taiji Furusawa (Hitotsubashi University) and, "" (rev. 08/2010: PDF; published,Theoretical Economics, 6, 219-256, 2011)
Abstract: We consider a (pure) public goods provision problem with voluntary participation in a quasi-linear economy. We propose a new hybrid solution concept, the free-riding-proof core (FRP-Core), which endogenously determine a contribution group, public good provision level, and its cost-sharing. The definition of the FRP-Core is based on credibility of coalitional deviations. The FRP-Core is always nonempty in public good economy but does not usually achieve global efficiency. The FRP-Core has support from both cooperative and noncooperative games. In particular, it is equivalent to the set of perfectly coalition-proof Nash equilibrium (Bernheim, Peleg and Whinston, 1987 JET) of a dynamic game with participation decision followed by a common agency game. We illustrate the properties of the FRP-Core with an example. We also show that the equilibrium level of public good shrinks to zero as the economy is replicated.
680.Susanne Schennach (University of Chicago), Halbert White (University of California-San Diego) and, "" (rev. 12/2009: PDF; previously circulated as "Estimating average marginal effects in nonseparable structural systems")
Abstract: We study the scope of local indirect least squares (LILS) methods for nonparametrically estimating average marginal effects of an endogenous cause X on a response Y in triangular structural systems that need not exhibit linearity, separability, or monotonicity in scalar unobservables. One main finding is negative: in the fully nonseparable case, LILS methods cannot recover the average marginal effect. LILS methods can nevertheless test the hypothesis of no effect in the general nonseparable case. We provide new nonparametric asymptotic theory, treating both the traditional case of observed exogenous instruments Z and the case where one observes only error-laden proxies for Z.
679.and Andrei Zlate, "" (12/2007: 3 Mb, PDF)
Abstract: This paper explores the production characteristics of three important U.S. state government services--public higher education, public welfare, and state psychiatric hospitals—during the last half of the twentieth century. We estimate translog cost functions for the three services and find that their production attributes are similar in a number of respects. First, production exhibits substantial economies of scale; unexploited scale economies are so severe that the average state operates on the negative portion of its marginal cost curve. Second, the analysis of technical change indicates that public education, welfare, and hospitals are affected by severe technical regression in all states, in both the long run and short run. Third, production of all three services is overcapitalized in most states; the provision of these services is not long-run efficient. Finally, we show that the Baumol-Oates cost disease of lagging productivity growth is rampant in all three services; only the short-run productivity growth in education matches the performance of the private sector, as technical regression is more than offset by the productivity-enhancing scale effect of increased enrollments.
678.Yingyao Hu (Johns Hopkins University) and, "" (rev. 06/2009: 204 Kb, PDF; previously titled "Identifying the Returns to Lying When the Truth is Unobserved")
Abstract: Consider an observed binary regressor D and an unobserved binary variable D*, both of which affect some other variable Y. This paper considers nonparametric identification and estimation of the effect of D on Y, conditioning on D*=0. For example, suppose Y is a person's wage, the unobserved D* indicates if the person has been to college, and the observed D indicates whether the individual claims to have been to college. This paper then identifies and estimates the difference in average wages between those who falsely claim college experience versus those who tell the truth about not having college. We estimate this average effect of lying to be about 6% to 20%. Nonparametric identification without observing D* is obtained either by observing a variable V that is roughly analogous to an instrument for ordinary measurement error, or by imposing restrictions on model error moments.
677.Anthony Creane (Michigan State University) and, "" (rev. 06/2008: PDF; published,International Journal of Industrial Organization, 27, 379-389, 2009)
Abstract: In 1984 GM and Toyota began the joint production of automobiles to much controversy over its anti-competitive effects. The argument for the joint production was the considerable efficiency gains GM would obtain. Since then, the anti-trust controversy has died, but a question remains: why would the most efficient manufacturer (Toyota) transfer to its largest rival the knowledge to transform itself into a very efficient rival? We examine when such transfers could be unilaterally profitable, finding that it can serve as a credible way to make the market more competitive, forcing high cost firms to exit (or preventing future entry). This is not without a cost to Toyota since such a transfer also makes the remaining rivals more efficient. Despite this, we find a sufficient (but not necessary) condition for it to be profitable to predate "by proxy": the market satisfies an entry equilibrium condition. Further, we find that it is then optimal to predate on every firm that is vulnerable and so a market with many firms can become a duopoly. Profitable predation implies higher prices, to the detriment of consumers. Yet the improved production efficiency outweighs this loss, resulting enhanced social welfare. In contrast, profitable non-predatory joint production (or technology transfers) may reduce welfare. Paradoxically, the potential for predation could encourage entry ex ante.
676. Xiaohong Chen (Yale University), Yingyao Hu (Johns Hopkins University) and, "" (08/2007: 276 Kb, PDF)
Abstract: This paper considers identification and estimation of a nonparametric regression model with an unobserved discrete covariate. The sample consists of a dependent variable and a set of covariates, one of which is discrete and arbitrarily correlates with the unobserved covariate. The observed discrete covariate has the same support as the unobserved covariate, and can be interpreted as a proxy or mismeasure of the unobserved one, but with a nonclassical measurement error that has an unknown distribution. We obtain nonparametric identification of the model given monotonicity of the regression function and a rank condition that is directly testable given the data. Our identification strategy does not require additional sample information, such as instrumental variables or a secondary sample. We then estimate the model via the method of sieve maximum likelihood, and provide root-n asymptotic normality and semiparametric efficiency of smooth functionals of interest. Two small simulations are presented to illustrate the identification and the estimation results.
Please note:To view Adobe Acrobat® (PDF) formatted files.
675.Xiaohong Chen (Yale University), Yingyao Hu (Johns Hopkins University) and, "" (07/2007: 132 Kb, PDF)
Abstract: This note considers nonparametric identification of a general nonlinear regression model with a dichotomous regressor subject to misclassification error. The available sample information consists of a dependent variable and a set of regressors, one of which is binary and error-ridden with misclassification error that has unknown distribution. Our identification strategy does not parameterize any regression or distribution functions, and does not require additional sample information such as instrumental variables, repeated measurements, or an auxiliary sample. Our main identifying assumption is that the regression model error has zero conditional third moment. The results include a closed-form solution for the unknown distributions and the regression function.
674.Susanne Schennach (University of Chicago), Yingyao Hu (Johns Hopkins University) and, "" (07/2007: 277 Kb, PDF)
Abstract: This note establishes that the fully nonparametric classical errors-in-variables model is identifiable from data on the regressor and the dependent variable alone, unless the specification is a member of a very specific parametric family. This family includes the linear specification with normally distributed variables as a special case. This result relies on standard primitive regularity conditions taking the form of smoothness and monotonicity of the regression function and nonvanishing characteristic functions of the disturbances.
673., "" (09/2005: PDF; published,World Economy, 31, 141-157, 2008)
Abstract: Economic integration lowers one form of trade costs, tariffs, and stimulates changes in other trade costs. This paper offers a model in which integration may raise or lower the important trade cost associated with insecurity. The model can help to explain the varied experience with integration and it points to the usefulness of combining enforcement policy integration with trade policy integration.
672., "" (07/2007: PDF; published,Economic Theory, 41, 105-131, 2009)
Abstract: Contract enforcement is probabilistic, but the probability depends on rules and processes. A stimulus to trade may induce traders to alter rules or processes to improve enforcement. In the model of this paper, such a positive knock-on effect occurs when the elasticity of supply of traders is sufficiently high. Negative knock-on is possible when the elasticity is low. Enforcement strategies in competing markets are complements (substitutes) if the supply of traders is sufficiently elastic (inelastic). The model provides a useful structure of endogenous enforcement that gives promise of explaining patterns of institutional development.
671.Erich Battistin (University of Padova), Richard Blundell (University College London) and, "" (07/2007: 577 Kb, PDF)
Abstract: Significant departures from log normality are observed in income data, in violation of Gibrat's law. We identify a new empirical regularity, which is that the distribution of consumption expenditures across households is, within cohorts, closer to log normal than the distribution of income. We explain these empirical results by showing that the logic of Gibrat's law applies not to total income, but to permanent income and to maginal utility. These findings have important implications for welfare and inequality measurement, aggregation, and econometric model analysis.
670.Michael Giandrea (U.S. Bureau of Labor Statistics), Kevin Cahill (The Analysis Group) and, "" (rev. 12/2008: PDF; previously circulated as ">An Update on Bridge Jobs: the HRS War Babies")
Abstract: Are today's youngest retirees following in the footsteps of their older peers with respect to gradual retirement? Recent evidence from the Health and Retirement Study (HRS) suggests that most older Americans with full-time career jobs later in life transitioned to another job prior to complete labor force withdrawal. This paper explores the retirement patterns of a younger cohort of individuals from the HRS known as the "War Babies." These survey respondents were born between 1942 and 1947 and were 57 to 62 years of age at the time of their fourth bi-annual HRS interview in 2004. We compare the War Babies to an older cohort of HRS respondents and find that, for the most part, the War Babies have followed the gradual-retirement trends of their slightly older predecessors. Traditional one-time, permanent retirements appear to be fading, a sign that the impact of changes in the retirement income landscape since the 1980s continues to unfold.
669., "" (rev. 11/2008; 150 Kb, PDF)
Abstract: Shape invariance is a property of demand functions that is convenient for semiparametric demand modelling. All known shape invariant demands are derived from utility functions that, up to monotonic transformation, are called IB/ESE (independent of base - equivalence scale exact) utility functions, because they yield IB/ESE equivalence scales, which are widely used in welfare calculations. This paper provides a counterexample, i.e., a shape invariant demand system that is not derived from a transform of IB/ESE utility. A general theorem is then provided that characterizes all shape invariant demand systems. The usual practice of equating shape invariance with the IB/ESE utility class is shown to be not quite right, but it can be made valid by testing for the small class of exceptions noted here. In particular, all the exceptions have rank two, so any rank three or higher shape invariant system must be derived from transforms of IB/ESE utility.
668.and, "" (rev. 06/2009; PDF)
Abstract: Estimation of demand systems with many goods is empirically difficult because demand functions depend, flexibly and usually nonlinearly, on the prices of all goods. The standard solution is to impose strong, empirically questionable behavioral restrictions on price elasticities via separability. This paper proposes an alternative based on applying statistical dimension reduction methods to the price vector, and deriving the resulting restrictions on demand functions that remain due to Slutsky symmetry and other implications of utility maximization. The results permit estimation of the effects of income and of prices of some goods on the demand functions for every good without imposing any separability. We illustrate the results by reporting estimates of the effects of gasoline prices on the demands for many goods.
667., Mark E. Schaffer (Heriot-Watt University) and Steven Stillman (Motu Economic and Public Policy Research), "" (09/2007; 410 Kb, PDF; published,Stata Journal, 7:4, 465-506, 2007)
Abstract: We extend our 2003 paper on instrumental variables (IV) and GMM estimation and testing and describe enhanced routines that address HAC standard errors, weak instruments, LIML and k-class estimation, tests for endogeneity and RESET and autocorrelation tests for IV estimates.
666.Eren Inci, "" (05/2007: 747 Kb, PDF)
Abstract: This paper focuses on the quality of entrepreneurs when individuals, who differ in terms of entrepreneurial ability and wealth, choose between entrepreneurship and wage-earning. A loan is required to become an entrepreneur. Four wealth classes form endogenously. Banks' inability to identify the ability of individuals leads them to offer pooling contracts to the poor and the lower-middle classes. Regardless of ability, all poor class individuals become workers and all lower-middle class individuals become entrepreneurs. Banks are able to offer separating contracts to the upper-middle and the rich classes. High-ability individuals in these wealth classes become entrepreneurs and their low-ability counterparts become workers. Equilibrium contracts may entail cross-subsidies within or between occupations. In some economies, a small success tax on entrepreneurs used to subsidize workers can increase the average quality of entrepreneurs and welfare by changing the thresholds of the wealth classes. In some others a reverse policy is required. Since the aggregate level of investment is fixed, the reason for these policies is not under- or overinvestment by entrepreneurs, as it often is in previous literature.
665.Richard Arnottand John Rowse (University of Calgary), "" (04/2007: 241 Kb, PDF)
Abstract: This paper develops and calibrates a model of downtown parking in a city without mass transit, and applies it to investigate downtown parking policy. There is curbside and garage parking and traffic congestion. Spatial competition between private parking garages determines the equilibrium garage parking fee and spacing between parking garages. Curbside parking is priced below its social opportunity cost. Cruising for parking adjusts to equalize the full prices of on- and off-street parking, and contributes to traffic congestion. The central result is that raising curbside parking fees appears to be a very attractive policy since it generates efficiency gains that may be several times as large as the increased revenues raised.
664., James G. Bohn and Atreya Chakraborty (University of Massachusetts-mdý), "" (rev. 05/2016: PDF; forthcoming,International Review of Law and Economics)
Abstract: We examine the relationship between outcomes of securities fraud class action lawsuits (SFCAs) and corporate board turnover rates. Our results indicate the strength of the allegations in lawsuits affects board turnover. The turnover rates for each type of board member: outsiders, insiders, and CEOs are higher when a lawsuit is settled relative to those that are dismissed. Turnover rates of outside directors are more sensitive to the outcome of the SFCA among firms with higher levels of external blockholdings and those with greater institutional ownership. These results support the view that firms act to impose sanctions on those individuals associated with fraudulent activities.
663.Giuseppe Fiori, Giuseppe Nicoletti (OECD), Stefano Scarpetta (OECD) and, "" (rev. 08/2008: 310 Kb, PDF)
Abstract: This paper provides a systematic empirical investigation of the effect of product market liberalization on employment when there are interactions between policies and institutions in product and labor markets. Using panel data for OECD countries over the period 1980-2002, we present evidence that product market deregulation is more effective at the margin when labor market regulation is high. Moreover, there is evidence in our sample that product market deregulation promotes labor market deregulation. We show that these results are mostly consistent with the basic predictions of a standard bargaining model (e.g. Blanchard and Giavazzi (2003)), once one allows for a full specification of the fall back position of the unions.
662., "" (04/2007: 256 Kb, PDF; published,American Economic Review, June 2009)
Abstract: Post-1980 U.S. data trace out a stable long-run money demand relationship of Cagan's semi-log form between the M1-income ratio and the nominal interest rate, with an interest semi-elasticity of 1.79. Integrating under this money demand curve yields estimates of the welfare cost of modest departures from Friedman's zero nominal interest rate rule for the optimum quantity of money that are quite small. The results suggest that the Federal Reserve's current policy, which generates low but still positive rates of inflation, provides an adequate approximation in welfare terms to the alternative of moving all the way to the Friedman rule.
661.Richard Arnottand Elizaveta Shevyakhova, "" (04/2007: 612 Kb, PDF)
Abstract: Under tenancy rent control, rents are regulated within a tenancy but not between tenancies. This paper investigates the effects of tenancy rent control on housing quality, maintenance, and rehabilitation. Since the discounted revenue received over a fixed-duration tenancy depends only on the starting rent, intuitively the landlord has an incentive to spruce up the unit between tenancies in order to 'show' it well, but little incentive to maintain the unit well during the tenancy. The paper formalizes this intuition, and presents numerical examples illustrating the efficiency loss from this effect.
660.Richard Arnott, "" (03/2007: 244 Kb, PDF)
Abstract: Consider an urban economy with two types of externalities, negative traffic congestion externalities and positive agglomeration externalities deriving from non-market interaction. Suppose that urban travel can be tolled, that non-market interaction cannot be subsidized, and that non-market interaction is stimulated by a reduction in travel costs. Then the optimal toll is below the congestion externality cost. This paper explores this line of reasoning.
659.and Stefano Neri (Banca D'Italia), "(rev. 010/2009: PDF; forthcoming,American Economic Journals: Macroeconomics)
Abstract: Using U.S. data and Bayesian methods, we quantify the contribution of the housing market to business fluctuations. The estimated model, which contains nominal and real rigidities and collateral constraints, is used to address two questions. First, what shocks drive the housing market? We find that the upward trend in real housing prices of the last 40 years can be explained by slow technological progress in the housing sector. Over the business cycle instead, housing demand and housing technology shocks account for roughly one-quarter each of the volatility of housing investment and housing prices. Monetary factors account for about 20 percent, but they played a major role in the housing market cycle at the turn of the century. Second, do fluctuations in the housing market propagate to other forms of expenditure? We find that the spillovers from the housing market to the broader economy are non-negligible, concentrated on consumption rather than business investment, and they have become more important over time, to the extent that financial innovation has increased the marginal availability of funds for credit-constrained agents.
658.,and Scott Schuh (Federal Reserve Bank of mdý), "" (rev. 10/2009: PDF)
Abstract: We build and estimate a two-sector (goods and services) dynamic general equilibrium model with two types of inventories: finished goods (output) inventories yield utility services while materials (input) inventories facilitate the production of goods. The model, which contains neutral and inventory-specific technology shocks and preference shocks, is estimated by Bayesian methods. The estimated model replicates the volatility and cyclicality of inventory investment and inventory-target ratios. When estimated over subperiods, the results suggest that changes in the volatility of inventory shocks, or in structural parameters associated with inventories, play a minor role in the reduction of the volatility of output.
657., Mustafa Caglayan (University of Sheffield), Dorothea Schäfer (DIW Berlin) and Oleksandr Talavera (DIW Berlin), "" (rev. 02/2008: 133 Kb, PDF; published,Economics of Transition, 16(3), 2008, 537-557; earlier version published (in German) as "Ukrainische Banken: Politische Patronage von Bedeutung", Wochenbericht Nr. 23/2007, DIW Berlin, pp. 367-371.)
Abstract: This paper empirically investigates the link between political patronage and bank performance for Ukraine during 2003Q3-2005Q2. We find significant differences between politically affiliated and non-affiliated banks. The data suggest that affiliated banks have significantly lower interest margins. The gap between affiliated banks' and non-affiliated banks' capitalization ratios, is narrowing over time. Parliamentary deputies might use financial institutions to achieve political goals which reduces their banks' performance.
656.Margarita Sapozhnikov, "" (11/2006: 280 Kb, PDF)
Abstract: It has long been thought that government antitrust policy has an effect on aggregate merger and acquisition activity, but the empirical support for this hypothesis has been weak and inconsistent. This paper uses a new empirical specification and a new dataset on mergers and acquisitions to provide support for this conjecture. Regression analysis shows that government policy has a significant influence on mergers and that the nature of the effects depends on the type of merger. Fitting the time series into a two-state Markov switching model shows that conglomerate and horizontal time series follow different dynamics for the last half century, which is most likely caused by the dissimilar treatment of the two types of merger by the government. Only the conglomerate merger and acquisition time series is well described by a two-state Markov switching model. In contrast, the horizontal time series has a break in the early 1980s that may be attributed to the dramatic change in government policy.
655., "" (rev. 01/2008: 270 Kb, PDF; published,Regional Science and Urban Economics, 2008, 38:461-471)
Abstract: This paper establishes the existence and efficiency of equilibrium in a local public goods economy with spatial structures by formalizing Hamilton's (1975 Urban Studies) elaboration of Tiebout's (1956 JPE) tale. We use a well-known equilibrium concept from Rothschild and Stiglitz (1976, QJE) in a market with asymmetric information, and show that Hamilton's zoning policy plays an essential role in proving existence and efficiency of equilibrium. We use an idealized large economy following Ellickson, Grodal, Scotchmer and Zame (1999, Econometrica) and Allouch, Conley and Wooders (2004). Our theorem is directly applicable to the existence and efficiency of a discrete approximation of mono- or multi-centric city equilibrium in urban economics with commuting time costs even if we allow existence of multiple qualities of (collective) residences, when externalities due to traffic congestion are not present.
654.and Margarita Sapozhnikov, "" (rev. 01/2008: 264 Kb, PDF; published,Games and Economic Behavior, 2008, 64:193-218)
Abstract: In a Shapley-Shubik assignment problem with a supermodular output matrix, we consider games in which each firm makes a take-it-or-leave-it salary offer to one applicant, and a match is made only when the offer is accepted by her. We consider both one-shot and multistage games. In either game, we show that there can be many equilibrium salary vectors which are higher or lower than the minimal competitive salary vector. If we exclude artificial equilibria, applicants' equilibrium salary vectors are bounded above by the minimal competitive salary vector, while firms' equilibrium payoff vectors are bounded below by the payoff vector under the minimal competitive salary vector. This suggests that adopting the minimal competitive salary vector as the equilibrium outcome in decentralized markets does not have a strong justification.
653.Bariş K. Yörük, "" (10/2006: 520 Kb, PDF)
Abstract: People tend to contribute to a charity only when they are asked to. Although this so-called 'power of asking' is a well-known technique among fundraisers, the existing literature does not pay much attention to the role of donation requests in charitable giving. We estimate the causal effects of charitable solicitations on both the propensity to give and the amount of charitable contributions using a unique data set, which was designed to measure the giving behavior in the United States. In order to address the endogeneity of the donation requests due to non-random solicitation of charitable donors, we link this data set to IRS data on charitable organizations and the 2000 Census and propose identifying instruments. After controlling for the endogeneity, we find that people are both more likely to contribute to a charity and also donate more when they are asked to. This effect is robust under different specifications and with different sets of instruments and is much larger compared with the estimates of univariate models. Furthermore, we argue that some identifiable characteristics of individuals are associated with the higher probability of being solicited. In particular, we find some evidence that income, age, education, and race play significant roles in explaining the selection of potential charitable donors.
652.David Jacho-Chavez (Indiana University),and Oliver Linton (London School of Economics), "" (rev. 11/2008: 3.3 Mb, PDF)
Abstract: Let r(x,z) be a function that, along with its derivatives, can be consistently estimated nonparametrically. This paper discusses identification and consistent estimation of the unknown functions H, M, G and F, where r(x, z) = H[M (x, z)] and M(x,z) = G(x) + F(z). An estimation algorithm is proposed for each of the model's unknown components when r(x, z) represents a conditional mean function. The resulting estimators use marginal integration, and are shown to have a limiting Normal distribution with a faster rate of convergence than unrestricted nonparametric alternatives. Their small sample performance is studied in a Monte Carlo experiment. We empirically apply our results to nonparametrically estimate and test generalized homothetic production functions in four industries within the Chinese economy.
651.and Krishna Pendakur (Simon Fraser University), "" (rev. 11/2008: 2.2 Mb, PDF; published,American Economic Review, June 2009)
Abstract: We invent Implicit Marshallian Demands, a new type of demand function that combines desirable features of Hicksian and Marshallian demand functions. We propose and estimate the Exact Affine Stone Index (EASI) Implicit Marshallian Demand system. Like the Almost Ideal Demand (AID) system, EASI budget shares are linear in parameters given real expenditures. However, unlike the AID, EASI demands can have any rank and its Engel curves can be polynomials or splines of any order in real expenditures. EASI error terms equal random utility parameters to account for unobserved preference heterogeneity. EASI demand functions can be estimated using ordinary GMM, and, like AID, an approximate EASI model can be estimated by linear regression.
Please note:To view Adobe Acrobat® (PDF) formatted files.
650., "" (09/2006: 180 Kb, PDF)
Abstract: My goal here is to provide some synthesis of recent results regarding unobserved heterogeneity in nonlinear and semiparametric models, using as a context Matzkin (2005a) and Browning and Carro (2005), which were the papers presented in the Modeling Heterogeneity session of the 2005 Econometric Society World Meetings in London. These papers themselves consist of enormously heterogeneous content, ranging from high theory to Danish milk, which I will attempt to homogenize. The overall theme of this literature is that, in models of individual economic agents, errors at least partly reflect unexplained heterogeneity in behavior, and hence in tastes, technologies, etc.,. Economic theory can imply restrictions on the structure of these errors, and in particular can generate nonadditive or nonseparable errors, which has profound implications for model specification, identification, estimation, and policy analysis.
649.and Minh Huynh (U.S. Social Security Administration), "" (08/2006: 681 Kb, PDF; forthcoming,Review of Economics and Statistics)
Abstract: Measures of inequality and mobility based on self-reported earnings reflect attributes of both the joint distribution of earnings across time and the joint distribution of measurement error and earnings. While classical measurement error would increase measures of inequality and mobility there is substantial evidence that measurement error in earnings is not classical. In this paper we present the analytical links between non-classical measurement error and measures of inequality and mobility. The empirical importance of non-classical measurement error is explored using the Survey of Income and Program Participation matched to tax records. We find that the effects of non-classical measurement error are large. However, these non-classical effects are largely offsetting when estimating mobility. As a result SIPP estimates of mobility are similar to estimates based on tax records, though SIPP estimates of inequality are smaller than estimates based on tax records.
648., "" (08/2006: 234 Kb, PDF)
Abstract: The paper derives the optimal organizational response of a bank (the principal) which faces a risk of collusion between the credit manager (the agent) and the credit-seeking firms. The bank can deter collusion either through internal incentives or by distorting the credit contracts. The model thus explicitly takes into account the interaction between internal (collusion) risks and external (default) risks in the optimal design of the internal organization as well as of the credit contracts. We investigate this question in two settings. In the first one, we adopt the standard assumption that the agent is always willing to collude (is corruptible) if that increases his monetary payoff. In the second one, he is corruptible with some probability only, and honest otherwise. A novel feature of our approach is to allow for screening among corruptible and honest agents. We find that if the probability that the agent is honest is sufficiently large, collusion occurs in equilibrium.
647.and Nicholas G. Tresp (UBS Investment Bank), "" (rev. 08/2006: 248 Kb, PDF)
Abstract: This paper reconsiders the role of economic policy in determining the effectiveness of foreign aid for generating economic growth in developing countries. We update and modify the data set originally used by Burnside and Dollar (2000) in order to more fully consider the critique presented by Easterly et al. (2004). Our findings suggest that the relationship among foreign aid, government policy, and economic growth is tenuous and depends importantly on the subset of countries included in the analysis. Good policy enhances the effectiveness of foreign aid in spurring growth when we use the original set of countries included in Burnside and Dollar, but this relationship disappears for an expanded set of countries. Because the relationship among aid, policy, and growth is likely to be nonlinear, we present an alternative probit model emphasizing growth thresholds. Our results from this alternative analysis confirm the conclusions of Easterly et al., finding little support for the view that good policy increases the probability that foreign aid contributes to growth.
646., Mustafa Caglayan (University of Sheffield) and Oleksandr Talavera (DIW Berlin), "" (02/2007: 140 Kb, PDF; published,Economics Letters, 98:3, 282-287, 2008)
Abstract: We investigate the impact of measures of uncertainty on firms' capital investment behavior using a panel of U.S. firms. Increases in firm-specific and CAPM-based measures have a significant negative impact on investment spending, while market-based uncertainty has a positive impact.
645.Zvi Safra (Tel Aviv University) and, "" (07/2006: 204 Kb, PDF)
Abstract: Rabin proved that a low level of risk aversion with respect to small gambles leads to a high, and absurd, level of risk aversion with respect to large gambles. Rabin's arguments strongly depend on expected utility theory, but we show that similar arguments apply to almost all non-expected utility theories.
644., "" (07/2006: 2.6 Mb, PDF)
643.and Jörgen W. Weibull (Stockholm School of Economics), "" (07/2006: 560 Kb, PDF)
Abstract: Recognizing that individualism, or weak family ties, may be favorable to economic development, we ask how family ties interact with climate to determine individual behavior and whether there is reason to believe that the strength of family ties evolves differently in different climates. For this purpose, we develop a simple model of the interaction between two individuals who are more or less altruistic towards each other. Each individual exerts effort to produce a consumption good under uncertainty. Outputs are observed and each individual chooses how much, if any, of his or her output to share with the other. We analyze how the equilibrium outcome depends on altruism and climate for ex ante identical individuals. We also consider (a) "coerced altruism," that is, situations where a social norm dictates how output be shared, (b) the effects of insurance markets ,and (c) the role of institutional quality. The evolutionary robustness of altruism is analyzed and we study how this depends on climate.
642.and Scott Schuh (Federal Reserve Bank of mdý), "" (04/2006: 378 Kb, PDF)
Abstract: A two-sector real business cycle model, estimated with postwar U.S. data, identifies shocks to the levels and growth rates of total factor productivity in distinct consumption- and investment-goods-producing technologies. This model attributes most of the productivity slowdown of the 1970s to the consumption-goods sector; it suggests that a slowdown in the investment-goods sector occurred later and was much less persistent. Against this broader backdrop, the model interprets the more recent episode of robust investment and investment-specific technological change during the 1990s largely as a catch-up in levels that is unlikely to persist or be repeated anytime soon.
641.and Mustafa Caglayan (University of Sheffield), "" (rev. 07/2008: 212 Kb, PDF; published,Journal of International Money and Finance, 2010, 29:79-93)
Abstract: We present an empirical investigation of the hypotheses that exchange rate uncertainty may have an impact on both the volume and variability of trade flows by considering a broad set of industrial countries' bilateral real trade flows over the period 1980--1998. Similar to the findings of earlier theoretical and empirical research, our first set of results shows that the impact of exchange rate uncertainty on trade flows is indeterminate. Our second set of results provides new and novel findings that exchange rate uncertainty has a consistent positive and significant effect on the volatility of bilateral trade flows, helping us better understand macroeconomic volatility.
640.and, "" (02/2006: 260 Kb, PDF; forthcoming as "House Allocation with Existing Tenants: A Characterization",Games and Economic Behavior)
Abstract: We analyze mechanisms to kidney exchange with good samaritan donors where exchange is feasible not only among donor-patient pairs but also among such pairs and non-directed altruistic donors. We show that you request my donor-I get your turn mechanism (Abdulkadiroglu and Sonmez [1999]) is the only mechanism that is Pareto efficient, individually rational, strategy-proof, weakly neutral and consistent.
639.Atila Abdulkadiroglu (Columbia University), Parag A. Pathak (Harvard University), Alvin E. Roth (Harvard University),, "" (01/2006: 1.1 Mb, PDF)
Abstract: In July 2005 the mdý School Committee voted to replace the existing mdý school choice mechanism with a deferred acceptance mechanism that simplifies the strategic choices facing parents. This paper presents the empirical case against the previous mdý mechanism, a priority matching mechanism, and the case in favor of the change to a strategy-proof mechanism. Using detailed records on student choices and assignments, we present evidence both of sophisticated strategic behavior among some parents, and of unsophisticated strategic behavior by others. We find evidence that some parents pay close attention to the capacity constraints of different schools, while others appear not to. In particular, we show that many unassigned students could have been assigned to one of their stated choices with a different strategy under the current mechanism. This interaction between sophisticated and unsophisticated players identifies a new rationale for strategy-proof mechanisms based on fairness, and was a critical argument in mdý's decision to change the mechanism. We then discuss the considerations that led to the adoption of a deferred acceptance mechanism as opposed to the (also strategy-proof) top trading cycles mechanism.
638., Mustafa Caglayan (University of Sheffield) and Oleksandr Talavera (Aberdeen Business School), "" (rev. 04/2008: 211 Kb, PDF; published,Oxford Economic Papers, 62 (2010))
Abstract: We investigate the analytical and empirical linkages between cash flow, uncertainty and firms' capital investment behavior. Our empirical approach constructs measures of own- and market-specific uncertainty from firms' daily stock returns and S&P 500 index returns along with a CAPM-based risk measure. Our results indicate that even in the presence of important firm-specific variables, uncertainty is an important determinant of firms' investment behavior. Depending on the measure of uncertainty used, investment may be stimulated or curtailed by the effects of uncertainty on its own or through its interactions on cash flow.
637., Dorothea Schäfer (DIW Berlin) and Oleksandr Talavera (DIW Berlin), "" (rev. 08/2006: 188 Kb, PDF)
Abstract: This paper investigates the link between the optimal level of non-financial firms' liquid assets and industry-level uncertainty. We develop a structural model of a firm's value maximization problem that predicts that as industry-level uncertainty increases the firm will increase its optimal level of liquidity. We test this hypothesis using a panel of German firms drawn from the Bundesbank's balance sheet database and show that greater uncertainty at the industry level causes firms to increase their cash holdings. The strength of these effects differ among subsamples of the firms with different characteristics.
636., Dorothea Schäfer (DIW Berlin) and Oleksandr Talavera (DIW Berlin), "" (rev. 04/2007: 164 Kb, PDF)
Abstract: The paper adopts the methodology of the empirical finance literature to analyze a common question that liability maturity structure has an impact on firm performance. A comparison is made between two countries, the US and Germany, with different types of financial systems. We find that German firms that rely more heavily on short-term liabilities are likely to be more profitable. The link between liability maturity structure and profitability does not appear in the results from the US sample, which reflects the importance of institutional factors.
635.Mariano Kulish (Reserve Bank of Australia), "" (11/2005: 760 Kb, PDF)
Abstract: This paper studies two roles that long-term nominal interest rates can play in the conduct of monetary policy in a New Keynesian model. The first allows long-term rates to enter the reaction function of the monetary authority. The second considers the possibility of using long-term rates as instruments of policy. It is shown that in both cases a unique rational expectations equilibrium exists. Reacting to movements in long yields does not improve macroeconomic performance as measured by the loss function. However, long-term rates turn out to be better instruments when the relative concern of the monetary authority for inflation volatility is high.
634., Mustafa Caglayan (University of Sheffield), Andreas Stephan (European University Viadrina & DIW Berlin) and Oleksandr Talavera (DIW Berlin), "" (rev. 10/2006: 307 Kb, PDF; published,Economic Modelling, 25 (2008), 833-849)
Abstract: This paper investigates the link between the optimal level of non-financial firms' liquid assets and uncertainty. We develop a partial equilibrium model of precautionary demand for liquid assets showing that firms change their liquidity ratio in response to changes in either macroeconomic or idiosyncratic uncertainty. We test this proposition using a panel of non-financial US firms drawn from the COMPUSTAT quarterly database covering the period 1993-2002. The results indicate that firms increase their liquidity ratios when macroeconomic uncertainty or idiosyncratic uncertainty increases.
633.Zvi Safra (Tel Aviv University) and, "" (12/2005: 304 Kb, PDF)
Abstract: Rabin proved that a low level of risk aversion with respect to small gambles leads to a high, and absurd, level of risk aversion with respect to large gambles. Rabin's arguments strongly depend on expected utility theory, but we show that similar arguments apply to almost all non-expected utility theories and even to theories dealing with uncertainty. The set of restrictions needed in order to avoid such absurd behavior may suggest that the assumption of universality of preferences over final wealth is too strong.
632.J. Michael Finger (World Bank) and Andrei Zlate, "" (11/2005: 272 Kb, PDF; published,Journal of World Investment and Trade, 6:4, 2005)
Abstract: Maintaining an economically sensible trade policy is often a matter of managing pressures for exceptions--for protection for a particular industry. Good policy becomes a matter of managing interventions so as to strengthen the politics of openness and liberalization--of avoiding rather than of imposing such restrictions in the future. In the 1990s, antidumping measures emerged as the instrument of choice to accomplish this, despite the fact that they satisfy neither of these criteria. Its economics is ordinary protection; it considers the impact on the domestic interests that will benefit while excluding the domestic interests that will bear the costs. Its unfair trade rhetoric undercuts rather than supports a policy of openness. As to what would be better, the key issue in a domestic policy decision should be the impact on the domestic economy. Antidumping reform depends less on the good will of WTO delegates toward the "public interest" than on those business interests that are currently treated by trade law as bastards insisting that they be given the same standing as the law now recognizes for protection seekers.
631.Luigi Benfratello (Università di Torino),and Alessandro Sembenelli (Università di Torino), "" (rev. 06/2007: 328 Kb, PDF)
Abstract: This paper contains a detailed empirical investigation of the effect of local banking development on firms' innovative activities, using a rich data set on innovation at the firm level for a large number of Italian firms over the 90's. There is evidence that banking development affects the probability of process innovation, particularly for small firms and for firms in high(er) tech sectors and in sectors more dependent upon external finance. There is also some evidence that banking development reduces the cash flow sensitivity of fixed investment spending, particularly for small firms, and that it increases the probability they will engage in R&D.
630., "" (11/2005: 144 Kb, PDF)
629., "" (rev. 10/2007: 553 Kb, PDF; forthcoming,Journal of Money, Credit and Banking)
Abstract: I construct an economy with heterogeneous agents that mimics the time-series behavior of the earnings distribution in the United States from 1963 to 2003. Agents face aggregate and idiosyncratic shocks and accumulate real and financial assets. I estimate the shocks that drive the model using data on income inequality, aggregate income, and measures of financial liberalization. I show how the model economy can replicate two empirical facts: the trend and cyclical behavior of household debt, and the diverging patterns in consumption and wealth inequality over time. While business cycle fluctuations can account for the short-run changes in household debt, its prolonged rise of the 1980s and the 1990s can be quantitatively explained only by the concurrent increase in income inequality.
628., "" (10/2005: 158 Kb, PDF)
Abstract:The monetary transmission mechanism describes how policy-induced changes in the nominal money stock or the short-term nominal interest rate impact on real variables such as aggregate output and employment.† Specific channels of monetary transmission operate through the effects that monetary policy has on interest rates, exchange rates, equity and real estate prices, bank lending, and firm balance sheets.† Recent research on the transmission mechanism seeks to understand how these channels work in the context of dynamic, stochastic, general equilibrium models.
627., "" (09/2005: 437 Kb, PDF)
Abstract: This paper looks at parking policy in dense urban districts ("downtown"), where spatial competition between parking garages is a key feature. The paper has four parts. The first looks at the "parking garage operator's problem". The second derives the equilibrium in the parking garage market when there is no on-street parking, compares the equilibrium to the social optimum, and examines parking policy in this context. The third considers how the presence of on-street parking alters the analysis, and the fourth extends the analysis to include mass transit.
626.Kevin E. Cahill (Tinari Economics Group), Michael D. Giandrea (U.S. Bureau of Labor Statistics) and, "" (09/2005: 372 Kb, PDF)
Abstract: This paper investigates whether permanent, one-time retirements are coming to an end just as the trend towards earlier and earlier retirements did nearly 20 years ago. We explore how common bridge jobs are among today's retirees, and how uncommon traditional retirements have become. Methods: Using data from the Health and Retirement Study (HRS), we explore the work histories and retirement patterns of a cohort of retirees aged 51 to 61 in 1992 over a ten-year time period in both a cross-sectional and longitudinal context. Bridge job determinants are examined using bivariate comparisons and a multinomial logistic regression model of the bridge job decision. Results: We find that one-half to two-thirds of the HRS respondents with full-time career jobs take on bridge jobs before exiting the labor force completely. We also find that bridge job behavior is most common among younger respondents, respondents without defined-benefit pension plans, and respondents at the lower- and upper-end of the wage distribution. Implications: The evidence suggests that changes in the retirement income landscape since the 1980s appear to be taking root. Going forward, traditional retirements will be the exception rather than the rule.
Please note:To view Adobe Acrobat® (PDF) formatted files.
625. Arturo Galindo (Universidad de los Andes),and Andrew Weiss (mdý University), "" (10/2005: 256 Kb, PDF; forthcoming,Journal of Development Economics)
Abstract: Using firm level panel data from twelve developing countries we explore if financial liberalization improves the efficiency with which investment funds are allocated. A summary index of the efficiency of investment allocation that measures whether investment funds are going to firms with a higher marginal return to capital is developed. We examine the relationship between this and various measures of financial liberalization and find that liberalization increases the efficiency with which investment funds are allocated. This holds after various robustness checks and is consistent with firm level evidence that a stronger association between investment and fundamentals after financial liberalization.
624. Amrita Bhattacharyya, "" (rev. 11/2005: 256 Kb, PDF)
Abstract: This paper analyzes advertising incentives and strategies in specialized markets, where consumers' decisions are dictated by experts. By analyzing the market stealing and market expanding aspects of advertising, this study shows that in a sub-game perfect equilibrium only some (and not all) firms may choose to advertise to consumers. From the welfare perspective, consumer advertising is socially optimal when advertising has only market expanding effects. Furthermore, a simple game-theoretic model shows that when only some firms advertise to consumers, the crucial determinant of advertising is the number of advertisers. In particular, with increased competition from rival advertisers, each firm's advertising decreases. Modeling specific features of the U.S. prescription drugs market the theoretical analysis suggests that the wide variation in direct-to-consumer-advertising (DTCA) by U.S. pharmaceutical companies both within and across drug classes is due to differences in disease-familiarity and heterogeneity in patients' types. Using annual, brand-level DTCA expenditure data for prescription drugs, empirical results give evidence of the negative impact of competition on advertising.
623., "" (rev. 08/2008: 204 Kb, PDF)
Abstract: The main purpose of this paper is to provide a critical overview of the recent empirical contributions that use cross-country data to study the effect of product market regulation and reform on a country’s macroeconomic performance. After a review of the theoretical literature and of relevant micro-econometric evidence, the paper discusses the main data and methodological issues related to empirical work on this topic. It then critically evaluates the cross-country evidence on the effect of product market regulation on markups, firm dynamics, investment, employment, innovation, productivity and output growth. A summary of what we learn from the econometric results concludes the paper.
622. Yan Chen (University of Michigan - Ann Arbor) and, "" (10/2004: 196 Kb, PDF; forthcoming,Journal of Economic Theory)
Abstract: We present an experimental study of three school choice mechanisms. The mdý mechanism is influential in practice, while the two alternative mechanisms, the Gale-Shapley and Top Trading Cycles mechanisms, have superior theoretical properties in terms of incentives and efficiency. Consistent with theory, this study indicates a high preference manipulation rate under the mdý mechanism. As a result, efficiency under mdý is significantly lower than that of the two competing mechanisms in the designed environment. However, contrary to theory, Gale-Shapley outperforms the Top Trading Cycles mechanism and generates the highest efficiency. Our results suggest that replacing the mdý mechanism with either Gale-Shapley or Top Trading Cycles mechanism might significantly improve efficiency, however, the efficiency gains are likely to be more profound when parents are educated about the incentive compatibility of these mechanisms.
621. Alvin E. Roth (Harvard University),and, "" (05/2005: 492 Kb, PDF; published as "Efficient Kidney Exchange: Coincidence of Wants in Markets with Compatibility-Based Preferences",American Economic Review, 97(3): 828-851, 2007)
Abstract: Patients needing kidney transplants may have willing donors who cannot donate to them because of blood or tissue incompatibility. Incompatible patient-donor pairs can exchange donor kidneys with other such pairs. The situation facing such pairs resem- bles models of the ìdouble coincidence of wants,î and relatively few exchanges have been consummated by decentralized means. As the population of available patient-donor pairs grows, the frequency with which exchanges can be arranged will depend in part on how exchanges are organized. We study the potential frequency of exchanges as a function of the number of patient-donor pairs, and the size of the largest feasible exchange. Developing infrastructure to identify and perform 3-way as well as 2-way exchanges will have a substantial effect on the number of transplants, and will help the most vulnerable patients. Larger than 3-way exchanges have much smaller impact. Larger populations of patient-donor pairs increase the percentage of patients of all kinds who can find exchanges.
620. Alvin E. Roth (Harvard University),and, "" (08/2004: 492 Kb, PDF; published,Journal of Economic Theory, 125: 151-188, 2005)
Abstract: In connection with an earlier paper on the exchange of live donor kidneys (Roth, Sönmez, and Ünver 2004) the authors entered into discussions with New England transplant surgeons and their colleagues in the transplant community, aimed at implementing a Kidney Exchange program. In the course of those discussions it became clear that a likely first step will be to implement pairwise exchanges, between just two patient-donor pairs, as these are logistically simpler than exchanges involving more than two pairs. Furthermore, the experience of these surgeons suggests to them that patient and surgeon preferences over kidneys should be 0-1, i.e. that patients and surgeons should be indifferent among kidneys from healthy donors whose kidneys are compatible with the patient. This is because, in the United States, transplants of compatible live kidneys have about equal graft survival probabilities, regardless of the closeness of tissue types between patient and donor (unless there is a rare perfect match). In the present paper we show that, although the pairwise constraint eliminates some potential exchanges, there is a wide class of constrained-efficient mechanisms that are strategy-proof when patient- donor pairs and surgeons have 0-1 preferences. This class of mechanisms includes deterministic mechanisms that would accomodate the kinds of priority setting that organ banks currently use for the allocation of cadaver organs, as well as stochastic mechanisms that allow considerations of distributive justice to be addressed.
619. Haluk Ergin (MIT) and, "" (09/2005: 288 Kb, PDF; forthcoming,Journal of Public Economics)
Abstract: Many school districts in the U.S. use a student assignment mechanism that we refer to as the mdý mechanism. Under this mechanism a student loses his priority at a school unless his parents rank it as their first choice. Therefore parents are given incentives to rank high on their list the schools where the student has a good chance of getting in. We characterize the Nash equilibria of the induced preference revelation game. An important policy implication of our result is that a transition from the mdý mechanism to the student-optimal stable mechanism would lead to unambiguous efficiency gains.
618.and, "" (09/2005: 288 Kb, PDF; published,International Economic Review, 51, 99-123, 2010)
Abstract: Mechanisms that rely on course bidding are widely used at Business Schools in order to allocate seats at oversubscribed courses. Bids play two key roles under these mechanisms: Bids are used to infer student preferences and bids are used to determine who have bigger claims on course seats. We show that these two roles may easily conflict and preferences induced from bids may significantly differ from the true preferences. Therefore while these mechanisms are promoted as market mechanisms, they do not necessarily yield market outcomes. The two conflicting roles of bids is a potential source of efficiency loss part of which can be avoided simply by asking students to state their preferences in addition to bidding and thus separating the two roles of the bids. While there may be multiple market outcomes under this proposal, there is a market outcome which Pareto dominates any other market outcome.
617. Will Martin (World Bank) and, "" (09/2005: 184 Kb, PDF; updated version available as WP 731)
Abstract: The fact that raising taxes can increase taxed labor supply through income effects is frequently used to justify very much lower measures of the marginal welfare cost of taxes and greater public good provision than indicated by traditional, compensated analyses. We confirm that this difference remains substantial with newer elasticity estimates, but show that either compensated or uncompensated measures of the marginal cost of funds can be used to evaluate the costs of taxation---and will provide the same result---as long as the income effects of both taxes and public good provision are incorporated in a consistent manner.
616. Kim C. Border (California Institute of Technology), Paolo Ghirardato (Universitá da Torino) and, "" (rev. 12/2005: 148 Kb, PDF)
Abstract: This note shows that if the space of events is sufficiently rich and the subjective probability function of each individual is non-atomic, then there is a sigma-algebra of events over which everyone will have the same probability function, and moreover, the range of these probabilities is the whole [0, 1] segment.
615.and Alex Stein (Benjamin N. Cardoso School of Law), "" (rev. 07/2006: 436 Kb, PDF)
Abstract: This is the first article to examine the effects of ambiguity aversion on the criminal process. Ambiguity aversion is a person's rational attitude towards probability's indeterminacy. When a person is averse towards such ambiguities, he increases the probability of the unfavorable outcome to reflect that fear. This observation is particularly true about a criminal defendant who faces a jury trial. Neither the defendant nor the prosecution knows whether the jury will convict the defendant. Their best estimation relies on a highly generalized probability that attaches to a broad category of similar cases. The prosecution, as a repeat player, is predominantly interested in the conviction rate that it achieves over a long series of cases. It therefore can depend on this general probability as an adequate predictor of this rate. The defendant only cares about his individual case and cannot depend on this general probability. From the defendant's perspective, his individual probability of conviction is ambiguous. The defendant consequently increases this probability to reflect his fear of that ambiguity. Because most defendants are ambiguity-averse, while the prosecution is not, the criminal process systematically involves and is thoroughly affected by asymmetric ambiguity-aversion.
614. Maria Arbatskaya (Emory University) and, "" (rev. 05/2011, PDF; forthcoming,International Journal of Industrial Organization)
Abstract: This paper compares the equilibrium outcomes in search markets with and without referrals. Although it seems clear that consumers would benefit from honest referrals, it is not at all clear whether firms would unilaterally provide information about competing offers since such information could encourage the consumer to purchase the product elsewhere. In a model of a horizontally differentiated product and sequential consumer search, we show that valuable referrals can arise in the equilibrium: a firm will give referrals to consumers whose ideal product is sufficiently far from the firm's offering. It is found that prices in the equilibrium are higher in markets with referrals. Although referrals can make consumers worse off, referrals lead to a Pareto improvement as long as the search cost is not too low relative to product heterogeneity. The effects of referral fees and third-party referrals are examined, and policy implications are drawn.
613., "" (rev. 09/2006: 214 Kb, PDF)
Abstract: This paper provides numerically trivial estimators for short panels of either binary choices or of linear models that suffer from confounded, nonignorable sample selection. The estimators allow for fixed effects, endogenous regressors, lagged dependent variables, and heterokedastic errors with unknown distribution. The estimators, which converge at rate root n, are based on variants of the Honoré and Lewbel (2002) panel binary choice model and Lewbel's (2005) cross section sample selection model.
612., "" (06/2005: 238 Kb, PDF)
Abstract: This tutorial will discuss a number of elementary Stata programming constructs and discuss how they may be used to automate and robustify common data manipulation, estimation and graphics tasks. Those used to the syntax of other statistical packages or programming languages must adopt a different mindset when working with Stata to take full advantage of its capabilities. Some of Stata's most useful commands for handling repetitive tasks: -forvalues-, -foreach-, -egen-, -local- and -matrix- are commonly underutilized by users unacquainted with their power and ease of use. While relatively few users may develop ado-files for circulation to the user community, nearly all will benefit from learning the rudiments of use of the -program-, -syntax- and -return- statements when they are faced with the need to perform repetitive analyses. Worked examples making use of these commands will be presented and discussed in the tutorial.
611., "" (10/2004: 710 Kb, PDF; published,Review of Economics and Statistics, 87:3, 556-568, 2005)
Abstract: This paper presents a new method to correct for measurement error in wage data and applies this method to address an old question. How much downward wage flexibility is there in the U.S? We apply standard methods developed by Bai and Perron (1998b) to identify structural breaks in time series data. Applying these methods to wage histories allows us to identify when each person experienced a change in nominal wages. The length of the period of constant nominal wages is left unrestricted and is allowed to differ across individuals, as is the size and direction of the nominal wage change. We apply these methods to data from the Survey of Income and Program Participation. The evidence we provide indicates that the probability of a cut in nominal wages is substantially overstated in data that is not corrected for measurement error.
610. Amrita Bhattacharyya, "" (rev. 11/2005: 266 Kb, PDF)
Abstract: This paper studies the usefulness of advertising to both consumers and experts in specialized markets like the prescription drugs, travel and real-estate markets where the consumers' purchasing decisions are influenced by the experts (e.g., doctors, travel agents and real-estate agents). Inspired by the features of the prescription drugs market the study shows that direct-to-consumer-advertising (DTCA) does not substitute for advertising directed to physicians even when physician-advertising is only persuasive in nature. Furthermore, the paper analyzes possible advertising equilibriums in a two-firm setting and finds that it is possible to have a sub-game perfect, non-symmetric Nash Equilibrium in which only one firm advertises to the consumers and the other firm becomes a free-rider when, (i) the number of patients who are aware of treatment is very low, and (ii) there are very few patients who insist on a particular drug. Otherwise, for familiar diseases a non-advertising equilibrium is most likely. Finally, consumer advertising can have welfare improving implications depending on the disease types and patient characteristics.
609. Shihe Fu, "" (02/2005: 177 Kb, PDF)
Abstract: Existing studies have explored either only one or two of the mechanisms that human capital externalities percolate at only macrogeographic levels. This paper, by using the 1990 Massachusetts census data, tests four mechanisms at the microgeographic levels in the mdý metropolitan area labor market. We propose that individual workers can learn from their occupational and industrial peers in the same local labor market through four channels: depth of human capital stock, Marshallian labor market externalities, Jacobs labor market externalities, and thickness of the local labor market. We find that all types of human capital externalities are significant across census tracts and blocks. Marshallian labor market externalities and the effect of labor market thickness in terms of industry employment density are significant at the block level. The mechanisms of knowledge spillovers vary across industries and occupations. Different types of externalities attenuate at different speeds over geographic distances. The effect of labor market thickness -- in terms of industry employment density -- decays rapidly beyond 1.5 miles away from block centroid; the effect of human capital depth decays rapidly beyond three miles; while Jacobs externalities decay very slowly, indicating a certain degree of urbanization economies. We conclude that knowledge spillovers are very localized within microgeographic scope in cities that we call, "Smart Café Cities."
608.and Eren Inci, "" (02/2005: 402 Kb, PDF; published,Journal of Urban Economics, 60, 418-442, 2006.)
Abstract: This paper presents a downtown parking model that integrates traffic congestion and saturated on-street parking. We assume that the stock of cars cruising for parking adds to traffic congestion. Two major results come out from the model, one of which is robust. The robust one is that, whether or not the amount of on-street parking is optimal, it is efficient to raise the on-street parking fee to the point where cruising for parking is eliminated without parking becoming unsaturated. The other is that, if the parking fee is fixed at a sub-optimal level, it is second-best optimal to increase the amount of curbside allocated to parking until cruising for parking is eliminated without parking becoming unsaturated.
607., ""(01/2005: 444 Kb, PDF)
Abstract: This paper estimates a New Keynesian model to draw inferences about the behavior of the Federal Reserve's unobserved inflation target. The results indicate that the target rose from 1 1/4 percent in 1959 to over 8 percent in the mid-to-late 1970s before falling back below 2 1/2 percent in 2004. The results also provide some support for the hypothesis that over the entire postwar period, Federal Reserve policy has systematically translated short-run price pressures set off by supply-side shocks into more persistent movements in inflation itself, although considerable uncertainty remains about the true source of shifts in the inflation target.
606. Taiji Furusawa (Hitotsubashi University) and, "" (rev. 01/2005: 329 Kb, PDF; forthcoming,Japanese Economic Review)
Abstract: The paper investigates the network of bilateral free trade agreements (FTA) in the context of a network formation game with transfers. Furusawa and Konishi (2002) show that without international transfers, countries with different industrialization levels may not sign an FTA, so that the global free trade network, in which every pair of countries sign an FTA, is not pairwise stable in general. We show in this paper that even if the world consists of fairly asymmetric countries, the global free trade network is pairwise stable when transfers between FTA signatories are allowed. Moreover, it is the unique pairwise stable network unless industrial commodities are highly substitutable from one another.
605., "" (12/2003: 277 Kb, PDF; published inDeath and Dollars: The Role of Gifts and Bequests in America, Brookings Institution Press)
Abstract: Despite recent advances in data collection and the growing number of empirical studies that examine private intergenerational transfers, there still exist significant gaps in our knowledge. Who transfers what to whom, and why do they it? I argue that some of these gaps could be filled by departing from the standard parent-child framework and concentrating instead on fathers, mothers, sons and daughters in a way that accounts for fundamental--and sometimes obvious--male-female differences in concerns and objectives in family life. Elementary sex differences in reproductive biology constitute the basic building blocks of studies of family behavior in many disciplines, but despite recent progress they get far less attention than they deserve in economic studies of the family. I explore, separately, the implications of three basic biological facts for intergenerational transfer behavior. The first is paternity uncertainty: how does it affect the incentives of fathers, mothers and of various grandparents to invest in children? The second is differing reproductive prospects of sons versus daughters: when are sons a better investment than daughters and vice versa? The third is conflict: How much acrimony might we expect to occur in families, and why? In examining these issues I also explore household survey data from the United States. This preliminary evidence is consistent with non-biological as well as biological explanations of behavior. Nonetheless, the biological focus confers two advantages, by generating falsifiable predictions and by illuminating new avenues for empirical work. There is enormous potential for further micro-data-based empirical work in this area.
604. Yingying Dong (California State University-Fullerton) and, "" (rev. 06/2012, PDF)
Abstract: This paper provides a few variants of a simple estimator for binary choice models with endogenous or mismeasured regressors, or with heteroskedastic errors. Unlike control function methods, which are generally only valid when endogenous regressors are continuous, the estimators proposed here can be used with limited, censored, continuous, or discrete endogenous regressors, and they allow for latent errors having heteroskedasticity of unknown form, including random coefficients. The variants of special regressor based estimators we provide are numerically trivial to implement. We illustrate these methods with an empirical application estimating migration probabilities within the US.
603. Marcus Berliant (Washington University in St. Louis) and, "" (05/2004: 342 Kb, PDF; published,Public Choice, 125, 129-149(2005))
Abstract: Which issues are discussed by candidates in an election campaign? Why are some issues never discussed? Model tractability is lost quickly when dealing with these questions, partly because of the multidimensional voting inherent in models of multiple issues. Our model features two candidates for office who can talk about any subset of issues, allowing uncertainty both on the part of voters and candidates, and taking candidates to be office motivated. Candidates move first and simultaneously, announcing any positions they choose on any issues. To us, salience is simply the discussion of an issue in a campaign. If both candidates and voters are expected utility maximizers, we find salience results, in that candidates typically want to talk about everything (or they are indifferent between talking and nonsalience). Leaving the expected utility framework, we present an example using "Knightian uncertainty" or "maxmin expected utility with multiple priors" of Gilboa-Schmeidler to illustrate how robust nonsalience and salience of issues might be generated.
602., Andreas Stephan (European University Viadrina & DIW Berlin) and Oleksandr Talavera (DIW Berlin), "" (rev. 07/2007: PDF; published,Economic Inquiry, 47 (2009), 216-225)
Abstract: This paper investigates the link between the optimal level of nonfinancial firms' leverage and macroeconomic uncertainty. We develop a structural model of a firm's value maximization problem that predicts that as macroeconomic uncertainty increases the firm will decrease its optimal level of borrowing. We test this proposition using a panel of non-financial US firms drawn from the COMPUSTAT quarterly database covering the period 1991-2001. The estimates confirm that as macroeconomic uncertainty increases, firms decrease their levels of leverage. Furthermore, we demonstrate that our results are robust with respect to the inclusion of the index of leading indicators.
601.and J. Peter Neary (University College Dublin), "" (08/2004: PDF; published,Journal of International Economics, 2007, 71, 187-205)
Abstract: We show that the effects of tariff changes on welfare and import volume can be fully characterised by their effects on the generalised mean and variance of the tariff distribution. Using these tools, we derive new results for welfare- and market-access-improving tariff changes, which imply two "cones of liberalisation" in price space. Because welfare is negatively but import volume positively related to the generalised variance, the cones do not intersect, which poses a dilemma for trade policy reform. Finally, we show that generalised and trade-weighted moments are mutually proportional when the trade expenditure function is CES.
Please note:To view Adobe Acrobat® (PDF) formatted files.
600., "" (09/2004: 96 Kb, PDF)
599.and Marc J. Melitz (Harvard University), "" (05/2004: 484 Kb, PDF;available)
Abstract: We develop a stochastic, general equilibrium, two-country model of trade and macroeconomic dynamics. Productivity differs across individual, monopolistically competitive firms in each country. Firms face a sunk entry cost in the domestic market and both fixed and per-unit export costs. Only relatively more productive firms export. Exogenous shocks to aggregate productivity and entry or trade costs induce firms to enter and exit both their domestic and export markets, thus altering the composition of consumption baskets across countries over time. In a world of flexible prices, our model generates endogenously persistent deviations from PPP that would not exist absent our microeconomic structure with heterogeneous firms. It provides an endogenous, microfounded explanation for a Harrod-Balassa-Samuelson effect in response to aggregate productivity differentials and deregulation. Finally, the model successfully matches several moments of U.S. and international business cycles.
598., "" (rev. 09/2004: 148 Kb, PDF; published,Stata Journal, 2005, 5:46-63)
Abstract: This paper discusses the use of Stata for the analysis of time series and panel data. The evolution of time-series capabilities in Stata is reviewed. Facilities for data management, graphics, and econometric analysis from both official Stata and the user community are discussed. A new routine to provide moving-window regression estimates-rollreg-is described, and its use illustrated.
597. Eren Inci, "" (rev. 10/2006: 298 Kb, PDF; published,International Tax and Public Finance, 16, 797-821, 2009)
Abstract: This paper examines R&D tax incentives in oligopolistic markets. We characterize the conditions under which tax incentives reach the socially desirable level of firm-financed R&D spending. The outcome of the market depends not only on the level of technological spillover in the industry but also on the degree of strategic interaction between the firms. One major result emerges from the model: The socially desirable level of R&D investment is not necessarily reached by subsidizing R&D. When the sector spillover is sufficiently low, the government might want to tax R&D investments, and this result does not necessarily arise because firms are overinvesting in R&D. There are also cases in which an R&D tax is desirable even though firms are underinvesting in R&D compared with the first-best optimum. In practice, this theoretical finding calls for a lower sales tax combined with an R&D subsidy in oligopolistic industries with high technological spillovers, and a lower sales tax combined with an R&D tax in oligopolistic industries with low technological spillovers.
596. István Kónya and Hiroshi Ohashi (University of Tokyo), "" (06/2004: 198 Kb, PDF)
Abstract: The paper examines the evolution of consumption patterns in Organization for Economic Co-operation and Development (OECD) countries from 1985 to 1999. Estimation of demand function parameters uncovered consistent evidence that differences in consumption patterns have recently diminished between the countries. The paper further uncovers an empirical relationship that indicates that increased bilateral trade and foreign direct investment (FDI) has contributed to the convergence of consumption patterns. The result is robust to the consideration of endogeneity in the trade and FDI variables.
595., "" (05/2004: 160 Kb, PDF)
Abstract: In models with heterogeneous agents, issues of distribution and redistribution jump to the fore, raising the question: which policies--monetary or fiscal--work most effectively in transferring income from one group to another? To begin answering this question, this note works through a series of examples using Townsend's turnpike model. Two basic results emerge. First, the zero lower bound on nominal interest rates often appears as an obstacle to redistribution by monetary means. Second, assumptions made about the government's ability to raise tax revenue without distortion and to discriminate between agent types in distributing that tax revenue play a large role in determining whether agents prefer to redistribute income by monetary or fiscal means.
594.and Maurizio Zanardi (Tilburg University), "" (02/2004: PDF; published,Public Choice, 141, 129-50, 2009)
Abstract: Much economic policy is deliberately shifted away from direct political processes to administrative processes - political pressure deflection. Pressure deflection poses a puzzle to standard political economy models which suggest that having policies to 'sell' is valuable to politicians. The puzzle is solved here by showing that incumbents will favor pressure deflection since it can deter viability of a challenger, essentially like entry deterrence. U.S. trade policy since 1934 provides a prime example, especially antidumping law and its evolution.
593.and Eric van Wincoop (University of Virginia), "" (04/2004: 728 Kb, PDF; published,Journal of Economic Literature, 42, 691-751, 2004)
Abstract: This paper surveys the measurement of trade costs - what we know, and what we don't know but may usefully attempt to find out. Partial and incomplete data on direct measures of costs go together with inference on implicit costs from trade flows and prices. Total trade costs in rich countries are large. The ad valorem tax equivalent is about 170% when pushing the data very hard. Poor countries face even higher trade costs. There is a lot of variation across countries and across goods within countries, much of which makes economic sense. Theory looms large in our survey, providing interpretation and perspective on the one hand and suggesting improvements for the future on the other hand. Some new results are presented to apply and interpret gravity theory properly and to handle aggregation appropriately.
592. Rasim Özcan, "" (04/2004: 324 Kb, PDF)
Abstract: This paper models an auction game in which two identical licenses for participating in an oligopolistic market are sold in a sequential auction. There is no incumbent. The auction for the first license is a standard first-price, sealed-bid type with an exogenously set reserve price, while the second uses the price of the first unit as the reserve price. This auction rule mimics the license auction for the Turkish Global Mobile Telecommunications in 2000. For some parameter values of the model, this auction setup generates less or equal revenue as selling the monopoly right with the second-price, sealed-bid auction. However, for other parameter values, the seller may get higher revenues.
591. Steve Bond (Oxford University and IFS), Asli Leblebicioglu and, "" (rev. 08/2007: 376 Kb, PDF)
Abstract: We present evidence that an increase in investment as a share of GDP predicts a higher growth rate of output per worker, not only temporarily, but also in the steady state. These results are found using pooled annual data for a large panel of countries, using pooled data for non-overlapping five-year periods, or allowing for heterogeneity across countries in regression coefficients. They are robust to model specifications and estimation methods. The evidence that investment has a long-run effect on growth rates is consistent with the main implication of certain endogenous growth models, such as the AK model.
590., "" (02/2004: 137 Kb, PDF)
Abstract: This paper takes a simple, informal suggestion by Broome and another more explicit suggestion by Kamm for how to deal with asymmetric claims and shows how they can be interpreted to be consistent with two different social welfare functions: Sum-of-square-roots of individual utilities, and product of utilities. These functions are then used to analyze more complicated situations but I show that the first yields more intuitive results, and a better compromise of efficiency and justice, than the other.
589., "" (rev. 09/2004: 236 Kb, PDF)
Abstract: If borrowing capacity of indebted households is tied to the value of their home, house prices should enter a correctly specified aggregate Euler equation for consumption. I develop a simple two-agent, dynamic general equilibrium model in which home (collateral) values affect debt capacity and consumption possibilities for a fraction of the households. I then derive and estimate an aggregate consumption Euler equation, and estimate its structural parameters. The results provide robust support for housing prices as a driving force of consumption fluctuations.
588. Martin Browning (University of Copenhagen), Pierre-André Chiappori (University of Chicago) and, "" (rev. 09/2010 PDF)
Abstract: How much income would a woman living alone require to attain the same standard of living that she would have if she were married? What percentage of a married couple's expenditures are controlled by the husband? How much money does a couple save on consumption goods by living together versus living apart? We propose and estimate a collective model of household behavior that permits identification and estimation of concepts such as these. We model households in terms of the utility functions of its members, a bargaining or social welfare function, and a consumption technology function. We demonstrate generic nonparametric identification of the model, and hence of a version of adult equivalence scales that we call "indifference scales," as well as consumption economies of scale, the household's resource sharing rule or members' bargaining power, and other related concepts.
587., "" (rev. 12/2010, PDF; published,Journal of Business and Economic Statistics, 2012, 30, 67-80)
Abstract: This paper proposes a new method of obtaining identification in mismeasured regressor models, triangular systems, and simultaneous equation systems. The method may be used in applications where other sources of identification such as instrumental variables or repeated measurements are not available. Associated estimators take the form of two stage least squares or generalized method of moments. Identification comes from a heteroskedastic covariance restriction that is shown to be a feature of many models of endogeneity or mismeasurement. Identification is also obtained for semiparametric partly linear models, and associated estimators are provided. Set identification bounds are derived for cases where point identifying assumptions fail to hold. An empirical application estimating Engel curves is provided.
586. Øivind A. Nilsen (University of Bergen), Kjell G. Salvanes (Norwegian School of Economics and Business Administration) and, "" (10/2003: 302 Kb, PDF)
Abstract: In this paper we analyze the pattern of employment adjustment using a rich panel of Norwegian plants. The data suggest that the frequency of episodes of zero net employment changes is inversely related to plant size. We develop and estimate a simple "q" model of labor demand, allowing for the presence of fixed, linear and convex components in adjustment costs. The econometric evidence supports the existence of purely fixed components, unrelated to plant size. As a result, the range of inaction is wider for smaller plants. The quadratic components of costs are also important. Finally, in most specifications both fixed and convex costs are higher for employment contractions.
585.and Oliver Linton (London School of Economics), "" (rev. 09/2006: 308 Kb, PDF; previously titled "Nonparametric Estimation of Homothetic and Homothetically Separable Functions")
Abstract: For vectors z and w and scalar v, let r(v,z,w) be a function that can be nonparametrically estimated consistently and asymptotically normally, such as a distribution, density, or conditional mean regression function. We provide consistent, asymptotically normal nonparametric estimators for the functions G and H, where r(v,z,w)=H[vG(z),w], and some related models. This framework encompasses homothetic and homothetically separable functions, and transformed partly additive models r(v,z,w)=h[v+g(z),w] for unknown functions g and h. Such models reduce the curse of dimensionality, provide a natural generalization of linear index models, and are widely used in utility, production, and cost function applications. We also provide an estimator of G that is oracle efficient, achieving the same performance as an estimator based on local least squares knowing H.
584. Raffaella Giacomini, Andreas Gottschling (Deutsche Bank), Christian Haefke (Universitat Pompeu Fabre) and Halbert White (University of California, San Diego), "" (09/2002: 780 Kb, PDF)
Abstract: We derive a new family of probability densities that have the property of closed-form integrability. This flexible family finds a variety of applications, of which we illustrate density forecasting from models of the AR-ARCH class for U.S. inflation. We find that the hypernormal distribution for the model's disturbances leads to better density forecasts than the ones produced under the assumption that the disturbances are Normal or Student's t.
583. Raffaella Giacomini, "" (06/2002: 380 Kb, PDF)
Abstract: This paper proposes tests for comparing the accuracy of density forecasts. The evaluation makes use of scoring rules, which are loss functions defined over the density forecast and the realizations of the variable. In particular, a logarithmic scoring rule leads to the development of asymptotic and bootstrap 'weighted likelihood ratio' tests. I conclude with an application to S&P500 daily returns, comparing the performance of density forecasts obtained from GARCH models with different distributional assumptions.
582. Raffaella Giacomini and Clive W.J. Granger (University of California, San Diego), "" (07/2002: 292 Kb, PDF; published,Journal of Econometrics, 2004, 118, 7-26)
Abstract: In this paper we compare the relative efficiency of different methods of forecasting the aggregate of spatially correlated variables. Small sample simulations confirm the asymptotic result that improved forecasting performance can be obtained by imposing a priori constraints on the amount of spatial correlation in the system. We also show that ignoring spatial correlation, even when it is weak, leads to highly inaccurate forecasts.
581., "" (10/2003: 218 Kb, PDF; published,International Journal of Forecasting, 2004, 20:151-161)
580. Luisa Lambertini and Giovanni Peri (Università Bocconi and EUI), "" (04/2001: 425 Kb, PDF)
Abstract: In the transitional phase towards full economic integration, European countries have the possibility of re-shaping the continental geography of specialization. We use an Economic Geography model of industrial agglomeration to show how fiscal incentives can be critical in this phase. Differently from other work we concentrate on the role of indirect taxation, and sector specific state-aid, still important in the EU but little studied. While it is obvious that tax incentives could be used to attract some industries, it is not obvious that, in a general equilibrium analysis, such use of taxes is welfare improving. In the paper, we show that the optimal policy is to levy asymmetric taxes on the two sectors only during the phase of intermediate transport costs, when such a measure induces welfare improving agglomerations.
579. Luisa Lambertini, "" (11/2001: 294 Kb, PDF)
Abstract: We study investment in education in an overlapping generation model with altruism where credit market imperfections ration borrowing and cause persistent underinvestment in human capital. We characterize the optimal government policy and the policy that would emerge under majority voting in response to a technological change that raises the returns to education. The optimal government policy consists in a transfer of resources from future to current generations to finance investment in education and an increase in consumption for the current old generation. The policy chosen under majority voting accomplishes a generational transfer only if a majority of individuals are credit constrained. We consider two policy instruments: a labor income tax and an education subsidy. Current voters prefer a reduction in the current income tax rate to an education subsidy, as the former can finance an increase in their consumption.
578. Luisa Lambertini, "" (06/2003: 361 Kb, PDF)
Abstract: This paper tests empirically the strategic explanation of budget deficits suggested by Tabellini and Alesina and Persson and Svensson. Tabellini and Alesina suggest that governments with di erent political orientation provide different public goods. The model predicts that: a) public good provision follows a political pattern; b) the incumbent that anticipates her defeat at the next election runs budget deficits to tie the hands of the future government. Persson and Svensson suggest that liberal governments prefer more public good provision than conservative ones. The model predicts that: a) the conservative (liberal) incumbent that anticipates her defeat at the next election runs budget deficits (surpluses); b) budget imbalances have a political color. Using U.S. and pooled data for sixteen OECD countries, we find little evidence that the incumbent's probability of being voted out of office explains budget deficits, that the provision of public goods follows a political pattern or that budget imbalances have a political color.
577. Luisa Lambertini, "" (10/2001: 361 Kb, PDF)
Abstract: The history of international lending shows that countries default on external debt when their economies experience a downturn. This paper presents a theoretical model of international lending that is consistent with this evidence. In this model, output is stochastic, international capital markets are incomplete because borrowing can only occur via issuing bonds, and borrowers cannot com- mit to repay loans. Self-fulfilling and solvency debt crises arise when borrowers experience low output realizations; moreover, when lenders are atomistic, self- fulfilling crises may arise for debt levels that do not cause default when lenders are non-atomistic. Alternative reforms to eliminate liquidity crises are analyzed. An international lender of last resort can eliminate liquidity crises provided it implements full bailouts via purchasing debt at its market price.
576. Luisa Lambertini and José Tavares (Universidade Nova, Lisbon), "" (08/2003: 171 Kb, PDF)
Abstract: We study monetary and exchange-rate policies around successful and unsuccessful fiscal adjustments and find that successful adjustments are preceded by large nominal exchange rate depreciations, whereas unsuccessful adjustments are preceded by appreciations. Pre-adjustment depreciation is a significant and quantitatively important predictor of the success of adjustment. Our results are robust to the inclusion of other determinants of the success of adjustment and to the definition of the depreciation period, of the persistence of the adjustment, and of the exchange rate. Monetary policy does not affect the success of fiscal adjustments. This result is confirmed when the sample is divided into countries that follow a fixed exchange rate policy and those that do not: for both cases it is exchange rate depreciations that affect the likelihood of success. Our results suggest that the adoption of a single currency will make successful fiscal adjustments more difficult to attain within EMU.
Please note:To view Adobe Acrobat® (PDF) formatted files.
575. Avinash Dixit (Princeton) and Luisa Lambertini, "" (06/2003: 218 Kb, PDF format; forthcoming,American Economic Review)
Abstract: We consider monetary-fiscal interactions when the monetary authority is more conservative than the fiscal. With both policies discretionary, (1) Nash equilibrium yields lower output and higher price than the ideal points of both authorities, (2) of the two leadership possibilities, fiscal leadership is generally better. With fiscal discretion, monetary commitment yields the same outcome as discretionary monetary leadership for all realizations of shocks. But fiscal commitment is not similarly negated by monetary discretion. Second-best outcomes require either joint commitment, or identical targets for both authorities -- output socially optimal and price level appropriately conservative -- or complete separation of tasks.
574. Francesco Zanetti, "" (rev. 05/2004: 287 Kb, PDF format)
Abstract: This paper investigates to what extent a New Keynesian, monetary model with the addition of a microfounded, non-Walrasian labor market based on union bargaining is able to replicate key aspects of the European business cycle. The presence of a representative union permits to explain two features of the cycle. First, it generates an endogenous mechanism which produces persistent responses of the economy to both supply and demand shocks. Second, labor unionization causes a lower elasticity of marginal costs to output. This leads to lower inflation volatility. The model can replicate the negative correlation between productivity shocks and employment in the data. Model simulations show the superiority of the unionized framework to reproduce European business cycle statistics relative to a model with competitive labor market.
573. Marina Pavan, "" (rev. 05/2005: 446 Kb, PDF format)
Abstract: This paper estimates a dynamic model of durable and non-durable consumption choice and default behavior in an economy where risky borrowing is allowed and bankruptcy protection is regulated by law. I exploit the substantial difference in the generosity of bankruptcy exemptions across the U.S. states to assess the role of durable goods as both informal collateral for unsecured debt and self-insurance against bad shocks to earnings. The model accounts for the equilibrium effects of bankruptcy protection on both consumer saving behavior and the credit market. In addition to providing reasonable estimates of the discount rate and risk aversion, I find that the generosity of bankruptcy protection does change both the incentives and the ability of households to accumulate durable wealth. The more generous the bankruptcy regulation, the lower the net durable wealth held by households in the first half of the lifecycle before retirement. In order to minimize the default rate bankruptcy protection should be removed. The optimal level of exemption is positive but low.
572. Raffaella Giacomini and Halbert White (University of California, San Diego), "" (04/2003: 657 Kb, PDF format)
Abstract: We argue that the current framework for predictive ability testing (e.g.,West, 1996) is not necessarily useful for real-time forecast selection, i.e., for assessing which of two competing forecasting methods will perform better in the future. We propose an alternative framework for out-of-sample comparison of predictive ability which delivers more practically relevant conclusions. Our approach is based on inference about conditional expectations of forecasts and forecast errors rather than the unconditional expectations that are the focus of the existing literature. We capture important determinants of forecast performance that are neglected in the existing literature by evaluating what we call the forecasting method (the model and the parameter estimation procedure), rather than just the forecasting model. Compared to previous approaches, our tests are valid under more general data assumptions (heterogeneity rather than stationarity) and estimation methods, and they can handle comparison of both nested and non-nested models, which is not currently possible. To illustrate the usefulness of the proposed tests, we compare the forecast performance of three leading parameter-reduction methods for macroeconomic forecasting using a large number of predictors: a sequential model selection approach, the "diffusion indexes" approach of Stock and Watson (2002), and the use of Bayesian shrinkage estimators.
571. Raffaella Giacomini and Ivana Komunjer (California Institute of Technology), "" (06/2003: 582 Kb, PDF format)
Abstract: This paper proposes a method for comparing and combining conditional quantile forecasts in an out-of-sample framework. We construct a Conditional Quantile Forecast Encompassing (CQFE) test as a Wald-type test of superior predictive ability. Rejection of CQFE provides a basis for combination of conditional quantile forecasts. Two central features of our implementation of the principle of encompassing are, first, the use of the 'tick' loss function and, second, a conditional, rather than unconditional approach to out-of-sample evaluation. Some of the advantages of the conditional approach are that it allows the forecasts to be generated by using general estimation procedures and that it is applicable when the forecasts are based on both nested and non-nested models. The test is also relatively easy to implement using standard GMM techniques. An empirical application to Value-at-Risk evaluation illustrates the usefulness of our method.
570., M. Ütku Ünver (Koç University), "" (rev. 09/2004: 346 Kb, PDF format; published,Journal of Economic Theory, 129, 57-80 (2006))
Abstract: It is known that in two-sided many-to-many matching problems, pairwise-stable matchings may not be immune to group deviations, unlike in many-to-one matching problems (Blair 1988). In this paper, we show that pairwise stability is equivalent to credible group stability when one side has responsive preferences and the other side has categorywise-responsive preferences. A credibly group-stable matching is immune to any "executable" group deviations with an appropriate definition of executability. Under the same preference restriction, we also show the equivalence between the set of pairwise-stable matchings and the set of matchings generated by coalition-proof Nash equilibria of an appropriately defined strategic-form game.
569. Taiji Furusawa (Hitotsubashi University) and, "" (08/2003: 124 Kb, PDF format; published,Economics Letters, 2004)
Abstract: We propose a decomposition of social welfare when consumers' preferences are described by quasi-linear utility functions. In our decomposition, social welfare is expressed as the sum of consumers' gross utilities and trade surplus of non-numeraire goods, whose consumption enters utility functions non-linearly. This decomposition is useful especially when we assess the impact of trade liberalization on individual countries. We propose a decomposition of social welfare when consumers' preferences are described by quasi-linear utility functions. In our decomposition, social welfare is expressed as the sum of consumers' gross utilities and trade surplus of non-numeraire goods, whose consumption enters utility functions non-linearly. This decomposition is useful especially when we assess the impact of trade liberalization on individual countries.
568., Carsten Kowalczyk (Fletcher School, Tufts) and Tomas Sjöström, "" (07/2003: 276 Kb, PDF format)
Abstract: All countries would agree to immediate global free trade if countries were compensated for any terms-of-trade losses with transfers from countries whose terms-of-trade improve, and if customs unions were required to have no effects on non-member countries. Global free trade with transfers is in the core of a Kemp-Wan-Grinols customs union game.
567., "" (06/2003: 236 Kb, PDF format; published,Journal of Policy Analysis and Management, 2005)
Abstract: Some public policies aimed at integrating welfare recipients into the world of work are predicated on the premise that getting welfare recipients to work will change their beliefs about how they will be treated in the labor market. This paper explores the rationale for these policies and concludes that a plausible argument can be made on the basis of concepts developed by social psychologists and by economists. The prediction that work affects beliefs is tested using a unique data set that allows us to estimate the causal effect. We find that exogenous increases in work induced by an experimental tax credit led to the predicted changes in self-efficacy.
566., Mustafa Caglayan (University of Sheffield) and Neslihan Ozkan (University of Bristol), "" (06/2003: 128 Kb, PDF format)
Abstract: In this paper we hypothesize that greater macroeconomic uncertainty would cause firms to increasingly turn to their suppliers as a source of finance, making greater use of trade credit. We test this hypothesis using a panel of non-financial firms drawn from the annual COMPUSTAT database and show that an increase in macroeconomic uncertainty leads to a narrowing of the cross-sectional distribution of firms' trade credit-to-sales ratios.
565., Talan B. İsçan (Dalhousie University) and Alessandro Rebucci (International Monetary Fund), "" (06/2003: 716 Kb, PDF format)
Abstract: We develop a two-country, dynamic general equilibrium model that links cross-country differences in net foreign asset and consumption dynamics to differences in discount factors and steady-state levels of productivity. We compare the results of the model to those of VARs for the G3 economies. We identify country-specific productivity shocks by assuming that productivity does not respond contemporaneously to other variables in these VARs. We identify global productivity shocks by estimating the VARs in common trend representation after testing for and imposing model-based, long-run cointegration restrictions. We then compare the model's predictions for net foreign asset and consumption dynamics in response to productivity shocks with the estimated VAR impulse responses. We find that the two sources of heterogeneity we consider go some way toward reconciling the consumption smoothing hypothesis with the data and explaining variations in net foreign asset and consumption dynamics across countries.
563. Alon Harel (Hebrew University School of Law), Zvi Safra (Tel Aviv University) and, "" (06/2003: 161 Kb, PDF format)
562. Ingela Alger and Régis Renault (GEMMA, Université de Caen), "" (rev. 11/2004: 210 Kb, PDF format)
Abstract: We study the implications of honesty when it requires pre-commitment. Within a two-period hidden information problem, an agent learns his match with the assigned task in period 2 and, if honest, reveals it to the principal if he has committed to it. The principal may offer a menu of contracts to screen ethics. Both honest and dishonest agents are willing to misrepresent their ethics. The principal and dishonest agents benefit from an increased likelihood of honesty as long as honesty is likely enough. Honest agents always profit from ethics uncertainty if a good match is likely. This is also true if dishonesty is likely enough, in which case an honest receives the same surplus as a dishonest.
561., Mustafa Caglayan (University of Sheffield) and Neslihan Ozkan (University of Bristol), "" (rev. 07/2007: PDF format; published,Manchester School, 81:2, 202-225, 2013)
Abstract: In this paper we re-examine commercial banks' lending behavior taking into account changes in the stance of monetary policy in conjunction with changes in financial sector uncertainty. Using a very large data set covering all banks in the US between 1986-2000, we show that financial uncertainty has an important and significant role in the monetary policy transmission mechanism that varies across bank categories and the strength of banks' balance sheets. We find support for the existence of a bank lending channel among US banks.
560.and Sheldon Danziger (University of Michigan), "" (05/2003: 2.5 Mb, PDF format; published,Review of Income and Wealth, 2005)
Abstract: This paper tracks distributional changes over the last quarter of the twentieth century. We focus on three conceptually distinct distributions: the distribution of wages, the distribution of annual earnings and the distribution of total family income adjusted for family size. We show that all three distributions became less equal during the last half of the 1970's and the 1980's. This was, however, not the case during the 1990's. Wage inequality stabilized, earnings inequality declined and family income inequality actually continued to rise. We decompose changes in family income inequality over the last quarter century and show that roughly half of the increase is accounted for by changes in the distribution of earnings. This suggests that further research on family income inequality should pay as much attention to changes in the distribution of other income sources as to factors affecting the labor market.
559., "" (05/2003: 206 Kb, PDF format)
558. John Barkoulas (University of Tennessee) and, "" (05/2003: 4 Mb, PDF format; published,Journal of Forecasting, 2006, 25, 291-302)
Abstract: Several studies have tested for long-range dependence in macroeconomic and financial time series but very few have assessed the usefulness of long-memory models as forecast generating mechanisms. This study tests for fractional differencing in the U.S. monetary indices (simple sum and divisia) and compares the out-of-sample fractional forecasts to benchmark forecasts. The long-memory parameter is estimated using Robinson's Gaussian semiparametric and multivariate log-periodogram methods. The evidence amply suggests that the monetary series possess a fractional order between one and two. Fractional out-of-sample forecasts are consistently more accurate (with the exception of the M3 series) than benchmark autoregressive forecasts but the forecasting gains are not generally statistically significant. In terms of forecast encompassing, the fractional model encompasses the autoregressive model for the divisia series but neither model encompasses the other for the simple sum series.
557.and Susanne M. Schennach (University of Chicago), "" (rev. 03/2005: 348 Kb, PDF format; forthcoming,Journal of Econometrics)
Abstract: We consider estimation of means of functions that are scaled by an unknown density, or equivalently, integrals of conditional expectations. The "ordered data" estimator we provide is root n consistent, asymptotically normal, and is numerically extremely simple, involving little more than ordering the data and summing the results. No sample size dependent smoothing is required. A similarly simple estimator is provided for the limiting variance. The proofs include new limiting distribution results for functions of nearest neighbor spacings. Potential applications include endogeneous binary choice, willingness to pay, selection, and treatment models.
556., "" (rev. 09/2006: 389 Kb, PDF format)
Abstract: This paper considers identification and estimation of the marginal effect of a mismeasured binary regressor in a nonparametric regression, or the conditional average effect of a binary treatment or policy on some outcome where treatment may be misclassified. Misclassification probabilities and the true probability of treatment are also nonparametrically identified. Misclassification occurs when treatment is measured with error, that is, some units are reported to have received treatment when they actually have not, and vice versa. The identifying assumption is existence of a variable that affects the decision to treat (the binary regressor) and satisfies some conditional independence assumptions. This variable could be an instrument or a second mismeasure of treatment. Estimation is either ordinary GMM or a proposed local GMM, which can be used generally to nonparametrically estimate functions based on conditional moment restrictions. An empirical application estimating returns to schooling is provided.
555. Helen Connolly (Northeastern University),and Katherine Newman (Kennedy School, Harvard University), "" (rev. 07/2005: 1.4 Mb, PDF format)
554.and Raoul Minetti (Michigan State University), "" (rev. 12/2003: 330 Kb, PDF format)
Abstract: We examine the international transmission of business cycles in a two-country economy in which credit contracts are imperfectly enforceable. In our economy, foreign lenders differ from domestic lenders in their ability to recover value from borrowers' assets and, therefore, to protect themselves against contractual non-enforceability. The relative importance of domestic and foreign credit frictions changes over the cycle. This induces entrepreneurs to adjust their debt exposure and allocation of collateral between domestic and foreign lenders in response to exogenous productivity shocks. We show that such a model can explain positive output correlations across countries. The model also appears consistent with econometric evidence on asset values and domestic and foreign debt exposure.
553. Richard Arnott and, "" (01/2003)
Abstract: This paper is concerned with the application of microeconomic theory to resource allocation in the transportation sector. The basic questions it addresses are how transportation should be priced and how capacity should be determined. Three models, the traditional highway pricing and investment model, the highway bottleneck model, and the traditional model of mass transit pricing and service, are employed to develop principles common to all transportation modes. This paper has been published as a chapter with the same title in Randolph W. Hall, ed., Handbook of Transportation Science, 2nd ed., Kluwer Academic Publishers, 2002.
552., Mustafa Caglayan (University of Sheffield), Neslihan Ozkan (University of Bristol) and Oleksandr Talavera (DIW Berlin), "" (rev. 12/2005: 211 Kb, PDF format; published,Review of Financial Economics, 15, 289-304, 2006)
Abstract: This paper empirically investigates whether changes in macroeconomic volatility affect the efficient allocation of non-financial firms' liquid assets. We argue that higher uncertainty will hamper managers' ability to accurately predict firm-specific information and induce them to implement similar cash management policies. Contrarily, when the macroeconomic environment becomes more tranquil, each manager will have the latitude to behave more idiosyncratically as she can adjust liquid assets based on the specific requirements of the firm, bringing about a more efficient allocation of liquid assets. Our empirical analysis provides support for these predictions.
551., "" (12/2002: 77 Kb, PDF format; published,Journal of Urban Economics, 2003, 54, 511-530)
Abstract: This paper considers the second-best policy problem that arises when auto travel is priced below its marginal cost and there is a substitute mass transit mode. We analyze the problem by combining a model of a rail line based on Kraus and Yoshida (JUE, 2002)) with the highway bottleneck model. The model involves a transit authority which optimizes, in addition to the fare, two dimensions of transit capacity. These are (1) the number of train units serving the route and (2) the capacity of an individual train unit. Under a very weak condition, second-best optimality involves expanding both dimensions of transit capacity. The larger of the effects is on train size.
Please note:To view Adobe Acrobat® (PDF) formatted files.
550. Gary Burtless (Brookings Institution) and, "" (12/2002: 134 Kb, Adobe Acrobat format)
Abstract: One of the most important labor market developments of the last century was the sustained trend toward earlier retirement among American men. This trend came to at least a temporary halt in the mid-1980s. Since then, male participation rates at older ages have stabilized or even increased slightly, while older women's participation rates have begun rising dramatically. The dominant factor driving the trend toward earlier male retirement was a long-term increase in economic wealth, which permitted workers to enjoy rising living standards even as they spent a growing percentage of their lives outside the workforce. The expansion of Social Security and of employer-sponsored pension plans, and the introduction of mandatory retirement rules, also encouraged earlier retirement over much of the last century. In recent years, many public policies and private institutions that encourage early retirement have been modified. Mandatory retirement was outlawed in most jobs. Social Security is no longer growing more generous, and coverage under company pension plans is no longer rising. In addition, both Social Security and private pensions have become more "age neutral," meaning that they provide either weaker incentives or no incentives to retire at particular ages, such as age 62 or age 65. Finally, the scheduled rise in Social Security's normal retirement age over the next two decades will encourage later retirements, at least modestly. An open question is whether further changes are needed. Given that labor force growth is slowing and Americans are enjoying longer and healthier lives, efforts to encourage people to work longer could have important benefits both for individuals and for the national economy. On the other hand, rising labor productivity, increased work effort, and more saving during the pre-retirement years could allow Americans to enjoy higher living standards even if they choose to spend more years in retirement. If opinion polls are to be believed, most workers favor preserving options for early retirement, even if it means heavier contributions to the retirement system during their working careers.
549. Alberto Alesina (Harvard University), Silvia Ardagna (Wellesley College), Guiseppe Nicoletti (OECD) and, "" (12/2002: 391 Kb, Adobe Acrobat format)
Abstract: One commonly held view about the difference between continental European countries and other OECD economies, especially the United States, is that the heavy regulation of the former reduces their growth. Using newly assembled data on regulation in several sectors of many OECD countries, we provide substantial and robust evidence that various measures of regulation in the product markets, concerning in particular entry barriers, are negatively related to investment. The policy implication of our analysis is clear: regulatory reforms that liberalize entry are very likely to spur investment.
548. Taiji Furusawa (Yokohama National University) and, "" (rev. 11/2006: 296 Kb, Adobe Acrobat format; published,Journal of International Economics, 72, 310-335, 2007)
Abstract: The paper examines the formation of free trade agreements (FTAs) as a network formation game. We consider a general n-country model in which (possibly asymmetric) countries trade differentiated industrial commodities as well as a numeraire good. We show that if all countries are symmetric, the complete FTA network is pairwise stable and it is the unique stable network if industrial commodities are not highly substitutable. We also compare FTAs and customs unions (CUs) as to which of these two regimes facilitates global trade liberalization, emphasizing the fact that unlike in the case of a CU, each signatory of an FTA can have a new FTA without consent of other member countries.
547. István Kónya, "" (11/2002: 290 Kb, Adobe Acrobat format)
Abstract: The paper presents a model that analyses the role of cultural differences in international trade. The decision to study foreign cultures and languages is incorporated into a simple trade model, which captures some basic properties of cultural and language barriers. First, cultural costs differ from physical ones in that they can be eliminated by learning. Second, learning a language has economies of scale, thus smaller countries tend to invest more into learning. Third, learning decisions within one country impose an externality on trading partners, since learning by one party makes communication easier also for the other one. This implies that learning decisions are in general inefficient, and the paper derives the connection between the equilibrium and optimal outcomes. Finally, because of the substitutability of learning among countries, a policy where a country discourages learning - "cultural protectionism" - can be rationalized. Under certain conditions, such a policy can improve global welfare, not just that of the protecting country.
546. István Kónya, "" (11/2002: 179 Kb, Adobe Acrobat format)
Abstract: The paper analyzes the population dynamics of a country that has two ethnic groups, a minority and a majority. Minority members can choose whether or not to assimilate into the majority. If the minority is small, the long-run outcome is full assimilation. When the minority is large, the unique long-run equilibrium is the initial situation. For intermediate minority sizes multiple equilibria are possible, including the full- and no-assimilation ones. The paper also solves the social planner's problem, which indicates that the country can end up in an inefficient steady state. Even if the steady state is the optimal one, the equilibrium path will be suboptimal. Two extensions to the basic model are considered. The first one allows for a comparison between a multicultural and a "melting pot" society. The second one introduces population growth and studies the interplay between exogenous and endogenous changes in the minority's size.
545., Mark E. Schaffer (Heriot-Watt University) and Steven Stillman (Motu Economic Policy Research), "" (rev. 02/2003: 384 Kb, Adobe Acrobat format; published,Stata Journal, 3(1), 1-31, 2003)
Abstract: We discuss instrumental variables (IV) estimation in the broader context of the generalized method of moments (GMM), and describe an extended IV estimation routine that provides GMM estimates as well as additional diagnostic tests. Stand-alone test procedures for heteroskedasticity, overidentification, and endogeneity in the IV context are also described.
544. Michael T. Belongia (University of Mississippi) and, "" (10/2002: 200 Kb, Adobe Acrobat format)
Abstract: Traditionally, the effects of monetary policy actions on output are thought to be transmitted via monetary or credit channels. Real business cycle theory, by contrast, highlights the role of real price changes as a source of revisions in spending and production decisions. Motivated by the desire to focus on the effects of price changes in the monetary transmission mechanism, this paper incorporates a direct measure of the real own-price of money into an estimated vector autoregression and a calibrated real business cycle model. Consistent with this new view of the monetary transmission mechanism, both approaches reveal that movements in the own-price of money are strongly related to movements in output.
543. Helen Connolly (Northeastern University) and, "" (10/2002: 382 Kb, PDF; forthcoming,Labour: Review of Labour Economics and Industrial Relations)
Abstract: This paper examines the decisions faced by workers who must decide whether or not to accept job offers that may differ from their current jobs in both initial wages and wage growth. Introducing heterogeneity in slopes as well as intercepts in a search framework has the immediate implication that agents must look ahead to determine the expected reward of higher wage growth. Under the assumption of homogeneous wage growth, it is hard to explain why an agent would accept a job that required a wage cut. Allowing for heterogenity in wage growth opens the possibility that these are transitions to jobs in which the initial cut in pay is more than offset by expected future wage growth. Such transitions that lead to an increase in the value function should be classified as transitions to "better" jobs even if they entail an initial cut in wages. We use the SIPP to estimate a structural model, which allows us to recover parameters of the wage o er distributions and the probability of involuntary terminations. With these primitives we are able to classify job changes according to their impact on the value function. We find that roughly a third of transitions that entail a wage cut are actually transitions to "better" jobs when the classification is based on the change in the value function.
542., "" (rev. 08/2004: 382 Kb, Adobe Acrobat format)
Abstract: I develop a general equilibrium model with sticky prices, credit constraints, nominal loans and asset (house) prices. Changes in house prices modify agents' borrowing capacity through collateral value; changes in nominal prices affect real repayments through debt deflation. Monetary shocks move asset and nominal prices in the same direction, and are amplified and propagated over time. The "financial accelerator" is not constant across shocks: nominal debt stabilises supply shocks, making the economy less volatile when the central bank controls the interest rate. I discuss the role of equity, debt indexation and household and firm leverage in the propagation mechanism. Finally, I find that monetary policy should not respond to asset prices as a means of reducing output and inflation volatility.
541.and Raoul Minetti (Michigan State University and LSE), "" (rev. 08/2003: 198 Kb, Adobe Acrobat format)
Abstract: This paper tests a credit channel of monetary policy (especially a bank-lending channel) in the housing market. We argue that the relevance of the credit channel depends on the structural features of the housing finance system, in particular efficiency and institutional organisation. We employ a structural VAR approach to analyse this issue in four European housing markets (Finland, Germany, Norway and the UK). Our results support the existence of a broad credit channel and, in some contexts, of a bank-lending channel. More importantly, the findings show across countries a clear-cut relationship between presence of a credit (bank-lending) channel, efficiency of housing finance and type of institutions active in mortgage provision.
540., "" (10/2002: 612 Kb, Adobe Acrobat format)
Abstract: A structural vector autoregressive approach identifies the main macroeconomic factors behind fluctuations in house prices in France, Germany, Italy, Spain, Sweden and the UK. Quarterly GDP, house prices, money, inflation and interest rates are characterised by a multivariate process driven by supply, nominal, monetary, inflationary and demand shocks. Tight money leads to a fall in real house prices; house price responses are hump-shaped; the responses of house prices and, to a lesser extent, GDP to a monetary shock can be partly justified by the different housing and financial market institutions across countries; transitory shocks drive a significant part of short-run house price fluctuations.
539.and François Ortalo-Magnè (LSE), "" (10/2002: 273 Kb, Adobe Acrobat format)
Abstract: This paper investigates the benefits of allowing households to compensate the portfolio distortion due to their housing consumption through investments in housing price derivatives. Focusing on the London market, we show that a major loss from over-investment in housing is that households are forced to hold a very risky portfolio. However, the strong performance of the London housing market means that little is lost in terms of expected returns. Even households with limited wealth are better off owning their home rather than renting and investing in financial assets, as long as they are willing to face the financial risk involved. In this context, access to housing price derivatives would benefit most poor homeowners looking to limit their risk exposure. It would also benefit wealthier investors looking for the high returns provided by housing investments without the costs of direct ownership of properties. Comparisons with French, Swedish and US data provide a broader perspective on our findings.
538.and Raoul Minetti (Michigan State University and LSE), "" (10/2002: 260 Kb, Adobe Acrobat format)
Abstract: We analyse the impact of financial liberalisation on the link between monetary policy and house prices. We present a simple model of a small open economy subject to credit constraints. The model shows that the higher the degree of financial liberalisation, the stronger is the impact of interest rate shocks on house prices. We then use vector autoregressions to study the role of monetary policy shocks in house price fluctuations in Finland, Sweden and UK, characterised by financial liberalisation episodes over the last twenty years. We find that the response of house prices to interest rate surprises is bigger and more persistent in periods characterised by more liberalised financial markets.
537. Arturo Galindo (Inter-American Development Bank) and, "" (09/2002: 235 Kb, Adobe Acrobat format)
Abstract:This paper summarizes and discusses new evidence on the nature, extent, evolution and consequences of financing constraints in Latin America. The countries covered are: Argentina, Colombia, Costa Rica, Ecuador, Mexico, and Uruguay. All the new contributions share the characteristics of being based on microdata. Most of the data sources are firms' balance sheets. For Argentina information on debt contracts and credit history is also available, while for Costa Rica personal information on entrepreneurs was also collected. Some of the papers investigate the determinants of firms' financing choices, and the consequences of access or debt composition on performance. Other papers attempt to assess the severity of financing constraints, by focusing on firms' investment choices. All the papers (but one) were part of the project "Determinants and Consequences of Financial Constraints Facing Firms in Latin America and the Caribbean," financed by the IADB. However, other recent micro-econometric contributions are discussed as well. The results suggest that access to credit (and its cost) depends not only upon favorable balance sheet characteristics, but also upon the closeness of the relationship between firms and banks as well as credit history. Access to long-term loans and to loans denominated in foreign currency is positively related to the size and tangibility of firms' assets and negatively related to measures of country risk. Moreover, firms that have foreign participation appear to be less financially constrained in their investment decisions. The same is true for firms that are associated with business groups. On the whole, it appears that financial liberalization tends to relax financial constraints for firms that were previously constrained, while financial crises tighten them. However, firms that have more access to external sources of finance via, for instance, exports or ownership links, appear to suffer less in the post-crisis period. The paper concludes with a discussion of the policy implications of these results.
536., "" (08/2002: 655 Kb, Adobe Acrobat format)
Abstract: In a New Keynesian model, technology and cost-push shocks compete as terms that stochastically shift the Phillips curve. A version of this model, estimated via maximum likelihood, points to the cost-push shock as far more important than the technology shock in explaining the behavior of output, inflation, and interest rates in the postwar United States data. These results weaken the links between the current generation of New Keynesian models and the real business cycle models from which they were originally derived; they also suggest that Federal Reserve ocials have often faced dicult trade-offs in conducting monetary policy.
535., "" (rev. 10/2003: 453 Kb, Adobe Acrobat format)
Abstract: This paper shows that many estimators of thresholds in ordered response models exist, because binary choice location estimators can be converted into threshold estimators. A new threshold estimator is proposed that is consistent under more general conditions. An extension to random thresholds is provided.
534., "" (rev. 03/2004: 480 Kb, Adobe Acrobat format)
Abstract: This paper presents a new method to correct for measurement error in wage data and applies this method to address an old question. How much downward wage flexibility is there in the U.S.? We apply standard methods developed by Bai and Perron (1998b) to identify structural breaks in time series data. Applying these methods to wage histories allows us to identify when each person experienced a change in nominal wages. The length of the period of constant nominal wages is left unrestricted and is allowed to differ across individuals, as is the size and direction of the nominal wage change. We apply these methods to data from the Survey of Income and Program Participation. The evidence we provide indicates that the probability of a cut in nominal wages is substantially overstated in data that is not corrected for measurement error.
533. Charles T. Carlstrom (Federal Reserve Bank of Cleveland), Timothy S. Fuerst (Bowling Green State University) and, "" (rev. 09/2004: 232 Kb, Adobe Acrobat format)
Abstract: What inflation rate should the central bank target? We address determinacy issues related to this question in a two-sector model in which prices can differ in equilibrium. We assume that the degree of nominal price stickiness can vary across the sectors and that labor is immobile. The contribution of this paper is to demonstrate that a modified Taylor Principle holds in this environment. If the central bank elects to target sector one, and if it responds with a coefficient greater than unity to price movements in this sector, then this policy rule will ensure determinacy across all sectors. The results of this paper have at least two implications. First, the equilibrium-determinacy criterion does not imply a preference to any particular measure of inflation. Second, since the Taylor Principle applies at the sectoral level, there is no need for a Taylor Principle at the aggregate level.
532. Richard Arnott and Petia Petrova, "" (04/2002: 265 Kb, Adobe Acrobat format)
Abstract: Consider an atomistic developer who decides when and at what density to develop his land, under a property value tax system characterized by three time-invariant tax rates: the tax rate on pre-development land value; the tax rate on post- development residual site value; and the tax rate on structure value. Arnott (2002) identified the subset of property value tax systems which are neutral. This paper investigates the relative efficiency of four idealized, non-neutral property value tax systems (i) Canadian property tax system; (ii) simple property tax system; (iii)residual site value tax system;( iv) differentiated property tax system under the assumption of a constant rental growth rate.
531., "" (01/2002: 109 Kb, Adobe Acrobat format; published inProgramming Languages and Systems in Computational Economics and Finance, Soren S. Nielsen, ed.)
Abstract: We describe the Stata software environment, and illustrate how it may be profitably employed for applied economic research. Stata stands between "point and click" statistical packages and matrix languages in terms of extensibility and ease of use, and provides web-accessible features that enhance collaborative research and instruction.
530., "" (04/2002: 28 Kb, Adobe Acrobat format)
Abstract: Kydland and Prescott first identified the inflationary bias that results when a central bank does not precommit to a monetary policy rule. Subsequent work, published over the past twenty five years, demonstrates that this inflationary bias can be minimized by appointing central bankers whose preferences or incentives differ systematically from those of society as a whole. Subsequent work also shows that central bankers may optimally choose to maintain their reputations as inflation fighters. The literature to date, however, says remarkably little about how central bankers establish their reputations, or build credibility for their policies, in the first place.
529. David Heyd (Hebrew University) and, "" (03/2002: 298 Kb, Adobe Acrobat format)
Abstract: The article suggests a formal model of a two-tier voting procedure, which unlike traditional voting systems does not presuppose that ev- ery vote counts the same. In deciding a particular issue voters are called in the first round to assign categories of their fellow-citizens with differential voting power (or weights) according to the special position or concern individuals are perceived as having with regard to that issue. In the second stage, voters vote on the issue itself accord- ing to their substantive view and their votes are counted in the light of the differential weights assigned in the first round. We analyze the formal and the philosophical reasons that support the model.
528. István Kónya, "" (10/2001: 190 Kb, Adobe Acrobat format)
Abstract: The paper presents a model that can explain how regional differences emerge in a country as a consequence of foreign trade. The model is based on the widely used increasing returns/transportation costs framework. In addition to the conventional elements, heterogeneous households and imperfect labor mobility are added. The results indicate that for a small economy international trade leads to human capital reallocation, and thus more regional inequality than without labor heterogeneity. Even small migration ows can lead to large inequalities in per capita incomes, if the most skilled workers move. The model also sheds some light on the relative importance of fundamentals and historical factors.
527., "" (03/2002: 272 Kb, Adobe Acrobat format)
Abstract: I show that endogenously persistent output dynamics are not a puzzle for the standard sticky-price model once openness of the economy is taken into account. I make this point using a two-country, monetary model of macroeconomic interdependence under internationally incomplete asset markets with stationary net foreign asset dynamics. If asset markets are incomplete, price stickiness generates endogenous persistence in the cross-country GDP differential by introducing persistence in the dynamics of the relative price differential between the two economies. This results in the dependence of the current real GDP differential on its past value, as well as on the stock of net foreign assets accumulated in the previous period. The elasticity of the current GDP differential to its past value is sizable for standard parameter values, implying a quantitatively significant persistence effect through this channel. Endogenous persistence yields hump-shaped responses of GDP to productivity and monetary policy shocks.
526. Maria Laura Parisi (Università di Padova),, and Alessandro Sembenelli (Università di Torino), "" (02/2002: 217 Kb, Adobe Acrobat format)
Abstract: By exploiting a rich firm level database, this paper presents novel empirical evidence on the impact that the introduction of process and product innovations exerts on productivity, as well as on the role played by R&D and fixed capital investment in enhancing the likelihood of introducing innovations at the firm level. Our results imply that process innovation has a large impact on productivity. Furthermore, R&D spending is strongly positively associated with the probability of introducing a new product, whereas fixed capital spending increases the likelihood of introducing a process innovation. The latter result rejects the fact that new technologies are frequently embodied in new capital goods. However, the effect of fixed investment on the probability of introducing a process innovation is magnified by R&D spending internal to the firm. This implies that, in our sample, R&D affects productivity growth by facilitating the absorption of new technologies.
Please note:To view Adobe Acrobat® (PDF) formatted files.