Research | Jiang Han: Using the difference-in-differences design with panel data in international business research: progress, potential issues, and practical suggestions
Difference-in-differences (DID) is a popular design of causal inference in social science research. The rationale is to identify a particular event (the treatment) that influences some subjects (the treated group) but not others (the control group) and compare the differences in an outcome of interest between the treated and control groups before and after the treatment. This design constitutes a quasi-experiment that helps rule out confounding effects. As such, when applied with panel data (or, time-series cross-sectional data), DID design offers robust causal inference in terms of Granger causality between the time series of the treatment and that of the outcome of interest.
The collaborative research conducted by Professor Jiang Han of the School of Management and Economics, The Chinese University of Hong Kong, Shenzhen, titled "Using the Difference-in-Differences Design with Panel Data in International Business Research: Progress, Potential Issues, and Practical Suggestions," reviewed the advancements in the application of the Difference-in-Differences (DID) methodology based on panel data within the realm of international business research. The study highlights that the DID design is gaining increasing popularity in this field; however, it also points out that violations of underlying assumptions during its application may lead to drastic estimation biases.
Recently, this research has been accepted and published by the top international journal, the Journal of International Business Studies.

Author

Jiang Han
Assistant Dean (Education), Assistant Professor, School of Management and Economics, CUHK-Shenzhen
Research Area
Social network and social capital, Corporate governance and strategic leadership, Entrepreneurship and venture financing, Global supply chains and cross-border M&A
Abstract
Difference-in-differences (DID) is a popular design of causal inference in social science research. The rationale is to identify a particular event (the treatment) that influences some subjects (the treated group) but not others (the control group) and compare the differences in an outcome of interest between the treated and control groups before and after the treatment. This design constitutes a quasi-experiment that helps rule out confounding effects. As such, when applied with panel data (or, time-series cross-sectional data), DID design offers robust causal inference in terms of Granger causality between the time series of the treatment and that of the outcome of interest.
This rationale of DID design aligns well with the nature of international business (hereafter, IB) research. Firms’ IB practices are jointly shaped by location-specific factors in different countries and firm- and industry-specific factors in global value networks. Therefore, IB research often seeks to unveil the cause-and-effect implications (or, treatment effects) of critical events that shift these factors at the country, industry, or firm level for firms’ global strategies and international operations. Moreover, as the impacts of these critical events in the IB contexts often last over years and affect different firms at different time points, IB research commonly relies on panel data to capture such treatment effects. In this regard, DID design offers a superior empirical solution in line with such key features of IB research. Commensurately, ever more studies using DID design with panel data have been published over the past decade. For example, among the 131 empirical studies published in the Journal of International Business Studies between 2020 and 2022, a sizable portion (18 studies, 13.0%) applied DID design with panel data as at least part of their empirical methodologies.