ブックタイトルRILAS 早稲田大学総合人文科学研究センター研究誌
- ページ
- 112/230
このページは RILAS 早稲田大学総合人文科学研究センター研究誌 の電子ブックに掲載されている112ページの概要です。
秒後に電子ブックの対象ページへ移動します。
「ブックを開く」ボタンをクリックすると今すぐブックを開きます。
このページは RILAS 早稲田大学総合人文科学研究センター研究誌 の電子ブックに掲載されている112ページの概要です。
秒後に電子ブックの対象ページへ移動します。
「ブックを開く」ボタンをクリックすると今すぐブックを開きます。
RILAS 早稲田大学総合人文科学研究センター研究誌
WASEDA RILAS JOURNALclassification of these children as ADHD patients is asocial construct that was only made possible throughsocial negotiation between doctors, health care officials,parents, and other related sectors of society.Under different (more permissive or more negligent,depending on your value-judgment) circumstances,some (not all) ADHD children would have beenregarded as just noisy, curious kids, typical for theirage. When we start to look at a“normal”child-developmentprocess with stricter standards for behavioralcharacteristics, we are able to invent a new category ofdisease and its associated risks. ?When the risk itself is a social construct to someextent, an effective measure to forestall it must containsocial restructuring and negotiations, and this is easiersaid than done. One cannot simply add some intricatecontrol system to the risky technological system andexpect it to take care of all the associated risks. Socialinstitutions and negotiation protocols must be installedto continuously redefine acceptable levels of risk, andlegitimate methods for dealing with risk must be determined.Further, the question remains as to whether itis really worth trying to manage risks rather than simplydropping the system entirely. As we will see in thefollowing sections, these issues do arise in risk-managementin action, despite official efforts to limit theissues to purely technical matters. The pragmatic considerationsare unavoidable.Risk itself can be indeterminate in the sense thatwe do not have sufficient statistics to determine economicallyrational policies that deal with potentialdisasters. The necessary probabilities might be missingdue to either our epistemic limits or to theintrinsically uncertain nature of disasters. ? Underthese circumstances, we can no longer rely on theclear-cut answers of cost-benefit analysis. We have tocombine a number of heterogeneous factors, includingexplicit value judgment, to arrive at a social consensus.Politics in its original sense of“science ofgovernment”is badly needed here.I shall examine two cases of risk management inSouth Korea, one related to American beef and theother, nuclear power plants. The first is a case of dramaticfailure, and the other of continued interactionsbetween multiple interests groups. I shall argue thatsuccessful and democratically justified risk managementof science-technology systems should start fromtheir essentially Janus-faced nature. The unpredictabilityof technological consequences requires us to beeven more cautious when it comes to calculating theexpected utilities and costs of a complex technologicalsystem. The implications of the precautionary principlewill also be discussed, especially in the context ofnational and international responses to global warmingand climate change.2. The American Beef Crisis and theFailure of the Deficit ModelIn 2008, a very unusual public movement tookplace in South Korea. A public demonstration wasorganized to protest the unclear dealings of the Koreangovernment with regard to the import of Americanbeef. The risk of BSE (Bovine Spongiform Encephalopathy,a.k.a.“mad cow disease”) was called intoquestion by civil sectors with the support of scientificexperts, and the Korean government, backed by theirown scientists, vehemently denied the existence of anyrisk. The public demonstration, (accompanied by candlesdistributed by volunteers) mobilized a hundredthousand citizens across the country (see Figure 1).The so-called“American beef crisis”and its associatedCandlelight Demonstration have multipleaspects and actors including international risk-governanceregimes such as OIE (Office International DesEpizooties), and to discuss them all here is not myintention. ? I am going to focus on one prominentevent involving the dramatic failure of an old-fashionedmodel of science communication: the deficitmodel.Korean government officials and their supportingscientists emphatically asserted from the very start thatAmerican beef was“safe from BSE.”Later, whenchallenged, they retreated to the more defensible positionthat the health risk of American beef associatedwith BSE was extremely low, low enough to be rationallyignored. This transition from the simplisticlanguage of“safe”to the more technical-sounding“low risk”reveals how government officials conceptualizedthe general public in their risk management.They tended to think that the public was quite volatile(they were right about this, but in the wrong way), andlikely to be irrationally worried about scientificallynegligible risks. They also seemed to believe that moreinformation would bring more trouble rather thanmore rational behavior. Administrators seemed tothink that if the public were provided with detailed110