首页 养生问答 疾病百科 养生资讯 女性养生 男性养生
您的当前位置:首页正文

Causes and Explanations A Structural-Model Approach. Part II Explanations

2020-02-03 来源:华佗健康网
CausesandExplanations:AStructural-Model

Approach.PartII:Explanations

arXiv:cs/0208034v3 [cs.AI] 19 Nov 2005JosephY.Halpern∗CornellUniversityDept.ofComputerScience

Ithaca,NY14853halpern@cs.cornell.edu

http://www.cs.cornell.edu/home/halpernJudeaPearl†

Dept.ofComputerScience

UniversityofCalifornia,LosAngeles

LosAngeles,CA90095judea@cs.ucla.edu

http://www.cs.ucla.edu/∼judea

February1,2008

Abstract

Weproposenewdefinitionsof(causal)explanation,usingstructuralequationstomodelcounterfactuals.Thedefinitionisbasedonthenotionofactualcause,asdefinedandmotivatedinacompanionpaper.Essentially,anexplanationisafactthatisnotknownforcertainbut,iffoundtobetrue,wouldconstituteanactualcauseofthefacttobeexplained,regardlessoftheagent’sinitialuncertainty.Weshowthatthedefinitionhandleswellanumberofproblematicexamplesfromtheliterature.

1Introduction

Theautomaticgenerationofadequateexplanationsisataskessentialinplanning,diag-nosisandnaturallanguageprocessing.Asystemdoinginferencemustbeabletoexplainitsfindingsandrecommendationstoevokeauser’sconfidence.However,gettingagooddefinitionofexplanationisanotoriouslydifficultproblem,whichhasbeenstudiedforyears.(See[ChajewskaandHalpern1997;G¨ardenfors1988;Hempel1965;Pearl1988;Salmon1989]andthereferencesthereinforanintroductiontoanddiscussionoftheissues.)

InPartIofthispaper[HalpernandPearl2004]wegiveadefinitionofactualcausalityusingstructuralequations.Hereweshowhowtheideasbehindthatdefinitioncanbeusedtogiveanelegantdefinitionof(causal)explanationthatdealswellwithmanyoftheproblematicexamplesdiscussedintheliterature.Thebasicideaisthatanexplanationisafactthatisnotknownforcertainbut,iffoundtobetrue,wouldconstituteanactualcauseoftheexplanandum(thefacttobeexplained),regardlessoftheagent’sinitialuncertainty.

Notethatourdefinitioninvolvescausalityandknowledge.FollowingG¨ardenfors[1988],wetakethenotionofexplanationtoberelativetoanagent’sepistemicstate.Whatcountsasanexplanationforoneagentmaynotcountasanexplanationforanotheragent.WealsofollowG¨ardenforsinallowingexplanationstoinclude(fragmentsof)acausalmodel.ToborrowanexamplefromG¨ardenfors,anagentseekinganexplanationofwhyMr.Johanssonhasbeentakenillwithlungcancerwillnotconsiderthefactthatheworkedforyearsinasbestosmanufacturingapartofanexplanationifhealreadyknewthisfact.Forsuchanagent,anexplanationofMr.Johansson’sillnessmayincludeacausalmodeldescribingtheconnectionbetweenasbestosfibersandlungcancer.Ontheotherhand,forsomeonewhoalreadyknowsthecausalmodelbutdoesnotknowthatMr.Johanssonworkedinasbestosmanufacturing,theexplanationwouldinvolveMr.Johansson’semploymentbutwouldnotmentionthecausalmodel.WhereourdefinitiondiffersfromthatofG¨ardenfors(andallothersintheliterature)isinthewayitisformulatedintermsoftheunderlyingnotionsofknowledge,causalmodels,actualcausation,andcounterfactuals.Thedefinitionisnotbasedonprobabilisticdependence,“statisticalrelevance”,orlogicalimplication,andthusisabletodealwiththedirectionalityinherentincommonexplanations.Whileitseemsreasonabletosay“theheightoftheflagpoleexplainsthelengthoftheshadow”,itwouldsoundawkwardifoneweretoexplaintheformerwiththelatter.Ourdefinitionisabletocapturethisdistinctioneasily.

Thebestjudgeoftheadequacyofanapproacharetheintuitiveappealofthedefi-nitionsandhowwellitdealswithexamples;webelievethatthispapershowsthatourapproachfareswellonbothcounts.

Theremainderofthepaperisorganizedasfollows.InSection2,wereviewthebasicdefinitionsofcausalmodelsbasedonstructuralequations,whicharethebasisfor

1

ourdefinitionsofcausalityandexplanation,andthenreviewthedefinitionofcausalityfromthecompanionpaper.Wehavetriedtoincludeenoughdetailheretomakethepaperself-contained,butweencouragethereadertoconsultthecompanionpaperformoremotivationanddiscussion.InSection3wegivethebasicdefinitionofexplanation,undertheassumptionthatthecausalmodelisknown.InSection4,probabilityisaddedtothepicture,togivenotionsofpartialexplanationandexplanatorypower.Thegeneraldefinition,whichdispenseswiththeassumptionthatthecausalmodelisknown,isdiscussedinSection5.WeconcludeinSection6withsomediscussion.

2

CausalModelsandtheDefinitionofActualCausal-ity:AReview

Tomakethispaperself-contained,thissectionrepeatsmaterialfromthecompanionpaper;wereviewthebasicdefinitionsofcausalmodels,asdefinedintermsofstructuralequations,thesyntaxandsemanticsofalanguageforreasoningaboutcausalityandexplanations,andthedefinitionofactualcause.

2.1Causalmodels

Theuseofstructuralequationsasamodelforcausalrelationshipsisstandardinthesocialsciences,andseemstogobacktotheworkofSewallWrightinthe1920s(see[Goldberger1972]foradiscussion);theparticularframeworkthatweusehereisduetoPearl[1995],andisfurtherdevelopedin[Pearl2000].

Thebasicpictureisthattheworldisdescribedbyrandomvariables,someofwhichmayhaveacausalinfluenceonothers.Thisinfluenceismodeledbyasetofstructuralequations.Eachequationrepresentsadistinctmechanism(orlaw)intheworld,whichmaybemodified(byexternalactions)withoutalteringtheothers.Inpractice,itseemsusefultosplittherandomvariablesintotwosets,theexogenousvariables,whosevaluesaredeterminedbyfactorsoutsidethemodel,andtheendogenousvariables,whosevaluesareultimatelydeterminedbytheexogenousvariables.Itistheseendogenousvariableswhosevaluesaredescribedbythestructuralequations.

Formally,asignatureSisatuple(U,V,R),whereUisasetofexogenousvariables,Visafinitesetofendogenousvariables,andRassociateswitheveryvariableY∈U∪VanonemptysetR(Y)ofpossiblevaluesforY(thatis,thesetofvaluesoverwhichYranges).Acausal(orstructural)modeloversignatureSisatupleM=(S,F),whereFassociateswitheachvariableX∈VafunctiondenotedFXsuchthatFX:(×U∈UR(U))×(×Y∈V−{X}R(Y))→R(X).FXdeterminesthevalueofXgiventhevaluesofalltheothervariablesinU∪V.Forexample,ifFX(Y,Z,U)=Y+U(whichweusuallywriteasX=Y+U),thenifY=3andU=2,thenX=5,regardlessofhowZisset.

2

Theseequationscanbethoughtofasrepresentingprocesses(ormechanisms)bywhichvaluesareassignedtovariables.Hence,likephysicallaws,theysupportacounterfactualinterpretation.Forexample,theequationaboveclaimsthat,inthecontextU=u,ifYwere4,thenXwouldbeu+4(whichwewriteas(M,u)|=[Y←4](X=u+4)),regardlessofwhatvaluesX,Y,andZactuallytakeintherealworld.

Thecounterfactualinterpretationandthecausalasymmetryassociatedwiththestructuralequationsarebestseenwhenweconsiderexternalinterventions(orspon-taneouschanges),underwhichsomeequationsinFaremodified.Anequationsuchasx=FX(󰀟u,y)shouldbethoughtofassayingthatinacontextwheretheexogenousvariableshavevalues󰀟u,ifYweresettoybysomemeans(notspecifiedinthemodel),thenXwouldtakeonthevaluex,asdictatedbyFX.ThesamedoesnotholdwhenweintervenedirectlyonX;suchaninterventionamountstoassigningavaluetoXbyexternalmeans,thusoverrulingtheassignmentspecifiedbyFX.

Forthosemorecomfortablewiththinkingofcounterfactualsintermsofpossibleworlds,thismodificationofequationsmaybegivenasimple“closestworld”interpre-tation:thesolutionoftheequationsobtainedreplacingtheequationforYwiththeequationY=y,whileleavingallotherequationsunaltered,givestheclosest“world”totheactualworldwhereY=y.

Wecandescribe(somesalientfeaturesof)acausalmodelMusingacausalnetwork.ThisisagraphwithnodescorrespondingtotherandomvariablesinVandanedgefromanodelabeledXtoonelabeledYifFYdependsonthevalueofX.Intuitively,variablescanhaveacausaleffectonlyontheirdescendantsinthecausalnetwork;ifYisnotadescendantofX,thenachangeinthevalueofXhasnoeffectonthevalueofY.CausalnetworksaresimilarinspirittoLewis’sneurondiagrams[1973],buttherearesignificantdifferencesaswell(seePartIforadiscussion).Inthispaper,werestrictattentiontowhatarecalledrecursive(oracyclic)equations;theseareonesthatcanbedescribedwithacausalnetworkthatisadirectedacyclicgraph(thatis,agraphthathasnocycleofedges).ItshouldbeclearthatifMisarecursivecausalmodel,thenthereisalwaysauniquesolutiontotheequationsinM,givenasetting󰀟uforthevariablesinU.1Suchasettingiscalledacontext.Contextswillplaytheroleofpossibleworldswhenwemodeluncertainty.Forfuturereference,apair(M,󰀟u)consistingofacausalmodelandacontextiscalledasituation.

Example2.1:Supposethattwoarsonistsdroplitmatchesindifferentpartsofadryforest,andbothcausetreestostartburning.Considertwoscenarios.Inthefirst,calledthedisjunctivescenario,eithermatchbyitselfsufficestoburndownthewholeforest.Thatis,evenifonlyonematchwerelit,theforestwouldburndown.Inthesecondscenario,calledtheconjunctivescenario,bothmatchesarenecessarytoburndowntheforest;ifonlyonematchwerelit,thefirewoulddiedownbeforetheforestwasconsumed.

Wecandescribetheessentialstructureofthesetwoscenariosusingacausalmodelwithfourvariables:

•anexogenousvariableUthatdetermines,amongotherthings,themotivationandstateofmindofthearsonists.Forsimplicity,assumethatR(U)={u00,u10,u01,u11};ifU=uij,thenthefirstarsonistintendstostartafireiffi=1andthesecondarsonistintendstostartafireiffj=1.InbothscenariosU=u11.

•endogenousvariablesML1andML2,eacheither0or1,whereMLi=0ifarsonistidoesn’tdropthematchandMLi=1ifhedoes,fori=1,2.

•anendogenousvariableFBforforestburnsdown,withvalues0(itdoesn’t)and1(itdoes).

Bothscenarioshavethesamecausalnetwork(seeFigure1);theydifferonlyintheequationforFB.ForthedisjunctivescenariowehaveFFB(u,1,1)=FFB(u,0,1)=FFB(u,1,0)=1andFFB(u,0,0)=0(whereu∈R(U));fortheconjunctivescenariowehaveFFB(u,1,1)=1andFFB(u,0,0)=FFB(u,1,0)=FFB(u,0,1)=0.

Ingeneral,weexpectthatthecausalmodelforreasoningaboutforestfireswouldinvolvemanyothervariables;inparticular,variablesforotherpotentialcausesofforestfiressuchaslightningandunattendedcampfires.Herewefocusonthatpartofthecausalmodelthatinvolvesforestfiresstartedbyarsonists.Sinceforcausalityweassumethatalltherelevantfactsaregiven,wecanassumeherethatitisknownthattherewerenounattendedcampfiresandtherewasnolightning,whichmakesitsafetoignorethatportionofthecausalmodel.

DenotebyM1andM2the(portionofthe)causalmodelsassociatedwiththedisjunc-tiveandconjunctivescenarios,respectively.ThecausalnetworkfortherelevantportionofM1andM2isdescribedinFigure1.ThediagramemphasizesthatthevalueofFB

UML1FBML2

Figure1:ThecausalnetworkforM1andM2.

isdeterminedbythevaluesofML1andML2(whichinturnaredeterminedbythevalueoftheexogenousvariableU.

X←󰀐xX←󰀐x

isobtainedfromFYbysettingthevaluesofthe),whereFYMX󰀐←󰀐󰀐,Fx=(SX

󰀟to󰀟variablesinXx.Intuitively,thisisthecausalmodelthatresultswhenthevariables

󰀟aresetto󰀟󰀟;wedoinXxbysomeexternalactionthataffectsonlythevariablesinX

notmodeltheactionoritscausesexplicitly.Forexample,ifM1isthemodelforthedisjunctivescenarioinExample2.1,then(M1)ML1←0isthemodelwhereFB=ML2:ifthefirstmatchisnotdropped,thenthereisafireifandonlyifthesecondmatchisdropped.Similarly,(M2)ML2←0isthemodelwhereFB=0:ifthefirstmatchisnotdropped,thenthereisnofireintheconjunctivescenario.NotethatifMisarecursive

󰀟causalmodel,thenthereisalwaysauniquesolutiontotheequationsinMX󰀐←󰀐xforallX

and󰀟x.

󰀐󰀐

2.2SyntaxandSemantics:

GivenasignatureS=(U,V,R),aformulaoftheformX=x,forX∈Vandx∈R(X),iscalledaprimitiveevent.Abasiccausalformulaisoneoftheform[Y1←y1,...,Yk←yk]ϕ,where

•ϕisaBooleancombinationofprimitiveevents;•Y1,...,YkaredistinctvariablesinV;and•yi∈R(Yi).

󰀟←󰀟Suchaformulaisabbreviatedas[Yy]ϕ.Thespecialcasewherek=0isabbreviated

asϕ.Intuitively,[Y1←y1,...,Yk←yk]ϕsaysthatϕholdsinthecounterfactualworldthatwouldariseifYiweresettoyi,i=1,...,k.AcausalformulaisaBooleancombinationofbasiccausalformulas.2

Acausalformulaϕistrueorfalseinacausalmodel,givenacontext.Wewrite

󰀟←󰀟(M,󰀟u)|=ϕifϕistrueincausalmodelMgivencontext󰀟u.(M,󰀟u)|=[Yy](X=x)

ifthevariableXhasvaluexintheunique(sincewearedealingwithrecursivemodels)solutiontotheequationsinMYu(thatis,theuniquevectorofvaluesfor󰀐←y󰀐incontext󰀟

󰀐←yY󰀐󰀟,,Z∈V−YtheexogenousvariablesthatsimultaneouslysatisfiesallequationsFZ

withthevariablesinUsetto󰀟u).Weextendthedefinitiontoarbitrarycausalformulasintheobviousway.

Thus,inExample2.1,wehave(M1,u11)|=[ML1=0](FB=1)and(M2,u11)|=[ML1=0](FB=0).Inthedisjunctivemodel,inthecontextwherebotharsonistsdropamatch,ifthefirstarsonistdoesnotdropamatch,theforeststillburnsdown.Onthe

otherhand,intheconjunctivemodel,ifthefirstarsonistdoesnotdropamatch(inthesamecontext),theforestdoesnotburndown.

Notethatthestructuralequationsaredeterministic.Welateraddprobabilitytothepicturebyputtingaprobabilityonthesetofcontexts(i.e.,onthepossibleworlds).Thisprobabilityisnotneededinthedefinitionofcausality,butwillbeusefulinthediscussionofexplanation.

2.3TheDefinitionofCause

Withallthisnotationinhand,wecannowgiveadefinitionofactualcause(“cause”forshort).Wewanttomakesenseoutofstatementsoftheform“eventAisanactualcauseofeventϕ(incontext󰀟u)”.3Thepicturehereisthatthecontext(andthestructuralequations)aregiven.Intuitively,theyencodethebackgroundknowledge.Alltherelevanteventsareknown.Theonlyquestionispickingoutwhichofthemarethecausesofϕor,alternatively,testingwhetheragivensetofeventscanbeconsideredthecauseofϕ.ThetypesofeventsthatweallowasactualcausesareonesoftheformX1=x1∧...∧Xk=xk—thatis,conjunctionsofprimitiveevents;wetypicallyabbreviatethisas󰀟=󰀟Xx.TheeventsthatcanbecausedarearbitraryBooleancombinationsofprimitiveevents.Wedonotbelievethatwelosemuchbydisallowingdisjunctivecauseshere.Disjunctiveexplanations,however,arecertainlyofinterest.

󰀟=󰀟󰀟=󰀟Roughlyspeaking,Xxisasufficientcauseofϕinsituation(M,󰀟u)if(a)Xx∧ϕ

󰀟andsomeothervariablesW󰀟istruein(M,󰀟u)and(b)thereissomeset󰀟x′ofvaluesofX

󰀟tow󰀟to󰀟(satisfyingsomeconstraints)suchthatsettingW󰀟′andchangingXx′resultsin

󰀟←󰀟󰀟←w¬ϕbeingtrue(thatis,(M,󰀟u)|=[Xx′,W󰀟′]¬ϕ).Part(b)isveryclosetothe

standardcounterfactualdefinitionofcausality,advocatedbyLewis[1973]andothers:󰀟=󰀟󰀟notbeenequalto󰀟Xxisthecauseofϕif,hadXx,ϕwouldnothavebeenthecase.

󰀟towThedifferenceisthatweallowsettingsomeothervariablesW󰀟.Aswesaid,the

󰀟.Amongotherthings,itmustbethecaseformaldefinitionputssomeconstraintsonW

󰀟to󰀟󰀟issettowthatsettingXxisenoughtoforceϕtobetruewhenW󰀟′.Thatis,it

󰀟←󰀟󰀟←wmustbethecasethat(M,󰀟u)|=[Xx,W󰀟′]ϕ.(Seetheappendixfortheformal

definition.)

󰀟=󰀟Xxisanactualcauseofϕin(M,󰀟u)ifitisasufficientclausewithnoirrelevant

󰀟←󰀟conjuncts.Thatis,Xxisanactualcauseofϕin(M,󰀟u)ifitisasufficientcauseand

󰀟isalsoasufficientcause.EiterandLukasiewicz[2002]and,independently,nosubsetofX

Hopkins[2001]haveshownthatactualcausesarealwayssingleconjuncts.Asweshallsee,thisisnotthecaseforexplanations.

ReturningtoExample2.1,notethatML1=1isanactualcauseofFB=1inboththeconjunctiveandthedisjunctivescenarios.Thisshouldbeclearintheconjunctive

scenario.SettingML1to0resultsintheforestnotburningdown.ToseethatML1=1isalsoacauseinthedisjunctivescenario,letWbeML2.Notethat(M1,u00)|=[ML1←0,ML2←0]FB=0,sothecounterfactualconditionissatisfied.Moreover,(M1,u00)|=[ML1←1,ML2←0]FB=1;thatis,inthedisjunctivescenario,ifthefirstarsonistdropsthematch,thatisenoughtoburndowntheforest,nomatterwhatthesecondarsonistdoes.Ineitherscenario,botharsonistsdroppingalitmatchconstitutesasufficientcausefortheforestfire,asdoesthefirstarsonistdroppingalitmatchandsneezing.Givenanactualcause,asufficientcausecanbeobtainedbyaddingarbitraryconjunctstoit.Althougheacharsonistisacauseoftheforestburningdownintheconjunctivesce-nario,underreasonableassumptionsabouttheknowledgeoftheagentwantinganex-planation,eacharsonistaloneisnotanexplanationoftheforestburningdown.Botharsoniststogetherprovidetheexplanationinthatcase;identifyingarsonist1wouldonlytriggerafurtherquestfortheidentityofheraccomplice.Inthedisjunctivescenario,eacharsonistaloneisanexplanationoftheforestburningdown.

Wehopethatthisinformaldiscussionofthedefinitionofcausalitywillsufficeforreaderswhowanttofocusmainlyonexplanation.Forcompleteness,wegivetheformaldefinitionsofsufficientandactualcauseintheappendix.Thedefinitionismotivated,discussed,anddefendedinmuchmoredetailinPartI,whereitisalsocomparedwithotherdefinitionsofcausality.Inparticular,itisshowntoavoidanumberofproblemsthathavebeenidentifiedwithLewis’saccount(e.g.,see[Pearl2000,Chapter10]),suchascommitmenttotransitivityofcauses.Forthepurposesofthispaper,weaskthatthereaderacceptourdefinitionofcausality.Wenotethat,tosomeextent,ourdefinitionofexplanationismodularinitsuseofcausality,inthatanotherdefinitionofcausalitycouldbesubstitutedfortheoneweuseinthedefinitionofexplanation(provideditwasgiveninthesameframework).

3Explanation:TheBasicDefinition

Aswesaidintheintroduction,manydefinitionsofcausalexplanationhavebeengivenintheliterature.The“classical”approachesinthephilosophyliterature,suchasHempel’s[1965]deductive-nomologicalmodelandSalmon’s[1989]statisticalrelevancemodel(aswellasmanyotherapproaches),becausetheyarebasedonlogicalimplicationandprob-abilisticdependence,respectively,failtoexhibitthedirectionalityinherentincommonexplanations.Despitealltheexamplesinthephilosophyliteratureontheneedfortakingcausalityandcounterfactualsintoaccount,andtheextensiveworkoncausalitydefinedintermsofcounterfactualsinthephilosophyliterature,asWoodward[2001]observes,philosophershavebeenreluctanttobuildatheoryofexplanationontopofatheoryofcausality.Theconcernseemstobeoneofcircularity.Inthissection,wegiveadefinitionofexplanationbasedonthedefinitionofcausalitydiscussedinSection2.3.Circularityisavoidedbecausethedefinitionofexplanationinvokesstrictlyformalfeaturesofacausalmodelthatdonotinturndependonthenotionofexplanation.

7

Ourdefinitionofcausalityassumedthatthecausalmodelandalltherelevantfactsweregiven;theproblemwastofigureoutwhichofthegivenfactswerecauses.Incon-trast,theroleofexplanationistoprovidetheinformationneededtoestablishcausation.Roughlyspeaking,weviewanexplanationasafactthatisnotknownforcertainbut,iffoundtobetrue,wouldconstituteagenuinecauseoftheexplanandum(facttobeexplained),regardlessoftheagent’sinitialuncertainty.Roughlyspeaking,theroleofexplanationistoprovidetheinformationneededtoestablishcausation.Thus,aswesaidintheintroduction,whatcountsasanexplanationdependsonwhatonealreadyknows(orbelieves—welargelyblurthedistinctionbetweenknowledgeandbeliefinthispaper).Asaconsequence,thedefinitionofanexplanationshouldberelativetotheagent’sepis-temicstate(asinG¨ardenfors[1988]).Itisalsonatural,fromthisviewpoint,thatanexplanationwillincludefragmentsofthecausalmodelMorreferencetothephysicallawsunderlyingtheconnectionbetweenthecauseandtheeffect.

Thedefinitionofexplanationismotivatedbythefollowingintuitions.AnindividualinagivenepistemicstateKaskswhyϕholds.Whatconstitutesagoodanswertohisquestion?Agoodanswermust(a)provideinformationthatgoesbeyondKand(b)besuchthattheindividualcanseethatitwould,iftrue,be(orbeverylikelytobe)acauseofϕ.Wemayalsowanttorequirethat(c)ϕbetrue(oratleastprobable).Althoughourbasicdefinitiondoesnotinsiston(c),itiseasytoaddthisrequirement.

Howdowecaptureanagent’sepistemicstateinourframework?Foreaseofexposi-tion,wefirstconsiderthecasewherethecausalmodelisknownandonlythecontextisuncertain.(TheminormodificationsrequiredtodealwiththegeneralcasearedescribedinSection5.)Inthatcase,onewayofdescribinganagent’sepistemicstateisbysimplydescribingthesetofcontextstheagentconsiderspossible.

󰀟=󰀟Definition3.1:(Explanation)GivenastructuralmodelM,Xxisanexplanation

ofϕrelativetoasetKofcontextsifthefollowingconditionshold:

EX1.(M,󰀟u)|=ϕforeachcontext󰀟u∈K.(Thatis,ϕmustholdinallcontextsthe

agentconsiderspossible—theagentconsiderswhatsheistryingtoexplainasanestablishedfact.)󰀟=󰀟󰀟=󰀟EX2.Xxisasufficientcauseofϕin(M,󰀟u)foreach󰀟u∈Ksuchthat(M,󰀟u)|=Xx.󰀟isminimal;nosubsetofX󰀟satisfiesEX2.EX3.X

󰀟=󰀟󰀟=󰀟EX4.(M,󰀟u)|=¬(Xx)forsome󰀟u∈Kand(M,󰀟u′)|=Xxforsome󰀟u′∈K.(Thisjustsaysthattheagentconsidersacontextpossiblewheretheexplanationisfalse,sotheexplanationisnotknowntostartwith,andconsidersacontextpossiblewheretheexplanationistrue,sothatitisnotvacuous.)

whyBhappened.”Clearly,AisnotanexplanationofwhyBhappenedrelativetotheepistemicstateafterAhasbeendiscovered,sinceatthatpointAisknown.However,AcanlegitimatelybeconsideredanexplanationofBrelativetotheepistemicstatebeforeAwasdiscovered.Interestingly,asweshallsee,althoughthereisacauseforeveryeventϕ(althoughsometimesitmaybethetrivialcauseϕ),oneoftheeffectsoftherequirementEX4isthatsomeeventsmayhavenoexplanation.Thisseemstousquiteconsistentwithstandardusageofthenotionofexplanation.Afterall,wedospeakof“inexplicableevents”.

Whatdoesthedefinitionofexplanationtellusforthearsonistexample?Whatcountsasacauseis,asexpected,verymuchdependentonthecausalmodelandtheagent’sepistemicstate.Ifthecausalmodelhasonlyarsonistsasthecauseofthefire,therearetwopossibleexplanationsinthedisjunctivescenario:(a)arsonist1diditand(b)arsonist2didit(assumingKconsistsofthreecontexts,whereeither1,2,orbothsetthefire).Intheconjunctivescenario,noexplanationisnecessary,sincetheagentknowsthatbotharsonistsmusthavelitamatchifarsonistheonlypossiblecauseofthefire(assumingthattheagentconsidersthetwoarsoniststobetheonlypossiblearsonists).Thatis,ifthetwoarsonistsaretheonlypossiblecauseofthefireandthefireisobserved,thenKcanconsistofonlyonecontext,namely,theonewherebotharsonistsstartedthefire.NoexplanationcansatisfyEX4inthiscase.

Perhapsmoreinterestingistoconsideracausalmodelwithotherpossiblecauses,suchaslightningandunattendedcampfires.Sincetheagentknowsthattherewasafire,ineachofthecontextsinK,atleastoneofthepotentialcausesmusthaveactuallyoccurred.IfthereisacontextinKwhereonlyarsonist1droppedalitmatch(and,say,therewaslightning),anotherwhereonlyarsonist2droppedalitmatch,andathirdwherebotharsonistsdroppedmatches,then,intheconjunctivescenario,ML1=1∧ML2=1isanexplanationofFB=1,butneitherML1=1norML2=1byitselfisanexplanation(sinceneitherbyitselfisacauseinallcontextsinKthatsatisfytheformula).Ontheotherhand,inthedisjunctivescenario,bothML1=1andML2=1areexplanations.Considerthefollowingexample,duetoBennett(see[SosaandTooley1993,pp.222–223]),whichisanalyzedinPartI.

Example3.2:SupposethattherewasaheavyraininAprilandelectricalstormsinthefollowingtwomonths;andinJunethelightningtookhold.Ifithadn’tbeenfortheheavyraininApril,theforestwouldhavecaughtfireinMay.ThefirstquestioniswhethertheAprilrainscausedtheforestfire.Accordingtoanaivecounterfactualanalysis,theydo,sinceifithadn’trained,therewouldn’thavebeenaforestfireinJune.Inourframework,itisnotthecasethattheAprilrainscausedthefire,buttheywereacauseoftherebeingafireinJune,asopposedtoMay.Thisseemstousintuitivelyright.

Thesituationcanbecapturedusingamodelwiththreeendogenousrandomvariables:•ASfor“Aprilshowers”,withtwovalues—0standingfordidnotrainheavilyinApriland1standingforrainedheavilyinApril;

9

•ESfor“electricstorms”,withfourpossiblevalues:(0,0)(noelectricstormsineitherMayorJune),(1,0)(electricstormsinMaybutnotJune),(0,1)(stormsinJunebutnotMay),and(1,1)(stormsinbothAprilandMay);

•andFfor“fire”,withthreepossiblevalues:0(nofireatall),1(fireinMay),or2(fireinJune).

Wedonotdescribethecontextexplicitly.Assumeitsvalue󰀟uissuchthatitensuresthatthereisashowerinApril,thereareelectricstormsinbothMayandJune,thereissufficientoxygen,therearenootherpotentialcausesoffire(likedroppedmatches),nootherinhibitorsoffire(alertcamperssettingupabucketbrigade),andsoon.Thatis,wechoose󰀟usoastoallowustofocusontheissueathandandtoensurethattherightthingshappened(therewasbothfireandrain).Wewillnotbotherwritingoutthedetailsofthestructuralequations—theyshouldbeobvious,giventhestory(atleast,forthecontext󰀟u);thisisalsothecaseforalltheotherexamplesinthissection.Thecausalnetworkissimple:thereareedgesfromAStoFandfromEStoF.AsobservedinPartI,eachofthefollowingholds.

•AS=1isacauseoftheJunefire(F=2).

•AS=1isnotacauseofthefire(F=1∨F=2).IfESisset(0,1),(1,0),or(1,1),thentherewillbeafire(ineitherMayorJune)whetherAS=0orAS=1.Ontheotherhand,ifESissetto(0,0),thenthereisnofire,whetherAS=0orAS=1.•ES=(1,1)isacauseofbothF=2and(F=1∨F=2).HavingelectricstormsinbothMayandJunecausedtheretobeafire.

•AS=1∧ES=(1,1)isasufficientcauseofF=2;eachindividualconjunctisanactualcause.

Nowconsidertheproblemofexplanation.Supposethattheagentknowsthattherewasanelectricstorm,butdoesnotknowwhen,anddoesnotknowwhethertherewereAprilshowers.Thus,Kconsistsofsixcontexts,onecorrespondingtoeachofthevalues(1,0),(0,1),and(1,1)ofESandthevalues0and1ofAS.ThenitiseasytoseethatAS=1isnotanexplanationoffire(F=1∨F=2),sinceitisnotacauseoffireinanycontextinK.Similarly,AS=0isnotanexplanationoffire.Ontheotherhand,eachofES=(1,1),ES=(1,0),andES=(0,1)isanexplanationoffire.

NowsupposethatwearelookingforanexplanationoftheJunefire.ThenthesetKcanconsistonlyofcontextscompatiblewiththerebeingafireinJune.SupposethatKconsistsofthreecontexts,onewhereAS=1andES=(0,1),onewhereAS=1andES=(1,1),andonewhereAS=0andES=(0,1).Inthiscase,eachofAS=1,ES=(0,1),andES=(1,1)isanexplanationoftheJunefire.(InthecaseofAS=1,weneedtoconsiderthesettingwhereES=(1,1).)

10

Finally,iftheagentknowsthattherewasanelectricstorminMayandJuneandheavyraininApril(sothatKconsistsofonlyonecontext),thenthereisnoexplanationofeitherfireorthefireinJune.Formally,thisisbecauseitisimpossibletosatisfyEX4.Informally,thisisbecausetheagentalreadyknowswhytherewasafireinJune.

theactualworldaspartofthemodel,wecouldalsorequirethat󰀟u∈K.ThisconditionwouldentailthatKrepresentstheagent’sknowledgeratherthantheagent’sbeliefs:theactualcontextisoneoftheonestheagentconsiderspossible.

4PartialExplanationsandExplanatoryPower

Notallexplanationsareconsideredequallygood.Someexplanationsaremorelikelythanothers.Onewaytodefinethe“goodness”ofanexplanationisbybringingprobabilityintothepicture.SupposethattheagenthasaprobabilityonthesetKofpossiblecontexts.Inthiscase,wecanconsidertheprobabilityofthesetofcontextswheretheexplanation󰀟=󰀟Xxistrue.Forexample,iftheagenthasreasontobelievethatelectricstormsarequitecommoninbothMayandJune,thenthesetofcontextswhereES=(1,1)holdswouldhavegreaterprobabilitythanthesetwhereeitherES=(1,0)orES=(0,1)holds.Thus,ES=(1,1)wouldbeconsideredabetterexplanation.

Formally,supposethatthereisaprobabilityPronthesetKofpossiblecontexts.

󰀟=󰀟󰀟=󰀟ThentheprobabilityofexplanationXxisjustPr(Xx).Whiletheprobabilityof

anexplanationclearlycapturessomeimportantaspectsofhowgoodtheexplanationis,itisonlypartofthestory.Theotherpartconcernsthedegreetowhichanexplanationfulfillsitsrole(relativetoϕ)inthevariouscontextsconsidered.Thisbecomesclearerwhenweconsiderpartialexplanations.Thefollowingexample,takenfrom[G¨ardenfors1988],isonewherepartialexplanationsplayarole.

Example4.1:SupposeIseethatVictoriaistannedandIseekanexplanation.Supposethatthecausalmodelincludesvariablesfor“VictoriatookavacationintheCanaryIslands”,“sunnyintheCanaryIslands”,and“wenttoatanningsalon”.ThesetKincludescontextsforallsettingsofthesevariablescompatiblewithVictoriabeingtanned.Notethat,inparticular,thereisacontextwhereVictoriawentbothtotheCanaries(anddidn’tgettannedthere,sinceitwasn’tsunny)andtoatanningsalon.G¨ardenforspointsoutthatwenormallyaccept“VictoriatookavacationintheCanaryIslands”asasatisfactoryexplanationofVictoriabeingtannedand,indeed,accordingtohisdefinition,itisanexplanation.Victoriatakingavacationisnotanexplanation(relativetothecontextK)inourframework,sincethereisacontext󰀟u∗∈KwhereVictoriawenttotheCanaryIslandsbutitwasnotsunny,andin󰀟u∗theactualcauseofhertanisthetanningsalon,notthevacation.Thus,EX2isnotsatisfied.However,intuitively,itis“almost”satisfied,sinceitissatisfiedbyeverycontextinKinwhichVictoriagoestotheCanariesbutu∗.Theonlycompleteexplanationaccordingtoourdefinitionis“VictoriawenttotheCanaryIslandsanditwassunny.”“VictoriawenttotheCanaryIslands”isapartialexplanation,inasensetobedefinedbelow.

explanation.Roughlyspeaking,thecompleteexplanationmayinvolveexogenousfactors,whicharenotpermittedinexplanations.Assume,forexample,thatgoingtoatanningsalonwasnotanendogenousvariableinthemodel;instead,themodelsimplyhadanexogenousvariableUsthatcouldmakeVictoriasuntannedevenintheabsenceofsunintheCanaryIslands.Likewise,assumethattheweatherintheCanaryIslandswasalsopartofthebackgroundcontext.Inthiscase,Victoria’svacationwouldstillbeapartialexplanationofhersuntan,sincethecontextwhereitfailstobeacause(nosunintheCanaryIslands)isfairlyunlikely,butwecannotaddconjunctstothiseventtototallyexcludethatcontextfromtheagent’srealmofpossibilities.Indeed,inthismodelthereisno(complete)explanationforVictoria’stan;itisinexplicable!Inexplicableeventsarenotsouncommon,asthefollowingexampleshows.

Example4.2:Supposethatthesoundonatelevisionworksbutthereisnopicture.Furthermore,theonlycauseoftherebeingnopicturethattheagentisawareofisthepicturetubebeingfaulty.However,theagentisalsoawarethattherearetimeswhenthereisnopictureeventhoughthepicturetubeworksperfectlywell—intuitively,thereisnopicture“forinexplicablereasons”.ThisiscapturedbythecausalnetworkdescribedinFigure2,whereTdescribeswhetherornotthepicturetubeisworking(1ifitisand0ifitisnot)andPdescribeswhetherornotthereisapicture(1ifthereisand0ifthereisnot).TheexogenousvariableU0determinesthestatusofthepicturetube:T=U0.The

U0TPU1

Figure2:Thetelevisionwithnopicture.

exogenousvariableU1ismeanttorepresentthemysterious“otherpossiblecauses”.IfU1=0,thenwhetherornotthereisapicturedependssolelyonthestatusofthepicturetube—thatis,P=T.Ontheotherhand,ifU1=1,thenthereisnopicture(P=0)nomatterwhatthestatusofthepicturetube.Thus,incontextswhereU1=1,T=0isnotacauseofP=0.NowsupposethatKincludesacontext󰀟u00whereU0=U1=0and󰀟u10whereU0=1andU1=0.TheonlycauseofP=0inboth󰀟u00and󰀟u10isP=0itself.(T=0isnotacauseofP=0in󰀟u10,sinceP=0evenifTissetto1.)Asaresult,thereisnoexplanationofP=0relativetoanepistemicKthatincludes󰀟u00and󰀟u10.(EX4excludesthevacuousexplanationP=0.)Ontheotherhand,T=0isacauseofP=0inallothercontextsinKsatisfyingT=0otherthan󰀟u00.Iftheprobabilityof󰀟u00(capturingtheintuitionthatitisunlikelythatmorethanonethinggoeswrongwithatelevisionatonce),thenweareentitledtoviewT=0asaquitegoodpartialexplanationofP=0withrespecttoK.

13

Notethatifwemodifythecausalmodelherebyaddinganendogenousvariable,sayI,correspondingtothe“inexplicable”causeU1(withequationI=U1),thenI=0isacauseofP=0inthecontextu10(andbothI=0andT=0arecausesofP=0inu00).Inthismodel,I=0isanexplanationofP=0.

InExample4.1,iftheagentbelievesthatitissunnyintheCanaryIslandswithprobability.9(thatis,theprobabilitythatitwassunnygiventhatVictoriaissuntannedandthatshewenttotheCanariesis.9),thenVictoriagoingtotheCanariesisapartialexplanationofherbeingtannedwithgoodness.9.TherelevantsetK′consistsofthosecontextswhereitissunnyintheCanaries.Similarly,inExample4.2,iftheagentbelievesthattheprobabilityofboththepicturetubebeingfaultyandtheothermysteriouscausesbeingoperativeis.1,thenT=0isapartialexplanationofP=0withgoodness.9(withK′consistingofallthecontextswhereU1=1).

Afullexplanationisclearlyapartialexplanationwithgoodness1,butweareoften

󰀟=󰀟satisfiedwithpartialexplanationsXxthatarenotasgood,especiallyiftheyhave

󰀟=󰀟highprobability(i.e.,ifPr(Xx)ishigh).Notethat,ingeneral,thereisatension

betweenthegoodnessofanexplanationanditsprobability.

Theseideasalsoleadtoadefinitionofexplanatorypower.ConsiderExample2.1yetagain,andsupposethatthereisanendogenousrandomvariableOcorrespondingtothepresenceofoxygen.NowifO=1holdsinallthecontextsthattheagentconsiderspossible,thenO=1isexcludedasanexplanationbyEX4.Iftheagentknowsthatthereisoxygen,thenthepresenceofoxygencannotbepartofanexplanation.But

supposethatO=0holdsinonecontextthattheagentconsiderspossible,albeitaveryunlikelyone(forexample,theremaybeanothercombustiblegas).Inthatcase,O=1becomesaverygoodpartialexplanationofthefire.Nevertheless,itisanexplanationwith,intuitively,verylittleexplanatorypower.Howcanwemakethisprecise?

SupposethatthereisaprobabilitydistributionPr−onasetK−ofcontextsthatincludesK.Pr−intuitivelyrepresentstheagent’s“pre-observation”probability,thatis,theagent’spriorprobabilitybeforetheexplanandumϕisobservedordiscovered.Thus,PristheresultofconditioningPr−onϕandKconsistsofthecontextsinK−thatsatisfyϕ.G¨ardenforsidentifiestheexplanatorypowerofthe(partial)explanation󰀟=󰀟󰀟=󰀟XxofϕwithPr−(ϕ|Xx)(see[ChajewskaandHalpern1997;G¨ardenfors1988]).IfthisprobabilityishigherthanPr−(ϕ),thentheexplanationmakesϕmorelikely.5NotethatsinceKconsistsofallthecontextsinK−whereϕistrue,G¨ardenfors’notionof

󰀟=󰀟explanatorypowerisequivalenttoPr−(K|Xx).

G¨ardenfors’definitionclearlycapturessomeimportantfeaturesofourintuition.Forexample,underreasonableassumptionsaboutPr−,O=1hasmuchlowerexplanatorypowerthan,say,ML1=1.Learningthatthereisoxygenintheaircertainlyhasalmostnoeffectonanagent’spriorprobabilitythattheforestburnsdown,whilelearningthatanarsonistdroppedamatchalmostcertainlyincreasesit.However,G¨ardenfors’definitionstillhasaproblem.Itbasicallyconfoundscorrelationwithcausation.Forexample,accordingtothisdefinition,thebarometerfallingisanexplanationofitrainingwithhighexplanatorypower.

󰀟=󰀟󰀟WewouldarguethatabettermeasureoftheexplanatorypowerofXxisPr−(KX󰀐=󰀐x,ϕ|X=

󰀟=󰀟󰀟x).NotethatthetwodefinitionsagreeinthecasethatXxisafullexplanation(since

thenKX󰀐=󰀐x,ϕisjustK,thesetofcontextsinKwhereϕistrue).Inparticular,theyagreethatO=1hasverylowexplanatorypower,whileML1=1hashighexplanatorypower.Thedifferencebetweenthetwodefinitionsarisesiftherearecontextswhereϕ

󰀟=󰀟󰀟=󰀟andXxbothhappentobetrue,butXxisnotacauseofϕ.InExample4.1,thecontext󰀟u∗isonesuchcontext,sincein󰀟u∗,VictoriawenttotheCanaryIslands,butthiswasnotanexplanationofhergettingtanned,sinceitwasnotsunny.Becauseofthisdifference,forus,thefallingbarometerhas0explanatorypowerasfarasexplainingtherain.Eventhoughthebarometerfallsinalmostallcontextswhereitrains(assumethattherearecontextswhereitrainsandthebarometerdoesnotfall,perhapsbecauseitisdefective,sothatthebarometerfallingatleastsatisfiesEX4),thebarometerfallingisnotacauseoftheraininanycontext.Makingthebarometerrisewouldnotresultintherainstopping!

Again,(partial)explanationswithhigherexplanatorypowertypicallyaremorerefined

and,hence,lesslikely,thanexplanationswithlessexplanatorypower.Thereisnoobviouswaytoresolvethistension.(See[ChajewskaandHalpern1997]formorediscussionofthisissue.)

Asthisdiscussionsuggests,ourdefinitionsharessomefeatureswiththatofG¨ardenfors’[1988].Likehim,weconsiderexplanationrelativetoanagent’sepistemicstate.G¨ardenforsalsoconsidersa“contracted”epistemicstatecharacterizedbythedistributionPr−.In-tuitively,Pr−describestheagent’sbeliefsbeforediscoveringϕ.(Moreaccurately,itdescribesanepistemicstateascloseaspossibletoPrwheretheagentdoesnotascribeprobability1toϕ.)Iftheagent’scurrentepistemicstatecameaboutasaresultofobservingϕ,thenwecantakePrtobetheresultofconditioningPr−onϕ.However,G¨ardenforsdoesnecessarilyassumesuchaconnectionbetweenPrandPr−.Inany

󰀟=󰀟case,forG¨ardenfors,XxisanexplanationofϕrelativetoPrif(1)Pr(ϕ)=1,(2)

󰀟=󰀟󰀟=󰀟0Pr−(ϕ).(1)istheprobabilisticanalogue

ofEX1.Clearly,(2)istheprobabilisticanalogueofEX4.Finally,(3)saysthatlearningtheexplanationincreasesthelikelihoodofϕ.G¨ardenforsfocusesontheexplanatorypowerofanexplanation,butdoesnottakeintoaccountitspriorprobability.AspointedoutbyChajewskaandHalpern[1997],G¨ardenfors’definitionsuffersfromanotherde-󰀟=󰀟󰀟=󰀟fect:IfthereisanexplanationofYyatall,thenforalleventsXxsuchthat

󰀟=󰀟󰀟=󰀟󰀟=󰀟󰀟=󰀟󰀟=󰀟0ithasthehighestpossibleexplanatorypower.(Notethat,inourdefinition,EX3blocks󰀟=󰀟󰀟=󰀟󰀟=󰀟Xx∧YyfrombeingacauseofYy.)IncontrasttoG¨ardenfors’definition,thedominantapproachtoexplanationintheAIliterature,themaximumaposteriori(MAP)approach(see,forexample,[HenrionandDruzdzel1990;Pearl1988;Shimony1991]),focusesontheprobabilityoftheexplanation,

󰀟=󰀟thatis,whatwehavedenotedPr(Xx).6TheMAPapproachisbasedontheintuition

thatthebestexplanationforanobservationisthestateoftheworld(inoursetting,thecontext)thatismostprobablegiventheevidence.Therearevariousproblemswiththisapproach(see[ChajewskaandHalpern1997]foracritique).Mostofthemcanbedealtwith,exceptforthemainone:itsimplyignorestheissueofexplanatorypower.AnexplanationlikeO=1hasaveryhighprobability,eventhoughitisintuitivelyirrelevanttotheforestburningdown.Toremedythisproblem,moreintricatecombinationsofthe

󰀟=󰀟󰀟=󰀟quantitiesPr(Xx),Pr−(ϕ|Xx),andPr−(ϕ)havebeensuggestedtoquantifythe

󰀟=󰀟causalrelevanceofXxonϕbut,asarguedbyPearl[2000,p.221],withouttaking

causalityintoaccount,nosuchcombinationofparameterscanwork.

5TheGeneralDefinition

Ingeneral,anagentmaybeuncertainaboutthecausalmodel,soanexplanationwillhavetoincludeinformationaboutit.(G¨ardenfors[1988]andHempel[1965]makesim-

ilarobservations,althoughtheyfocusnotoncausalinformation,butonstatisticalandnomologicalinformation;wereturntothispointbelow.)Itisrelativelystraightforwardtoextendourdefinitionofexplanationtoaccommodatethisprovision.NowanepistemicstateKconsistsnotonlyofcontexts,butofpairs(M,󰀟u)consistingofacausalmodelMandacontext󰀟u.(Recallthatsuchapairisasituation.)Intuitively,nowanexplanationshouldconsistofsomecausalinformation(suchas“prayersdonotcausefires”)andthe

󰀟=󰀟factsthataretrue.Thus,a(general)explanationhastheform(ψ,Xx),whereψis󰀟=󰀟anarbitraryformulainourcausallanguageand,asbefore,Xxisaconjunctionof

primitiveevents.Wethinkoftheψcomponentasconsistingofsomecausalinformation(suchas“prayersdonotcausefires”,whichcorrespondstoaconjunctionofstatementsoftheform(F=i)⇒[P←x](F=i),wherePisarandomvariabledescribingwhetherornotprayertakesplace).Thefirstcomponentinageneralexplanationisviewedasrestrictingthesetofcausalmodels.Tomakethisprecise,givenacausalmodelM,wesayψisvalidinM,andwriteM|=ψ,if(M,󰀟u)|=ψforallcontexts󰀟uconsistentwithM.Withthisbackground,itiseasytostatethegeneraldefinition.

󰀟=󰀟Definition5.1:(ψ,Xx)isanexplanationofϕrelativetoasetKofsituationsif

thefollowingconditionshold:

EX1.(M,󰀟u)|=ϕforeachsituation(M,󰀟u)∈K.

󰀟=󰀟EX2.Xxisasufficientcauseofϕin(M,󰀟u)forall(M,󰀟u)∈Ksuchthat(M,󰀟u)|=

󰀟=󰀟󰀟=󰀟XxandM|=ψ.Xxisasufficientcauseofϕin(M,󰀟u).󰀟=󰀟󰀟′=󰀟󰀟=󰀟EX3.(ψ,Xx)isminimal;thereisnopair(ψ′,Xx′)=(ψ,Xx)satisfying

′′′′′′′′′

EX2suchthat{M∈M(K):M|=ψ}⊇{M∈M(K):M|=ψ},where

󰀟′⊆X󰀟,and󰀟M(K)={M:(M,󰀟u)∈Kforsome󰀟u},Xx′istherestrictionof󰀟xto

󰀟′.Roughlyspeaking,thissaysthatnosubsetofXprovidesathevariablesinX

sufficientcauseofϕinmorecontextsthanthosewhereψisvalid.󰀟=󰀟󰀟=󰀟EX4.(M,󰀟u)|=¬(Xx)forsome(M,󰀟u)∈Kand(M′,󰀟u′)|=Xxforsome(M′,󰀟u′)∈

K.

Example5.2:Usingthisgeneraldefinitionofcausality,letusconsiderScriven’s[1959]famousparesisexample,whichhascausedproblemsformanyotherformalisms.Paresisdevelopsonlyinpatientswhohavebeensyphiliticforalongtime,butonlyasmallnumberofpatientswhoaresyphiliticinfactdevelopparesis.Furthermore,accordingtoScriven,nootherfactorisknowntoberelevantinthedevelopmentofparesis.7ThisdescriptioniscapturedbyasimplecausalmodelMP.Therearetwoendogenousvariables,S(forsyphilis)andP(forparesis),andtwoexogenousvariables,U1,thebackgroundfactorsthatdetermineS,andU2,whichintuitivelyrepresents“dispositiontoparesis”,thatis,thefactorsthatdetermine,inconjunctionwithsyphilis,whetherparesisactuallydevelops.Anagentwhoknowsthiscausalmodelandthatapatienthasparesisdoesnotneedanexplanationofwhy:heknowswithoutbeingtoldthatthepatientmusthavesyphilisandthatU2=1.Ontheotherhand,foranagentwhodoesnotknowthecausalmodel(i.e.,considersanumberofcausalmodelsofparesispossible),(ψP,S=1)isanexplanationofparesis,whereψPisaformulathatcharacterizesMP.

Apparentlytherearenowotherknownfactors,butthisdoesnotchangetheimportoftheexample.WeremarkthatG¨ardenforsalsoconsidertwotypesofprobabilitymeasures,oneonhisanalogueofsituations,andoneontheworldsinamodel.Likehere,theprobabilitymeasureonworldsinamodelallowsexplanationswithstatisticalinformation,whiletheprobabilityonsituationsallowshimtodefinehisnotionofexplanatorypower.

8

7

18

6Discussion

Wehavegivenaformaldefinitionofexplanationintermsofcausality.Aswementionedearlier,therearenottoomanyformaldefinitionsofexplanationintermsofcausalityintheliterature.OneofthefewexceptionsisgivenbyLewis[1986],whodefendsthethesisthat“toexplainaneventistoprovidesomeinformationaboutitscausalhistory”.Whilethisviewiscompatiblewithourdefinition,thereisnoformaldefinitiongiventoallowforacarefulcomparisonbetweentheapproaches.Inanycase,ifweweretodefinecausalhistoryintermsofLewis’s[1973]definitionofcausality,wewouldinheritalltheproblemsofthatdefinition.Ourdefinitionavoidstheseproblems.

Wehavementionedonesignificantproblemofthedefinitionalready:dealingwithdisjunctiveexplanations.Disjunctionscauseproblemsinthedefinitionofcausality,whichiswhywedonotdealwiththeminthecontextofexplanation.Aswepointedoutearlier,itmaybepossibletomodifythedefinitionofcausalitysoastobeabletodealwithdisjunctionswithoutchangingthestructureofourdefinitionofexplanation.Inaddition,ourdefinitiongivesnotoolsfordealingwiththeinherenttensionbetweenexplanatorypower,goodnessofpartialbeliefs,andtheprobabilityoftheexplanation.Clearlythisisanareathatrequiresfurtherwork.

AAppendix:TheFormalDefinitionofCausality

Tokeepthispartofthepaperself-contained,wereproduceheretheformaldefinitionofactualcausalityfromPartI.

󰀟=󰀟DefinitionA.1:(Actualcause)Xxisanactualcauseofϕin(M,󰀟u)ifthefollowing

threeconditionshold:

󰀟=󰀟󰀟=󰀟AC1.(M,󰀟u)|=(Xx)∧ϕ.(Thatis,bothXxandϕaretrueintheactualworld.)󰀟W󰀟)ofVwithX󰀟⊆Z󰀟andsomesetting(󰀟AC2.Thereexistsapartition(Z,x′,w󰀟′)ofthe󰀟W󰀟)suchthatif(M,󰀟󰀟,thenvariablesin(X,u)|=(Z=z∗)forallZ∈Z

󰀟←󰀟󰀟←w󰀟W󰀟)from(󰀟(a)(M,󰀟u)|=[Xx′,W󰀟′]¬ϕ.Inwords,changing(X,x,w󰀟)to

(󰀟x′,w󰀟′)changesϕfromtruetofalse;

󰀟←󰀟󰀟′←w󰀟′←󰀟󰀟′ofW󰀟andallsubsets(b)(M,󰀟u)|=[Xx,W󰀟′,Zz∗]ϕforallsubsetsW

󰀟′ofZ󰀟.Inwords,settinganysubsetofvariablesinW󰀟totheirvaluesinw󰀟′Z

󰀟iskeptatitscurrentvalue󰀟shouldhavenoeffectonϕaslongasXx,evenif

󰀟aresettotheiroriginalvaluesinallthevariablesinanarbitrarysubsetofZ

thecontext󰀟u.󰀟isminimal;nosubsetofX󰀟satisfiesconditionsAC1andAC2.MinimalityensuresAC3.X

󰀟=󰀟thatonlythoseelementsoftheconjunctionXxthatareessentialforchanging

ϕinAC2(a)areconsideredpartofacause;inessentialelementsarepruned.

󰀟=󰀟Xxisasufficientcauseofϕin(M,󰀟u)ifAC1andAC2hold,butnotnecessarilyAC3.

WeremarkthatinPartI,aslightgeneralizationofthisdefinitionisconsidered.

󰀟andW󰀟inAC2,themoregeneraldefinitionpre-RatherthanallowingallsettingsofX

supposesasetofallowablesettings.AllthesettingsusedinAC2mustcomefromthisallowableset.Itisshownthatbygeneralizinginthisway,itispossibletoavoidanumberofproblematicexamples.

Acknowledgments:

ThankstoRiccardoPucellaandVickyWeissmanforusefulcomments.

References

Chajewska,U.andJ.Y.Halpern[1997]:Definingexplanationinprobabilisticsystems.InProc.ThirteenthConferenceonUncertaintyinArtificialIntelligence(UAI’97),pp.62–71.Eiter,T.andT.Lukasiewicz[2002]:Complexityresultsforstructure-basedcausality.ArtificialIntelligence142:1,pp.53–89.G¨ardenfors,P.[1988]:KnowledgeinFlux.Cambridge,Mass.:MITPress.

Goldberger,A.S.[1972]:Structuralequationmethodsinthesocialsciences.Econo-metrica40:6,pp.979–1001.Halpern,J.Y.andJ.Pearl[2004]:Causesandexplanations:Astructural-modelapproach.PartI:Causes.BritishJournalforPhilosophyofScience.Hempel,C.G.[1965]:AspectsofScientificExplanation.FreePress.

Henrion,M.andM.J.Druzdzel[1990]:Qualitativepropagationandscenario-basedapproachestoexplanationofprobabilisticreasoning.InUncertaintyinArtificialIntelligence6,pp.17–32.Hopkins,M.[2001]:manuscript.

Aproofoftheconjunctivecauseconjecture.Unpublished

Lewis,D.[1986]:Causalexplanation.InPhilosophicalPapers,VolumeII,pp.214–240.NewYork:OxfordUniversityPress.Lewis,D.K.[1973]:Counterfactuals.Cambridge,Mass.:HarvardUniversityPress.Pearl,J.[1988]:ProbabilisticReasoninginIntelligentSystems.SanFrancisco:MorganKaufmann.Pearl,J.[1995]:Causaldiagramsforempiricalresearch.Biometrika82:4,pp.669–710.

20

Pearl,J.[2000]:Causality:Models,Reasoning,andInference.NewYork:CambridgeUniversityPress.Salmon,W.C.[1989]:FourDecadesofScientificExplanation.Minneapolis:UniversityofMinnesotaPress.Scriven,M.J.[1959]:Explanationandpredictioninevolutionarytheory.Science130,pp.477–482.Shimony,S.E.[1991]:Explanation,irrelevanceandstatisticalindependence.InPro-ceedings,NinthNationalConferenceonArtificialIntelligence(AAAI’91),pp.482–487.Sosa,E.andM.Tooley(Eds.)[1993]:Causation.OxfordreadingsinPhilosophy.Oxford:OxfordUniversityPress.Woodward,J.[2001]:Explanation.InTheBlackwellGuidetothePhilosophyofSci-ence.Toappear.

21

因篇幅问题不能全部显示,请点此查看更多更全内容