Movatterモバイル変換


[0]ホーム

URL:


PhilPapersPhilPeoplePhilArchivePhilEventsPhilJobs
Order:

1 filter applied
  1.  53
    Against explainability requirements for ethical artificial intelligence in health care.Suzanne Kawamleh -2023 -AI and Ethics 3 (3):901-916.
    No categories
    Direct download(2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  2.  64
    Can Machines Learn How Clouds Work? The Epistemic Implications of Machine Learning Methods in Climate Science.Suzanne Kawamleh -2021 -Philosophy of Science 88 (5):1008-1020.
    Scientists and decision makers rely on climate models for predictions concerning future climate change. Traditionally, physical processes that are key to predicting extreme events are either directly represented or indirectly represented. Scientists are now replacing physically based parameterizations with neural networks that do not represent physical processes directly or indirectly. I analyze the epistemic implications of this method and argue that it undermines the reliability of model predictions. I attribute the widespread failure in neural network generalizability to the lack of (...) process representation. The representation of climate processes adds significant and irreducible value to the reliability of climate model predictions. (shrink)
    Direct download(3 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  3.  29
    Confirming (climate) change: a dynamical account of model evaluation.Suzanne Kawamleh -2022 -Synthese 200 (2):1-26.
    Philosophers of science have offered various accounts of climate model evaluation which have largely centered on model-fit assessment. However, despite the wide-spread prevalence of process-based evaluation in climate science practice, this sort of model evaluation has been undertheorized by philosophers of science. In this paper, I aim to expand this narrow philosophical view of climate model evaluation by providing a philosophical account of process evaluation that is rooted in a close examination of scientific practice. I propose dynamical adequacy as a (...) metric by which scientists test and evaluate models that represent and produce key regional climate processes and features. I argue that process-based evaluation confirms the adequacy of a regional climate model for simulating and predicting future changes of a specific regional climate feature. I offer a case study of how, in practice, scientists establish the reliability of model projections by assessing the dynamical adequacy of such models. I also show how process-based evaluation mitigates some well-known shortcomings of model-fit assessment and supports the adequacy and reliability of climate model projections against philosophical objections, like confirmational holism. Such adequacy is especially important for decision makers who need reliable regional model projections to guide their climate change mitigation and adaptation policies. (shrink)
    Direct download(3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
Export
Limit to items.
Filters





Configure languageshere.Sign in to use this feature.

Viewing options


Open Category Editor
Off-campus access
Using PhilPapers from home?

Create an account to enable off-campus access through your institution's proxy server or OpenAthens.


[8]ページ先頭

©2009-2025 Movatter.jp