Accuracy and uncertainty

From Opasnet
Jump to: navigation, search
The text on this page is taken from an equivalent page of the IEHIAS-project.

The results of assessments clearly need to be accurate if they are to provide reliable and robust information to policy-makers and other stakeholders concerned about environmental health issues. Accuracy, however, is not an absolute, but a matter of degree. All assessments are to some degree uncertain (and thus contain inaccuracies), if only because the phenomena being studied are themselves variable and somewhat unpredictable. What matters is whether the results are accurate enough to serve their purpose. To know this, we need to understand, and be able to evaluate, the uncertainties involved in the assessment and their implications for the consequences of any decisions that might be made.

Identifying how uncertainties might arise, where they come from, and how they affect the results is therefore an important element of any assessment. This is not something that can be left to the end, and 'bolted on' to the analysis once the assessment is complete. To do so would not only risk missing important uncertainties, which were not evident in the results, but would also mean that the accuracy of the results might be unnecessarily compromised: uncertainties that could have been avoided or reduced by judicious changes in the assessment procedures would have been allowed to develop and survive. Instead, careful examination of potential sources of uncertainty needs to be carried out during the Design stage, in order to:

  1. identify likely areas of uncertainty;
  2. map out how these uncertainties might persist and propagate through the assessment process;
  3. assess the likely scale of the uncertainties involved;
  4. evaluate the potential overall effect of the various uncertainties, in combination, on the assessment results;
  5. define and evaluate possible ways of reducing or eliminating the uncertainties.

As with most elements of Design, this is not a one-off procedure. The first four steps require reiteration, for example, in order to explore the possible consequences of any changes in the assessment methodology suggested in step 5. And the whole process needs to be repeated at intervals throughout the assessment, in order to ensure that he initial evaluation is still valid, and that unforeseen uncertainties have not arisen as the assessment has progressed.

None of this is easy, for uncertainties take many forms and, by their very nature, arise largely from lack of prior knowledge - e.g. about the state of the system under consideration, how it operates, or how these processes can be simulated and represented. Uncertainty analysis is therefore an attempt to explore and describe the unknown. Users also often have a very shaky appreciation of both the conceptual and statistical characteristics of uncertainty, so need help in understanding what the results of uncertainty analysis actually mean. For these reasons, special attention needs to be given to ensuring that:

  1. a clear conceptual framework is used for characterising uncertainties;
  2. methods of specifying and evaluating the uncertainties are consistent, robust and reliable;
  3. the results of the uncertainty analysis are reported in an understandable and unambiguous form.

Further information on concepts and methods for characterising uncertainty is provided by the links below.

References

See also

Integrated Environmental Health Impact Assessment System
IEHIAS is a website developed by two large EU-funded projects Intarese and Heimtsa. The content from the original website was moved to Opasnet.
Topic Pages
Toolkit
Data

Boundaries · Population: age+sex 100m LAU2 Totals Age and gender · ExpoPlatform · Agriculture emissions · Climate · Soil: Degredation · Atlases: Geochemical Urban · SoDa · PVGIS · CORINE 2000 · Biomarkers: AP As BPA BFRs Cd Dioxins DBPs Fluorinated surfactants Pb Organochlorine insecticides OPs Parabens Phthalates PAHs PCBs · Health: Effects Statistics · CARE · IRTAD · Functions: Impact Exposure-response · Monetary values · Morbidity · Mortality: Database

Examples and case studies Defining question: Agriculture Waste Water · Defining stakeholders: Agriculture Waste Water · Engaging stakeholders: Water · Scenarios: Agriculture Crop CAP Crop allocation Energy crop · Scenario examples: Transport Waste SRES-population UVR and Cancer
Models and methods Ind. select · Mindmap · Diagr. tools · Scen. constr. · Focal sum · Land use · Visual. toolbox · SIENA: Simulator Data Description · Mass balance · Matrix · Princ. comp. · ADMS · CAR · CHIMERE · EcoSenseWeb · H2O Quality · EMF loss · Geomorf · UVR models · INDEX · RISK IAQ · CalTOX · PANGEA · dynamiCROP · IndusChemFate · Transport · PBPK Cd · PBTK dioxin · Exp. Response · Impact calc. · Aguila · Protocol elic. · Info value · DST metadata · E & H: Monitoring Frameworks · Integrated monitoring: Concepts Framework Methods Needs
Listings Health impacts of agricultural land use change · Health impacts of regulative policies on use of DBP in consumer products
Guidance System
The concept
Issue framing Formulating scenarios · Scenarios: Prescriptive Descriptive Predictive Probabilistic · Scoping · Building a conceptual model · Causal chain · Other frameworks · Selecting indicators
Design Learning · Accuracy · Complex exposures · Matching exposure and health · Info needs · Vulnerable groups · Values · Variation · Location · Resolution · Zone design · Timeframes · Justice · Screening · Estimation · Elicitation · Delphi · Extrapolation · Transferring results · Temporal extrapolation · Spatial extrapolation · Triangulation · Rapid modelling · Intake fraction · iF reading · Piloting · Example · Piloting data · Protocol development
Execution Causal chain · Contaminant sources · Disaggregation · Contaminant release · Transport and fate · Source attribution · Multimedia models · Exposure · Exposure modelling · Intake fraction · Exposure-to-intake · Internal dose · Exposure-response · Impact analysis · Monetisation · Monetary values · Uncertainty
Appraisal