06.01.2015 Views

Traits of a successful Monitoring, Evaluation and Reporting program

Traits of a successful Monitoring, Evaluation and Reporting program

Traits of a successful Monitoring, Evaluation and Reporting program

SHOW MORE
SHOW LESS

You also want an ePaper? Increase the reach of your titles

YUMPU automatically turns print PDFs into web optimized ePapers that Google loves.

<strong>Traits</strong> <strong>of</strong> a <strong>successful</strong><br />

<strong>Monitoring</strong>,<br />

<strong>Evaluation</strong> <strong>and</strong><br />

<strong>Reporting</strong> <strong>program</strong><br />

Nick Marsh – Yorb Pty Ltd


Supporting adaptive water planning<br />

<strong>and</strong> management<br />

----what does that mean<br />

1. Assessing compliance <strong>of</strong> water plan volumetric specifications<br />

2. Assessing ecological condition<br />

3. Ecological condition in relation to watering regimes<br />

4. Testing hypothesised hydro-ecological associations<br />

5. What about non-flow drivers such as l<strong>and</strong> management <strong>and</strong> water<br />

quality<br />

6. Short term issues – Drought / flood response<br />

• We <strong>of</strong>ten have a legislative requirement to do 1)-volumetric compliance.<br />

But we stretch the budget to achieve all 6.<br />

• So how do you get the best return on MER investment


<strong>Traits</strong> <strong>of</strong> MER Programs<br />

1. Data storage, management <strong>and</strong> dissemination.<br />

2. <strong>Reporting</strong>: What, Who, How.<br />

3. Risks <strong>and</strong> Hazards: does my bad = your bad<br />

4. Scalability (both in time <strong>and</strong> space)<br />

5. Co-ordination with other data sets in space <strong>and</strong> time<br />

6. Representativeness<br />

7. Statistical power. Is the sample size adequate


1) Data storage, management<br />

<strong>and</strong> dissemination<br />

• Are the underlying data sets easily accessible for use by public, private<br />

or research projects<br />

• Excellent data sets that are hidden from the public domain severely<br />

limit their value.<br />

• The excuses:-<br />

1. We didn’t budget a good data storage <strong>and</strong> retrieval system,<br />

2. It might be used for evil if it gets into the wrong h<strong>and</strong>s<br />

3. The Bureau <strong>of</strong> Meteorology will do it<br />

• Example solution:<br />

• Vic Data warehouse<br />

• EHMP data portal


1) Data Storage continued..<br />

Data mashup …Big Data…<br />

• The opportunities <strong>of</strong> combining datasets from novel sources is<br />

largely untapped in NRM.<br />

• Consider the finance sector – potential data sources to inform<br />

their decision making includes stock <strong>and</strong> commodity data,<br />

international long term weather forecasts, community trends<br />

expressed through blogs <strong>and</strong> tweets…<br />

• By making datasets easy to access allows them to be applied in<br />

unexpected <strong>and</strong> potentially useful ways. How does the<br />

international cotton price effect wetl<strong>and</strong> health<br />

• Data storage <strong>and</strong> delivery should focus on providing maximum<br />

volume <strong>and</strong> ease <strong>of</strong> access. Let the researchers decide its value.


2) <strong>Reporting</strong><br />

• What is the required rigor <strong>of</strong> the analysis, presenting ‘condition’ scores is<br />

a different level <strong>of</strong> data collection <strong>and</strong> analysis to presenting a ‘trend’<br />

or testing an hypothesis.<br />

• <strong>Reporting</strong> should define the minimum data collection, not the other<br />

way around.<br />

3) Risks <strong>and</strong> Hazards<br />

• Is my bad = your bad<br />

• For large spatial scales – differences in defining harm can be big – is low<br />

dissolved oxygen in Tasmanian streams the same as low dissolved<br />

oxygen in Northern Territory streams<br />

• the European Union Water Directive development monitoring <strong>and</strong><br />

reporting requirements based on ‘equivalency’ panels – so reporting <strong>of</strong><br />

adverse conditions has the same meaning.


4) Scalability (both in time <strong>and</strong> space)<br />

• Can the <strong>program</strong> be exp<strong>and</strong>ed <strong>and</strong> contracted in line with monitoring<br />

requirements <strong>and</strong> budgetary limitations<br />

• ….is it OK to report at a catchment level every other year rather than<br />

reach level.<br />

• …..what statistical power is lost if 1/3 the sites are not sampled for 1<br />

year<br />

• Are there well documented protocols <strong>and</strong> training <strong>program</strong>s that allow<br />

a wide pool <strong>of</strong> monitoring expertise so the <strong>program</strong> is not expertise<br />

limited


5) Co-ordination with other data<br />

sets in space <strong>and</strong> time<br />

Gauge sites <strong>and</strong> EPA reaches<br />

• Do the monitoring <strong>program</strong>s coincide with other data collection<br />

<strong>program</strong>s <strong>and</strong> could this add value to the MER <strong>program</strong>s<br />

• hydrometric measurement stations<br />

• regions <strong>of</strong> Lidar surveys or<br />

• vegetation mapping studies<br />

• research <strong>program</strong>s.


6) Representativeness<br />

• What are the approaches for site selection, <strong>and</strong> does the method <strong>of</strong> site<br />

selection restrict future use <strong>of</strong> the data<br />

• Example – non-r<strong>and</strong>om site selection, how ‘dry’ sites are recorded<br />

7) Adequate statistical power<br />

• Adequate statistical power to detect target effect size: Is the sample<br />

size adequate The measurement <strong>of</strong> any parameter is only useful if the<br />

sample size is large enough to detect a change that would trigger<br />

some regulatory response.


Compare a few <strong>program</strong>s<br />

Trait FARWH EHMP California<br />

Water Board<br />

Chesapeake<br />

Bay<br />

(Estuarine)<br />

Chesapeake<br />

Bay<br />

(Freshwater<br />

)<br />

EU Water<br />

Framework<br />

directive<br />

Purpose (X indicates purpose <strong>of</strong> <strong>program</strong>)<br />

Collection X X<br />

Integration X X X X X X<br />

<strong>Reporting</strong> X X X X X X<br />

<strong>Traits</strong> (L = low emphasis in <strong>program</strong>, M= Medium emphasis in <strong>program</strong>, H= High emphasis in <strong>program</strong>)<br />

Data storage,<br />

management, <strong>and</strong><br />

dissemination<br />

L H M H L L<br />

<strong>Reporting</strong> M H M H M L<br />

Risks <strong>and</strong> Hazards L M L M L M<br />

Scalability H M M M M H<br />

Co-ordination with M M L H L L<br />

other data sets<br />

Representativeness N/A H N/A H H N/A<br />

Statistical power N/A H N/A N/A H N/A


<strong>Monitoring</strong> purpose<br />

Pressure<br />

e.g<br />

Cease to Flow, No. Zero Flow days<br />

Measure:<br />

Early Warning<br />

Response<br />

e.g<br />

Critical NDVI values<br />

• Pressure<br />

• Response<br />

Compliance<br />

Pressure<br />

Response<br />

e.g<br />

e.g<br />

Cease to Flow, ∆ Baseflow Index<br />

Water quality thresholds<br />

• Both<br />

Pressure<br />

e.g<br />

∆ Baseflow Index, ∆ Water quality<br />

Diagnostic<br />

Response<br />

e.g<br />

SIGNAL Score, Trait representation<br />

MER Trait Early Warning Compliance Diagnostic<br />

Data storage, management <strong>and</strong><br />

dissemination<br />

Critical Critical Important<br />

<strong>Reporting</strong>: Critical Critical N/A<br />

Risks <strong>and</strong> Hazards: Critical Important Important<br />

Scalability Important Critical important<br />

Co-ordination with other data sets in<br />

space <strong>and</strong> time<br />

important N/A Critical<br />

Representativeness. Important Important Important<br />

Adequate statistical power Important Critical Critical


MER for adaptive water planning<br />

<strong>and</strong> management<br />

AWPM<br />

• Data storage <strong>and</strong> dissemination should<br />

be a high priority<br />

• Centres <strong>of</strong> distribution <strong>and</strong> agreed<br />

measures <strong>of</strong> ‘equivalence’ but no need<br />

to centralise the collection <strong>and</strong> reporting<br />

• Water resource planning <strong>and</strong><br />

management is driven by environmental<br />

requirements<br />

• Diagnostic reporting is critical – get long<br />

term biological data sets into the public<br />

domain<br />

Purpose<br />

Collection<br />

Integration<br />

<strong>Reporting</strong><br />

<strong>Traits</strong><br />

Data storage,<br />

management, <strong>and</strong><br />

dissemination<br />

<strong>Reporting</strong><br />

Risks <strong>and</strong> Hazards<br />

Scalability<br />

Co-ordination with<br />

other data sets<br />

Representativeness<br />

Statistical power<br />

X<br />

X<br />

X<br />

H<br />

M<br />

H<br />

H<br />

H<br />

H<br />

H

Hooray! Your file is uploaded and ready to be published.

Saved successfully!

Ooh no, something went wrong!