Grant Selection and Performance Task Force

May 11, 2016

 

Present: Ed Barrett, Rob Liscord, David Wihry, Joe Young

Absent: Kevin Bois, Laura Hudson, Nicole Pellenz

Staff: Michael Ashmore, Pelin Fitzpatrick

Guests: John Portela

Minutes:

  1. Support Grant Performance Measure updates from Regional Conference

John Portela participated in this portion of the meeting to share his experience and thoughts about the National Service Atlantic Region Conference he attended in Springfield (staff members, MA and PF also attended).

JP made the following observations: That the conference, though not particularly designed for Commissioners, was informative and was, especially, an opportunity to meet with other programs and staff members from other areas of the Region.  JP described his participation in a session with Bill Basl, Director of AmeriCorps, noting his interest in leadership and why it is critical to take the time to step outside of day to day work and focus on leadership tasks.  He also attended sessions on Asset-based Community Development, external marketing and IOG findings.  He described the recent OIG findings regarding NACHC, noting that there is no connection to the Maine Commission and that NACHC no longer operates in Maine.

JP went on to describe the post conference session with ASC and representatives of other states in the region concerning performance measures for Commission Support Grants.  He described the session as robust and informative, relaying that the representatives from Maine were able to push for consideration of capacity in rural states and states with limited Commission staffing resources.  In particular, the discussion centered on the draft PM’s sent out by ASC.  Maine reps stated that they did not want to be only assumed to manage AC grants solely because other Commission activities might not be common to all Commissions or might be difficult to measure.

DW thanked JP for his willingness to report out to the Grants Task Force and to take the time to attend the conference in Springfield. JP, in turn, thanked DW for his ongoing participation in these discussions with ASC.

  1. Review of Risk Management Tool testing results

The members reviewed the results of the “test drive” of the proposed Risk Management tool.MA and PF used the tool to do assessments of current grantees to see if the scores resulting on the tool matched the sense of overall risk and to see if the draft scoring scale had appropriate benchmark score splits.

MA reported that, overall, the tool worked very well.  It was reasonably simple to execute once the data sources were identified and could be applied to a grantee in 15 to 20 minutes.  The tool was used on 5 current grantees and one past grantee.

There are two questions that would be assessed based on interview contact with a sample of site supervisors and members – these tools are not yet developed.  Member suggested updating the split points once these are in place.

Observations included the fact that new grantees would not have enough data to be realistically scored, but existing policy indicates that all new grantees would be considered high risk.  Recommendation was to add a section on the form indicating new grantees noting their high risk status.  It was also noted that Fixed price grants does not get scored on two of the included metrics, thereby lowering their overall score.  Since Fixed price grants are designed to be lower risk, this result is appropriate but one member suggested giving them at least a single point in those areas.  Scoring of EMDC was also discussed as it was a challenge to score them given the shared history of their program management.  It was noted that they would also be a new grantee and be considered high risk.  Members suggested that the medium risk score band be narrowed by lowering the high risk band to 80 points.

Timing of the assessment was also discussed.  Members settled on developing a procedure where this would be used immediately following the submission of Fall Grantee progress Reports to CNCS, since the data accumulated for that reporting would be readily available.  Next steps are to align the current risk management policy with the form and its use and to develop the questionnaires for supervisors and members.

  1. Dates for Formula Continuation Review

AmeriCorps State Formula Applications were submitted last week and must be reviewed by the June Commission meeting.  Key review dates are noted here:

5-12-16        Continuation applications due with associated documents

5-17-16        Materials posted for review

5-23-16        All Staff Assessments posted

6-6-16           Consensus meeting to develop recommendations and establish any conditions

6-8-16           GTF confirms recommendations of review panel

6-10-16        Materials posted with Commission agenda

6-17-16        Commission votes on continuation requests

 

  1. Mid-Year GPR's

MA shared copies of the submitted Grantee Progress report summaries sent to CNCS before the April 30 deadline.  He summarized and explained the information included on the documents, noting that Performance measure data is rarely complete at this time of year because data collection is ongoing.  He also pointed out that some programs had “red” indicators on exit approval timing because they had not exits to date so the system defaults to that indication. 

He also noted that total enrollment is the lowest it has ever been in Maine and that this is a problem for programs across the country.  He described the work at the most recent AmeriCorps Technical Assistance meeting to improve recruitment going it the upcoming program year,.

  1. Disallowance and MCC Audit update

MA informed the group he attended a session on disallowance resulting from errors in sub-recipient background check processing while he was attending the Regional Conference.  He shared that CNCS has moved away from 100%disallowance of funds related to record check errors in favor of a risk management process that assigns a value to each error based on its total risk and what level of mitigation the program had or has in place – e.g. the frequency of incidence and the type of error(s) made influence the total amount of disallowance.  CNCS has issued draft guidance on how to apply this in the field and has developed a form to use in determining the total disallowance. MA noted that MCC had a total of 18 errors in processing background checks over the past two grant years and that he would be using the guidance to develop the total disallowance.