Grant Selection and Performance Task Force

PRESENT: Ed Barrett, Rob Meinders, Matt L’Italien, Becky Hayes-Boober, Michael Moran, Jamie McFaul, Maryalice Crofton

The task force convened by videoconference at 8:30 am.

State process comparison of RFP vs. RFA. State Procurement Services has recognized the differences between grants and competitive bids for goods/services. They have designated Request for Applications as the grant format and developed a different site for grants. This happened over the last year.

Through procurement services, we have been living under the RFP rules for years and needed constant waivers. In RFPs, cost is the most significant factor in scoring, the money was awarded according to rank scoring (highest point score first) until we ran out of money. With the RFA process, instead of having one or two bidders’ conferences we would have more coaching opportunities for applicants.

We are proposing to move to the RFA process, which has huge implications. First, the Funding priorities have a stronger role in scoring, new language in the RFA will reflect new Commission funding discretion, the potential applicants would be coached through series of training sessions (which would be recorded), and applicants would still have right to appeal decisions made. When making final recommendations to the Commission for funding, the task force would first identify all applicants eligible for consideration (i.e., remove any that should not be funded). It would then separate applicants into two categories: match the funding priorities, other. It would first consider awards to those in the funding priorities category. In the event there are funds that are not allocated there, the “other” category would be considered.

We plan to standardize the things that we have control over, especially in the AmeriCorps Formula competition. Staff have a series of recommendations for task force consideration. 
•    AmeriCorps formula operating grants both rural and standard: standardize the narrative and only collect what required by federal statute and regulations. The logic models are giving condensed info and not the level of information task force members have wanted, like how members will be supervised. We would continue requiring performance measures for member development and community engagement.
•    Planning grant narrative content: no change.
Staff question: is there value to asking whether this work been done in the past by other volunteers or employees? The concern is two-fold. First, this certification is in a part of the application which is done at the last minute by applicants using check boxes and the language is buried in a lengthy legal pop-up screen. Second, as ARP starts to wane in the next few years, organizations may start looking to AmeriCorps to step in where ARP had covered employees. This happened with ARRA. Task force members concluded the question should be added as a certification like the labor union concurrence and submitted as an additional document. 
•    Assessing program sustainability. Need to find a better way to assess applicant capability and stability. The number of issues that don’t surface until the program is well into the first year is becoming a problem. In the last rural grant competition, the task force experimented with using a structured interview rather than submitted documents (AmeriCorps readiness). The recommendation is to adopt the interview format and use it in all formula and planning grant competitions. 

Task force members discussed the merits of doing the interviews as a panel or having applicants interviewed and recorded for later viewing by the task force. The decision was to go with recorded interviews and continue using the structured interview questions developed for the trial period.

Task force members raised two additional issues. It was agreed the interviews would not be done for continuation years, just during the initial application and would include recompeting organizations. On another issue, the consideration of budget and cost was raised. With the deliberate increase in fixed amount grants, scoring the financial plan seems impossible. Staff noted that, in fixed amount grants, the insight into sufficient resources shows up under Source of Funds. The task force can now look at the degree of specificity about where funds will come from and how much is speculative versus committed. Definitely understand the confusion when shifting from review of expenses to review of supporting funds.

Next discussion item is a review of the tech review scores. The criteria here are partly in the Code of Federal Regulations in a section “states shall also consider.” The Commission criteria also comes into play. Task force members had previously raised concerns about how some elements were joined together as one scoring item. The staff proposal for modifying points across categories addresses that issue by splitting into three, gives funding priorities more weight, and standardizes the distribution across planning and operating grants. 

Task force members discussed the point distribution and made some adjustments to better emphasize the importance of some elements. See chart below for final distribution.

New Grant Task Force Tech Review point distribution
  Planning

Operating
Standard + Rural

Funding Priority alignment 30 25
Program Model n/a 10
Commission Preferences (rural, partnership, marginalized community) 30 (10 ea) 15 (5 ea)
Financial Plan (for grant) 10 10
Fiscal Systems 15 15
Past Performance n/a 10
Grant Readiness 15 15

The new distribution will be used in the RFAs to be released this summer.

Meeting was adjourned at 9:35 a.m.