RSS News Feed

How to Submit a Proposal to the Maryland Space Grant Consortium

Email your proposal to Dr. Matthew Collinge at: collinge@jhu.edu


Anyone may propose––there is no prescribed format for proposals. Proposals should be no longer than a few pages, and must include a detailed budget with identification of any non-Federal matching funds. Proposals must have a NASA interest, and may be in any area of education, public understanding of NASA objectives, space-based research, and the like, that address NASA’s major education goals:

  • Strengthen NASA and the Nation’s future workforce
  • Attract and retain students in STEM disciplines
  • Engage Americans in NASA’s mission

A proposal’s probability of being accepted is enhanced if:

  • Proposal is from a Maryland institution or individual.
  • Proposal is for less than about $20,000 (the smaller the better)
  • You already have half of what you need from non-Federal funds, so you are making a 100% non-Federal match to what you are requesting. (The match can be in-kind, and can include regular staff salaries.)
  • The proposal involves support for STEM students (and/or, aims at attracting/retaining such students). STEM is “Science, Technology, Engineering, and Mathematics.”
  • The proposal has a NASA Center connection, especially Goddard Space Flight Center.
  • The proposal involves more than one Maryland institution.
  • The proposal has a strong diversity emphasis (involvement of under-represented groups).
  • The proposal has a well-founded Evaluation Plan detailed (this is required).

No-no’s:

  • No overhead may be charged (MDSGC does not charge overhead to NASA).
  • No equipment may be purchased (Federal regulation for training grants).
  • Do not request support for regular staff salaries.
  • Only US citizens (students/faculty) may receive direct NASA Space Grant support.

Proposals are peer-reviewed by the Program Committee of Maryland Space Grant Consortium. We pride ourselves on quick action and minimal bureaucracy. The proposer may be asked to present their proposal in person at a Program Committee meeting, and/or to report back results at a Program Committee meeting following the conclusion of the project. A final report is required.

SOME HINTS FOR EVALUATING SMALL PROJECTS

The Maryland Space Grant Consortium (MDSGC) requires that all of the projects and programs that it funds provide a method of evaluation for the effort.  This description should be sufficiently detailed so that the Program Committee can feel confident that the program will be evaluated effectively to determine if it met the stated goals of the program.

It is important to realize that many proposal teams will not have expertise in program evaluation.  MDSGC does have such expertise available and will be happy to discuss the issues with potential proposers.  Another good source of information about evaluation for small projects is posted on the Space Telescope Science Institute’s website (STScI is a member of MDSGC).  The article by Bonnie Eisenhamer of STScI can be found at http://ideas.stsci.edu/Evaluation.shtml

Another important consideration is that MDSGC grants are small grants, which means that you can’t spend huge resources on an extensive evaluation, such as hiring a professional external evaluator, etc.  So what can you do that is reasonable for a small grant?  The above url is focused on answering that problem for the (now defunct) IDEAS program, which had grants of similar size, so it is quite useful.  A general rule of thumb is that 10% of a program’s resources should be devoted to evaluation, but you will have to be flexible in planning for a small program.

Before you think about your evaluation, you need to have a clear plan of what your program is.  You need to identify the target audience – who will be the beneficiaries of this program?  Then, what is the outcome that you want to happen with this audience?

One of the most common mistakes for proposers is to indicate a broad audience, thinking that the more people that they target the better.  Actually, you need to be very specific, so that a modest program has a chance of achievable goals.  When a proposal indicates that it will serve K-12th grade students, it receives poor marks from reviewers.  These audiences have very different needs and require different approaches.  Similarly, there are differences in the audience needs between students majoring in science, technology, engineering, and mathematics (STEM) fields and general education students taking an introductory STEM class.  Your goals for different audiences should be appropriate for that audience.

Evaluation is frequently divided into formative or summative in nature.  A formative evaluation is one that is done during the course of the project, and is generally aimed at improving the process or product during development.  This could include such classical activities as prototype testing, formal reviews, beta testing with end-users, etc.  The key here is that you use the results of the evaluation to improve the product or service being developed.  A second type of formative evaluation occurs when you are conducting your program.  For example, you are developing an internship experience for college engineering freshman.  Midway through the program you assess how well the program that you have designed is meeting the students’ needs.  This is commonly done in university classes by a questionnaire that the students fill out.  You might use such a method.  By doing this during the internship, you are able to make mid-course corrections based on the information that you receive.

For many small projects, the activity is of short duration and there isn’t a great deal of opportunity to assess the program mid-stream.  You may be compelled to use only a summative evaluation.  Essentially this is one that is done after the project is complete and evaluates how successful it was.  Once again, you have to have defined your desired outcomes in order to decide how to design an evaluation that determines if you attained them.

Formal vs. Informal Education
There are clearly significant differences in the audience and goals between formal education activities and informal education activities such as those that occur at museums, planetaria, and public observatories.  In the informal education arena you usually do not have control over who participates (the general public, perhaps) and so your goals have to be commensurate with a “walk in” audience.  Some short post experience interviews or quick questionnaires can be used to get a sampling of responses to the exhibit, show, etc.  During development, a focus group can be very useful for formative evaluation.

Common Mistakes Made in Selecting Goals and Objectives

Too Broad: Many proposals suggest very broad and lofty goals that cannot be met by a modest project.  If you are conducting, for example, an event for middle school students, then an indication that your goal is to improve their understanding and appreciation on STEM subjects is far too broad for a single event (or even in a huge number of events!).  You should identify the specific content that you wish the students to learn as a result of their participation.

Giving a long list of goals for a small project is equally unlikely to be successful.  Select one or a very few specific ones.  Specific goals that identify the expected outcome are what work.

Unmeasurable: This category covers a number of common problems.  Your objective must be something that can be measured, that you are able to measure, that you identify how you will measure it.

If you indicate that you will improve the knowledge about some content in your participants, then you would need a mechanism that measures their level of knowledge before the involvement and then afterwards.  This will require a mechanism to follow-up with the participants.  In some cases you may have continuing contact with this audience and may have such a mechanism, but in others (people participating in a public event, for example) you may not have such an opportunity.

If your stated goal is to excite the public about, for example, NASA missions, then you will need to know how you will determine that you have accomplished this goal.  This is hard without obtaining interviews or other direct feedback from participants.  Some useful indicators may be if they return for subsequent experiences or seek out for more sources of information.  How will you know this?  Do you have a mechanism to determine it?

Consider the question: Was the workshop useful to you?  A yes or no response supplies some information, but not much that is of any use to interpret whether the goals of the workshop were met.  Yes or no can be too limiting in obtaining answers as well.  That is why you will frequently see people use a Likert questionnaire format.  In this, a statement is made and the respondent is asked to strongly agree, agree, neither agree nor disagree, disagree, or strongly disagree with the statement.

Is it the correct question?
The data collected must answer the question of whether the goal is met.  All too often program developers ask about whether or not the participants enjoyed the activity.  If your goal was for the participants to have enjoyed themselves, then this is appropriate, but it does not tell you whether or not the activity was useful to the participants or met any other goals.

 


Print this page


MDSGC Logo
Join The MDSGC On:
Stay Informed
Calendar of Events