Judging Criteria

Judging at ISEF Affiliated Fairs

Dsc 2248


Every Intel ISEF affiliated fair has its own methodology for judging projects at their fair. We provide the following tips and judging criteria as suggested aids in your process. The following points may be of value to you and your judges as they go out to review and score projects.

  • Examine the quality of the finalist’s work, and how well the finalist understands his or her project and area of study. The physical display is secondary to the student’s knowledge of the subject. Look for evidence of laboratory, field or theoretical work, not just library research or gadgeteering.
  • Judges should keep in mind that competing in a science fair is not only a competition, but an educational and motivating experience for the students. The high point of the fair experience for most of the students is their judging interviews.
  • Students may have worked on a research project for more than one year. However, for the purpose of judging, ONLY research conducted within the current year is to be evaluated. Although previous work is important, it should not unduly impact the judging of this year’s project.
  • As a general rule, judges represent professional authority to finalists. For this reason, judges should use an encouraging tone when asking questions, offering suggestions or giving constructive criticism. Judges should not criticize, treat lightly, or display boredom toward projects they personally consider unimportant. Always give credit to the finalist for completing a challenging task and/or for their success in previous competitions.
  • Compare projects only with those competing at this fair and not with projects seen in other competitions or scholastic events.
  • It is important in the evaluation of a project to determine how much guidance was provided to the student in the design and implementation of his or her research. When research is conducted in an industrial or institutional setting, the student should have documentation, most often the Intel ISEF Form 1C, that provides a forum for the mentor or supervisor to discuss the project. Judges should review this information in detail when evaluating research.
  • Please be discreet when discussing winners or making critical comments in elevators, restaurants, or elsewhere, as students or adult escorts might overhear. Results are confidential until announced at the awards ceremony.


Evaluation Criteria for Category Judging
The criteria and questions below are used by the Grand Awards Judges of the Intel ISEF and is suggested as a guide for your category judging. Scientific Thought and Engineering Goals are separated into IIa. and IIb. to be used appropriately by category. There are also added questions for team projects.


I. Creative Ability (Individual - 30, Team - 25)

1. Does the project show creative ability and originality in the questions asked?

• the approach to solving the problem?, the analysis of the data?, the interpretation of the data?
• the use of equipment?, the construction or design of new equipment?

2. Creative research should support an investigation and help answer a question in an original way.
3. A creative contribution promotes an efficient and reliable method for solving a problem. When evaluating projects, it is important to distinguish between gadgeteering and ingenuity.


II a. Scientific Thought (Individual - 30, Team - 25)
If an engineering project, the more appropriate questions are those found in IIb.
Engineering Goals.


1. Is the problem stated clearly and unambiguously?
2. Was the problem sufficiently limited to allow plausible approach? Good scientists can identify important problems capable of solutions.
3. Was there a procedural plan for obtaining a solution?
4. Are the variables clearly recognized and defined?
5. If controls were necessary, did the student recognize their need and were they correctly used?
6. Are there adequate data to support the conclusions?
7. Does the finalist or team recognize the data’s limitations?
8. Does the finalist/team understand the project’s ties to related research?
9. Does the finalist/team have an idea of what further research is warranted?
10. Did the finalist/team cite scientific literature, or only popular literature (i.e., local
newspapers, Reader’s Digest).


II b. Engineering Goals (Individual - 30, Team -25)
1. Does the project have a clear objective?
2. Is the objective relevant to the potential user’s needs?
3. Is the solution workable? acceptable to the potential user? economically feasible?
4. Could the solution be utilized successfully in design or construction of an end product?
5. Is the solution a significant improvement over previous alternatives?
6. Has the solution been tested for performance under the conditions of use?


III. Thoroughness (Individual - 15, Team - 12)
1. Was the purpose carried out to completion within the scope of the original intent?
2. How completely was the problem covered?
3. Are the conclusions based on a single experiment or replication?
4. How complete are the project notes?
5. Is the finalist/team aware of other approaches or theories?
6. How much time did the finalist or team spend on the project?
7. Is the finalist/team familiar with scientific literature in the studied field?


IV. Skill (Individual - 15, Team - 12)
1. Does the finalist/team have the required laboratory, computation, observational and design skills to obtain supporting data?
2. Where was the project performed? (i.e., home, school laboratory, university laboratory) Did the student or team receive assistance from parents, teachers, scientists or engineers?
3. Was the project completed under adult supervision, or did the student/team work largely alone?
4. Where did the equipment come from? Was it built independently by the finalist or team? Was it obtained on loan? Was it part of a laboratory where the finalist or team worked?


V. Clarity (Individual - 10, Team - 10)
1. How clearly does the finalist discuss his/her project and explain the purpose, procedure, and conclusions? Watch out for memorized speeches that reflect little understanding of principles.
2. Does the written material reflect the finalist’s or team’s understanding of the research?
3. Are the important phases of the project presented in an orderly manner?
4. How clearly is the data presented?
5. How clearly are the results presented?
6. How well does the project display explain the project?
7. Was the presentation done in a forthright manner, without tricks or gadgets?
8. Did the finalist/team perform all the project work, or did someone help?


VI. Teamwork (Team Projects only- 16)
1. Are the tasks and contributions of each team member clearly outlined?
2. Was each team member fully involved with the project, and is each member familiar with all aspects?
3. Does the final work reflect the coordinated efforts of all team members?

Addendum to Category Judging
Projects are to be judged for First, Second, Third or Honorable Mention ribbon relative to the ISEF judging criteria, taking into consideration the grade level of the student. Each project is to be judged relative to itself and the criteria, rather than relative to others in its category. Thus, all projects in a category could be awarded a first place ribbon (rare) or no project could be awarded a first place (also rare).

A project which deserves a first place ribbon is one in which all criteria are well satisfied. A second place might be awarded to a project somewhat deficient in one or more categories. A third place might be awarded to a project which is seriously deficient in one or more categories, but which displays a significant effort or learning experience (e.g., a "good try"). Projects which fail to meet any of the above would be awarded the Honorable Mention ribbon - however you should try to use this sparingly.

The decision of the judges should be made by circling the appropriate level on the front of the judging card, and constructive criticism and/or encouragement should be offered in the space for comments on the back of the cards. Comments are extremely important to the students, as they offer tangible evidence of the interest of the judges in their work.