RELATED APPLICATIONS The present application relates to, and claims priority from, U.S. Provisional Application No. 60/544,657, filed on Feb. 14, 2004, and entitled “Method and System for Improving Performance on Standardized Examinations.” The provisional application names Tarek A. Fadel and Kingsley Martin as joint inventors. The provisional application is incorporated by reference herein in its entirety including the specifications, drawings, claims, abstracts and the like.
FEDERALLY SPONSORED RESEARCH OR DEVELOPMENT [Not Applicable]
MICROFICHE/COPYRIGHT REFERENCE [Not Applicable]
BACKGROUND OF THE INVENTION The present invention generally relates to a computerized learning approach. In particular, the present invention relates to a system and method for improving performance on timed examinations.
Many standardized multiple choice tests exist, such as the Multistate Bar Examination (“MBE”), that rigorously test abilities in several different areas in a time-limited manner. For instance, the MBE tests six substantive areas of law with multiple-choice questions. The MBE forms a significant portion of the bar examination for most of the states in the United States. The ability of the test-takers to achieve a passing score on this portion of the bar examination is critical. It is the difference between being able to practice law in a particular state, and not. Consequently, those who desire to pass the bar examination spend a great deal of time studying for the MBE.
Known study approaches make use of written materials and rely on a user's discipline and drive to keep them working. Several bar review courses provide potential examinees with workbooks having several hundred practice questions that the user can work through as he/she sees fit. The workbooks also contain answers and explanations for the answers.
A major problem with these known and traditional approaches is that they do not, and cannot, force the user to study in a consistent, systematic and effective way. As a result, users typically study in a haphazard way, which varies with their mood, desire and drive. The danger with these conventional approaches is that users tend not to develop a consistent problem-solving approach, but instead develop and utilize inefficient and undesirable study habits. Another serious problem is that users also tend not to understand fully a question, and why one answer choice is correct, while the other choices are incorrect. Therefore, a method and system for preparing for an examination that assists a user to develop a consistent problem-solving approach and aids the user in understanding the question and answer is highly desirable.
Additionally, users preparing for a test may spend too much time or too little time preparing for a certain topic or subject that will be on the test. Doing practice questions from a written booklet may cause the user to answer repeatedly questions in areas where the student does not need additional work. If the booklet of questions is divided into sections, the user may study a section in the beginning of his/her preparation, but never revisit that section or the questions in the section as his/her preparation progresses. Therefore, a method and system for preparing for an examination that aids a user in efficiently preparing for a exam without ignoring topics the user has already mastered is highly desirable.
Another major problem is that test-takers may spend too little or too much time on a question when taking an actual exam or a practice exam. Generally, a test-taker has only a limited amount of time to answer each question on an exam. Spending too much time on a question means that the test-taker will not be able to spend enough time on a future question. Spending too little time on a question reduces the likelihood that the test-taker properly read the question and reviewed the answer choices. Constantly checking how much time has elapsed may cause the test-taker to lose precious time when taking the exam. Therefore, a method and system for preparing for an examination that aids a user in developing skills to manage efficiently time on the examination is highly desirable.
Thus, there is a need for a method and a system for preparing a user for an examination that offers greater efficiency and effectiveness by allowing the user to study and prepare to take the test in a consistent, systematic, and timely way.
BRIEF SUMMARY OF THE INVENTION A preferred embodiment of the present invention provides an improved method and system for improving performance on examinations. The method and system include presenting a user with a plurality of questions, timing the user's response time for the plurality of questions, identifying an optimal answer time where the user is able to answer a maximum percentage of questions correctly, and informing the user of the optimal answer time. An embodiment of the method may also include providing the user with a suggested change in an actual time taken to answer a question, wherein the suggested change in time is the optimal answer time minus the actual time taken to answer the question. The plurality of questions may be multiple choice questions with a plurality of answer choices, at least one of which is a correct answer. Furthermore, an embodiment of the method may also include providing a graph that illustrates what percentage of questions answered in a given period of time are answered correctly.
Certain embodiments may also include weighting the chance of receiving a question in a particular topic based on the user's past performance in answering questions in that topic.
BRIEF DESCRIPTION OF SEVERAL VIEWS OF THE DRAWINGSFIG. 1 illustrates an examination performance analysis and improvement system in accordance with an embodiment of the present invention.
FIG. 2 illustrates a flow diagram for a method for improving performance on examinations in accordance with an embodiment of the present invention.
FIG. 3 depicts a flow diagram for a method for performance analysis and benchmarking in accordance with an embodiment of the present invention.
FIG. 4 illustrates a flow diagram for a method for timing analysis and benchmarking in accordance with an embodiment of the present invention.
FIG. 5 illustrates a flow diagram for a method for calculating and displaying the user's optimal answer time in accordance with an embodiment of the present invention.
FIG. 6 depicts an example of a user interface displaying a question, a plurality of answer choices, and a question timer on a user interface in accordance with an embodiment of the present invention.
FIG. 7 depicts an example of a user interface displaying a correct answer, whether a user got a question correct, the user's actual time spent on the question, a suggested change in actual time spent on the question, an estimated change in chances of getting the question correct, and explanations for the correct answer and why the other choices were incorrect according to an embodiment of the present invention.
FIG. 8 depicts an example of a toolbar offering a user several modes of reviewing the user's performance in accordance with an embodiment of the present invention.
FIG. 9 depicts an example of a subject benchmark chart in accordance with an embodiment of the present invention.
FIG. 10 depicts an example of a subject performance chart classified by major topic in accordance with an embodiment of the present invention.
FIG. 11 depicts an example of a timing performance chart in accordance with an embodiment of the present invention.
FIG. 12 depicts an example of a timing analysis chart in accordance with an embodiment of the present invention.
FIG. 13 depicts an example of a settings menu in accordance with an embodiment of the present invention.
FIG. 14 depicts an example of a subject performance chart classified by major topic and subtopic in accordance with an embodiment of the present invention.
FIG. 15 illustrates a flow diagram for a method of adaptive learning used in accordance with an embodiment of the present invention
FIG. 16 depicts an example of a timing analysis chart in accordance with an embodiment of the present invention.
The foregoing summary, as well as the following detailed description of certain embodiments of the present invention, will be better understood when read in conjunction with the appended drawings. For the purpose of illustrating the invention, certain embodiments are shown in the drawings. It should be understood, however, that the present invention is not limited to the arrangements and instrumentality shown in the attached drawings.
DETAILED DESCRIPTION OF THE INVENTION For the purpose of illustration only, the following detailed description references a certain embodiment of a Multistate Bar Examination (“MBE”) performance analysis improvement system and method. It is understood that the present invention may be used with other test performance analysis and improvement systems, such as performance improvement systems for the certified public account (“CPA”) examination, the Law School Admission Test (“LSAT”), the real estate examination, the patent bar examination, medical boards, nursing boards, dental boards, social work licensing examinations, the ACT examination, the SAT examination, and other examinations, for example.
FIG. 1 illustrates an examination performance analysis andimprovement system100 according to an embodiment of the present invention. The examination performance analysis andimprovement system100 includes auser interface120, aprocessing unit130, and astorage unit140.
Theuser110 isanyone using system100, such as a student preparing for the MBE, for example.
Theuser interface120 receives input from and/or transmits output to a plurality of sources, such as the user110 (for example, a student) and theprocessing unit130. Theuser interface120 includes any device capable of transmitting and receiving data to a user. For example, theuser interface120 includes desktop personal computers, touchscreens, personal digital assistants (PDAs), laptop personal computers, or any Internet-enabled computer or device.
Theprocessing unit130 receives input from and/or transmits output to a plurality of sources, such as user theinterface120 and thestorage unit140. Theprocessing unit130 includes any device whether in hardware or software that can be configured to perform the functions of theprocessing unit130 described herein. Moreover, theprocessing unit130 includes any current processors currently known to one of ordinary skill in the art, whether implemented in hardware or software. For example, processing unit can include an Internet-enabled server, personal desktop, or laptop computer.
Thestorage unit140 includes disk drives, removable memories, such as “flash memories,” hard disk drives, any internal storage on a computer or Internet server, and any device whether in hardware or software that can be configured to perform the functions of the storage unit.
Theuser interface120, theprocessing unit130, and thestorage unit140 may be implemented in hardware or software as single system, for example, as a personal desktop or laptop computer or a specialized processing system, or may be implemented in separate networked systems, for example. For example, theuser interface120 may be a personal computer, and theprocessing unit130 andstorage unit140 may be combined in an Internet server or separate in multiple Internet servers that communicate withuser interface120. Alternatively, for example, theuser interface120, theprocessing unit130, and thestorage unit140 may all be in the same personal computer. In another embodiment, theuser interface120, theprocessing unit130, and thestorage unit140 may each be distributed among a multitude of systems. Theuser interface120, theprocessing unit130, and thestorage unit140 may communicate via known and established wired and/or wireless communication methods, including network connections and the Internet.
In operation, theuser110 enrolls in a preparation program for a test by sending registration information through theuser interface120 to theprocessing unit130. Theuser110 sends payment information, such as credit card information, checking account numbers, or some other form of payment information, for example, through theuser interface120 to theprocessing unit130. Alternatively, theuser110 may mail this information to an operator that operates theprocessing unit130. The operator may then enter the payment information into theprocessing unit130. Theprocessing unit130 then stores the registration information and payment information in thestorage unit140. After verifying that an appropriate payment has been made, theprocessing unit130 may then send a username145 andpassword150 to theuser110 through theuser interface120. In an alternate embodiment, theuser110 may purchase software or hardware and load or install the software or hardware into a processing unit.
The username145 may become active a set time before a scheduled administration of the test for which the program is preparinguser110. For example, the username145 may become active on December 1st if theuser110 is registered for the February MBE, or on May 1st if theuser110 is registered for the July MBE. After the username145 becomes active, theuser110 may log into theprocessing unit130 by using theuser interface120.
Thestorage unit140 stores a set of questions142 from the tests for which theuser110 is registered for each improvement program. For example,storage unit140 may store practice questions licensed from the National Conference of Bar Examiners (“NCBE”). All the questions may be unoriginal, for example, thereby allowinguser110 to have greater confidence in the accuracy of the practice questions and their similarity to actual MBE questions. In an embodiment, the licensed questions may be derived from previously administered MBEs, chosen by the NCBE. The questions may be a multitude of types, such as multiple choice, true false, short answer, or other formats for example.
In addition, thestorage unit140 stores pluralities of answer choices144 that correspond to each of the questions contained in the set of questions142, correct answers146 that correspond to each question stored in the set of questions142, and explanations for the correct answers148, which include a discussion of why the rest of the choices are incorrect. Additionally, the questions may fall in different categories of major topics and subtopics. Themajor topic150 and subtopic152 areas for each question may also be stored in thestorage unit140. For example, the MBE covers six major topic areas of law: Torts, Contracts, Constitutional Law, Evidence, Real Property, and Criminal Law. The subtopics are part of the major topics. For example, Constitutional Law may have four subtopics: Individual Rights, Judicial Review, Relations Between Federal and State Governments, and Separation of Powers. The subtopic of a question may also be stored in thestorage unit140. For example, if the question concerned First Amendment rights, thestorage unit140 may store the question142, the plurality of answer choices144, the correct answer146, the explanation for the correct answer and why the other choices were incorrect148, the major topic are150: Constitutional Law, and the subtopic area152: Individual Rights. It will be understood that multiple categorizations for each question may be stored in thestorage unit140. The above-described information may be stored as an array or as a database.
Because theuser110 may be studying and learning different subjects at different times, thesystem100 allows theuser110 to choose the subject areas in which the user wishes to receive questions. As shown inFIG. 13, processingunit130 may begin the preparation program by presenting touser110 through user interface120 asettings menu1310 that includes topic selection boxes1320, where each topic selection box corresponds to amajor topic150 from which theuser110 can receive questions, “show timer selection”box1330 and “show question and number selection”box1335, and a “save settings”button1340.
Using theuser interface120, theuser110 selects themajor topics150 from on which to receive questions by selecting the topic selection box1320 that corresponds to the desiredmajor topic150. If theuser110 wishes the elapsed time to be displayed on theuser interface120 while answering a question, then theuser110 may activate that function by selecting the “show timer”selection box1330. If theuser110 wishes the “question number and subject” to be displayed, then theuser110 may activate that function by selecting the “show question number andsubject selection box1330. Theuser110 saves his/her settings by clicking on the “save selections” button155. The settings are saved in thestorage unit140. The user's100 selections are communicated to theprocessing unit130, which then randomly selects a displayedquestion154 from the set of questions142 that corresponds to the selectedmajor topics150. The question is selected using the method for adaptive learning, which randomly selects questions for a user preparing for an examination from topics with weighted ranges, described below and inFIG. 15.
In an alternative embodiment, theprocessing unit130 may also presentuser110 with the option of restricting the subtopics from which the questions are chosen. For example,user110 using theuser interface120 selects the subtopics152 from which to receive questions by placing a checkmark in the subtopic selection box1322 that corresponds to the desired subtopic152 and clicking on the “save selections” button1345. The settings are saved in thestorage unit140. The user's100 selections are communicated to theprocessing unit130, which randomly selects a displayedquestion154 from the set of questions142 that corresponds to the selected subtopics152. The question is selected using the method for adaptive learning, which randomly selects questions for a user preparing for an examination from topics with weighted ranges, described below and inFIG. 15.
As shown inFIG. 6, for example, theuser interface120 receives the displayedquestion154 from theprocessing unit130 and displays it for theuser110. Displayedquestion154 includes aquestion identifier156, text ofquestion158, atime indicator159, andquestion answer choices160a,160b,160c, and160d. In an embodiment,question identifier156 includes question number162, major topic indicator164, and subtopic indicator166. Displayedquestion154 further includes an “answer”button167, a “pass” button168, and a “stop for now”button169.
In an embodiment, the displayedquestion154 is a multiple choice question, and theuser110 may cause theprocessing unit130 to reduce the emphasis of a particular answer choice160 on theuser interface120 by inputting a selection on theuser interface120. The emphasis may be reduced in a multitude of ways, such as by turning the answer choice a faded gray, unhighlighting an answer choice, reducing the font of an answer choice, unbolding an answer choice, italicizing an answer choice, or another change in the appearance of the answer choice on theuser interface120, for example. For example, theuser110 may accomplish this by clicking with a mouse on the answer choice instead of a check box next to the answer choice on theuser interface120. This allows theuser110 to eliminate answers from the possible choices. This emulates the process in which a user would engage when taking a handwritten exam or practice questions and crossing out certain answers that theuser110 feels are incorrect. Even if theuser110 chooses to eliminate a particular answer choice160 by reducing its emphasis, however, theuser110 may still choose that answer as correct and submit that answer.
Theuser110 selects an answer choice160 on theuser interface120 by selecting theanswer indicator163 that corresponds to that respective answer choice and by selecting the “answer” icon. Theuser interface120 communicates the selected answer choice to theprocessing unit130. Theprocessing unit130 stops timing the answer and stops thetime indicator159 displayed to theuser110 through theuser interface120. Theprocessing unit130 stores how long theuser110 took to answer the question in thestorage unit140. Theprocessing unit130 retrieves the correct answer146 to the displayedquestion154 from thestorage unit140. Theprocessing unit130 then compares the selected answer choice and the correct answer146 to see if the answer is correct or incorrect. Theprocessing unit130 stores whether the answer is correct in thestorage unit140.
Theprocessing unit130 times how long theuser110 takes to answer each displayedquestion154. If the user has activated the “show timer” function on thesettings menu1310, theprocessing unit130 communicates the elapsed time at set time intervals to theuser interface120 for display to theuser110 throughtime indicator159 while theuser110 is answering the displayedquestion154. The set intervals may be any amount of time, such as seconds, minutes, hours, or a fraction thereof, for example.
To display the elapsed time throughtime indicator159,user interface120 may display the time to theuser110 in a multitude of ways, such as presenting a graphical representation of a clock, a graphical representation of an hour glass, a bar graph, or a digital timer, for example, that times how long theuser110 takes to answer a question, as shown inFIG. 6, for example. Alternatively, theuser interface120 may present a graphical representation of a clock or dial that rotates according to increases in elapsed time. For an embodiment, see discussion ofFIG. 2 andmethod200, below.
As shown inFIG. 7, after theuser110 has answered a question, theprocessing unit130 communicatesanswer data171 to theuser interface120 for display to theuser110. Answer data includes aquestion identifier156, a time to answerindicator170, ananswer indicator172, anexplanation indicator176, and aquestion repeater177. The time to answerindicator170 indicates displays how long it took the user to answer the question. Theanswer indicator172 displays the correct answer to the answered question. Theexplanation indicator176 displays an explanation of why the answer is correct and why the other choices are incorrect. If, for example, the question is a multiple-choice question with four choices, theprocessing unit130 displays through theuser interface120 to theuser110 why the other three answers are incorrect. Thequestion repeater177 repeats the text ofquestion158 andquestion answer choices160a,160b,160c, and160d.
Theprocessing unit130 also calculates an optimal answer time (“OAT”)600, which is the optimal amount of time that theuser110 should spend on each question. Theprocessing unit130 calculates an OAT600 by using a series of steps, such as inFIG. 4 andFIG. 5, for example (see below for a detailed description of certain embodiments). Theprocessing unit130 then communicates the OAT600 to theuser interface120 for display to theuser interface110.
Theprocessing unit130 also calculates the difference between time actually spent on answering a given question and the OAT and calculate the user's110 difference in the percentage of correct answers given for those two times in the past from theuser110. Theprocessing unit130 displays anadditional time indicator178 to theuser110 through theuser interface120 which indicates the amount of additional time theuser110 should have spent reviewing the problem before answering and how much the user's110 chance of answering the question correctly would have increased, such as inFIG. 7, for example. By giving theuser110 this type of feedback, theuser110, through practice, may begin to develop internal timing which may assist theuser110 in performing better on the actual exam for which theuser110 is preparing.
In an alternative embodiment, theprocessing unit130 may also compute and communicate to theuser interface120 for display to theuser110 an average answer time of all the users who have answered the question correctly, either historically or for that session of the examination. Alternatively, theprocessing unit130 may also compute and communicate to theuser interface120 for display to theuser110 the average time of all users, either historically or for that examination session, that have answered the question regardless of whether their answer was correct.
In an embodiment, processingunit130 may retrieve all the answers to this question given by all the users of the program from thedata storage140 and then find the percentage of correct answers. In an alternative embodiment, processingunit130 may retrieve the answers to this question via wireless and/or wired communication with all the users' data storage unit, such as user'sstorage unit140, or with a centralized data storage unit and find the percentage of correct answers.
Answerdata171 also includes two icons—“next question” button180 or “stop for now”button182—available to theuser110. Theuser110 may spend as much time as necessary to read the explanatory answer and then chose to continue to the next question by selecting “next question” icon180 through theuser interface120 or to quit the program by selecting “stop for now”icon182 through theuser interface120. When theuser110 selects an icon, theuser interface120 then communicates the user's110 selection to theprocessing unit130.
If theuser110 selects the “stop for now”icon182, then theprocessing unit130 stores in thestorage unit140 user data112, such as the questions theuser110 has answered already, the times theuser110 took to answer the questions, theanswers user110 gave, and whether theanswers user110 gave were correct. If theuser110 selects the “next question” icon180, theprocessing unit130 randomly selects another question from the set of questions142 stored in thestorage unit140, and then communicates the question to theuser interface120 for display to theuser110 using themethod1500 described below and inFIG. 15.
Thesystem100 also provides several analysis and performance tools. Theuser110 selects “performance” through a toolbar displayed on theuser interface120, such as inFIG. 8, for example. If theuser110 selects “performance” using theuser interface120, theuser interface120 communicates the user's110 choice to theprocessing unit130. In an embodiment, theprocessing unit130 then communicates to theuser interface120 for display to the user110 a performance analysis center (PAC) that includes four button: (1) “subject benchmark”button191; (2) “subject performance”button192; (3) “timing performance”button193; and (4) “timing analysis”button194, such as inFIG. 8, for example.
Giving theuser110 an opportunity to review an analysis of his/her performance allows the user to develop and utilize efficient and desirable study habits. The instant feedback may save theuser110 time, because theuser110 does not need to go back and review answers after completing an entire practice test. Additionally, theuser110 may not be able to devote a three-hour block of time to take a practice test for the MBE, for example. Theuser110 is still able to time himself/herself taking the test because each individual question is timed and recorded in an effort to make sure theuser110 does not run out of time on the actual examination.
If theuser110 selects the “subject benchmark” icon, theuser interface120 communicates the user's110 choice to theprocessing unit130. Theprocessing unit130 retrieves from thestorage unit140 user data112, which contains an identification of each question answered byuser110, an indication of whether theuser110 answered each question correctly, and the major topic and subtopic from which each of the questions came. Then, theprocessing unit130 calculates the total number of questions answered, the overall percentage of questions answered correctly, the percentage of questions answered correctly for each major topic and creates asubject benchmark chart700, which visually displays the percentage of questions answered correctly by theuser110 in each major topic as well as the average percentage of questions answered correctly by all users in each major topic. Theprocessing unit130 then communicates the subject benchmark chart to theuser interface120 for display to theuser110, such as inFIG. 9, for example. Also, seeFIG. 3 and the description ofFIG. 3 below for a description of how theprocessing unit130 may create a subject benchmark chart.
As shown inFIG. 9,subject benchmark chart700 includes auser indicator710, a total number of questions answeredindicator720, anoverall accuracy indicator730, agraph field740, a “Low Bar Passing Grade”indicator750, and a “High Bar Passing Grade”indicator760, and an average majortopic accuracy indicator770.
Theuser indicator710 identifies the current user. The total number of questions answeredindicator720 displays the total number of questions answered by theuser110. Theoverall accuracy indicator730 displays the overall percentage of questions answered correctly by theuser110. Thegraph field740 displays abar graph780 for eachmajor topic150 that shows the user's110 performance in eachmajor topic150. As shown inFIG. 9, eachmajor topic150 has acorresponding bar graph780 next to it that represents the percentage of questions the user has answered correctly in thatmajor topic150.
In an alternative embodiment, only the user's110 performance by major topic on theuser interface120 is displayed initially. If theuser110 selects any one of the major topics through theuser interface120, theprocessing unit130 calculates the total number of questions answered, the overall percentage of questions answered correctly, the percentage of questions answered correctly for each major topic and subtopic and creates a newsubject benchmark graph700, which visually displays the percentage of questions answered correctly by theuser110 in each major topic and subtopic as well as the average percentage of questions answered correctly by all users in each major topic and subtopic. Theprocessing unit130 then communicates thesubject benchmark chart700 to theuser interface120 for display to theuser110.
As discussed above, thesubject benchmark chart700 also has an average majortopic accuracy indicator760 for each major topic that represents the average performance of all users of the program. This enables theuser110 viewing thesubject benchmark chart700 displayed on theuser interface120 to gauge his/her performance in comparison to all users enrolled in the program. In an embodiment, the users are the total number of users for that session of the examination. For example, all users studying for the February MBE. In another embodiment, the total users can be the total number of users to have used thesystem100 to study for that particular examination. For example, all users that have studied for the MBE using the system.
In an embodiment, processingunit130 retrieves from thestorage unit140 all the answers given by all the current users of the program for each major topic and sub-topic and then calculates the percentage of correct answers given for each major topic and sub-topic. In an alternative embodiment, processingunit130 may retrieve the answers to questions in each major topic and sub-topic given by all the current users of the program via wireless and/or wired communication with all the users' storage units or with a centralized data storage unit.
In an alternative embodiment, theuser110 can select the time period for which asubject benchmark chart700 is created and displayed. As shown inFIG. 9,subject benchmark chart700 also includes a week drop-down box701 and a month drop-down box702. By using either drop-down box, the user can limit the time period for which thesubject benchmark chart700 is created. Once theuser110 has selected a specific time range,user interface120 communicates user's110 choice toprocessing unit130.Processing unit130 retrieves the user data112 from thestorage unit140 and generatessubject benchmark chart700 and communicates to theuser interface120 for display to theuser110.
If theuser110 selects “subject performance”icon192, theuser interface120 communicates the user's110 choice to theprocessing unit130. Theprocessing unit130 retrieves from thestorage unit140 user data112, which contains an identification of each question answered byuser110, an indication of whether theuser110 answered each question correctly, and the major topic and subtopic from which each of the questions came. Then, theprocessing unit130 calculates the percentage of correct answers for each major topic and each subtopic and creates asubject performance chart800 indicating the percentage of correct answers for each major topic and each subtopic. Theprocessing unit130 then communicates thesubject performance chart800 to theuser interface120 for display to theuser110, such as inFIG. 10, for example.
As shown inFIG. 10,subject performance chart800 includes auser indicator810, a total number of questions answeredindicator820, anoverall accuracy indicator830, a total number of questions answered in eachmajor topic indicator845, a graph field850, and a majortopic accuracy indicator860.
Theuser indicator810 identifies the current user. The total number of questions answeredindicator820 displays the total number of questions answered by theuser110. Theoverall accuracy indicator830 displays the overall percentage of questions answered correctly by theuser110. The total number of questions answered in eachmajor topic indicator845 displays the number of questions answered in eachmajor topic150. The graph field850 displays abar graph880 that shows the user's110 performance in eachmajor topic150, as shown inFIG. 10. Eachmajor topic150 has a corresponding bar graph next to it that represents the percentage of questions the user has answered correctly in thatmajor topic150. The major topic accuracy indicator indicates the percentage of questions answered correctly by theuser110 for that major topic. For example, inFIG. 10, theuser110 has answered 55% of the Criminal Law questions correctly.
Thesubject performance chart800 initially displays the percentage of questions answered correctly by theuser110 in each major topic. If theuser110 click on any major topics through theuser interface120, theprocessing unit130 refreshes theuser interface120 and communicates the performance in the subtopics for all major topics that theuser110 has selected, as shown inFIG. 14.
In an embodiment, theuser110 may select a major topic by clicking on themajor topic150 identified on thesubject performance chart800. If theuser110 clicks on amajor topic150, theprocessing unit130 generates a newsubject performance chart800 and communicates it to theuser interface120 for display to theuser110. For example, as shown inFIG. 14, thesubject performance chart800 displays the percentage of questions answered correctly by theuser110 in each subtopic under the major topic—“Evidence.”
In an alternative embodiment, theuser110 may select a (+) symbol that is located near the major topic identifier. If theuser110 selects the “+” sign on theuser interface120, then the “+” sign may turn to a “−” sign and below the major topic selected may appear the subtopic areas for that major topic, for example. The average of the performances in each subtopic may make up the total percentage for the major topic. Alternatively, the percentage of all the questions answered correctly by theuser110 in the major topic may constitute the topic's total percentage, such as inFIG. 10, for example.
In an alternative embodiment, theuser110 can select the time period for which asubject performance chart800 is created and displayed. As shown inFIG. 10,subject performance chart800 also includes a week drop-down box801 and a month drop-down box802. By using either drop-down box, the user can limit the time period for which thesubject performance chart800 is created. Once theuser110 has selected a specific time range,user interface120 communicates user's110 choice toprocessing unit130.Processing unit130 retrieves the user data112 from thestorage unit140 and generatessubject performance chart800 and communicates to theuser interface120 for display to theuser110.
If theuser110 selects the “timing performance” icon, theuser interface120 communicates the user's110 choice to theprocessing unit130. Theprocessing unit130 retrieves from thestorage unit140 user data112, which contains an identification of each question answered byuser110, an indication of whether theuser110 answered each question correctly, the major topic and subtopic from which each of the questions came, and the time to answer each question. Theprocessing unit130 creates atiming performance chart900 and communicates the graph to theuser interface120 for display to theuser110, such as inFIG. 11, for example. Theprocessing unit130 may then display a bar graph onuser interface120 as the time analysis graph with each row representing a different time segment, and the length of the bar graph representing the percentage of correct answers in each time segment.
As shown inFIG. 11,timing performance chart900 includes auser indicator910, a total number of questions answeredindicator920, anoverall accuracy indicator930, agraph field940, a “minutes tracked”selector980, and an “increments per minute”selector990.
Theuser indicator910 identifies the current user. The total number of questions answeredindicator920 displays the total number of questions answered by theuser110. Theoverall accuracy indicator930 displays the overall percentage of questions answered correctly by theuser110. Thegraph field940 also displays abar graph950 that shows the user's110 performance for increment oftime970, a verbal description ofaccuracy945, and a percentage of questions answered correctly in eachtime increment indicator960.
In an embodiment, thetiming performance chart900 displays several options for theuser110 to choose from through theuser interface120. Theuser110 may choosedifferent time increments970. For example, as shown inFIG. 11, thetiming performance chart900 displays the percentage of questions answered correctly when the answer time is between 0 minute to 1 minute, 1 minute to 2 minutes, and 2 minutes to 3 minutes.
Theuser110 can change thetime increments970 displayed by thetiming performance chart900 by selecting different intervals through the “minutes tracked”selector980. For example, theuser110 can select 4 minutes using the “minutes tracked” selector and theprocessing unit130 creates a newtiming performance chart900 and communicates it to theuser interface120 for display to theuser110. The new timing performance chart displays the percentage of questions answered correctly when the answer time is between 0 minute to 1 minute, 1 minute to 2 minutes, 2 minutes to 3 minutes, and 3 to 4 minutes.
Using the “increments per minute”selector990, theuser110 can break down each time increment into different increments per minute. For example, a minute increment can be broken down into 6, 10, 15, and 30 seconds increments.
In an alternative embodiment, theuser110 can select the time period for which atiming performance chart900 is created and displayed. As shown inFIG. 11,timing performance chart900 also includes a week drop-down box901 and a month drop-down box902. By using either drop-down box, the user can limit the time period for which thetiming performance chart900 is created. Once theuser110 has selected a specific time range,user interface120 communicates user's110 choice toprocessing unit130.Processing unit130 retrieves the user data112 from thestorage unit140 and generatestiming performance chart900 and communicates to theuser interface120 for display to theuser110.
In an embodiment, theuser110 may select atime increment970 by clicking on it. If theuser110 clicks ontime increment 0 minute to 1 minute, theprocessing unit130 generates a newtiming performance chart900 and communicates it to theuser interface120 for display to theuser110. For example, as shown inFIG. 16, thetiming performance chart900 displays the percentage of questions answered correctly for 6-second increments when the answer time is between 0 minute to 1 minute as shown inFIG. 16.
If theuser110 selects “timing analysis” icon, theuser interface120 communicates the user's110 choice to theprocessing unit130. Theprocessing unit130 retrieves from thestorage unit140 user data112, which contains an identification of each question answered byuser110, an indication of whether theuser110 answered each question correctly, themajor topic150 and subtopic152 from which each of the questions came, and the time to answer each question. Theprocessing unit130 generates atiming analysis chart1000, which displays the user's110 performance at set intervals in the form of a line graph. Theprocessing unit130 communicates the chart to theuser interface120 for display to theuser110, such as inFIG. 12, for example. This way, theuser110 may view at what point theuser110 is achieving the highest rate of accuracy. Also, seeFIG. 4 and the description ofFIG. 4 below for a description of how theprocessing unit130 may create a timing analysis chart.
Timing analysis chart1000 includes auser indicator1010, a total number of questions answeredindicator1020, an overall accuracy indicator1030,graph field1040, an averageanswer time indicator1050, an estimatedcompletion time indicator1060, an underexamination time indicator1070, an average amount of time when answer iscorrect indicator1080, an average amount of time when answer iscorrect indicator1090, ahighest accuracy indicator1100, an estimated completion time at highest rate indicator1110, and an over/underexamination time indicator1120, an examination answer time (“EAT”) line1130, an optimal answer time (“OAT”)line1140, atrend line1150, andtime increment indicators1160.
Theuser indicator1010 identifies the current user. The total number of questions answeredindicator1020 displays the total number of questions answered by theuser110. The overall accuracy indicator1030 displays the overall percentage of questions answered correctly by theuser110. Thegraph field1040 displays atrend line1150 that charts the user's110 accuracy rate for 10-second time increments with % as the x-axis and time as the y-axis. In an alternative embodiment, the time increment can be any increment of time.
The averageanswer time indicator1050 displays the user's110 average time to answer each question. The user's110 average time to answer each question is determined by calculating the average time it has takenuser110 to provide an answer to questions. The sum of all the times theuser110 has taken to answer all the questions presented during the examination preparation divided by the number of questions theuser110 has completed equals the user's110 average time to answer each question.
The estimatedcompletion time indicator1060 displays the estimated time it will complete the examination based on the number of questions of the examination and the average answer time. The estimated completion time is determined by multiplying the user's110 average answer time by the number of questions on the examination for which theuser110 is preparing.
The underexamination time indicator1070 displays the amount of time under the allotted time that the user will finish the examination. The under examination time is determined by subtracting the user's110 estimated completion time from the time allotted for the examination for which theuser110 is preparing.
The average amount of time when answer iscorrect indicator1080 displays the average time it takes theuser110 to answer when the answer provided is correct. The average amount of time when an answer is correct is determined by calculating the average time it has takenuser110 to provide a correct answer to questions. The sum of all the times the user has taken to answer all the questions answered correctly divided by the number of questions the user has completed equals the user's average amount of time to answer when answer is correct.
The average amount of time when answer isincorrect indicator1090 displays the average time it takes theuser110 to answer when the answer provided is incorrect. The average amount of time when an answer is incorrect is determined by calculating the average time it has takenuser110 to provide a incorrect answer to questions. The sum of all the times the user has taken to answer all the questions answered incorrectly divided by the number of questions the user has completed equals the user's average amount of time to answer when answer is incorrect.
The optimal answer time (“OAT”)indicator1100 displays the time range at which theuser110 answers the highest percentage of questions correctly.
The estimated completion time at highest rate indicator1110 displays how long it will take user to complete the given examination if theuser110 answers each question of the given examination in the range of the optimal answer time (“OAT”). The estimated completion time at highest rate is determined by multiplying the OAT by the number of questions on the examination for which theuser110 is preparing.
The over/underexamination time indicator1120 displays how many minutes over or under the allotted time that theuser110 will complete the given examination if theuser110 answers each question of the given examination in the range of the optimal answer time. The over/under examination time is determined by subtracting the estimated completion time at highest rate from the allotted time for the examination for which theuser110 is preparing.
The examination answer time (“EAT”) line1130 indicates the allotted time for each question by the entity giving the examination. The EAT may vary according to the test theuser110 is preparing for, however, the general equation is the total amount of time for the test divided by the total number of questions on the test. For example, the MBE has two sessions lasting 180 minutes each with 100 questions per session. The EAT for the MBE is 1.8 minutes because 180 divided by 100 is 1.8. If the EAT is 1.8 seconds, then the time performance graph may have a vertical line at 1.8 minutes, such as inFIG. 12, for example.
The optimalanswer time line1140 indicates the time range at which theuser110 answers the highest percentage of questions correctly. For example, if theuser110 is achieving his/her highest level of accuracy at 1.1 minutes, the processing unit may generate a vertical line on thegraph field1040 at the 1.1 minute marker. Theuser110 may want to maintain an accuracy rate and timing that is at or lower than the 1.8 minute indicator line to be able to answer all the questions on the test.
Thetime increment indicators1160 indicate the percentage of questions answered correctly by theuser110 at that time increment.
In another embodiment, the time performance graph may also have two lines running parallel to one another and horizontally on the graph—“Low Bar” line1170 and “High Bar”line1180. The lower of the two lines indicates the “Low Bar” accuracy rate1170. The upper line indicates the “High Bar”accuracy rate1180 . The purpose of these bars is to represent to theuser110 at what level he/she should be performing. Theuser110 may make it a goal to fall between the two blue lines.
In an alternative embodiment, theuser110 can select the time period for which atiming analysis chart1000 is created and displayed. As shown inFIG. 12,timing analysis chart1000 also includes a week drop-down box1001 and a month drop-down box1002. By using either drop-down box, the user can limit the time period for which thetiming analysis chart1000 is created. Once theuser110 has selected a specific time range,user interface120 communicates user's110 choice toprocessing unit130.Processing unit130 retrieves the user data112 from thestorage unit140 and generates thetiming analysis chart1000 and communicates to theuser interface120 for display to theuser110.
The remainder of the detailed description references users, user interfaces, processing units, and data storages. The users may be similar touser110, for example. The user interfaces may be similar touser interface120, for example. The processing units may be similar toprocessing unit130, for example. The data storages may be similar todata storage140, for example.
FIG. 2 illustrates a flow diagram200 for a method for improving performance on examinations in accordance with an embodiment of the present invention. First, atstep210, randomly select and display a question and question answer choices from a pool of questions stored in a storage unit for the user's examination program and display a question and question answer choices to a user through a user interface. In addition to the question and question answer choices, also display the user a choice to answer the question, stop the questioning process, or pass and receive a new question. All may be displayed on a computer monitor.
In an embodiment, the displayedquestion154 is randomly selected from the pool of questions142 stored instorage unit140 for the user's110 examination program. The question is selected using the method for adaptive learning, which randomly selects questions for a user preparing for an examination from topics with weighted ranges, described below and inFIG. 15. For example, theprocessing unit130 may randomly select and display a displayedquestion154 to theuser110 throughuser interface120. As described above, displayedquestion154 includes aquestion identifier156,text question158, atime indicator159,question answer choices160a,160b,160c, and160d, an “answer”icon167, a “pass” icon168, and a “stop for now”icon169. The question and answer choices can be displayedquestion154, which includes aquestion identifier156,text question158, atime indicator159,question answer choices160a,160b,160c, and160d, an “answer”icon167, a “pass” icon168, and a “stop for now”icon169.
In an alternative embodiment, only questions for which the user has not provided answers may be in the pool of questions142 stored in thestorage unit140. Alternatively, theuser110 may restrict the major topic and subtopic areas from which a question is chosen. One or more storage units may store the questions, pluralities of answer choices for the questions, correct answers to the questions and explanations for the correct answer and why the remaining choices are incorrect.
Additionally, three options are presented to theuser110 through theuser interface120. The user may select “Answer,” “Pass,” or “Stop for Now.” In order to select “Answer,” the user needs to have selected one of the question answer choices160athrough160d.
In an embodiment, after the user has answered a set number of questions, such as ten, for example,method200 may reduce the frequency with which questions of those subtopics are selected and given to the user if the user has correctly answered a certain percentage of the questions in that subtopic, such as 75%, for example. The purpose of continuing to deliver questions in those subtopics once the user has reached a requisite level of knowledge is to prevent the user from forgetting about the subtopics that were mastered early in the studying process. The user will not unnecessarily waste time answering questions in subjects the user already knows well if questions on those topics are not asked as often.
In an embodiment, the user may be registered in the MBE program and the question may be a multiple choice question from the MBE program pool, such as inFIG. 6, for example. Preferably, the question and its answer choices are very similar to the questions and answers which actually appear on the MBE. The questions and answers may be displayed in a format and font that are very close to those used in the MBE. The closer the appearance and the format of the question and its answer to that of the MBE, the more comfortable the user may be on the actual MBE exam.
Next, atstep220, start a question timer. The timer operates to keep track of the amount of time elapsed from the time a question and answer choices are displayed to a user until the user selects and enters an answer choice.
In an embodiment, question timer221 operates to keep track of the amount of time elapsed from the time the display question142 is displayed to theuser110 until theuser110 selects and enters eitheranswer choice160a,160b,160c, or160d. Due to the fact that the MBE is a severely time-limited exam—two 180 minute sessions with 100 questions each—keeping track of the user's110 performance for each question is very important.
Atstep230, display the elapsed time to the user through a user interface, such asuser interface120. The elapsed time reflects the value of the question timer at certain intervals. In an embodiment, the intervals may be tenths of a second, such as inFIG. 6, for example. Alternatively, the intervals may be seconds, minutes, hours or a fraction thereof, for example. The display of the timer may take on a multitude of forms, such as a clock face, a stopwatch, a bar graph, a digital timer, an hour glass, a dial, or some other graphical technique, for example. By displaying a visual indication of the elapsed time, the user becomes sensitized to the amount of time he/she spends on answering questions and how he/she is doing time-wise with respect to a pre-determined duration of time. Alternatively, an audio signal may be used. An audio signal has the advantage of not distracting the user visually while the user reviews the question, however, the noise may be more distracting to the user and does not allow the user to quickly check the elapsed time.
Next, atstep235, determine whether the user's OAT has passed. The OAT is calculated using a method for calculating and displaying the user's optimal answer time in accordance with an embodiment of the present invention, as shown inFIG. 4 andFIG. 5 and described below. If the OAT has not passed, themethod200 moves todecision240. If the OAT has passed, the method goes on to step250.
The decision atstep235 may act as an alarm for the user when the OAT has passed. In an alternative embodiment,decision235 may check for a multitude of checkpoints, such as the program's EAT, a time set by the user, a checkpoint a certain length of time before or after the OAT, or other predetermined times, for example. In that alternative embodiment, if one of the checkpoints had passed,method200 may then go on to step250 to inform the user.
Atstep250, inform the user that the user's OAT has passed. The user may be informed or given an indication through a multitude of methods using a user interface, forexample user interface120, such as an audible noise, a message displayed to the user, a graphical representation of a traffic light signal with an illuminated red light, a stop sign, an hour glass half empty, a filled bar graph, a graphical representation of a timer, or some other some other graphical, physical or audio technique, for example. In an embodiment, the traffic light may have an illuminated green light when the question is first displayed, an illuminated yellow light when the OAT is a set time away from the then current elapsed time on the question timer, and an illuminated red light when the question timer has a time equal to or greater than the OAT.
After informing the user atstep250,method200 goes to step240. In an alternative embodiment, atstep250,method200 may also inform the user when a multitude of checkpoints has passed, such as the program's EAT, a time set by the user, a checkpoint a certain length of time before or after the OAT, or other predetermined times, for example. By giving the user this type of feedback, the user, through practice, may begin to develop internal timing which may assist the user in performing better on the actual exam for which the user is preparing. Additionally, the user may not have to constantly check how much time has elapsed on the examination.
Atstep240, determine whether the user has selected one of three options presented to the user: “answer,” “pass,” or “stop.” If the user has not yet selected one of the three choices, return to step230 and continue to display the elapsed time at a set interval. If the user has selected one of “answer,” “pass,” or “stop,” go to step255 and stop question timer. The question timer is stopped at this time so that only the time the user takes to answer the question is measured.
Atstep260, determine whether the user selected “answer,” “pass,” or “stop.” If the user selected “pass,” go to step210. If the user selected “stop,” go to step290. If the user selected “answer,” go to step270.
Atstep270, store question information in storage. In an alternative embodiment, the question information may be stored in centralized data storage for all users. In an embodiment, question information may consist of a multitude of different information, such as the question asked, the major topic and subtopic of the question, the answer chosen, whether the answer was correct, the OAT at the time the answer was given, the date and time the answer was given, the number of total questions the user has answered at that point, the amount of time taken to answer the question, or other information about the question or the user, for example. The question information may be stored in astorage unit140.
Atstep280, compare the answer selected by the user against the correct answer stored in storage and display results to the user through a user interface. In an embodiment, step280 displays the following to the user on a monitor, for example: whether the user's answer was correct, the explanation for the correct answer and why the other answer choices were incorrect, the elapsed time the user took to answer the question from the question timer, the difference between the OAT and the elapsed time, a suggestion as to whether the user should try to reduce or increase the time spent on answering the question, and the likely change in the user's performance should the user decide follow the suggested change in time. In an embodiment, if the user answered the question correctly, the method may not display a suggested change in time or a likely change in performance if the elapsed time for the question is less than the EAT—the average time allotted to each question in an actual examination. Alternatively, if the user answered the question correctly, the method may not display a suggested change in time or a likely change in performance if the elapsed time for the question is less than the OAT. Atstep280, two choices—“stop” and “next question”—may be displayed to a user through a user interface, such as a computer monitor for example.
Then, step285, determine if the user selected if user selected “stop” or “next question.” If the user selected “next question,” go to step210. If the user selected “stop,” go to step290.
Finally, atstep290,method200 stores the user information. The user information may include a multitude of different information, such as the number of questions the user answered, the last question the user answered, whether the user was presented a question and then elected to “pass” or to “stop,” and other user information, for example. Storing the last question the user answered may allow the user to start where he/she left off the next time the user uses the program.
FIG. 3 illustrates a flow diagram300 for performance analysis and benchmarking in accordance with an embodiment of the present invention, which measures a user's accuracy rate compared to passing grades and benchmarked to other users.
First, atstep310, create and store an array. The array has a number of rows equal to a number of major topic and subtopic areas in the user's preparation program.
Next, atstep320, calculate the user's accuracy rate for each topic area. The number of the user's correct answers in a particular topic area divided by the total number of questions answered by the user in that particular topic area equals the user's accuracy for that topic area.
Atstep325, calculate the user's overall accuracy rate. The total number of questions answered correctly by the user divided by the total number of questions presented to the user equals the user's overall accuracy.
Atstep330, calculate the accuracy rate for all users answering the questions in an examination period for each topic area. The total number of correct answers in a topic area provided in given examination period by all users divided by the number of all the answers in a topic area provided during that examination period by all users equals the accuracy rate for all users for each topic in a given examination period.
Atstep335, compute the overall accuracy rate of all users for the given examination period. The total number of questions answered correctly by all users in the given examination period divided by the total number of questions presented to all users during the given examination period equals the overall accuracy rate of all users.
An examination period may last however long a preparation program is active. For example, a user (such as a student) signed up for the MBE program for the July MBE may have an examination period from April 1st of the same year to the date of the MBE in July. All users enrolled in the program may answer questions in the different topic areas of the program.
Atstep340, display a summary of the user's accuracy in each topic area compared to the passing grade range for each topic area. The summary may take the form of the bar graph shown inFIG. 9. The passing grade range may be defined by a “low-bar” and a “high-bar.” For example, the “low-bar” may be 60% and the “high-bar” may be 75%, and both bars may be represented by vertical lines on a subject benchmark chart, such as inFIG. 9, for example. Displaying the passing grade range may allow the user to quickly assess strengths and weaknesses across the major topic areas.
Next, atstep350, display the user's overall accuracy rate compared to the overall accuracy rate of all the users answering questions for an exam period.
Finally, atstep360, display a summary of the user's accuracy rate in each topic area compared to the average accuracy rate of all users answering questions for an exam period for the samemajor topics150 and subtopics152. For example, the average performance of all the users answering questions for an exam period may be represented by a box in each row of a topic analysis bar graph.
Steps320,325,330, and335 may be conducted simultaneously by a processing unit, such as theprocessing unit130 described above.
Steps340,350, and360 may also be conducted simultaneously by a processing unit, such as theprocessing unit130 described above.
FIG. 4 illustrates a flow diagram400 for a method of timing analysis and benchmarking in accordance with an embodiment of the present invention, which measures a user's accuracy rate over a range of time taken to answer a question compared to an examination answer time (“EAT”) and a user's optimal answer time (“OAT”). The OAT is the time range in which the user's accuracy rate is the highest.
First, atstep410, creates and stores an array. The array has a number of rows equal to a number of time segments, plus one additional row to accumulate all answers exceeding a maximum time. A time segment may be any length of time, such as seconds, minutes, hours, or any fraction thereof, for example. Theprocessing unit130 may decide the appropriate time segment, or a user may select the time segment, for example. The number of time segments is determined by the maximum time the user or the processing unit chooses to display. The actual number of time segments is that maximum time divided by the length of a time segment. The array has a number of columns which hold the user's performance in each time segment.
Next, atstep420, retrieve the user's past questions, answers, and lengths of time to answer each question and total the user's correct and incorrect answers according to each time segment.
Atstep430, compute the user's accuracy rate for each time segment. The number of the user's correct answers in a time segment divided by the total number of questions answered by the user in the time segment equals the user's accuracy for that given time segment.
Then, atstep440, display the user's accuracy rate for each time segment on a graph with an x-axis and a y-axis on a user interface, plotting accuracy on the x-axis and the time segments on the y-axis, such as inFIG. 12, for example.
Next, atstep450, compare the user's performance to a passing grade range. The passing grade range may be defined by a “low-bar” and a “high-bar,” which may be displayed. For example, the “low-bar” may be 60% and the “high-bar” may be 75%, and both bars may be represented by horizontal lines on a time performance graph, such as inFIG. 12, for example. Displaying the passing grade range may allow the user to quickly assess the appropriate amount of time required to read and answer a question.
Atstep460, calculate the user's estimated time required to complete the examination based on the user's average time to answer each question and the number of questions on the test. The sum of all the times the user has taken to answer all the questions presented during the examination preparation divided by the number of questions the user has completed equals the user's average time to answer each question. The user's average time to answer each question multiplied by the total number of questions in the examination equals the user's estimated time required to complete the examination.
Atstep465, calculate the user's optimal answer time (“OAT”). The OAT is calculated by performing the following calculation: for each time segment, multiply the user's accuracy rate in that time segment by100 and then by the total number of questions answered by the user in that time segment. Then, determine the time segment with the highest value and that time range is the user's OAT.
Then, atstep470, calculate the estimated time for completion at highest accuracy rate, which is the OAT multiplied by the total number of questions on the examination for which the user is preparing.
Atstep480, calculate the underrun or overrun, which is the estimated time for completion at highest accuracy rate minus the total allotted time for the examination for which the user is preparing.
If the total allotted examination time is less than the estimated time for completion at a highest accuracy rate, then there is an overrun. An overrun reduces the available time to answer questions, lowering the user's performance rate. If an overrun occurs, the user needs to work to reduce their OAT.
If the estimated time for completion at a highest accuracy rate is less than the total allotted examination time, then this is an underrun and the user's performance should be equal to the user's highest accuracy rate. The underrun is time that may be spent reviewing questions and attaining a highest accuracy rate, thus yielding a higher accuracy rate.
Atstep490, display the user's average time to answer each question, the user's average time to complete the examination, the user's OAT, and the user's time to complete the examination at the OAT, and the overrun or underrun.
FIG. 5 illustrates a flow diagram500 for a method for calculating and displaying the user's optimal answer time in accordance with an embodiment of the present invention. First, atstep510, create and store an array. The array may be stored in storage as a virtual table. The array has a number of rows equal to a number of time segments, plus one additional row to accumulate all answers exceeding a maximum time. A time segment may be any length of time, such as seconds, minutes, hours, or any fraction thereof, for example. A processing unit may decide the appropriate time segment, or a user may select the time segment, for example. The number of time segments is determined by the maximum time the user or the processing unit chooses to display. The actual number of time segments is that maximum time divided by the length of a time segment. The array has a number of columns which hold the user's performance in each time segment.
Next, atstep520, calculate the total number of correct answers and incorrect answers for each time segment.
Atstep530, calculate the user's accuracy rate for each time segment. The number of the user's correct answers given in a time segment divided by the total number of questions answered by the user in the time segment equals the user's accuracy rate for each time segment.
Then, atstep540, calculate the user's the optimal answer time (“OAT”). The OAT is calculated by performing the following calculation. For each time segment, multiply the user's accuracy rate in that time segment by100 and then by the total number of questions answered by user in that time segment. Then, determine the time segment with the highest value and that time range is the user's OAT.
In another embodiment, a processing unit may loop through all the answers that a user has given and find the percentage of correct answers for each time segment. The highest of those percentages is the maximum accuracy rate. In an alternative embodiment, the OAT may be calculated by not only finding the accuracy rate of each time segment, but also weighing how long ago the user answered the question. A user's OAT may adjust over time. For example, if a user answers a lot of questions in five minutes before studying and then studies and is able to answer questions in one minute, the user's OAT may be too high. In another embodiment,method500 may only consider questions answered within a certain period of time before the calculation of the OAT, or may only consider a certain number of questions answered by the user most recently. Alternatively, answers could be given a weight according to how long ago they were answered or how many questions have been answered by the user after that question. Additionally, some questions may be harder than others. Thus, in an embodiment, an answer time may be given less weight according to the percentage of users that got the answer wrong. The OAT may not be as accurate until a significant amount of answers from the user have been accumulated. That amount may depend on the user's program. In another embodiment, as a default until the user has answered a certain number of questions, the OAT may be set to the same amount as the estimated answer time (“EAT”).
Atstep550, decide whether the OAT is less than the EAT.
Atstep560, display the OAT, EAT, and whether the OAT is greater than or less than the EAT. The OAT and EAT may be displayed on a user interface. The display may by in a multitude of forms, such as a graph, a message setting out the expected performance and the OAT, or other graphical technique, for example.
FIG. 15 illustrates a flow diagram1500 for a method for adaptive learning in accordance with an embodiment of the present invention, which randomly selects questions for a user preparing for an examination from topics with weighted ranges.
First, atstep1510, create and store an array with the number of rows equal to the number of topics on the examination for which the user is preparing and a column value equal to a preassigned range of values. The range of values can be an equal range of values initially, such as 100, which can equal the total number questions available in each topic. For example, for an examination with six topics,topic1 has a range of values from 1-100,topic2 has a range of values from 101-200,topic3 has a range of values from 201-300,topic4 has a range of values from 301-400,topic5 has a range of values from 401-500, andtopic6 has a range of values from 501 to 600. Therefore, the range of values is from 1 to 600.
Atstep1520, generate a random number in the range of values assigned to the topic areas.
Atstep1530, select the topic area by finding the value range in which the random number falls. For example, based on the values discussed instep1510 above, if the random number134 is generated, then selecttopic2.
Atstep1540, create a set of questions from the topic selected instep1530 that have not been answered.
Atstep1550, generate a random number in the range of the total number of questions in the unanswered set of questions.
Atstep1560, select a question equal to the question number in the set of unanswered questions and display it to the user through the user interface.
Atstep1570, update the range of values for the topic area from which the previous question was selected based on whether the user provided a correct or incorrect answer. The total number in the new range of values for a topic will be equal to the inverse proportion of the user's accuracy factor for that topic, multiplied by a weighting factor, plus a minimum value—((1−accuracy factor)*(weighting factor))+(minimal value). The total number of correct answers provided by a user in a topic divided by the total number of questions in the topic equals the user's accuracy factor for that topic. The weighing factor determines the total range of values in each topic, where a higher weighing factor allows the random number generator to generate a number for a larger range of values. For an examination with six topics and approximately 1200 questions, the weighting factor is 80. The minimal value sets the smallest range of values allocated to each major topic in order to allow the adaptive learning method to select randomly questions from all topics, including topics in which the user is performing well. For an examination with six topics and approximately 1200 questions, the minimal value is 5.
For example, if the user answered correctly the question selected instep1560 fromtopic2, the user's accuracy factor fortopic2 would be 0.01, which is 1 divided by 100, where 1 is the total number of questions answered correctly by user fromtopic2 and 100 is the total number of questions intopic2. Applying the equation described above and using the weighting factor and minimal value described instep1570, the total number in the new range of values fortopic2 would be 84, where ((1−0.01)*(80))+(5)=84.2. Round to the nearest whole number.
Atstep1580, update the range of values for all of topics in the array. If no answer has been provided for a particular topic, allocate the maximum range of values. For example, based on steps1510-1570. described above,topic1 has a range of values from 1-100,topic2 has a range of values from 101-184,topic3 has a range of values from 185-284,topic4 has a range of values from 285-384,topic5 has a range of values from 385-484, andtopic6 has a range of values from 485-584.
Afterstep1580, return tostep1520.
The range values for each topic are reset to the preassigned range of values when all the questions have been answered.
Steps1510 through1580 may be performed by a processing unit, such asprocessing unit130.
While the invention has been described with reference to certain embodiments, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted without departing from the scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from its scope. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.