CROSS-REFERENCE TO RELATED APPLICATIONSThis application claims priority under 35 U.S.C. §119(e) to U.S. Provisional Application Ser. No. 61/107,621, filed on Oct. 22, 2008, which is hereby expressly incorporated by reference in its entirety.
BACKGROUND1. Field
This invention relates to computing devices and, more particularly, to systems and methods of providing user interface for computing devices.
2. Description of the Related Art
In many computer uses, a user selects from a menu displayed on an interface such as a screen. Such selection can be achieved by, for example, a cursor based input. An interface device such as a mouse can move the cursor to a desired location for activating an icon of the menu.
In many situations, such cursor movement can cover significant distances on the screen. Repetition of cursor movements can result in user fatigue, frustration, and repetitive motion injury. Additionally, while each individual movement to a menu of a software application may require little time, repeated use of the menu over time results in a significant amount of cumulative time spent, reducing user productivity and efficiency.
SUMMARYIn one embodiment, a method for providing a user interface on a computing device comprises displaying a first menu on a display of a computing device, the first menu having a first plurality of icons arranged in an icon region that extends substantially around an initial position of a cursor, wherein the icon region defines a central region within the icon region that includes the initial cursor position. In one embodiment, the method further comprises detecting movement of the cursor to a second position within the central region, wherein the second position of the cursor is near a first icon of the first plurality of icons or includes at least a portion of the first icon, changing an appearance of the first icon in response to detecting movement of the cursor to the second position, wherein the change in appearance indicates that the icon is temporarily selected, initiating a first action associated with the first icon in response to detecting an input from the user indicating that the first icon should be permanently selected, wherein at least some of the method is performed by the computing device.
In one embodiment, a method for providing a user interface on a computing device comprises displaying a first menu on a display of the computing device, the first menu having a plurality of icons arranged substantially around a current position of a cursor, the plurality of icons defining a central region of the display between the plurality of icons and including the current position of the cursor, receiving a first input indicative of movement of the cursor, determining which of the plurality of icons is to be temporarily selected based at least in part on a pattern of the first input within the central region, and temporarily selecting the determined icon.
In one embodiment, a computing system comprises a display screen, an input device configured to facilitate interaction with a user, and a processor configured to execute software code that causes the computing system to display a menu on the display screen, the menu having a plurality of icons arranged about a home region, detect an input facilitated by the input device and indicative of the user's desire to at least temporarily select one of the icons, and determine which of the icons is to be at least temporarily selected based at least in part on a pattern of the input, the pattern involving at least a part of the home region.
In one embodiment, a method for providing a user interface on a computing device comprises displaying a first menu on a display of a computing device, the first menu having a first plurality of icons arranged in an icon region that extends substantially around an interaction position, wherein the interaction position comprises an area of the display where a user or an apparatus controlled by a user touched the display, a current position of a cursor, or a predetermined position on the display. In one embodiment, the method further comprising receiving a first user-initiated input indicative of movement from the interaction position, and in response to the movement, selecting an icon associated with a direction of the first user-initiated input, wherein at least some of the method is performed by the computing device.
BRIEF DESCRIPTION OF THE DRAWINGSFIG. 1ais a block diagram illustrating one embodiment of a computing system that may be used to implement certain systems and methods described herein.
FIG. 1billustrates an example of a graphical menu and an example of mouse activity that could be used to initiate its display.
FIG. 1cillustrates mouse activity that could be used to temporality select an icon within the graphical menu ofFIG. 1b.
FIG. 1dillustrates mouse activity that could be used to permanently select the temporarily selected icon ofFIG. 1c.
FIG. 1eillustrates how icons within a graphical menu, and icons of a second graphical menu, can be selected in response to exemplary movements of a cursor.
FIG. 2aillustrates an example use of a graphical menu on a handheld device, such as a cellular phone, PDA, or tablet computer.
FIG. 2bfurther illustrates the use of a graphical menu on a handheld device, such as a cellular phone, PDA, or tablet computer.
FIG. 2cillustrates an example use of a graphical menu on another handheld device that has the ability to monitor its position or movement.
FIG. 3 is a diagram illustrating screen regions of a sample graphical menu, where movement of the cursor between the screen regions in certain manners may be used to determine which icon within the graphical menu has been temporarily and/or permanently selected by the user.
FIG. 4ais a diagram illustrating another embodiment of a graphical menu including screen regions that may be used to determine which icon within the graphical menu has been selected by the user.
FIG. 4bis a diagram illustrating another embodiment of a graphical menu including screen regions that may be used to determine which icon within the graphical menu has been selected by the user.
FIG. 5ais a diagram illustrating another embodiment of a graphical menu.
FIG. 5billustrates an icon with multiple icon location points.
FIG. 5cillustrates a graphical menu including icons with multiple icon location points.
FIG. 6ais a flowchart illustrating one embodiment of a method for operating a graphical menu.
FIG. 6bis a flowchart illustrating another embodiment of a method for operating a graphical menu.
FIG. 7aillustrates an exemplary graphical menu superimposed on a homogenous screen.
FIG. 7billustrates an exemplary graphical menu superimposed on a complex screen output of a program that called the graphical menu.
FIG. 7cillustrates sample user interactions with the graphical menu illustrated inFIG. 7b.
DETAILED DESCRIPTION OF CERTAIN EMBODIMENTSEmbodiments of the user interface will now be described with reference to the accompanying figures, wherein like numerals refer to like elements throughout. The terminology used in the description presented herein is not intended to be interpreted in any limited or restrictive manner, simply because it is being utilized in conjunction with a detailed description of certain specific embodiments of the invention. Furthermore, embodiments of the user interface may include several novel features, no single one of which is solely responsible for its desirable attributes or which is essential to practicing the inventions herein described.
People spend large amounts of time interacting with computers and computer like devices such as cell phones, PDAs, gaming devices and portable media players. There is a need for improved ways of interacting with these and other devices that: improves speed and efficiency; reduces repetitive motion injury; is more intuitive; and/or operates well on small display screens.
Various systems and methods described herein address some or all of these issues with embodiments of a flexible graphical menu and an efficient method of interacting with it. While embodiments of the user interface will be illustrated using display of a graphical menu, sound could be used as a supplement or replacement for display of the graphical menu, as will be discussed below.
User Interfaces and MenusUser interfaces are described herein for depicting data on a display device of a computer, where the term “computer” is meant to include any of the computing devices described above, as well as any other electronic device that includes a display and/or other audio output device. Depending on the embodiment, the user interfaces described herein may provide one or more of several advantages. For example, a user interface may include a graphical menu that appears on demand so it does not take up room on the display screen until it is needed. This reduces screen clutter and is especially useful with small screens as there is no need to devote screen pixels to display the menu until it is needed. In another example, the user does not have to move the screen cursor large distances to initiate display of the graphical menu.
In yet another example, the graphical menu appears in a home region, which includes an area surrounding a current cursor position in one embodiment, or other area with which the user is likely interfacing with. Therefore, the user does not need to direct his attention to other areas of the screen, which may provide a particular advantage when users are concentrating on analyzing content of screen. In yet another example, the user can pick an icon (e.g., that is representative of a function that may be performed by a software application) within a graphical menu with only minimal mouse movement. In some embodiments, it is not necessary for the user to position the cursor over an icon or click on it, but only move slightly toward it. This may increase user speed and efficiency. In addition, the reduction in mouse movement has the potential to reduce repetitive motion injury, particularly in applications where users interface with computers for many hours per days, for example: radiologists reading medical imaging exams on Picture Archive and Communication Systems; office workers who spend hours per day with email, word processing, and spreadsheet applications, for example; web surfing; and/or computer gaming.
In another example, the systems and methods described herein may provide visual and/or auditory feedback as to which of the items in a graphical menu has been chosen and the user can vary mouse position and dynamically change the selected icon. In yet another example, once a user learns the relative positions of icons within a graphical menu, there is no need for the user to visually examine the presented menu; rather, the user may rapidly choose the desired icon by moving the mouse (or other input device) in the remembered direction (or pattern of directions) of the desired icon(s).
The present disclosure is presented generally in the following structure. Some terms as used herein are defined for clarity. An embodiment of an exemplary computing system, which is actually representative of any computing system on which user interfaces may be display and interfaced with by a user, is described with reference toFIG. 1a.FIGS. 1b-1eillustrate sample conceptual configurations of menus, and exemplary navigation thereof. Embodiments of the user interface systems and methods for use on computing devices with small screens or other systems without a mouse, such as a cell phone, PDA, gaming device, MP3 or media player, or tablet PC, are described in conjunction withFIG. 2aandFIG. 2b. An example embodiment on a handheld device that can sense movement or position, such as an Apple iPhone or iTouch, will be described in conjunction withFIG. 2c. Methods for determining icon selection within a graphical menu based on cursor position will be described in conjunction with FIGS.3,4a-4b, and5a-5c.FIGS. 6aand6bare flowcharts illustrating operation of a computing device according to embodiments. Another embodiment including computer screen examples is discussed in conjunction withFIGS. 7a-7c. Other contemplated embodiments are discussed, including use of sound as a supplement to or replacement for display of a graphical menu.
DEFINITIONS OF CERTAIN TERMSA “graphical menu” can include one or more graphical or textual objects, such as icons, where each of the objects is representative of a particular menu option.
An “icon” can be a component of a graphical menu that could be anything displayed on the screen that is visually distinguishable, such as a picture, button, frame, drawing, text, etc.
An “initial cursor position” can include a screen location of a cursor at the time the graphical menu system is initiated. The graphical menu is typically displayed around the initial cursor position and sufficient movement from this position is typically required for an icon to be selected.
A “home region” is the region around the initial cursor position, and including the initial cursor position. The home region may extend different distances from the initial cursor position, such as just a distance of a few millimeters on the display device to a few centimeters or more on the display device. Depending on the embodiment, the home region may be centered around the initial cursor position or may be offset such that the initial cursor position is closer to one edge (e.g., a top edge) of the home region than to an opposite edge (e.g., the bottom edge) of the home region. A home region may also be determined based on a location where a user has interfaced with a display device, where there may not be a cursor at all, such as a location where a touchscreen was touched by a finger or stylus of the user or where the finger or stylus moved in a predetermined pattern on the touchscreen.
A “temporarily selected icon” can include an icon within a graphical menu that has been temporarily chosen by the user, but has not yet been selected such that the particular menu option associated with the temporarily selected icon has not yet been initiated. Rather, the graphical menu is displayed so that the user can confirm that the temporarily selected icon is the desired icon. If the user is not satisfied with the indicated temporary selection, the user can choose a different icon within the graphical menu or choose no icon. A temporarily selected icon may be displayed in such a way as to allow the user to visually distinguish it from icons that are not temporarily selected.
A “permanently selected icon” (or simply “selected icon”) can include an icon that has been selected by the user. When an icon is permanently selected, a software function associated with the icon is initiated by the program or operating system. An icon may be permanently selected in various manners, depending on the embodiment, some of which are described in further detail below.
Computing SystemsIn some embodiments, the computing devices, computing systems, mobile devices, workstations, computer clients and/or servers described herein may comprise various combinations of components, such as the exemplary combinations of components illustrated inFIG. 1a-1e. Discussion herein of one or more specific types of computing devices should be construed to include any other type of computing device. Thus, a discussion of a method performed by a mobile computing device is also contemplated for performance on a desktop workstation, for example.
FIG. 1ais a block diagram illustrating one embodiment of acomputing system100 that may be used to implement certain systems and methods described herein. For example, thecomputing system100 may be configured to execute software modules that cause the display of a menu around an area of focus (e.g., a current cursor position or a position on a touch screen that is touched by a finger or stylus) on adisplay device104. Below is a description of exemplary components of thecomputing system100.
Thecomputing system100 includes, for example, a personal computer that is IBM, Macintosh, or Linux/Unix compatible. In one embodiment, thecomputing system100 comprises a server, a desktop computer, a laptop computer, a mobile computer, a cell phone, a personal digital assistant, a gaming system, a kiosk, an audio player, any other device that utilizes a graphical user interface (including office equipment, automobiles, airplane cockpits, household appliances, automated teller machines, self-service checkouts at stores, information and other kiosks, ticketing kiosks, vending machines, industrial equipment, etc.) and/or a television, for example. In one embodiment, theexemplary computing system100 includes a central processing unit (“CPU”)105, which may include one or more conventional or proprietary microprocessor. Thecomputing system100 further includes amemory108, such as one or more random access memories (“RAM”) for temporary storage of information, a read only memory (“ROM”) for permanent storage of information, and amass storage device102, such as a hard drive, diskette, flash memory drive, or optical media storage device. The modules of thecomputing system100 may be connected using a standard based bus system. In different embodiments, the standard based bus system could be Peripheral Component Interconnect (“PCI”), PCI Express, Accelerated Graphics Port (“ACP”), Microchannel, Small Computer System Interface (“SCSI”), Industrial Standard Architecture (“ISA”) and Extended ISA (“EISA”) architectures, for example. In addition, the functionality provided for in the components and modules ofcomputing system100 may be combined into fewer components and modules or further separated into additional components and modules.
Thecomputing system100 is generally controlled and coordinated by operating system software, such as Windows 95, Windows 98, Windows NT, Windows 2000, Windows XP, Windows Vista,Windows 7, Windows Mobile, Unix, Linux (including any of the various variants thereof), SunOS, Solaris, mobile phone operating systems, or other compatible operating systems. In Macintosh systems, the operating system may be any available operating system, such as MAC OS X or iPhone OS. In other embodiments, thecomputing system100 may be controlled by a proprietary operating system. Conventional operating systems control and schedule computer processes for execution, perform memory management, provide file system, networking, I/O services, and provide a user interface, such as a graphical user interface (“GUI”), among other things.
Theexemplary computing system100 includes one or more input/output (I/O) devices and interfaces110, such as a keyboard, trackball, mouse, drawing tablet, joystick, game controller, touchscreen (e.g., capacitive or resistive touchscreen) touchpad, accelerometer, and printer, for example. The computing system also includes a display device104 (also referred to herein as a display screen), which may also be one of the I/O device110 in the case of a touchscreen, for example. In other embodiments, thedisplay device104 may include an LCD, OLED, or other thin screen display surface, a monitor, television, projector, or any other device that visual depicts user interfaces and data to viewers. Thedisplay device104 provides for the presentation of GUIs, application software data, and multimedia presentations, for example. Thecomputing system100 may also include one or more multimedia devices, such as speakers, video cards, graphics accelerators, and microphones, for example.
In the embodiment ofFIG. 1, the I/O devices and interfaces110 may provide a communication interface to various external devices. For example, thecomputing system100 may be electronically coupled to a network, such as one or more of a LAN, WAN, or the Internet, for example, via a wired, wireless, or combination of wired and wireless, communication link(s). Such a network may allow communication with various other computing devices and/or other electronic devices via wired or wireless communication links.
In the embodiment ofFIG. 1, thecomputing system100 also includes auser interface module106 that may be executed by theCPU105. This module may include, by way of example, components, such as software components, object-oriented software components, class components and task components, processes, functions, attributes, procedures, subroutines, segments of program code, drivers, firmware, microcode, circuitry, data, databases, data structures, tables, arrays, and variables. In the embodiment shown inFIG. 1, thecomputing system100 is configured to execute theuser interface module106, among others, in order to provide user interfaces to the user, such as via thedisplay device104, and monitor input from the user, such as via a touchscreen sensor of thedisplay device104 and/or one or more I/O devices110, in order to navigate through various menus of a software application menu, for example.
In general, the word “module,” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions, possibly having entry and exit points, written in a programming language, such as, for example, Java, Javascript, ActionScript, Visual Basic, Lua, C, C++, or C#. A software module may be compiled and linked into an executable program, installed in a dynamic link library, or may be written in an interpreted programming language such as, for example, BASIC, Perl, or Python. It will be appreciated that software modules may be callable from other modules or from themselves, and/or may be invoked in response to detected events or interrupts. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules may be comprised of connected logic units, such as gates and flip-flops, and/or may be comprised of programmable units, such as programmable gate arrays or processors. The modules described herein are preferably implemented as software modules, but may be represented in hardware or firmware. Generally, the modules described herein refer to logical modules that may be combined with other modules or divided into sub-modules despite their physical organization or storage.
In other embodiments, the computing system may include fewer or additional components than are illustrated inFIG. 1a. For example, a mobile computing device may not include amass storage device102 and thedisplay device104 may also be the I/O device110 (e.g., a capacitive touchscreen). In some embodiments, two or more of the components of thecomputing system100 may be implement in one or more field programmable gate array (FPGA) or application specific integrated circuit (ASIC), for example.
Examples of Systems and MethodsInFIG. 1b,view120 illustrates amouse130 comprising aright button132. Inview120, a user depressesright mouse button132 ofmouse130, with depression of the right mouse button illustrated witharrow134. In one embodiment, depressing theright mouse button132 initiates display of agraphical menu140 on the display screen centered aroundinitial cursor position125 on the display device. In other embodiments, other operations may be performed on the mouse120 (or other input device) in order to initiate display of thegraphical menu140. In the embodiment ofFIG. 1b, thegraphical menu140 comprises one or more icons (in this example, eight octagonal icons labeled141-148). Graphical menus and their component icons can vary in appearance and functionality, as will be described below.
The examplegraphical menu140 may be displayed on top of whatever else might be displayed on the display screen, with some portions of the graphical menu transparent in some embodiments. In the example ofFIG. 1b, thegraphical menu140 is displayed so that it is centered around theinitial cursor position125.
For the purposes of the series of events illustrated inFIG. 1b,FIG. 1c, andFIG. 1d, it is assumed that by default, display of thegraphical menu140 is centered on initial cursor position125 (e.g., the cursor position when the user initiated displayed of the graphical menu, such as by right clicking the mouse130).
FIG. 1cillustrates in view122 a mouse movement that could be used to temporality select the icon142 (FIG. 1b), such that theicon142a(FIG. 1c) is temporarily selected. As illustrated inview122, the user continuesaction134 of depressing theright mouse button132 and, in this example, moves themouse130 superiorly and to the right, along the path depicted byarrow136. This movement of the mouse causescursor170 to move superiorly and to the right on the display device on which thegraphical menu140 is displayed. Thus,FIG. 1cillustratescursor170 moved from theinitial cursor position125 towardsicon142a.
As thecursor170 approaches a portion of the graphical menu, an icon within the graphical menu is temporarily chosen and displayed in such a way as to visually distinguish it from unselected icons within the graphical menu. Thus, thegraphical menu140 shows the temporarily selectedicon142adisplayed in a way that differentiates it from its original appearance as icon142 (FIG. 1b). In the example ofFIGS. 1band1c,icon142 inFIG. 1bhas changed toicon142ainFIG. 1cby changing background and font colors of theicon142, in order to indicate thaticon142 has been temporarily selected. There are many ways that an icon could change to depict that it is temporarily selected and differentiate it from icons that are not chosen. For example, an icon may become animated when temporarily selected, may display a modified or different image or text, or may be transformed in any other manner.
As noted above, in this exemplary embodiment the user is not required to position thecursor170 directly over an icon in order to select that icon. As will be discussed in more detail below, only minimal movement toward an icon may be required to select it, increasing efficiency and decreasing necessary mouse movement and the potential for repetitive motion injury.
FIG. 1ddemonstrates how the user indicates that the temporarily selectedicon142a(FIG. 1c) is permanently selected, which represents a final choice for this interaction with the graphical menu and the graphical menu is no longer displayed. As illustrated inview124, the user releasesmouse button132 such that the button moves in a direction depicted by arrow138 (e.g., releasing the right button132). Thus, in the embodiment ofFIGS. 1b,1c, and1d, an icon is temporarily selected by depressing theright button132 in order to initiate display of the graphical menu, moving thecursor170 towards (and/or partially or fully over) a desired icon in order to temporarily select the icon, and releasing theright button132 to permanently select the desired icon in order to initiate execution of an operation associated with the selected icon.
In the embodiment illustrated inFIG. 1b,graphical menu140 is displayed symmetrically aroundinitial cursor position125. However, in another embodiment where there is a default icon choice, for example, the graphical menu could be asymmetrically positioned around the initial cursor position such that an icon is chosen by default. In one embodiment, thegraphical menu140 may be positioned such that a default icon is closer to the initial cursor position when thegraphical menu140 is initially displayed. With reference toFIG. 1c, for example, if the initial cursor position is the position ofcursor170 shown inFIG. 1c, rather thanposition125 indicated in the figure, themenu140 may be initially displayed so thaticon142ais temporarily selected as a default. Depending on the embodiment, any of the icons in the graphical menu may be chosen by default, such as in response to options established by a user or based on frequency of use of respective icons, for example.
FIG. 1eillustrates how icons within a graphical menu, and display of a second graphical menu, can be selected in response to movements of a cursor. There is no limit to the number of choices that can be presented to the user using the graphical menus discussed herein. For example, the permanent selection of an icon in one graphical menu could initiate display of another graphical menu, as will be discussed in further detail with reference toFIG. 1e. This could be repeated so that selection of an icon in a second graphical menu could open a third graphical menu, and the process may be repeated ad infinitum to present further graphical menus. One of the selections in a graphical menu could be to return to a previous graphical menu.
InFIG. 1e,screen regions160,162,164 and166 represent the same physical screen region but at different stages in the navigation of a primary graphical menu (stages160,162) and a secondary graphical menu (stages164,166).Region161 is a magnification ofcentral region169 ofscreen region160, with its approximate size and location illustrated by a dashedrectangle169 withinregion160. Magnifiedcentral regions163,165, and167 ofscreen regions162,164, and166, respectively, are also shown, with the corresponding magnified regions having the same relationship asregion161 to screenregion160.
InFIG. 1e,screen region160 shows display ofgraphical menu140 including icons labeled A-H that are arranged in an icon region surrounding the initial cursor position ofcursor170, depicted in both the dashedrectangle169 and the magnifiedregion161 that represents the content of the same dashedrectangle169. In one embodiment, display of thegraphical menu140 was initiated by user actions.
Inscreen region162, the user has moved thecursor170 superiorly and to the right alongpath172, depicted in magnifiedregion163. In this embodiment, movement ofcursor170 towardicon152 has causedicon152 to be temporarily selected and its appearance has changed so that it can be visually differentiated from unselected icons, such asicons141 and143. As illustrated,icon152 is temporarily selected before the cursor reaches theicon152. In other embodiments, temporary selection of an icon may not occur until at least a predetermined portion of the cursor covers an icon. Various criteria for determining when icons are temporarily and/or permanently selected are discussed below.
In the example shown inFIG. 1e, permanent selection oficon152, such as by releasing the right mouse button whenicon152 is temporarily selected, for example, results in display of a newgraphical menu180. Depending on the embodiment, the display ofgraphical menu180 shown inscreen region164 could be configured to occur in the following example circumstances: (1) Display of the secondgraphical menu180 could occur as soon as the icon152 (or other icon associated with display of the graphical menu180) in the firstgraphical menu150 is temporarily selected, (2) display of the secondgraphical menu180 could occur after theicon152 is permanently selected, such as by releasing the right mouse button or with one of the other techniques describe herein, or (3) display of thegraphical menu180 could occur after a time delay. This would allow the user to reposition thecursor170 if an undesired icon is temporarily selected (e.g., rather than immediately replacinggraphical menu150 withgraphical menu180 when the undesired icon is temporarily selected). A selected time delay, such as 100 milliseconds, for example, could be set such that permanent selection of an icon, and display of a second menu in this example, would occur after an icon is temporarily selected for at least 100 milliseconds.
Thescreen region164 depicts display of the secondarygraphical menu180 and removal ofgraphical menu150, such as in response to one of the above-indicated interactions withicon152. In this embodiment, the secondarygraphical menu180 is centered around a new initial cursor position, the position of the cursor inscreen region162 at the time thaticon152 was permanently selected. As discussed elsewhere herein, graphical menus can vary in their appearance andgraphical menu180 happens to have 4 square icons.Screen region165 depicts a magnification ofscreen region164, as described above.
Screen regions166 and167 illustrate what happens when the user moves the cursor inferiorly from the position illustrated inscreen regions164 and165 along the path illustrated byarrow174. Cursor movement inferiorly has caused the temporary selection of anicon187 withingraphical menu186.Graphical menu186 is the same asgraphical menu180 except that an icon has been temporarily selected. Specifically,graphical menu186 has a temporarily selectedicon187 displayed in a way that differentiates it from unselected icons such as182 and183.
Implementation on Cell Phones, PDAs, Tablet PCsSome computing systems with displays do not utilize a mouse for navigation and the user interfaces described herein can be implemented with other forms of navigation. For example,FIGS. 2aand2billustrate the implementation of an enhanced user interface using a stylus, but other navigation devices could be utilized, such as a finger controlled touch screen or directional navigation buttons, for example. In the description below, the device inFIG. 2aandFIG. 2bwill be referred to as a cell phone, but it could be a PDA, tablet PC, or other device with adisplay screen212. While the pointing device illustrated is astylus214, it could alternatively be the user's finger or other object.
InFIG. 2a, the user initiates an action that causes display ofgraphical menu230. In this implementation, display of thegraphical menu230 is initiated by detection of aparticular motion path220 on theinput screen212. In this embodiment,motion path220 comprises roughly the path that the user would use to draw the number “6” using thestylus214. The user interface software that executes on thecell phone210 could be configured to display other graphical display menus for other input tracings. For example, the interface software could be configured to display a different graphical menu in response to the user tracing a path similar to the letter “L” or any other pattern. Display of graphical menus could be initiated in many other ways, as will be described herein.
In this example,graphical menu230 has eight hexagonal icons and is displayed centered about where the input pattern was completed on the display screen. Thus, in the embodiment ofFIG. 2a, the initial cursor position is a terminal position of tracing220. In other embodiments, thegraphical menu230 may be centered elsewhere, such as a start position of the tracing220 or some intermediate position of the tracing220, for example.Icon237 is one of the eight icons withingraphical menu230.
The user can temporarily select an icon within thegraphical menu230 by moving thestylus214 toward the desired icon.FIG. 2bshows an example where the user has moved thestylus214 towards the left alongpath222. This movement causes temporary selection of the closest icon, in this case theicon237, which changes appearance inFIG. 2bin response to being temporarily selected, in order to allow it to be visually differentiated from the other unselected icons in thegraphical menu240, for example unselectedicon246.
FIG. 2cillustrates the use of agraphical menu270 on anotherhandheld device260 that has the ability to monitor its position (e.g., orientation) or movement. Thedevice260 is a handheld device such as a cell phone (e.g., iPhone), PDA, tablet PC, portable music or media player, gaming device or other handheld device with a display screen. In another embodiment,device260 could be an input device, such as a Wii controller or 3D mouse, where the screen is on another device.
In order to use this graphical menu system in the way that will be described inFIG. 2c,device260 includes technology that allows it to sense its position and/or motion, such as one or more accelerometers.Device260 hasdisplay screen270 and may have one ormore input devices264 and265, that could include buttons or other input devices.Device260 is depicted inview250 in an arbitrary orientation (e.g., position held by the user). As will be described, its position will be changed by the user inviews252 and254 in order to indicate selection of icons.
Inview250 ofFIG. 2c,graphical menu270 is displayed onscreen262 and includes icons271-274. The user initiated some action to cause the graphical menu to be displayed, for example one of the other techniques described herein. Additional ways the user could initiate display ofgraphical menu270 include the pressing of a button, forexample button264, voice or other audible commands, touching the screen with two fingers in a predetermined pattern and/or location, or some positioning ofdevice260, such as shaking it side to side.
Inview252 ofFIG. 2c, x, y, and z axes are illustrated to indicate repositioning of the device by the user. The x axis and y axis are in the plane ofscreen262 ofdevice260, along its short and long axis respectively, and the z axis is perpendicular to the screen.Motion path285 illustrates that the user is physically rotating the device toward his left along the y axis.Device260 detects movement of thedevice260 alongmotion path285 and temporarily selects the icon within the graphical menu that is positioned in the detected direction from the point of view of the center of the graphical menu. In this case, because themotion path285 comprises rotation of thedevice260 towards the left, theleft icon274 of thegraphical menu270 is temporarily selected. Once temporarily selected, one or more characteristics oficon274 are changed (as shown by the dark background oficon274 in view252) in order to allow it to be visually differentiated from the remaining unselected icons.
Inview254 ofFIG. 2c, x, y, and z axes are again illustrated as inview252. As illustrated bymotion path286, the user is rotating the device downward (toward him, rotating about the x axis). In response to this movement, the computing device temporarily selects theicon273 at the bottom of thegraphical menu270. In this case, selectedicon273 is illustrated in a way to differentiate it from the remaining unselected icons. Whileviews252 and254 illustrate rotation around specific axes, the user may rotate the device in any arbitrary direction to allow temporary selection of an icon at any position on the screen.
In one embodiment, an icon that is temporarily selected may be permanently selected without further user interaction (e.g., there really is no temporary selection, by maintaining thedevice260 in an orientation to temporarily select an icon for a predetermined time period), by pressing a button, such as one ofbuttons264,265, or by any other input that may be provided by the user of thedevice260.
In the example depicted inFIG. 2c, detection of “pouring” motions (e.g.,motion252 shows the device being tilted as if the user is pouring into icon274) can be facilitated by tilt sensor(s) and/or accelerometer(s). In certain embodiments, sufficient number of such detection components can be provided so as to allow motion-based temporary selection and/or permanent selection of icons having both x and y components. For example, suppose than a fifth icon is provided between icons A and B (in first quadrant of the x-y plane). Then, a tilting motion towards such an icon can be detected by sensing a combination of motions about y and x axes (motions opposite to285 and286).
There are a number of other motion-based user inputs that can be implemented to achieve similar results. For example, a device can be jerked slightly towards an icon that the user wants to temporarily (or permanently) select. For such motions, one or more accelerometers can be provided and configured to detect two-dimensional motion along a plane such as a plane substantially parallel to the device screen.
Methods for Determining Icon SelectionThere are many possible methods for determining if a user has selected an icon within a graphical menu. Several will be described herein, but others are contemplated that would provide the same or similar functionality.
FIG. 3 is a diagram illustrating screen regions320-328 of agraphical menu150, where movement of acursor303 onto certain screen regions may be used to determine which icon within the graphical menu has been selected (or temporarily selected) by the user. Using such a mapping scheme, an icon within the graphical menu may be selected when the user positions thecursor303 within a screen region corresponding to an icon.
In this example,graphical menu150 depicts eight icons,141,152,143,144,145,146,147, and148. The screen is divided into multiple regions320-328, with regions321-328 corresponding to respective icons andregion320 centered on the initial cursor position, around which the graphical menu is displayed. In this embodiment, each of the regions321-328 includes at least a portion of its respective icon. In this example,region320 is centered about the initial cursor position at the time the graphical menu was displayed. In the example shown,region321 corresponds toicon141,region322 toicon152,region323 toicon143,region324 toicon144,region325 toicon145,region326 toicon146,region327 toicon147, andregion328 toicon148.
Determining whether the cursor falls within a region is straightforward in this example as the regions are bounded byhorizontal lines331 and333 andvertical lines335 and337 which may be represented by x and y coordinates in a computer system. For the purposes of illustration,lines335,337,331 and333 are labeled with “x1”, “x2”, “y1”, and “y2”, respectively, in order to indicate their positions in the coordinate system ofscreen310 as follows:
Vertical line335 is at position x1.
Vertical line337 is at position x2.
Horizontal line331 is at position y1.
Horizontal line333 is at position y2.
In this embodiment, if the position ofcursor303 at any given time is represent by coordinates (x,y), determining the region that the cursor is positioned can be accomplished as follows:
If x>x1 and x<x2 and y≧y2 then inregion321.
If x>x1 and x<x2 and y>y1 and y<y2 then inregion320.
If x>x1 and x<x2 and y≦y1 then inregion325.
If x≧x2 and y≧y2 then inregion322.
If x≧x2 and y>y1 and y<y2 then inregion323.
If x≧x2 and y≦y1 then inregion324.
If x≦x1 and y≧y2 then inregion328.
If x≦x1 and y>y1 and y<y2 then inregion327.
If x≦x1 and y≦y1 then inregion326.
FIG. 4ais a diagram illustrating another embodiment of a graphical menu including screen regions that may be used to determine which icon within a graphical menu has been selected by the user. In this embodiment,screen402 includes a graphical menu having 8 hexagonal icons.Position404 indicates the initial cursor position when the graphical menu was rendered at its current position on the screen.
In this example, thescreen402 is divided into radial regions421-428 and acentral home region420 centered in the graphical menu. In this embodiment, radial regions421-428 each correspond to a respective icon in the graphical menu. For example,region421 corresponds toicon431,region422 corresponds toicon432, andregion423 corresponds toicon433.
When cursor410 is positioned withinhome region420, no icon is selected. When the user moves the cursor out ofhome region420 and into another region, the icon within that region is temporarily selected. In this case,cursor410 has been moved by the user intoregion422. This has caused temporary selection of thecorresponding icon432 which is displayed in such a way as to differentiate it from the unselected icons within the graphical menu. In this example the temporarily selected icon is displayed as darker and the letter inside it displayed with a different font and color, but there are many other ways that a temporarily selected icon could be visually distinguished from unselected icons.
FIG. 4bis a diagram illustrating another embodiment of a user interface including screen regions461-468 that may be used to determine which icon within a graphical menu has been selected by the user, where the user interface includes asymmetric icons and asymmetric screen regions used for detection of icon selection. In the embodiment ofFIG. 4b, the graphical menu comprises eight icons withinscreen region440. This example demonstrates that unselected icons within a graphical menu may differ in appearance, including features such as size and shape. In addition, the size and/or shape of the screen regions associated with each icon in a graphical menu may differ.
The screen regions associated with each icon can differ in size and shape. This may be advantageous in cases where some icons are more commonly chosen than other. More commonly selected icons might be assigned larger associated screen regions to make it easier for the user to select those areas and therefore the respective icon. While the relative size of the various screen regions could vary independently of the size of the icons in the function menu, in thisexample icons473 and477 are larger than the other icons in the graphical menu, and their associatedscreen regions463 and467 are also larger than the other screen regions.
As in a previous example,home region460 is centered where the cursor was positioned at the time the graphical menu was displayed. When cursor170 is positioned within the home region, no icon is selected. In other embodiments, however, a default icon may be temporarily selected even when thecursor170 is initially positioned within thehome region460.
The remainder ofscreen region440 is divided into eight regions, one each corresponding to the icons within the graphical menu, with screen regions461-468 depicted using underlined text in the figure. For example,region461 is associated withunselected icon471,region462 withunselected icon472,region463 with selectedicon473, andregion467 withunselected icon477.
In this example, the user has positionedcursor170 inregion463, causing temporary selection oficon473. In this example,icon473 had an appearance similar to477 when it was unselected, but upon temporary selection of theicon473, theicon473 changed its appearance to allow it to be differentiated from the unselected ions. Temporarily selectedicon473 is darker than the unselected icons and the letter within it has a different font, larger, bold, and white instead of black color.
FIG. 5ais a diagram illustrating another embodiment of a graphical menu. In this embodiment, instead of dividing the screen into regions, this technique uses distance between the cursor and icons to determine whether and which icon is selected.
For the purposes of describing this technique, theinitial cursor position504 is the screen position at which the graphical menu was initially displayed.
In the technique depicted inFIG. 5a, each icon is associated with a single position (icon location point). In the example shown,graphical menu150 has eight hexagonal icons, includingicons141,152, and143. In this example, each icon's location point is at the icon center. However, an icon's location point could be assigned to any position within the icon or even outside of the icon. In the example shown, the icon location point foricon143 is atposition514.
The location of user controlledcursor510 isscreen position507 in the figure. In the figure, the distance between each icon's location point and thecursor position507 is depicted by a dashed line. For example, dashedline516, which would be invisible to the user of the graphical menu, represents the distance betweencursor position507 andicon location point514 oficon143.
Determining whether an icon has been selected and if so, which one, can be accomplished by using the distances between the cursor position and icon location points. Determining whether any icon has been selected can be determined in a number of ways. Two non-limiting examples are described below.
With one technique, the distance between thecursor position507 andinitial cursor position504 is determined, depicted in the figure as dashedline512. This distance is compared to a threshold distance, wherein when the distance is above the threshold distance (e.g., the cursor is close enough to an icon), an icon is temporarily selected. Thus, once it is determined that the cursor is positioned such that an icon should be temporarily selected, the computing system determines which of the icons is the selected icon. In one embodiment, a particular icon is identified for temporary selection by assessing the distances between thecursor location507 and icon location points, and then selecting the icon with the smallest cursor to icon location point distance. It is possible that two or more icons might have equal cursor to icon location point distances. This situation may be resolved in several ways, such as (1) no icon would be selected until the user repositions the cursor so that the choice is unique, (2) icons could be assigned priorities so that the highest priority icon is chosen in the case of distance ties, or (3) an icon is randomly chosen from among this group, for example.
In the example shown,distance518 is the smallest cursor position to icon location distance, causingicon152 to be temporarily selected. Note that appearance of selectedicon152 differs from the other unselected icons ingraphical menu150.
In another embodiment, rather that performing the step of first determining whether an icon is selected, such as based on a distance between the cursor and an initial cursor position, and then determine which specific icon has been selected, the computing device may repeatedly recalculate distances between thecursor position507 and one or more icon locations until one of the distances falls below a threshold.
FIG. 5bandFIG. 5cillustrate another technique in which distance is used to determine whether an icon is temporarily or permanently selected and, if so, which one. The technique illustrated inFIG. 5autilizes a single icon location point of each icon. However, multiple icon location points can be utilized for icons.
FIG. 5billustratesicon530 with multiple icon location points531-535 positioned at its vertices. However, icon location points can be assigned at any positions within an icon, along its edge or outside of it.
FIG. 5cillustrates agraphical menu540 having four icons,541-544. As inFIG. 5b, each of these icons has 5 icon location points, one at each of its vertices. Theposition507 ofcursor510 is illustrated in the figure. Dashedline548 depicts the distance betweencursor position507 and one of the icon location positions,535 oficon543. The figure illustrates dashed lines betweencursor position507 and several of the other icon location points of the icons in the graphical menu. In practice, the distance from the cursor location to every icon location point may be determined.
The process of determining whether and which icon would be selected with multiple icon locations per icon may be similar to that used with a single icon location per icon. In the case of multiple icons locations per icon, the cursor to icon location distance for each icon is the minimum distance of its icon location points to the cursor location.
FlowchartsFIG. 6ais aflowchart600 illustrating one embodiment of a method that could be used for display and interaction of a user with a graphical menu. As discussed above withFIG. 1e, graphical menus can be cascaded, as discussed in reference to the flowchart inFIG. 6b. There is no limit to the number of levels that could be implemented in such a cascade or tree of graphical menus. The method ofFIG. 6amay be performed on any suitable computing device, such as one of the computing devices discussed above with reference toFIG. 1a. Depending on the embodiment, the method ofFIGS. 6aand6bmay include fewer or additional blocks and the blocks may be performed in a different order than is illustrated.
Inflowchart600, a graphical menu module that is configured to display graphical menus is first initiated inblock610. The graphical menu module may include software code that is executed by a computing device, such as a mobile or desktop computing device. The graphical menu module is configured to cause the computing device to display the graphical menu and detect interactions of the user (e.g., a cursor controlled by the user or a stylus or finger touching the display screen). In one embodiment, the graphical menu module comprises a standalone software application that interfaces with other software applications on a computing device. Alternatively, the graphical menu module may be incorporated into another software application, such as a word processor, graphic application, image viewing application, or any other software application. Alternatively, the graphical menu module could be part of the operating system. Depending on the embodiment, the initiation of the graphical menu module might occur as a result of a user's action, for example depression of the right mouse button, or could occur automatically as a result a program or the operating system initiating the system to obtain user input. In response to initiation of the graphical menu module, a graphical menu is provided via a display device, such as a monitor or screen of a mobile computing device.
Inblock612, the graphical menu module determines whether the user has finished using the displayed graphical menu. For example, by releasing the right mouse button (possibly indicating a desire to permanently select a temporarily selected icon and initiate execution of a process associated with the icon), the user may indicate that he has finished with the current instance of the graphical menu system. Alternatively, a user may indicate that he is finished with a graphical menu by moving a cursor off of the graphical menu, such as outside an area of the screen where the graphical menu is displayed. In one embodiment, certain graphical menus may “time out,” so that if no user input is received for a predetermined time period, the user is considered to be finished with the graphical menu and the method continues to block620.
If the user is not finished using the displayed graphical menu, inblock614 the system determines whether an icon is temporarily selected using, for example, one of the techniques described herein.
If no icon has been temporarily selected, inblock616 the graphical menu is displayed with no icon selected (e.g., shaded or otherwise distinguished from other icons) inblock616. The method then loops back to block612 and again senses whether the user is finished with the graphical menu.
If an icon has been temporarily selected, inblock618 the graphical menu is displayed with the selected icon displayed in a way that differentiates it from the unselected icons, as described herein. The method then loops back to block612 and again senses whether the user is finished with the graphical menu.
If the graphical menu module determines that that the user is finished inblock612, the method branches to block620 and determines whether an icon was permanently selected. If an icon is determined to have been permanently selected, the method branches to block622 where the graphical menu module returns the identity of the permanently selected icon to the program or operating system that the graphical menu module is configured to interact with. In another embodiment, permanent selection of an icon initiates display of a secondary menu comprising a plurality of icons about the current cursor position. Thus, blocks612-624 may be repeated with respect to the secondary menu in order to determine if an icon of the second memory is selected. The process may be repeated any number of times in relation to any number of different menus that may be navigated to via other graphical menus.
Atblock620 if no icon has been permanently selected, the method branches to block624 where the graphical menu module returns an indication that no icon of the graphical menu was permanently selected to the program or operating system that the graphical menu module is configured to interact with.
FIG. 6bis aflowchart602 with logic similar toFIG. 6a, except forblocks630 and632. In the case that no icon was selected, a secondary menu is displayed inblock630. The user can pick from this secondary menu inblock632. This secondary menu could be a graphical menu, as described herein, or could be a conventional menu, as illustrated inFIG. 7c.
FIG. 7aillustrates a graphical menu superimposed on a homogenous screen. In this embodiment, thegraphical menu710 comprises eight square icons,711-718.Cursor720 is positioned near icon711 causing icon711 to be selected, using one of the techniques described herein. The selected icon711 appears color inverted with respect to the other unselected icons in the graphical menu, allowing the user to easily differentiate the selected icon from unselected icons.
FIG. 7billustrates a graphical menu superimposed on a complex screen output of a program that called the graphical menu. In particular, the graphical menu is superimposed on the contents of thescreen709, in this case a gray scale image. In addition, the cursor is superimposed on top of both the graphical menu and the underlying image on the screen. In this example, the cursor position is near the initial cursor position and no icon within the graphical menu is selected. In one embodiment, the graphical menu may have some transparency, as illustrated in this figure. In one embodiment, a level of transparency can be selected by the user.
FIG. 7cillustrates user interactions with the graphical menu illustrated inFIG. 7aorFIG. 7b. Ascreen region730 on which the graphical menu is superimposed is similar toscreen regions708 and709 inFIG. 7aandFIG. 7b, respectively. For clarity, the various components ofscreen region730 are not annotated as they are analogous to those illustrated inFIG. 7aandFIG. 7b.Screen region730 illustrates a graphical menu having eight unselected icons superimposed on a screen region that in this case is a grayscale image. The cursor is displayed as well, as previously. In this case the cursor is positioned in a region within the graphical menu and no icon within the graphical menu has yet been temporarily selected.
View731, which is associated withscreen region730, illustrates mouse activity that could have been used to initiate display of the graphical menu withinscreen region730.Exemplary mouse725 includes one or more buttons. In this example, depression of theright mouse button726 causes display of the graphical menu. Depression ofbutton726 is illustrated byarrow732.
View741 is similar to view731. While continuing to depress the right mouse button, illustrated byarrow732, the user moves the mouse to the right, illustrated bymotion path742. This causes rightward motion of the cursor, repositioning it from its position inscreen view730 to that inscreen view740. This causes selection of an icon, as described previously. In comparing the graphical menu inscreen view740 to that inscreen view730, it can be seen that an icon has changed its appearance, indicating to the user that it has been temporarily selected.
View751 is similar to view741 but illustrates further mouse movement. Inview751, the user movesmouse725 superiorly, illustrated bymouse path752. As in the example illustrated in740 and741, this results in repositioning of the cursor and selection of a different icon within the graphical menu.
View761 illustrates the case where there is little or no cursor repositioning compared to the original position of the cursor inview730. In this case, net movement of the mouse is insufficient for an icon to be selected within the graphical menu, as discussed previously. In this example, release ofmouse button726, illustrated byarrow762, results in the display of a different menu from which the user can choose. Thus, the user may be presented with a first graphical menu in response to a first action (e.g., depressing the mouse button) and may be presented with a second menu in response to a second action that follows the first action (e.g., releasing the mouse button without temporarily selecting an icon).
Other Contemplated EmbodimentsFor some of the embodiments illustrated herein, the user input device is a mouse. However, any input device or combination of input devices could be used to control the graphical menus described herein, including: mouse, trackball, keyboard, touch screen, 3d mice, foot controls, pointing sticks, touchpad, graphics tablet, joystick, brain-computer interfaces, eye-tracking systems, Wii remote, jog dial, and/or steering wheel.
A graphical menu module could be implemented in many situations. For example, a graphical menu module can be implemented within a computer program where the user might use it to choose among options. For example, in a word processing program it might be used to allow the user to choose font options, such as bold, italic, and underline. In a PACS system it might be used to allow users to choose various predefined display settings for an image.
In another example, a graphical menu module can be implemented within a computer program to allow selection of various operations. For example, in a word processing program it could be used to choose among operations like copy, paste, delete, and indent. In a PACS system it could be used to choose among different operations such as window/level, choose series, region of interest, zoom, pan, and others.
In another example, a graphical menu module can be implemented within an operating system where it could be used to choose among operations like “launch web browser,” “launch word processing program,” “launch spreadsheet program,” and so on.
In another example, a graphical menu module can be implemented as an add-in program or standalone driver, such as a mouse driver, that would allow the use of the system in cases where it had not been directly implemented within a program or operating system. The system may be configured to send key strokes or other inputs to the program or operating system for which it was configured.
The appearance and operation of graphical menus used in the system could vary in many ways. For example, the graphical menu, the icons it includes, and how it operates could vary depending on the context in which it was launched. Graphical menus could differ depending on the program or operating system that utilized the system. In addition they could be configurable by the user. In another example, graphical menus could contain one or more icons. In another example, icons within graphical menus could take many forms, including a computer graphic, a picture, and/or text. In another example, icons within a graphical menu could vary in appearance and size. In another example, different methods could be used to allow a user to visually differentiate selected from unselected icons within a graphical menu. For example, the icons could be differentiated by icon appearance, size, color, brightness, features of text font (such as size, bold, italics), and/or motion or blinking (e.g., the selected icon could blink or shake back and forth on the display).
While depression of the right button of a mouse is used in several examples to initiate display of a graphical menu, many other ways are contemplated to initiate display of a graphical menu. For example, such initiation can be via a key on a keyboard, a button on any input device (with example input devices listed herein), a mouse gesture, a gesture on a touch screen with a finger or stylus, physical motion of the device (for example, shaking a handheld device), a result of picking an icon on another graphical menu, and/or a result of a computer or system operation, rather than the result of the user initiating the action. For example, a computer program or operating system might require the user to provide input and in that case display a graphical menu. In another example, a computer with a battery that is running low might display a graphical menu allowing the user to choose among: continue working, shut down, save all open documents, and initiate hibernation.
After an icon is temporarily selected within a graphical menu, several examples herein illustrate the user permanently selecting that icon by releasing the right mouse button. However, they are many ways that a user could permanently select an icon from a graphical menu, including: removing a stylus or finger from a touch screen, pressing a button or key, a mouse gesture, sound input, cursor movement (for example, slight movement from the initial cursor position toward an icon might result in it being temporarily selected; and further movement toward the icon might result in the icon being permanently selected and termination of display of the graphical menu), time (the system could be configured such that a temporarily selected icon would be permanently selected after it was temporarily selected for a predetermined time duration, say for example 100 milliseconds), and/or if the user positioned the cursor over the icon or a predetermined portion of the icon.
SoundSound could be used in several ways with this technique to supplement the use of a graphical menu or substitute for the display of a graphical menu. For example, when any icon is temporarily selected, a sound could be played, for example a beep.
In another example, when no icon is temporality selected (e.g., when the user moves the cursor back toward its initial cursor position after temporarily selecting an icon), a sound could be played. This could be different than the sound played when an icon is selected (e.g., temporary selection of an icon could cause a single beep, and subsequent cursor movement that resulted in no icon selected could result in a double beep).
In another example, different sounds could be played for different icons, even spoken words. This could allow the user to accurately verify selection of an icon without the need for visual verification. For example, a graphical menu within a word processing program might have four choices: “cut”, “copy”, “paste”, and “look up”. As the user repositions the cursor, these options could be spoken. If one of these was chosen and the user repositioned to another, the sound associated with the new choice would be spoken. If he repositioned the cursor so that none were chosen, a different phase could be spoken, such as “no selection”.
In another example, a system using sound could be constructed in which visual display of the graphical menu was not required. This might be helpful in situations such as: blind users and drivers or pilots where the user would want to choose from a menu of options but not want to direct his attention to a display screen.
SUMMARYAll of the processes described above may be embodied in, and fully automated via, software code modules executed by one or more general purpose or specially configured computers. The code modules may be stored in any type of computer-readable medium or other computer storage device. Some or all of the methods may alternatively be embodied in specialized computer hardware. In addition, the components referred to herein may be implemented in hardware, software, firmware, or a combination thereof.
Any process descriptions, elements, or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process. Alternate implementations are included within the scope of the embodiments described herein in which elements or functions may be deleted, executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those skilled in the art.
One skilled in the relevant art will appreciate that the methods and systems described above may be implemented by one or more computing devices, such as a memory for storing computer executable components for implementing the processes shown, as well as a process unit for executing such components. It will further be appreciated that the data and/or components described above may be stored on a computer readable medium and loaded into a memory of a computer device using a drive mechanism, such as a CD-ROM, DVD-ROM, or network interface, for reading such computer readable medium. Further, the components and/or data can be included in a single device or distributed in any manner.
The foregoing description details certain embodiments of the invention. It will be appreciated, however, that no matter how detailed the foregoing appears in text, the invention can be practiced in many ways. As is also stated above, it should be noted that the use of particular terminology when describing certain features or aspects of the invention should not be taken to imply that the terminology is being re-defined herein to be restricted to including any specific characteristics of the features or aspects of the invention with which that terminology is associated.