CROSS REFERENCE TO RELATED APPLICATIONThis application claims the benefits to U.S. Provisional Application No. 62/129,639 filed on Mar. 6, 2015, which is incorporated herein by reference in its entireties.
TECHNICAL FIELDThis description relates generally to processing device user interfaces, and more particularly to compact portable processing device user interfaces for online marketing.
BACKGROUNDTouchscreens are touch-sensitive electronic visual displays that receive tactile input information entered using a human digit, a special glove, or a stylus. A typical touchscreen can sense touch events including contact or movement on the surface of the screen, such as taps, long touches, swipes, pinches, flicks, other gestures, marks, lines, or geometric shapes. In general, touchscreens enable users to interact directly with images displayed on the screen, rather than through an intermediate device, such as a mouse or a touchpad.
Some existing touchscreens implement resistive touch-sensing technology, while other existing touchscreens implement capacitive, surface acoustic wave, infrared or optical technologies to sense touch events. Touchscreens have been used as input devices in tablet computers, mobile phones, and gaming consoles.
A currently emerging area of application is in compact wearable processing devices, such as wrist-wearable devices, in which the touchscreens typically are of relatively small size. The reduced size of touchscreens on wearable devices have drawbacks regarding existing user interface implementations. On the other hand, the constant presence of wearable devices present opportunities for combining multiple utilities in new ways.
SUMMARYAccording to one embodiment of the present invention, a method for navigating time-based offers includes rendering on a touch display a display image of a watch dial, symbols corresponding to each hour on the watch dial, and one or more time indicators, where each symbol corresponds to a time-based offer. The method also includes determining whether or not a touch event sensed at the touch display complies with a predefined gesture, and assigning, in response to the touch event complying with the predefined gesture, an action based on the predefined gesture, a display context, and a temporal context.
According to another embodiment of the present invention, a method for navigating time-based offers includes rendering a composite display image comprising a watch dial, a bezel including symbols corresponding to each hour of the watch dial circumferentially surrounding the watch dial, each symbol corresponding to a time-based offer, and a seconds indicator circumferentially surrounded by the watch dial. The method also includes determining whether or not a first touch event complies with a predefined gesture and assigning, in response to the first touch event complying with the predefined gesture, an action associated with the predefined gesture, a display context, and a temporal context.
According to yet another embodiment of the present invention, a user interface for navigating time-based offers includes a rendition module that renders a display image of a watch dial and symbols corresponding to each hour, each symbol corresponding to a time-based offer. The user interface also includes a recognition module that determines whether or not a touch event complies with a predefined gesture, and an interpretation module that assigns, in response to the touch event complying with the predefined gesture, an action associated with the predefined gesture, a display context, and a temporal context.
The details of one or more embodiments of the disclosure are set forth in the accompanying drawings and the description below. Other features, objects, and advantages of the disclosure will be apparent from the description and drawings, and from the claims.
DESCRIPTION OF THE DRAWINGSFIG. 1 is a schematic view illustrating an exemplary user interface manager in accordance with an embodiment of the present disclosure.
FIG. 2 is a schematic view depicting an exemplary general computing system that can implement the user interface manager ofFIG. 1 in accordance with an embodiment of the present disclosure.
FIG. 3 is an illustration of an exemplary compact processing device that can implement the user interface manager ofFIG. 1 in accordance with an embodiment of the present disclosure.
FIG. 4 is an illustration of an exemplary communications network that can be employed by the user interface manager ofFIG. 1 in accordance with an embodiment of the present disclosure.
FIG. 5A is an illustration of an exemplary watch face image in accordance with an embodiment of the present disclosure.
FIG. 5B is an illustration of an exemplary current offer highlight display image in accordance with an embodiment of the present disclosure.
FIG. 5C is an illustration of an exemplary offer detail display image in accordance with an embodiment of the present disclosure.
FIG. 6A is an illustration of an exemplary noncurrent offer highlight display image in accordance with an embodiment of the present disclosure.
FIG. 6B is an illustration of another exemplary offer detail display image in accordance with an embodiment of the present disclosure.
FIG. 6C is an illustration of another exemplary noncurrent offer highlight display image in accordance with an embodiment of the present disclosure.
FIG. 6D is an illustration of another exemplary offer detail display image in accordance with an embodiment of the present disclosure.
FIG. 7 is a flowchart representing an exemplary method of displaying time-based special offers synchronized with a visual time display to market products and/or services in accordance with an embodiment of the present disclosure.
FIG. 8 is a flowchart representing an exemplary method of displaying time-based special offers with a watch face display image to market products and/or services in accordance with an embodiment of the present disclosure.
FIG. 9 is a flowchart representing an exemplary method of displaying a time-based offer highlight display image with a visual time display to market products and/or services in accordance with an embodiment of the present disclosure.
FIG. 10 is a flowchart representing an exemplary method of displaying a time-based offer detail image to market products and/or services in accordance with an embodiment of the present disclosure.
DETAILED DESCRIPTIONAn embodiment of the present disclosure provides a user interface for conveniently displaying special offers for consumer purchases, including time-based offers, which are synchronized with a visual time display, such as an analog watch dial. The user interface enables user interactions with a touch display, such as a touch screen or touch panel, to manipulate the display regarding time-related offerings of products and/or services.
An embodiment of the present disclosure is shown inFIG. 1, which illustrates an exemplaryuser interface manager10 that employs a touch display gesture response process in order to display time-based special offers synchronized with a visual time display to market products and services. Theuser interface manager10 includes anacquisition module12, arecognition module14, aninterpretation module16, alayout module18, acomposition module20, and arendition module22.
Theacquisition module12 acquires positional information regarding touch events over time from a touch display of a compact processing device, such as a mobile device or a wearable device. Therecognition module14 determines the category or type of the touch event. Touch events include sensed contact or movement on the surface of the touch display that correspond to taps, long touches, swipes, pinches, flicks, other gestures, marks, lines, geometric shapes, or the like.
For example, in some embodiments, therecognition module14 compares the acquired positional information to multiple gesture templates to determine whether the touch event fits into any of various predetermined event envelopes. For example, in an embodiment, therecognition module14 analyzes the acquired positional information with respect to a list of predetermined event definitions, or rules, to determine whether the touch event fits into the predefined category or type of gesture.
If the touch event is determined to comply with a known event type or category, then theinterpretation module16 evaluates the context of the touch event with respect to the display image and time to decide on the appropriate action to be carried out. For example, in some embodiments, theinterpretation module16 identifies the display image that initially was rendered on the touch display at the moment in time that the touch event was initiated. In certain embodiments, theinterpretation module16 further identifies a sequential history of one or more display images rendered immediately preceding the current display image rendered at the time the touch event was initiated. In various embodiments, the temporal context includes, for example, the time at which a selection is made, a countdown associated with a sale, or whether the selected offer is available during the current hour or corresponds to a noncurrent hour. The interpretation, or meaning, to be associated with the touch event depends in each case on the touch display context and the temporal context when the user initiated or completed the event.
Thus, once the context is determined, theinterpretation module16 associates, or assigns, a specific meaning to the event. As a result, when a user performs a single-digit swipe from right-to-left on the touch display, the intended significance of the swipe event may depend, for example, on the product image currently displayed when the swipe was initiated. In some instances, the intended significance of the swipe event may also depend on the preceding product or other image that was rendered immediately before the currently displayed product image.
Thelayout module18 creates one or more image layouts, or views, in response to the significance, meaning or signification assigned to the touch event by theinterpretation module16. Each layout, or view, can occupy the entire display screen or only a portion of the display screen. Thus, in some instances, the views are designed for simultaneous rendering on different portions of the display screen. In other instances, one view is designed to be superimposed over another view. For example, an added view can be wholly opaque, partly opaque and partly translucent, or wholly translucent, such that another layout can be simultaneously rendered through part or all of the added view.
Thecomposition module20 combines the individual layouts, or views, to be displayed together, and adds other content, such as textual content. For example, thecomposition module20 may superimpose a product image associated with a temporary offer over a portion of a watch face showing the current time and add the reduced price of the item to compose a complete composite display image.
Therendition module22 performs formatting procedures on the display image and forwards the final display image to the compact processing device to be visually rendered on the touch display. For example, in some embodiments, therendering module22 translates the image file from one image file format to another image file format to ensure compatibility with the touch display of the compact processing device. Various embodiments are compatible with numerous image file formats known in the art. In addition, in some embodiments, therendition module22 performs a data compression procedure on the display image.
As illustrated inFIG. 2, an exemplarygeneral computing device20 that can be employed in theuser interface manager10 ofFIG. 1 includes aprocessor32, amemory34, atouch display36, astorage38, and anetwork interface40. The various components of thecomputing device20 are coupled by alocal data link42, which in various embodiments incorporates, for example, an address bus, a data bus, a serial bus, a parallel bus, or any combination of these.
Thecomputing device20 can be used, for example, to implement the functions of theuser interface manager10 ofFIG. 1. Programming code, such as source code, object code or executable code, stored on a computer-readable medium, such as thestorage38 or a peripheral storage component coupled to thecomputing device20, can be loaded into thememory34 and executed by theprocessor32 in order to perform the functions of theuser interface manager10. In various embodiments, thecomputing device20 can include, for example, a mobile device, such as a personal digital assistant (PDA), a cellular telephone, a smart phone, a wearable device, or the like, with a relatively compact touch display.
Referring toFIG. 3, an exemplarycompact processing device50 is shown with an attachedadjustable strap52 having alatching mechanism54 for securing thecompact processing device50 to a bodily appendage, such as a human user wrist, arm or leg in a wearable configuration. Thecompact processing device50 includes a compacttouch display screen56 user interface to render display images and receive tactile input information, such as touch events. In various embodiments, the tactile input information can be entered, for example, using a human user digit, a special glove, a stylus, or the like, in accordance with touch display technologies known in the art.
As shown inFIG. 4, thecomputing device20 or thecompact processing device50 can be communicatively coupled to acommunications network60. For example, in some configurations, thecompact processing device50 communicates with aremote server62 to access data, such as marketing information, product information, service information or pricing information from aremote database64, such as an online shopping website data center. In various embodiments, thecommunication network60 can include any viable combination of devices and systems capable of linking computer-based systems, such as the Internet; an intranet or extranet; a local area network (LAN); a wide area network (WAN); a direct cable connection; a private network; a public network; an Ethernet-based system; a token ring; a value-added network; a telephony-based system, including, for example, T1 or E1 devices; an Asynchronous Transfer Mode (ATM) network; a wired system; a wireless system; an optical system; a combination of any number of distributed processing networks or systems or the like.
Referring now toFIG. 5A, an exemplary watchface display screen70 user interface rendered on the compacttouch display screen56 includes awatch dial72, or clock dial, withnumbers74 one through twelve correlating to hours at even intervals beginning to the right of the top dead center position and increasing in the clockwise direction. Thewatch dial72 is further divided into sixty equal divisions indicated bymarkings76, such as dashes, around thewatch dial72 correlating to minutes and seconds.
Of course, in various embodiments, the watchface display image70 can have other numerical indicators correlating to hours, such as roman numerals or may not include numerical indicators. Similarly, in some embodiments thewatch dial72 can be divided into a different number of divisions, for example, twelve divisions corresponding to the hours on thewatch dial72, or thewatch dial72 may not includedivision markings76.
The watchface display image70 also includes anhour indicator78, such as an hour hand, aminute indicator80, such as a minute hand, aseconds indicator82, such as an incremental circular indicator, and adigital time indicator84 of the current time, for example, in hours and minutes. Once again, in other embodiments, thewatch dial72 can include any type of time indicator, including incremental circular hour or minute indicators, a second hand, or any other analog or digital form that can effectively communicate the passage of time.
In addition, the watchface display image70 includes a virtual bezel withmarketing symbols86, such as product or service icons, corresponding to discrete time periods, for example, each hour index on thewatch dial72. Themarketing symbols86 correspond to time-based offers, such as time-sensitive sales, for example, hourly special offers of products or services. For example, in an embodiment themarketing symbols86 represent a “product of the hour” available for consumer purchase at a reduced price during a discrete time period, such as a music download or compact disc, an article of women's clothing, a camera, a mobile phone, a personal computer, an automotive accessory, a gift item, an article of men's clothing item, a timepiece, a handbag, a household item, or jewelry. In other embodiments, themarketing symbols86 may represent any product, service or category of products or services to be offered. Of course, in yet other embodiments the time-based offers can correspond to any discrete time period, including hourly offers, daily offers, quarter-hourly offers, or offers associated with any other equal or differing discrete periods of time.
The watchface display image70 further includes an end-of-offer indicator88, or “end-of-sale” indicator, such as a digital indication of the time remaining, in hours, minutes and seconds, during which the special offer will be available. Once again, in other embodiments, the end-of-offer indicator88 may include any useful representation of the time, for example, the time remaining for the special offer or the time at which the special offer will terminate.
Areas of the watchface display image70 are sensitized, or defined and associated with types or categories of touch events and related responsive actions. For example, the area including and immediately surrounding eachmarketing symbol86 is sensitized to be activated by tap or swipe gestures over the bezel.
Referring toFIG. 5B, an exemplary current product offerhighlight display image90 user interface rendered on the compacttouch display screen56 includes awatch dial72,numbers74 one through twelve correlating to hours,markings76 correlating to minutes, andmarketing symbols86, or icons, corresponding to each hour indicator. However, themarketing symbol86 at the ten o'clock location, or position, is a highlightedsymbol92, for example, rendered in brighter tone, with greater contrast or color intensity than themarketing symbols86 at the one o'clock through nine o'clock, eleven o'clock and twelve o'clock positions.
In various embodiments, the highlightedsymbol92 may be distinguished from the remainingmarketing symbols86 in any manner that permits the touch display to render differentiated symbols, such as a different color, brightness, contrast, intensity, or an animation scheme, for example, blinking, or the like.
In addition, the offerhighlight display image90 includes anoffer image94 in the center of thewatch dial72, in place of some or all of the time indicators, such as the hour andminute indicators78,80 of the watchface display image70 ofFIG. 5A. For example, theoffer image94 represents a product, such as a “product of the hour,” or service available at a reduced price during the hour corresponding to the highlightedsymbol92, for example from 10:00:00 a.m. until 10:59:59 a.m.
In some instances, the highlightedsymbol92 and theoffer image94 correspond to the current hour at the moment in time the offerhighlight display image90 is viewed. In other instances, the highlightedsymbol92 and theoffer image94 correspond to amarketing symbol86 ornumber74 other than the current hour selected by a user by way of a touch event.
When the highlightedsymbol92 and theoffer image94 correspond to the current hour at the time the offerhighlight display image90 is being rendered, the marking76 ofFIG. 5A that corresponds to the current hour is transformed into a highlighted marking96, such as an enlarged, brightened dot of a distinct color or other more prominent shape. Further, in an embodiment, the offerhighlight display image90 retains theseconds indicator82, such as an incremental circular indicator, partially surrounding theoffer image94 in the center of thewatch dial72. The highlighted marking96 and the seconds indicator can advantageously emphasize a sense of urgency associated with the time-based special offer.
Areas of the product offerhighlight display image90 are sensitized, or defined and associated with types or categories of touch events and related responsive actions. For example, the area including and immediately surrounding eachmarketing symbol86 is sensitized to be activated by tap or swipe gestures over the bezel, and the central portion of thehighlight display image90 is sensitized to be activated by tap gestures over theoffer image94.
Referring toFIG. 5C, an exemplary offerdetail display image100 user interface rendered on the compacttouch display screen56 includes an enlargedoffer detail image102, such as a product image or a service image corresponding to a product or service offer selected by the user by way of a touch event on a previously rendered display screen. Because of the relatively small size of the compacttouch display screen56, theoffer detail image102 rendered in the offerdetail display image100 occupies a substantial portion of the screen space, for example, a full-screen view, greater than ninety percent of the screen space, greater than seventy-five percent of the screen space, greater than half of the screen space, or the like.
In an embodiment, thewatch dial72,numbers74,markings76,digital time indicator84,marketing symbols86 and end-of-offer indicator88 ofFIG. 5A, as well as the time indicators, are replaced by theoffer detail image102. In other embodiments, any portion or all of these features may be partially or fully visible through, around or along with theoffer detail image102. For example, in one embodiment, portions of theoffer detail image102 may be partially opaque or translucent, allowing thewatch dial72 to be visible through portions of theoffer detail image102. In another embodiment,digital time indicator84 and the end-of-offer indicator88, for example, may remain partially or fully visible through or below theoffer detail image102.
In addition, the offerdetail display image100 includesprice information104 and apurchase button106, for example, superimposed over a portion of theoffer detail image102. For example, in an embodiment, theprice information104 includes a textual display of the crossed-out regular price of the product or service as well as the time-based special offer price for the product or service.
In one embodiment, the area surrounding the price information is distinguished by a unique color to indicate an area of the compacttouch display screen56 that may be activated by a touch event in order to place an order for the product or service. In other embodiments, theprice information104 or thepurchase button106 may be separate images, and each of these may be either superimposed over theoffer detail image102 or rendered beside, above or below theoffer detail image102. Further, in an embodiment, thepurchase button106 is animated to flash back and forth between theprice information104 and a marketing message, such as, “Buy Now,” to encourage the user to immediately purchase the product or service.
Areas of the offerdetail display image100 are sensitized, or defined and associated with types or categories of touch events and related responsive actions. For example, the area including thepurchase button106 is sensitized to be activated by tap gestures, and thedetail display image100 is sensitized to be activated by tap or swipe gestures.
Referring toFIG. 6A, an exemplary noncurrent product offerhighlight display image110 user interface rendered on the compacttouch display screen56 is similar to the current product offerhighlight display image90 ofFIG. 5B. This example includes anoffer image112 of a household item special offer corresponding to the eleven o'clock position. However, in the noncurrent version, the marking76 associated with the hour or position corresponding the displayed offerhighlight display image110 is not highlighted.
Referring toFIG. 6B, another exemplary offerdetail display image120 user interface rendered on the compacttouch display screen56 is similar to the offerdetail display image100 ofFIG. 5C. This example includes an enlargedoffer detail image122 of the household item special offer.
Referring toFIG. 6C, another exemplary noncurrent product offerhighlight display image130 user interface rendered on the compacttouch display screen56 is similar to the offerhighlight display image110 ofFIG. 6A. This example includes anotheroffer image132 of a jewelry item special offer corresponding to the twelve o'clock position.
Referring toFIG. 6D, yet another exemplary offerdetail display image140 user interface rendered on the compacttouch display screen56 is similar to the offerdetail display image120 ofFIG. 6B. This example includes another enlargedoffer detail image142 of the jewelry item special offer.
Referring now toFIG. 7, an exemplary process flow is illustrated that may be performed, for example, by theuser interface manager10 ofFIG. 1 to implement an embodiment of the method described in this disclosure for displaying time-based special offers synchronized with a visual time display to market products and services. The process begins atblock150, where a display image is rendered on a touch display. For example, in one instance, the display image depicts a watch face; in another instance, the display presents a highlight image, or offer image; in yet another instance, the display image presents an offer detail image.
Inblock152, positional information is acquired from the touch display over a period of time. A touch event is detected, inblock154, and assigned to a category or type of gesture based on compliance with predefined definitions, or rules. Inblock156, the display context and time context regarding the gesture are evaluated. A specific significance, meaning, or signification is assigned to the touch event, inblock158, based on the gesture type, display context and temporal context.
In accordance with the signification associated with the touch event, one or more actions are carried out. For example, inblock160, image layouts are created in accordance with the signification associated with the touch event. In some embodiments, a remote server or database is accessed in order to retrieve marketing information, product or service information, pricing information, or the like, for inclusion in the image layouts. For example, in a preferred embodiment, time-based product and pricing information is retrieved from an online shopping website data center and included in the image layouts.
A composite image is created from the layouts inblock162. For example, in various instances, multiple image layouts representing components of a watch face are superimposed, or a product highlight image layout is superimposed over a watch face. The display image is updated, or re-rendered, with the composite image inblock164.
In an embodiment, an essentially opaque, black layout provides the backdrop to the watch dial layout, the bezel layout, and the offer highlight image layout. The majority of the bezel layout, including the noncurrent-hour symbols, and the majority of the watch dial are shown in relatively subdued tones with relatively low luminescence, for example, medium or dark gray. The current-hour symbol and the offer highlight image, on the other hand, are rendered in relatively bright tones with relatively high luminescence to provide sharp contrast with the opaque periphery and subdued elements. As a result, the current-hour symbol and offer highlight image stand out from the remainder of the screen, for example, providing a pseudo-three-dimensional effect.
Referring toFIG. 8, an exemplary process flow is illustrated that may be performed, for example, by theuser interface manager10 ofFIG. 1 to implement an embodiment of the method described in this disclosure for displaying time-based special offers synchronized with a visual time display to market products and/or services. The process begins atblock170, where a watch face image is rendered on a touch display, such as the exemplary watch face shown inFIG. 5A.
A touch event is detected inblock172. A determination is made, inblock174, regarding whether or not the touch event complies with a predefined gesture associated with a tap on the area of the touch display that corresponds to the central portion of the watch face surrounded by the watch dial. If so, since the watch face was currently displayed at the time the gesture was initiated, then in block176 a highlight image, such as a product offer image or a service offer image that corresponds to the current hour, is rendered on the touch display.
Referring toFIG. 9, an exemplary process flow is illustrated that may be performed, for example, by theuser interface manager10 ofFIG. 1 to implement an embodiment of the method described in this disclosure for displaying time-based special offers synchronized with a visual time display to market products and services. The process begins atblock180, where a highlight image is rendered on a touch display, such as the exemplary product offer and watch dial composite image shown inFIG. 5B.
A touch event is detected inblock182. A determination is made, inblock184, regarding whether or not the touch event complies with a predefined gesture associated with a long press on the area of the touch display that corresponds to the product or service offer image, which is circumferentially surrounded, for example by the seconds indicator and watch dial. If so, since the highlight image was currently displayed at the time the gesture was initiated, then inblock186 an offer detail image, such as the product image or service image that corresponds to the current hour, is rendered on the touch display.
If not, another determination is made, inblock188, regarding whether or not the touch event complies with a predefined gesture associated with a long press on the area of the touch display that corresponds to the highlighted symbol, or icon, that corresponds to the current hour. If so, since the highlight image was currently displayed at the time the gesture was initiated, then inblock190 the offer detail image that corresponds to the current hour is rendered on the touch display.
If not, yet another determination is made, inblock192, regarding whether or not the touch event complies with a predefined gesture associated with a tap on the area of the touch display that corresponds to another symbol that corresponds to another hour. If so, since the highlight image was currently displayed at the time the gesture was initiated, then inblock194 the highlight image that corresponds to the selected symbol is rendered on the touch display, such as the exemplary noncurrent product offerhighlight display image110 ofFIG. 6A.
If not, an additional determination is made, inblock196, regarding whether or not the touch event complies with a predefined gesture associated with a swipe across an area of the touch display that corresponds to the symbols corresponding to two or more hours in sequence. For example, in various instances the touch event may include a clockwise swipe across multiple symbols or a counterclockwise swipe across multiple symbols. If so, since the highlight image was currently displayed at the time the gesture was initiated, then inblock198 the sequence of highlight images with offer images corresponding to the passed-over symbols is rendered on the touch display while the swipe is performed, and the highlight image corresponding to the final passed-over symbol remains on the touch display after the swipe has been terminated.
For example, when a swipe gesture is performed in an arc across the symbols corresponding to ten o'clock, eleven o'clock and twelve o'clock positions, respectively, along the virtual bezel shown inFIG. 5B, the highlight images shown inFIGS. 5B, 6A and 6C are shown in sequence on the touch display as the swipe gesture passes over each of the correlated symbols. If the swipe gesture is terminated before reaching the symbol at the one o'clock position, then the highlight image that corresponds to the twelve o'clock position remains on the touch display.
If not, a further determination is made, inblock200, regarding whether or not the touch event complies with a predefined gesture associated with a tap on the area of the touch display that corresponds to the highlight image. If so, then inblock202 the watch face is once again rendered on the touch display. If not, a further determination is made, inblock204, regarding whether or not the touch event complies with a predefined gesture associated with a tap on the area of the touch display that corresponds to the highlighted symbol. If so, then inblock206 the watch face is once again rendered on the touch display. Otherwise, monitoring of the touch display continues until another touch event is detected in step1822
Referring toFIG. 10, an exemplary process flow is illustrated that may be performed, for example, by theuser interface manager10 ofFIG. 1 to implement an embodiment of the method described in this disclosure for displaying time-based special offers synchronized with a visual time display to market products and services. The process begins atblock210, where a offer detail image is rendered on a touch display, such as the exemplary detail product offer image shown inFIG. 5C.
A touch event is detected inblock212. A determination is made, inblock214, regarding whether or not the touch event complies with a predefined gesture associated with a swipe from left-to-right across the area of the touch display that corresponds to the offer detail image. If so, since the offer detail image was currently displayed at the time the gesture was initiated, then in block216 a next offer detail image, such as the product image or service image that corresponds to the successive hour, is rendered on the touch display. For example, if the detail product image associated with the eleven o'clock hour is initially displayed, then the next detail product image associated with the twelve o'clock hour is rendered in response to a left-to-right swipe gesture.
If not, another determination is made, inblock218, regarding whether or not the touch event complies with a predefined gesture associated with a swipe from right-to-left across the area of the touch display that corresponds to the offer detail image. If so, since the offer detail image was currently displayed at the time the gesture was initiated, then in block220 a previous offer detail image, such as the product image or service image that corresponds to the preceding hour, is rendered on the touch display. For example, if the detail product image associated with the eleven o'clock hour is initially displayed, then the previous detail product image associated with the ten o'clock hour is rendered in response to a right-to-left swipe gesture.
If not, an additional determination is made, inblock222, regarding whether or not the touch event complies with a predefined gesture associated with a tap over an area of the touch display that corresponds to the purchase button layout. If so, since the offer detail image was currently displayed at the time the gesture was initiated, then in block224 a purchase, or order, transaction is carried out. For example, a purchase and sale transaction is performed between the user, or consumer, and the product or service provider.
If not, a further determination is made, inblock226, regarding whether or not the touch event complies with a predefined gesture associated with a tap on the area of the touch display that corresponds a central portion of the touch display over the offer detail image. If so, since the offer detail image was currently displayed at the time the gesture was initiated, then inblock228 the watch face is once again rendered on the touch display.
Aspects of this disclosure are described herein with reference to flowchart illustrations or block diagrams, in which each block or any combination of blocks can be implemented by computer program instructions. The instructions may be provided to a processor of a general purpose computer, special purpose computer, mobile programming device, or other programmable data processing apparatus to effectuate a machine or article of manufacture, and when executed by the processor the instructions create means for implementing the functions, acts or events specified in each block or combination of blocks in the diagrams.
In this regard, each block in the flowchart or block diagrams may correspond to a module, segment, or portion of code that including one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functionality associated with any block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or blocks may sometimes be executed in reverse order.
A person of ordinary skill in the art will appreciate that aspects of this disclosure may be embodied as a device, system, method or computer program product. Accordingly, aspects of this disclosure, generally referred to herein as circuits, modules, components or systems, may be embodied in hardware, in software (including firmware, resident software, micro-code, etc.), or in any combination of software and hardware, including computer program products embodied in a computer-readable medium having computer-readable program code embodied thereon. In the context of this disclosure, a computer readable storage medium may include any tangible medium that is capable of containing or storing program instructions for use by or in connection with a data processing system, apparatus, or device.
It will be understood that various modifications may be made. For example, useful results still could be achieved if steps of the disclosed techniques were performed in a different order, and/or if components in the disclosed systems were combined in a different manner and/or replaced or supplemented by other components. Accordingly, other implementations are within the scope of the following claims.