BACKGROUNDSmartphones, desktop computers, and other computing devices can run a wide variety of software programs. Many of these programs are designed to interact with specific peripherals. For example, many smartphone apps are designed to operate in computing devices included or connected to multi-touch-capable touchscreens. Similarly, some graphics applications are configured to operate with pen tablets instead of or in addition to mice or other pointing devices. The user experience of an application can depend significantly on which peripherals are used to interact with the application.
SUMMARYThis disclosure describes devices, systems, methods, and computer-readable media for launching or operating an application in response to a user-input accessory selected by the user. In some examples, a computing device can include a wireless interrogator configured to wirelessly detect a first identifier associated with a tagged user-input accessory in operational proximity to the wireless interrogator and wirelessly detect a second identifier associated with a tagged object in operational proximity to the wireless interrogator. The computing device can include a force sensor having a sensing surface. The computing device can determine a software application corresponding to the first identifier and execute the determined software application. The computing device can detect a force exerted against the sensing surface by the object and provide to the software application information of the second identifier and information of the detected force. Example techniques described herein can detect spatially-varying forces across the sensing surface. In some example techniques described herein, a location or shape of the object can be determined and provided to the software application. In some example techniques described herein, a representation of the object can be presented for display in a user interface. In some example techniques described herein, the software application can display a user interface corresponding to the first identifier.
This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter. The term “techniques,” for instance, can refer to devices, systems, methods, computer-readable instructions, engines, modules, algorithms, hardware logic, and/or operations as permitted by the context described above and throughout the document.
BRIEF DESCRIPTION OF THE DRAWINGSThe detailed description is described with reference to the accompanying figures. In the figures, the left-most digits of a reference number identifies the figure in which the reference number first appears. The same reference numbers in different figures indicate similar or identical items.
FIG. 1 is a block diagram depicting an example environment for implementing application selection and operation as described herein.
FIG. 2 is a block diagram depicting an example computing device configured to participate in application selection and operation according to various examples described herein.
FIG. 3 is a perspective illustrating an example computing device and example uses thereof.
FIG. 4 is a dataflow diagram depicting example module interactions during application selection and operation.
FIG. 5 is a flow diagram that illustrates example processes for application selection and operation.
FIG. 6 is a flow diagram that illustrates example processes for application operation.
FIG. 7 is a flow diagram that illustrates example processes for application operation based on spatial data of one or more objects.
FIG. 8 is a flow diagram that illustrates example processes for user-interface selection based on a user-input accessory.
DETAILED DESCRIPTIONOverviewExamples described herein provide techniques and constructs to launch or operate an application in response to a user-input accessory (UIA) selected, e.g., by an entity, such as an operator entity or a user. In some examples, an application specific to the UIA can be launched. In some examples, user interfaces presented by the application can be adjusted based on the UIA. Either of app-specific launching and user-interface customizing can improve operational efficiency compared to using an application not designed for that UIA. A UIA is a computer peripheral, an attachment to a computer or computer peripheral, or an accessory (electronic or not) usable with a computer or computer peripheral. A UIA of any of these types is configured to afford specific user inputs or types of user inputs, i.e., to express or represent to a user the possibility of providing those inputs or types of user inputs. For example, a UIA showing a depiction of a painter's palette affords selecting a color, e.g., to be used in a computer paint program such as GIMP. Further examples of user-input accessories are described below.
The user experience of an application can change significantly, and operational efficiency with the application can be reduced, when the application is used with a peripheral other than that for which the application was designed. Some examples described herein permit using an application with the peripheral for which that application was designed. Some examples described herein permit automatically executing a software application or automatically configuring a software application based on a connected peripheral, reducing the time required to begin using the peripheral and increasing operational efficiency. Some examples described herein permit effectively manipulating virtual objects using corresponding physical objects, e.g., placed on a force-sensing surface. For example, physical objects, such as wooden or plastic blocks carrying radio-frequency identification (RFID) tags, can be used with a UIA affording placing or arranging objects on the UIA. For example, the UIA can show outlines indicating where blocks should be placed, or can be arranged horizontally to serve as a tray or other support for blocks. Further examples of objects are described below.
As used herein, the terms “application,” “app,” and “software program” refer generally to any software or portion thereof running on a computing device and responsive to or associated with user-input accessories as described herein. Examples of apps can include smartphone downloadable programs, desktop-computer programs such as word processors and spreadsheets, smartphone and other embedded operating systems, programs included with such operating systems such as shells or device-management subsystems, and embedded programs (e.g., firmware or software) running on sensor devices such as Internet of Things (IoT) devices. As used herein, the terms “application,” “app,” and “software program” also encompass hardwired logic included in a computing device and configured to respond to presence of, or signals from, a UIA as described herein.
Some examples, scenarios, and techniques for application responsiveness to user-input accessories are presented in greater detail in the following description of the figures.
Illustrative EnvironmentFIG. 1 shows anexample environment100 in which examples of devices or systems responsive to user-input accessories can operate or in which program-launching or -operation methods such as described below can be performed. In the illustrated example, various devices and/or components ofenvironment100 include computing devices102(1)-102(N) (individually or collectively referred to herein with reference102), where N is any integer greater than or equal to 1. Although illustrated as, e.g., desktop computers, laptop computers, tablet computers, or cellular phones,computing devices102 can include a diverse variety of device categories, classes, or types and are not limited to a particular type of device.
By way of example and not limitation,computing devices102 can include, but are not limited to, automotive computers such as vehicle control systems, vehicle security systems, or electronic keys for vehicles (e.g.,102(1), represented graphically as an automobile); smartphones, mobile phones, mobile phone-tablet hybrid devices, personal data assistants (PDAs), or other telecommunication devices (e.g.,102(2)); portable or console-based gaming devices or other entertainment devices such as network-enabled televisions, set-top boxes, media players, cameras, or personal video recorders (PVRs) (e.g.,102(3), represented graphically as a gamepad); desktop computers (e.g.,102(4)); laptop computers, thin clients, terminals, or other mobile computers (e.g.,102(5)); tablet computers or tablet hybrid computers (e.g.,102(N)); server computers or blade servers such as Web servers, map/reduce servers or other computation engines, or network-attached-storage units; wearable computers such as smart watches or biometric or medical sensors; implanted computing devices such as biometric or medical sensors; fixed sensors, such as IoT sensors, configured to monitor time, environmental conditions, vibration, motion, or other attributes of the world or structures or devices therein, e.g., bridges or dams; computer navigation client computing devices, satellite-based navigation system devices including global positioning system (GPS) devices and other satellite-based navigation system devices; or integrated components for inclusion in computing devices or appliances configured to participate in or carry out application selection or operation as described herein.
Different computing devices102 or types ofcomputing devices102 can use different peripherals and can have different uses for those peripherals. Examples of peripherals can include user-input accessories such as those discussed below. For example, portable devices such as computing devices102(2) and102(N) can use peripherals designed to cooperate with the small sizes of those devices. Larger devices such as computing devices102(4) and102(5) can use peripherals that take advantage of physical desktop space, e.g., pen tablets.
In some examples,computing devices102 can communicate with each other and/or with other computing devices via one ormore networks104. In some examples,computing devices102 can communicate with external devices vianetworks104. For example,networks104 can include public networks such as the Internet, private networks such as an institutional or personal intranet, cellular networks, or combinations of private and public networks.Networks104 can also include any type of wired or wireless network, including but not limited to local area networks (LANs), wide area networks (WANs), satellite networks, cable networks, Wi-Fi networks, WiMAX networks, mobile communications networks (e.g., 3G, 4G, and so forth) or any combination thereof.Networks104 can utilize communications protocols, such as, for example, packet-based or datagram-based protocols such as Internet Protocol (IP), Transmission Control Protocol (TCP), User Datagram Protocol (UDP), other types of protocols, or combinations thereof. Moreover,networks104 can also include a number of devices that facilitate network communications or form a hardware infrastructure for the networks, such as switches, routers, gateways, access points, firewalls, base stations, repeaters, backbone devices, and the like.Networks104 can also include devices that facilitate communications betweencomputing devices102 using bus protocols of various topologies, e.g., crossbar switches, INFINIBAND switches, or FIBRE CHANNEL switches or hubs.
Different networks have different characteristics, e.g., bandwidth, latency, accessibility (open, announced but secured, or not announced), or coverage area. The type ofnetwork104 used for any given connection between, e.g., acomputing device102 and a computing cluster can be selected based on these characteristics and on the type of interaction. For example, a low-power, low-bandwidth network can be selected for IoT sensors, and a low-latency network can be selected for smartphones.
In some examples,networks104 can further include devices that enable connection to a wireless network, such as a wireless access point (WAP). Examples support connectivity through WAPs that send and receive data over various electromagnetic frequencies (e.g., radio frequencies), including WAPs that support Institute of Electrical and Electronics Engineers (IEEE) 802.11 standards (e.g., 802.11g, 802.11n, and so forth), other standards, e.g., BLUETOOTH, cellular-telephony standards such as GSM, LTE, or WiMAX, or multiples or combinations thereof.
Still referring to the example ofFIG. 1, details of an example computing device102(N) are illustrated atinset106. The details of example computing device102(N) can be representative of others ofcomputing devices102. However, individual ones of thecomputing devices102 can include additional or alternative hardware and/or software components. Computing device102(N) can include one ormore processing units108 operably connected to one or more computer-readable media110 such as via a bus112, which in some instances can include one or more of a system bus, a data bus, an address bus, a Peripheral Component Interconnect (PCI) Express (PCIe) bus, a PCI bus, a Mini-PCI bus, and any variety of local, peripheral, or independent buses, or any combination thereof. In some examples,plural processing units108 can exchange data through an internal bus112 (e.g., PCIe), rather than or in addition tonetwork104. While in this example theprocessing units108 are described as residing on the computing device102(N), theprocessing units108 can also reside ondifferent computing devices102 in some examples. In some examples, at least two of theprocessing units108 can reside ondifferent computing devices102. In such examples,multiple processing units108 on thesame computing device102 can use a bus112 of thecomputing device102 to exchange data, while processingunits108 ondifferent computing devices102 can exchange data vianetworks104.
Processingunits108 can be or include one or more single-core processors, multi-core processors, central processing units (CPUs), graphics processing units (GPUs), general-purpose GPUs (GPGPUs), or hardware logic components configured, e.g., via specialized programming from modules or Application Programming Interfaces (APIs), to perform functions described herein. For example, and without limitation, illustrative types of hardware logic components that can be used in or as processingunits108 include Field-programmable Gate Arrays (FPGAs), Application-specific Integrated Circuits (ASICs), Application-specific Standard Products (ASSPs), System-on-a-chip systems (SOCs), Complex Programmable Logic Devices (CPLDs), Digital Signal Processors (DSPs), and other types of customizable processors. For example, aprocessing unit108 can represent a hybrid device, such as a device from ALTERA or XILINX that includes a CPU core embedded in an FPGA fabric. These or other hardware logic components can operate independently or, in some instances, can be driven by a CPU. In some examples, at least some ofcomputing devices102 can include a plurality ofprocessing units108 of multiple types. For example, theprocessing units108 in computing device102(N) can be a combination of one or more GPGPUs and one or more FPGAs.Different processing units108 can have different execution models, e.g., as is the case for graphics processing units (GPUs) and central processing unit (CPUs). In some examples, processingunits108, computer-readable media110, and modules or engines stored on computer-readable media110 can together represent an ASIC, FPGA, or other logic device configured to carry out the functions of such modules or engines. In some examples, an engine as described herein can include one or more modules.
Computer-readable media described herein, e.g., computer-readable media110, includes computer storage media and/or communication media. Computer storage media includes tangible storage units such as volatile memory, nonvolatile memory, and/or other persistent and/or auxiliary computer storage media, removable and non-removable computer storage media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules, or other data. Computer storage media includes tangible or physical forms of media included in a device or hardware component that is part of a device or external to a device, including but not limited to random-access memory (RAM), static random-access memory (SRAM), dynamic random-access memory (DRAM), phase change memory (PRAM), read-only memory (ROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), flash memory, compact disc read-only memory (CD-ROM), digital versatile disks (DVDs), optical cards or other optical storage media, magnetic cassettes, magnetic tape, magnetic disk storage, magnetic cards or other magnetic storage devices or media, solid-state memory devices, storage arrays, network attached storage, storage area networks, hosted computer storage or memories, storage, devices, and/or storage media that can be used to store and maintain information for access by acomputing device102.
In contrast to computer storage media, communication media can embody computer-readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave, or other transmission mechanism. As defined herein, computer storage media does not include communication media.
In some examples, computer-readable media110 can store instructions executable by theprocessing units108 that, as discussed above, can represent a processing unit incorporated incomputing device102. Computer-readable media110 can additionally or alternatively store instructions executable by external processing units such as by an external CPU or external processor of any type discussed above. In some examples at least oneprocessing unit108, e.g., a CPU, GPU, or hardware logic device, is incorporated incomputing device102, while in some examples at least oneprocessing unit108, e.g., one or more of a CPU, GPU, or hardware logic device, is external tocomputing device102.
Computer-readable media110 can store, for example, executable instructions of anoperating system114, alaunching engine116, aninteraction engine118, and other modules, programs, or applications that are loadable and executable by processingunits108. In some examples not shown, one or more of theprocessing units108 in one of thecomputing devices102 can be operably connected to computer-readable media110 in a different one of thecomputing devices102, e.g., viacommunications interface120 andnetwork104. For example, program code to perform steps of flow diagrams herein can be downloaded from a server, e.g., computing device102(4), to a client, e.g., computing device102(N), e.g., via thenetwork104, and executed by one ormore processing units108 in computing device102(N). For example, the computer-executable instructions stored on the computer-readable media110 can upon execution configure a computer such as acomputing device102 to perform operations described herein with reference to theoperating system114, the launchingengine116, or theinteraction engine118.
Computer-readable media110 of thecomputing device102 can store anoperating system114. In some examples,operating system114 is not used (commonly referred to as a “bare metal” configuration). In some examples,operating system114 can include components that enable or direct thecomputing device102 to receive data via various inputs (e.g., user controls, network or communications interfaces, memory devices, or sensors), and process the data using theprocessing units108 to generate output. Theoperating system114 can further include one or more components that present the output (e.g., display an image on an electronic display, store data in memory, transmit data to another computing device, etc.). Theoperating system114 can enable a user to interact with apps or with modules of theinteraction engine118 using a user interface (omitted for brevity). Additionally, theoperating system114 can include components that perform various functions generally associated with an operating system, e.g., storage management and internal-device management.
Computing device102 can also include one ormore communications interfaces120 to enable wired or wireless communications betweencomputing devices102 and othernetworked computing devices102 involved in application selection or operation, or other computing devices, overnetworks104.Such communications interfaces120 can include one or more transceiver devices, e.g., network interface controllers (NICs) such as Ethernet NICs or other types of transceiver devices, to send and receive communications over a network. Theprocessing units108 can exchange data through respective communications interfaces120. In some examples, thecommunications interface120 can be a PCIe transceiver, and thenetwork104 can be a PCIe bus. In some examples, thecommunications interface120 can include, but is not limited to, a transceiver for cellular (3G, 4G, or other), WI-FI, Ultra-wideband (UWB), BLUETOOTH, or satellite transmissions. Thecommunications interface120 can include a wired I/O interface, such as an Ethernet interface, a serial interface, a Universal Serial Bus (USB) interface, an INFINIBAND interface, or other wired interfaces. For simplicity, these and other components are omitted from the illustratedcomputing device102.
In some examples, acomputing device102 such as the computing device102(N) can include or be connected with awireless interrogator122. Thewireless interrogator122 can be configured to wirelessly detect respective identifiers associated with tagged objects, e.g., by transmitting interrogation signals and receiving responses from tagged objects in operational proximity to thewireless interrogator122. In some examples, thewireless interrogator122 can include an RFID or near-field communications (NFC) reader configured to wirelessly detect the identifiers of RFID-tagged or NFC-tagged objects of the tagged objects. In some examples, thewireless interrogator122 can include a reader configured to wirelessly detect the identifiers of ones of the tagged objects having transceivers for BLUETOOTH, BLUETOOTH Low-Energy (BLE), or other personal-area-networking (PAN) technologies. In some examples, thewireless interrogator122 can include an optical detector configured to locate and decode visual indicia on or in surfaces of the tagged objects, e.g., barcodes, specific colors, or specific patterns.
In some examples, acomputing device102 such as the computing device102(N) can include or be connected with aforce sensor124. Theforce sensor124 can have a sensing surface. Theforce sensor124 can be configured to sense force, pressure, or other mechanical actions on or against the sensing surface. In some examples, theforce sensor124 can be configured to detect spatially-varying forces across the sensing surface. Example force sensors useful with various examples are described in U.S. Pat. No. 9,001,082 to Rosenberg et al. (“'082”) assigned to SENSEL, Inc., incorporated herein by reference. For example, theforce sensor124 can include a two-dimensional variable-impedance array having column and row electrodes interconnected with nonzero impedances, e.g., as shown in FIG. 3 of '082. In some examples, theforce sensor124 can include one or more optical, resistive, or capacitive touch sensors or one or more capacitive or inductive proximity sensors. In some examples, theforce sensor124 can include one or more deformable membranes or strain gauges.
In some examples, thewireless interrogator122 can be configured to detect a user-input accessory (UIA)126 or anobject128 arranged in operational proximity to thewireless interrogator122, e.g., in a sensing range thereof. In the illustrated example, theUIA126 is associated with (e.g., is attached to or includes) atag130, e.g., an RFID tag. Also in the illustrated example, theobject128 is associated with atag132, e.g., an RFID tag. Thewireless interrogator122 can detect identifiers (e.g., class, object, vendor, product, or unique identifiers) or other information stored in thetags130 or132, e.g., when thetags130 or132 are in operational proximity to thewireless interrogator122, e.g., in the sensing range thereof. In some examples, thewireless interrogator122 can store or update information in thetags130 or132. For example, thewireless interrogator122 can wirelessly detect a first identifier associated with the tagged user-input accessory126 in operational proximity to thewireless interrogator122, and wirelessly detect a second identifier associated with the taggedobject128 in operational proximity to thewireless interrogator122. Further examples are discussed below with reference toFIG. 3.
In some examples, theforce sensor124 can be configured to detect theUIA126 or theobject128 arranged in operational proximity to theforce sensor124. For example, as graphically represented by the open-headed arrows inFIG. 1, theUIA126 or theobject128 can exert a force against a sensing surface of theforce sensor124. Theforce sensor124 can detect, e.g., the presence, magnitude, magnitude per area (i.e., pressure), or direction of such a force, whether exerted against the sensing surface directly or indirectly. Example forces exerted directly against the sensing surface are graphically represented by solid arrow shafts; example forces exerted indirectly against the sensing surface (e.g., by theobject128 through the UIA126) are graphically represented by stippled arrow shafts. For example, theobject128 can be arranged above and supported by theUIA126 or theforce sensor124, and exert gravitational force (weight) against theforce sensor124. Examples of theforce sensor124 are discussed in more detail below with reference toFIG. 4. In some examples, theUIA126 can include a pad configured to overlie theforce sensor124 or the sensing surface of theforce sensor124. The pad can include, e.g., a deformable sheet, e.g., comprising a thermoplastic material. In some examples, theUIA126 andforce sensor124 can be arranged in any orientation. For example, theUIA126 andforce sensor124 can be arranged horizontally, e.g., to supportobject128 thereupon. In another example, theUIA126 andforce sensor124 can be arranged vertically, e.g., to detect lateral forces exerted by or viaobject128.
In some examples, theUIA126 can be selected by an entity. In some examples, theUIA126 can be placed by the entity into operational proximity to, or operational arrangement with, thewireless interrogator122 or theforce sensor124. The entity can be, for example, a robot or other robotic operator, which may not be directly controlled by a human user. For example, the entity can be a manufacturing robot, e.g., configured to pick and place theUIA126. In various examples, the entity can be a robotic assistance device such as a powered prosthesis or exoskeleton or a robotic remote manipulator (“waldo”). In some examples, the operator can be a human user and theUIA126 can be selected by the human user.
In some examples, the computing device102(N) is communicatively connected, e.g., via thenetwork104, with acomputing device134, e.g., a cloud or other server. In some examples, as indicated, computing devices, e.g.,computing devices102 and134, can intercommunicate to participate in or carry out application selection or operation as described herein. Thecomputing device134 can include components shown atinset106. As shown atinset136, thecomputing device134 can additionally or alternatively include one or more computer-readable media (omitted for brevity) including amapping store138 holding one or more mappings from identifiers of user-input accessories such as theUIA126 to indications of software applications corresponding to those identifiers. Examples of themapping store138 and the mappings are discussed in more detail below with reference toFIG. 4.
In some examples, thecomputing device134 or one or more of thecomputing devices102 can be computing nodes in a computing cluster, e.g., a cloud service such as MICROSOFT AZURE. Cloud computing permits computing resources to be provided as services rather than a deliverable product. For example, in a cloud-computing environment, computing power, software, information, and/or network connectivity are provided (for example, through a rental agreement) over a network, such as the Internet. In the illustrated example,computing devices102 can be clients of a cluster including thecomputing device134 and can submit jobs, e.g., including identifiers of UIAs126, to the cluster and/or receive job results, e.g., indications of software applications, from the cluster. Computing devices in the cluster can, e.g., share resources, balance load, increase performance, or provide fail-over support or redundancy.
Illustrative ComponentsFIG. 2 is an illustrative diagram that shows example components of acomputing device200, which can representcomputing devices102 or134, and which can be configured to participate in application selection or operation according to various examples described herein.Computing device200 can implement alaunching engine202, which can represent launchingengine116,FIG. 1.Computing device200 can implement aninteraction engine204, which can representinteraction engine118,FIG. 1.Computing device200 can implement amapping engine206.Computing device200 can implement asoftware application208, e.g., configured for interaction with a user.
In some examples, e.g., of acomputing device134 providing a mapping service, thecomputing device200 can implementmapping engine206 but not launchingengine202,interaction engine204, orsoftware application208. In some examples, e.g., of acomputing device102 making use of a mapping service, thecomputing device200 can implement launchingengine202,interaction engine204, orsoftware application208 but not mappingengine206. In some examples, e.g., of acomputing device102 implementing both a mapping service and the use thereof, thecomputing device200 can implement launchingengine202,interaction engine204,mapping engine206, andsoftware application208.
Thecomputing device200 can include or be connected to acommunications interface210, which can representcommunications interface120. For example,communications interface210 can include a transceiver device such as a network interface controller (NIC) to send and receive communications over a network104 (shown in phantom), e.g., as discussed above. As such, thecomputing device200 can have network capabilities. For example, thecomputing device200 can exchange data withother computing devices102 or134 (e.g., laptops, computers, and/or servers) via one ormore networks104, such as the Internet.
Thecomputing device200 can include or be connected to user-interface hardware212. User-interface hardware212 can include adisplay214 or other device(s) configured to present user interfaces, e.g., as described below.Display214 can include an organic light-emitting-diode (OLED) display, a liquid-crystal display (LCD), a cathode-ray tube (CRT), or another type of visual display.Display214 can be a component of a touchscreen, or can include a touchscreen. In addition to or instead of thedisplay214, the user-interface hardware212 can include various types of output devices configured for communication to a user or to anothercomputing device200. Output devices can be integral or peripheral tocomputing device200. Examples of output devices can include a display, a printer, audio speakers, beepers, or other audio output devices, a vibration motor, linear vibrator, or other haptic output device, and the like. In some examples, theinteraction engine206 is operatively coupled to thedisplay214 or another output device.
User-interface hardware212 can include a user-operable input device216 (graphically represented as a gamepad). User-operable input device216 can include various types of input devices, integral or peripheral tocomputing device200. The input devices can be user-operable, or can be configured for input fromother computing device200. Examples of input devices can include, e.g., a keyboard, keypad, a mouse, a trackball, a pen sensor or smart pen, a light pen or light gun, a game controller such as a joystick or game pad, a voice input device such as a microphone, voice-recognition device, or speech-recognition device, a touch input device, a gestural input device such as a touchscreen, a grip sensor, an accelerometer, another haptic input, a visual input device such as one or more cameras or image sensors, and the like.
In some examples,computing device102 can include one ormore measurement units218.Measurement units218 can detect physical properties or status ofcomputing device200 or its environment. Examples ofmeasurement units218 can include units to detect motion, temperature, force, pressure, light, sound, electromagnetic radiation (e.g., for wireless networking), or any detectable form of energy or matter in or within sensing range ofcomputing device200. For example, themeasurement units218 can include thewireless interrogator122 and/or theforce sensor124,FIG. 1. Individual measurement units of themeasurement units218 can be configured to output data corresponding to at least one physical property, e.g., a physical property of thecomputing device200, such as acceleration, or of an environment of thecomputing device200, such as temperature or humidity.
In some examples, e.g., of asmartphone computing device200,measurement units218 can include an accelerometer, a microphone, or front- and rear-facing cameras. In some examples,measurement units218 can include a motion sensor, a proximity detector (e.g., for nearby life forms, people, or devices), a light sensor (e.g., a CdS photoresistor or a phototransistor), a still imager (e.g., a charge-coupled device, CCD, or complementary metal-oxide-semiconductor, CMOS, sensor), a video imager (e.g., CCD or CMOS), a microphone, a fingerprint reader, a retinal scanner, an iris scanner, or a touchscreen (e.g., in or associated with a display in user-interface hardware212, such as display214).
In some examples,computing device102 can include one ormore sensors220. Components ofcommunications interface210, e.g., transceivers for BLUETOOTH, WI-FI, RFID, NFC, or LTE, can be examples ofsensors220. Such components can be used to, e.g., detect signals corresponding to characteristics of accessible networks. Such signals can also be detected by automatically locating information in a table of network information (e.g., cell-phone tower locations), or by a combination of detection by component ofcommunications interface120 and table lookup. Input components of user-interface hardware212, e.g., touchscreens or phone mouthpieces, can also be examples ofsensors220.Measurement units218 can also be examples ofsensors220. In some examples, a particular device can simultaneously or selectively operate as part of two or more ofcommunications interface210, user-interface hardware212, and one ormore measurement units218. For example, a touchscreen can be an element of user-interface hardware212 and used to present information and receive user commands. Signals from the same touchscreen can also be used in determining a user's grip oncomputing device200. Accordingly, that touchscreen in this example is also asensor220.
Computing device200 can further include one or more input/output (I/O) interfaces222 by whichcomputing device200 can communicate with input, output, or I/O devices (for clarity, some not depicted). Examples of such devices can include components of user-interface hardware212 orsensors220.Computing device200 can communicate via I/O interface222 with suitable devices or using suitable electronic/software interaction methods. Input data, e.g., of user inputs on user-operable input device216, can be received via I/O interfaces222, and output data, e.g., of user interface screens, can be provided via I/O interfaces222 to display214, e.g., for viewing by a user.
Thecomputing device200 can include one ormore processing units224, which can representprocessing units108. In some examples, processingunits224 can include or be connected to amemory226, e.g., a random-access memory (RAM) or cache. Processingunits224 can be operably coupled, e.g., via the I/O interface222, to the user-interface hardware212 and/or thesensors220. Processingunits224 can be operably coupled to at least one computer-readable media228, discussed below. Processingunits224 can include, e.g., processing unit types described above such as CPU- or GPGPU-type processing units.
Theprocessing units224 can be configured to execute modules of the plurality of modules. For example, the computer-executable instructions stored on the computer-readable media228 can upon execution configure a computer such as acomputing device200 to perform operations described herein with reference to the modules of the plurality of modules, e.g., modules of thelaunching engine202,interaction engine204,mapping engine206, orsoftware application208. The modules stored in the computer-readable media228 can include instructions that, when executed by the one ormore processing units224, cause the one ormore processing units224 to perform operations described below. Examples of modules in computer-readable media228 are discussed below. Computer-readable media228 can also include an operating system, e.g.,operating system114.
In the illustrated example, computer-readable media228 includes adata store230. In some examples,data store230 can include data storage, structured or unstructured, such as a database (e.g., a Structured Query Language, SQL, or NoSQL database) or data warehouse. In some examples,data store230 can include a corpus or a relational database with one or more tables, arrays, indices, stored procedures, and so forth to enable data access.Data store230 can store data for the operations of processes, applications, components, or modules stored in computer-readable media228 or computer instructions in those modules executed by processingunits224. In some examples, the data store can store computer program instructions (omitted for brevity), e.g., instructions corresponding to apps, to processes described herein, or to other software executable by processingunits224, amapping store232, which can represent themapping store138, or any combination thereof. In some examples, the computer program instructions include one or more program modules executable by theprocessing units224, e.g., program modules of an app.
In some examples, theprocessing units224 can access the modules on the computer-readable media228 via abus234, which can represent bus112,FIG. 1. I/O interface222 and communications interface210 can also communicate withprocessing units224 viabus234.
The launchingengine202 stored on computer-readable media228 can include one or more modules, e.g., shell modules, or API modules, which are illustrated as a accessory-identifyingmodule236, an app-determiningmodule238, and aspawning module240.
Theinteraction engine204 stored on computer-readable media228 can include one or more modules, e.g., shell modules, or API modules, which are illustrated as an object-identifyingmodule242, a force-detectingmodule244, a force-analysis module246.
Themapping engine206 stored on computer-readable media228 can include one or more modules, e.g., shell modules, or API modules, which are illustrated as amapping module248.
Thesoftware application208 stored on computer-readable media228 can include one or more modules, e.g., shell modules, or API modules, which are illustrated as a user-interface (UI)presentation module250, a representation-determiningmodule252, and an object-presentation module254.
The launchingengine202,interaction engine204,mapping engine206, orsoftware application208 can be embodied in a cloud service, or on acomputing device102 controlled by a user, or any combination thereof. Module(s) stored on the computer-readable media228 can implement inter-process communications (IPC) functions or networking functions to communicate with servers orother computing devices102 or134.
In thelaunching engine202, theinteraction engine204, themapping engine206, or thesoftware application208, the number of modules can vary higher or lower, and modules of various types can be used in various combinations. For example, functionality described associated with the illustrated modules can be combined to be performed by a fewer number of modules or APIs or can be split and performed by a larger number of modules or APIs. For example, the object-identifyingmodule242 and the force-detectingmodule244, or the accessory-identifyingmodule236 and the force-detectingmodule244, can be combined in a single module that performs at least some of the example functions described below of those modules. In another example, the accessory-identifyingmodule236 and the object-identifyingmodule242 can be combined in a single module that performs at least some of the example functions described below of those modules. In still another example, the app-determiningmodule238 and themapping module248 can be combined in a single module that performs at least some of the example functions described below of those modules. These and other combined modules can be shared by or accessible to more than one of thelaunching engine202, theinteraction engine204, themapping engine206, or thesoftware application208. In some examples, computer-readable media228 can include a subset ofmodules236,238,240,242,244,246,248,250,252, or254.
FIG. 3 is a perspective illustrating anexample computing device300, and example uses thereof. Thecomputing device300 can representcomputing devices102,134, or200.Computing device300 can additionally or alternatively represent a peripheral, e.g., a user-operable input device216, communicatively connectable with acomputing device102,134, or200.
Computing device300 includes the force sensor124 (FIG. 1; omitted here for brevity) having asensing surface302. TheUIA126 is arranged over thesensing surface302 in this example; however, other arrangements can be used. Thecomputing device300 includes mountingfeatures304 configured to retain theUIA126 in operational relationship with thesensing surface302. TheUIA126 can include mating features (omitted for brevity) configured to attach, affix, or otherwise hold to the mounting features304. The mounting features304 or the mating features can include, e.g., one or more magnets, pins, sockets, snaps, clips, buttons, zippers, adhesives (permanent, semi-permanent, or temporary), nails, screws, bolts, studs, points, or kinematic mounts or mating features of any of those. In some examples, theUIA126 can include batteries or other electronics. In some examples, theUIA126 can omit batteries or other power supplies and operate, e.g., based on inductively-coupled power transfers from thecomputing device300.
In the illustrated example, theforce sensor124 includes a plurality ofsensing elements306 distributed across thesensing surface302, graphically represented as ellipses. For clarity, only one of thesensing elements306 is labeled. The sensing elements can be distributed regularly, e.g., in a grid arrangement (as illustrated), irregularly, randomly, or according to any other pattern or arrangement. The sensing elements can include, e.g., resistive or capacitive touch sensors or strain gauges.
In the illustrated example, theUIA126 includes the outlines308(1)-308(3) (individually or collectively referred to herein with reference308) of three abutting irregular pentagons. ThisUIA126 can be used, e.g., to assist young children in developing fine-motor skills or learning how to arrange shapes. In this and other examples, theUIA126 can include outlines (or other tactile or visual indicia, and likewise throughout this paragraph) of positions or orientations of objects or groups of objects. Such aUIA126 can be used, e.g., to teach shape identification, sorting, or arranging, or relationships between entities represented by the objects. For example, theUIA126 can include the relative positions of the planets in a depiction of the solar system, and the objects can represent the planets. In some examples, theUIA126 can include the outline of a completed tangram puzzle but not the inner lines showing how the tangram pieces should be oriented. In some examples, theUIA126 can include a schematic outline of a molecule, e.g., a DNA double helix, and the objects can include representations of portions of the molecule. These and other examples can permit using thecomputing device300 with theUIA126 as an educational tool.
In some examples, theUIA126 can include outlines or other indicia of tools or input interfaces such as, e.g., a painter's palette, a painting or drawing canvas, a music controller such as a MIDI (Musical Instrument Digital Interface) controller, a piano, an organ, a drum pad, a keyboard such as a QWERTY keyboard, a video game controller, or a multiple-choice answer form (e.g., having respective buttons for choices “A,” “B,” and so on). These and other examples can permit using thecomputing device300 with theUIA126 as an input interface that can supplement or replace a keyboard or pointing device.
In some examples, theUIA126 can include combinations of any of the above examples. In some examples, theUIA126 can include multiple regions and at least two of the regions can have indicia corresponding to respective, different ones of the above examples.
In some examples, theUIA126 can correspond with a design environment. Thesoftware application208 can present representations of one or more objects arranged over theUIA126. For example, multiple gears can be placed on theUIA126, and thesoftware application208 can present an animated representation showing how the gears would interlock and interact while turning. In another example, objects representing parts of a robot can be placed on theUIA126, and thesoftware application208 can present an animated representation showing the robot having those parts assembled according to the spatial relationships between the objects on theUIA126. As the user moves objects on theUIA126, thesoftware application208 can reconfigure the representation of the object accordingly. In still another example, building blocks can be placed on theUIA126, and thesoftware application208 can present a representation showing how the blocks can stack or interlock to form a structure. In some examples, data of the magnitude of force can be used to determine whether multiple objects are stacked on top of each other. In some examples, a database of object attributes can be queried with the object identifier to determine a 3-D model or extent of the object or a representation corresponding to the object.
Returning to the illustrated example of shape placement, twoobjects310 and312, which can representobject128,FIG. 1, are illustrated over the corresponding outlines on theUIA126. In the illustrated example, objects310 and312 includerespective tags314 and316, e.g., RFID or NFC tags, carrying identifiers of therespective objects310 and312. In some examples, objects310 and312 have identifiers printed on surfaces thereof, e.g., in the form of barcodes or specific colors or patterns. When theUIA126 or either of theobjects310 or312 approaches thecomputing device300, thewireless interrogator122 can determine a respective identifier of thatUIA126 or that one of theobjects310 and312.
In various examples, when either of theobjects310 and312 is placed on thesensing surface302 or on theUTA126 arranged over thesensing surface302, a force can be exerted against thesensing surface302, e.g., because of gravity or a user's pressing theobject310 or312 against thesensing surface302. Theforce sensor124 can detect and provide information of this force, as discussed below with reference to the force-analysis module246,FIG. 4.
FIG. 4 is a dataflow diagram400 illustrating example interactions between the modules illustrated inFIG. 2, e.g., during application selection and operation. Some of the modules make use of amapping store138 or232, e.g., holding mappings between identifiers of UIAs126 and software applications such assoftware application208,FIG. 2.
In some examples, the accessory-identifyingmodule236 can be configured to, using thewireless interrogator122, detect a first identifier corresponding to theUIA126 in operational proximity to thewireless interrogator122. For example, the accessory-identifyingmodule236 can read the identifier of theUIA126 from an RFID or NFC tag or another storage device in, on, or associated with theUIA126. The accessory-identifyingmodule236 can be configured to detect (or attempt to detect) the first identifier, e.g., automatically when or after theUIA126 enters operational proximity to the wireless interrogator122 (e.g., a sensing range of the wireless interrogator122), in response to a actuation of a user-input control such as a “connect” or “activate” button, or on a schedule (e.g., every 0.5 s).
In some examples, the app-determiningmodule238 can be configured to determine asoftware application208 corresponding to the first identifier. For example, the app-determining module can determine thesoftware application208 using the stored mapping(s) inmapping store232 of identifiers to software applications, e.g., by querying themapping store232 with the first identifier. In some examples, the app-determiningmodule238 can be configured to look up software applications in a centralized location, e.g., as is done to map file extensions to applications in MICROSOFT WINDOWS using the Registry (in which HKEY_CLASSES_ROOT has entries mapping extensions to programmatic identifiers, ProgIDs, and entries mapping ProgIDs to Component Object Model, COM, class identifiers, CLSIDs, and entries mapping CLSIDs to filenames of software applications), or as is done to map USB Vendor ID (VID)/Product ID (PID) pairs to device drivers using INF driver-information files.
In some examples, the app-determiningmodule238 can be configured to determine thesoftware application208 corresponding to the first identifier by transmitting the first identifier via thecommunications interface210,FIG. 2. For example, the app-determiningmodule238 can transmit the first identifier to amapping module248, e.g., executing on acomputing device134,FIG. 1. Themapping module248 can determine thesoftware application208 using the mapping(s) stored inmapping store232, and transmit an indication of thedetermined software application208. The app-determiningmodule238 can then receive the indication of thedetermined software application208 via thecommunications interface210.
In some examples using themapping module248, as well as in some examples in which the app-determiningmodule238 directly queries themapping store232, themapping store232 can store data indicating which software application corresponds to the first identifier. The first identifier (and other identifiers herein) can include one or more of, e.g., a universally unique identifier (UUID) or globally unique identifier (GUID); a uniform resource locator (URL), uniform resource name (URN), uniform resource identifier (URI), Digital Object Identifier (DOI), domain name, or reverse domain name; a unique hardware identifier such as an Ethernet Media Access Control (MAC) address, a 1-WIRE device serial number, or a GS1 Electronic Product Code (EPC), e.g., in URI form or stored in EPC Binary Encoding on, e.g., an RFID tag; a platform-specific app identifier (e.g., a GOOGLE PLAY package name); or a public key or public-key hash. The indication data can include one or more of, e.g., a name, filename, file path, or registry path; a MICROSOFT WINDOWS ProgID, CLSID, or COM interface ID (IID), or a corresponding identifier in Common Object Request Broker Architecture (CORBA) or another object system; or an identifier of any of the types listed above with reference to the first identifier.
In a specific, nonlimiting example, the first identifier can be a UUID. The first identifier can be stored as a 128-bit value, e.g., the hexadecimal value 0x5258AC20517311E59F4F0002A5D5C51B, or as a human-readable string, e.g., “{5258ac20-5173-11e5-9f4f-0002a5d5c51b}”. Themapping store232 can include a mapping associating the 128-bit first identifier, in either form or another form, with a specific software program, e.g., the reverse domain name “com.microsoft.exchange.mowa” to identify the MICROSOFT OUTLOOK WEB APP for ANDROID.
In some examples, mappings stored in themapping store232 can be cryptographically signed, e.g., using signatures according to the Public-Key Cryptography Standard (PKCS) #1 or another public-key cryptosystem. For example, a hash such as a Secure Hash Algorithm-256 bit (SHA-256) hash can be computed for a mapping in themapping store232. The hash can then encrypted with a predetermined private key. The encrypted hash can be stored in themapping store232 in association with the hashed mapping.
In some examples using cryptographically-signed mappings, the app-determiningmodule238 can select only software applications indicated in mappings having valid cryptographic signatures. For example, the app-determiningmodule238 can locate a candidate one of the mappings in themapping store232 based at least in part on the identifier of theUTA126. The app-determiningmodule238 can decrypt the signature of the candidate mapping using a predetermined public key and compare the decrypted hash to the hash of the candidate mapping. If the hashes match, the candidate mapping was provided by a holder of the private key corresponding to the predetermined public key. In this situation, the app-determiningmodule238 can determine that the candidate mapping is valid and that the indicatedsoftware application208 can be used. Using cryptographically-signed mappings can improve system security by reducing the chance that a mapping for, e.g., a malware program or Trojan horse will be selected by the app-determiningmodule238. In some examples, the app-determiningmodule238 can provide a warning, e.g., an “are you sure?” prompt, before selecting a software application indicated in a mapping not having a valid cryptographic signature.
In some examples, determining a software application corresponding to the first identifier can include selecting one of the mappings, e.g., in themapping store232, corresponding to the first identifier. A cryptographic signature of the selected one of the mappings can then be verified. If the cryptographic signature is not verified, a different one of the mappings can be selected from themapping store232, a warning can be presented, e.g., via a user interface, or processing can terminate without determining an application. In some examples, thespawning module240 or themapping module248 can perform these or other functions of the app-determiningmodule238 related to verifying cryptographic signatures of mappings, or any two or more of thespawning module240, themapping module248, and the app-determiningmodule238 can divide these functions between them. In some examples, in response to verification of the cryptographic signature, thespawning module240 can execute the software application indicated in the verified mapping.
In some examples, thespawning module240 can be configured to execute thedetermined software application208. For example, thespawning module240 can include a loader, such as a relocating or dynamic-linking loader, configured to read processor-executable instructions from a disk image of thedetermined software application208 and place those instructions in main memory to be executed.
In some examples, thespawning module240 can be configured to determine whether thedetermined software application208 is available to be executed. Applications can be available if, e.g., they are loaded on local storage ofcomputing device102 or on storage accessible tocomputing device102 via a network. In some examples, if thedetermined software application208 is not available or unavailable to be executed, thespawning module240 can execute an installation package of thedetermined software application208 to make thedetermined software application208 available, and can then execute thedetermined software application208. In some examples, if thedetermined software application208 is not available to be executed, thespawning module240 can execute a user interface configured to permit the user to locate, download, or purchase thedetermined software application208. In some examples, the user interface is configured to, when executed, present an indication of thedetermined software application208; receive payment information; and, at least partly in response to the received payment information (e.g., based at least in part on exchanging payment and authorization data with a payment server), download the determined software application. Thespawning module240 can then execute thedetermined software application208 once thedetermined software application208 is available. This can permit, e.g., distributing a set of mappings, e.g., via automatic updates tooperating system114. The mappings can be distributed without regard to which applications are installed on a particular computer. This can provide increased user efficiency, since a user employing anew UTA126 for the first time can be automatically prompted to download thedetermined software application208. This can save network bandwidth and user time compared to requiring the user to manually locate thedetermined software application208.
In some examples, the object-identifyingmodule242 can be configured to, using thewireless interrogator122, detect a second identifier corresponding to an object in operational proximity to thewireless interrogator122. The object-identifyingmodule242 can detect the second identifier in any of the ways for detecting identifiers described above with reference to the accessory-identifyingmodule236, e.g., periodically polling for tags using an RFID reader.
In some examples, the force-detectingmodule244 can be configured to detect a force exerted against thesensing surface302 of theforce sensor124 by the object. This can be done, e.g., as discussed above with reference toFIGS. 2 and 3. In some examples, the object-identifyingmodule242 or the force-detectingmodule244 can be configured to provide information of the second identifier and information of the detected force, e.g., to thesoftware application208 or to the force-analysis module246. The information can include, e.g., spatial data of the object (e.g., theobject310 or312,FIG. 3). For example, the information can be transmitted via a kernel-process or interprocess communication channel such as a socket, pipe, or callback, or can be retrieved by the software application via an API call. In some examples, theforce sensor124 can be configured to detect spatially-varying forces across the sensing surface, e.g., using a plurality of sensing elements distributed across the sensing surface. The information of the detected force can include information of the spatially-varying forces.
In some examples, the force-analysis module246 can be configured to determine a location of the object based at least in part on the detected spatially-varying forces. The force-analysis module246 can then provide to thesoftware application208 information of the determined location. In some examples, the spatial data of theobject310 or312 can include location data of the object, e.g., with respect to thesensing surface302 or theUTA126. For example, bitmap data of magnitudes of the spatially-varying forces can be processed using object- or feature-detection image-processing algorithms such as edge tracing, gradient matching, the Speeded Up Robust Features (SURF) feature detector, or (e.g., for a single object) location-weighted summing. Once a shape is located, the centroid can be determined using location-weighted summing over the pixels of the bitmap data considered to lie within the shape.
In some examples, the force-analysis module246 can be configured to determine a shape of the object based at least in part on the detected spatially-varying forces. The force-analysis module246 can then provide to thesoftware application208 information of the determined shape. In some examples, the spatial data of the object can include shape data of the object, e.g., coordinates of an outline of the shape or a raster map of force readings corresponding to the shape(s) of surface(s) of the object exerting force against thesensing surface302. For example, shapes can be detected in bitmap force-magnitude data as noted above with respect to shape location. Pixels considered to lie within the shape can then be outlined, e.g., by fitting, e.g., cubic or quadratic Bezier curves or other polynomials to the edges defined by borders between those pixels and pixels not considered to lie within the shape. The Sobel, Canny, or other edge detectors can be used to locate edges. The potrace, Autotrace, or other algorithms can be used to trace, e.g., edges or centerlines of pixel regions to provide the shape information. In some examples, determined bitmap or outline data of a shape can be compared to a catalog of known shapes to provide shape data such as “triangle” or “square.”
In some examples, information associated with the identifier of an object can be used together with the force data for that object to determine where that object is. For example, when the wireless interrogator detects multiple objects and the force sensor has discrete regions of force data for those objects, the determined shapes of the objects can be correlated with the identifiers, e.g., using a database lookup or a shape catalog, to determine spatial data associated with each of the shape identifiers.
In some examples, the spatial data can include one or more bitmap representations of the detected spatially-varying forces or portions thereof. For example, the spatial data can include a raw bitmap (rasterized) representation of the spatially-varying forces, e.g., detected by a regular grid of force sensors. The bitmap can represent the entire active area of theforce sensor124 or only a portion thereof, e.g., a portion in which force greater than a threshold or noise level is present, or a bounding polygon of such a portion. The spatial data can additionally or alternatively include values calculated from force data, e.g., the size or center of a bounding box of a portion in one or more dimensions, the location in one or more dimensions of the highest force in a portion, or the average magnitude or direction of force in a portion.
In some examples, the spatial data can include one or more vector representations of the detected spatially-varying forces or portions thereof. For example, the spatial data can include coordinates of contours of constant force magnitude, coordinates of outlines of areas experiencing a force above a threshold, or coordinates or axis values of centroids or bounding boxes of such areas,
In some examples, the UI-presentation module250 of thesoftware application208 can be configured to present for display a user interface. The user interface can be presented for display on adisplay214 of user-interface hardware212, or on another display. The user interface can have at least one content element or presentation element determined based at least in part on the identifier of theUTA126, e.g., detected by the accessory-identifyingmodule236. In some examples, content elements can include, e.g., text box controls, labels, images, and other areas or representations conveying content not predetermined by theoperating system114 or thesoftware application208. In some examples, presentation elements can include, e.g., window borders, menus, shell controls, color schemes, sounds, and other areas or representations conveying structural or other information predetermined by theoperating system114 or thesoftware application208, or constant from the point of view of thesoftware application208. The user interface can include one or more screens or windows holding, e.g., content elements or presentation elements; different screens, windows, or elements can be visible at different times. For example, the user interface can show and hide different sets of screens, windows or elements at different times, e.g., in response to user inputs via theUIA126. Further examples of user interfaces are discussed below.
Throughout this discussion, references to presenting user interfaces and representations “for display” are examples and not limiting. User interfaces and representations can alternatively or additionally be presented audibly, e.g., via speakers or headphones, or in a haptic or tactile manner, e.g., using force-feedback devices such as a SPACEBALL or using refreshable braille displays.
In some examples, the UI-presentation module250 of thesoftware application208 can be configured to present for display the user interface including a plurality of content elements in an arrangement based at least in part on the detected identifier of theUIA126. For example, the sequence, relative or absolute sizes, or relative or absolute positions of the content elements can be determined based at least in part on the identifier of theUIA126. In some examples, thesoftware application208 can include a table, program module, or other data or instructions mapping the identifier of theUIA126 to a list of relative positions of the content elements. The list of relative positions can include an indication of which user-interface screen or window should hold each content element. For example, the order or type of questions on a test or survey for a particular user can be determined using the list of relative positions corresponding to the identifier of that user'sUIA126.
In some examples, the representation-determiningmodule252 of thesoftware application208 can be configured to determine a representation of an object. The representation can be based at least in part on an identifier of the object, e.g., detected by the object-identifyingmodule242. For example, the representation can include a two-dimensional (2-D) or three-dimensional (3-D) model of the object determined, e.g., by querying a database of models using the identifier of the object. The database can be stored locally on thecomputing device102 running thesoftware application208, or can be stored on acomputing device134.
In some examples, the object-presentation module254 of thesoftware application208 can be configured to present for display the representation of the object. The presented representation can be arranged in the presented user interface based at least in part on spatial data of the object. The spatial data of the object can include at least location data of the object or shape data of the object. For example, the representation can be positioned based on the location data of the object. The representation can be rotated or flipped based on the shape data of the object. For example, inFIG. 3, the objects corresponding to outlines308(1) and308(3) are identical, but one is flipped along the long axis of the object compared to the other. Since the objects are not bilaterally symmetrical about the long axis, the shape data can be used to determine which side of the object is against the force sensor. The representation can then be oriented accordingly. In some examples, the representation can be presented with colors, shades, outlines, or other marks or attributes corresponding to the magnitude or direction of the force. For example, as the user pressesobject310 more forcefully into thesensing surface302, the representation can change color from green to yellow or red.
In some examples, the object-presentation module254 can be configured to present the representation of the object in a three-dimensional virtual environment of the user interface. The three-dimensional virtual environment can be a subset of a four- or higher-dimensional virtual environment. In some examples, a geometry of the representation of the object in a first dimension or set of dimensions, e.g., an extent or a position of the representation of the object in the first dimension or set of dimensions, is determined based at least in part on the spatial data of the object. In some examples, a geometry (e.g., an extent or a position) of the representation of the object in a second dimension or set of dimensions is determined based at least in part on the identifier of the object. The first and second dimensions can be different, e.g., orthogonal or otherwise nonparallel, or the first set of dimensions can differ in at least one dimension from the second set of dimensions.
For example, in a virtual environment having a horizontal X-Y plane and a vertical Z axis orthogonal to the plane, the position of the representation of the first object in X, Y, or both can be determined based on the location of theobject310 with respect to theUIA126. The height of the representation, e.g., the Z-axis extent of the representation, can be determined based on an identifier of the object. For example, the height can be retrieved from a datastore mapping identifiers to heights. This can permit modeling three-dimensional environments with theforce sensor124 providing data in only two dimensions.
In some examples, the object-presentation module254 can be configured to determine that the spatial data of the object correspond to a spatial target. In the example ofFIG. 3, thesoftware application208 can store data of the locations of theoutlines308 of the pentagons on theUIA126. The spatial target can correspond to the location of outline308(1), and the object can beobject310. The object-presentation module254 can determine that theobject310 has been placed according to the outline308(1). In response, the object-presentation module254 can present for display a success indicator, e.g., a highlight, outline, star, or other indicator that theobject310 has been correctly positioned with respect to theUIA126.
In some examples, the UI-presentation module250 or the object-presentation module254 can be configured to present for display a goal representation arranged in the user interface based at least in part on the spatial target. For example, an outline or shaded area shaped as theobject310 or thecorresponding outline308 can be displayed on screen. The goal representation can be positioned with respect to the representation of the object in a way corresponding to the spatial relationship between theobject310 and thecorresponding outline308. For example, when theobject310 is to the right of the outline308(1), the representation of the object can be presented arranged to the right of the goal representation. The representations of the object or the goal can be presented with colors or other attributes determined based at least in part on, e.g., the distance between the object and the spatial target, or the degree of similarity or difference between an orientation of the object and an orientation of the spatial target. For example, if theobject310 is spaced apart from, or rotated or flipped with respect to, the spatial target, the goal representation can be highlighted more brightly than when theobject310 is close to or oriented as the spatial target.
Some examples present representations ofmultiple objects310,312. Functions described above with reference to spatial targets, goal representations, and success indicators can be performed for any number of objects. For example, a success indicator can be displayed when a plurality of objects have been positioned correctly with respect to respective spatial targets, whether or not goal indicator(s) were displayed for any particular one(s) of the objects.
In some examples involvingmultiple objects310,312, the representation-determiningmodule252 can be configured to determine that a second object, e.g.,object312, is in a selected spatial relationship with the object, e.g.,object310. The determination can be made based at least in part on the spatial data of the object and spatial data of the second object, e.g., on respective location data. The spatial data of the second object can include, e.g., location data of the second object or shape data of the second object. The selected spatial relationship can be represented, e.g., indata store230, and can include, e.g., proximity of shapes, edges or vertices; proximity of defined connection points; similarity of orientation or of position along one axis; or coaxial positioning.
In some examples, e.g., in response to the determination that the object and the second object are in the selected spatial relationship, the representation-determiningmodule252 can determine a representation of the second object. The representation can be based at least in part on an identifier of the second object.
In some examples, e.g., in response to the determination that the object and the second object are in the selected spatial relationship, the object-presentation module254 can present for display the representation of the second object. The representation of the second object can be arranged in the user interface, e.g., based at least in part on the spatial data of the object and the spatial data of the second object. In the example ofFIG. 3, whenobjects310 and312 are placed and oriented according to the outlines308(1) and308(2), the respective representations ofobjects310 and312 can be presented together in the user interface, or with an indication that they are linked or correctly placed with respect to each other. Such an indication can include, e.g., a green or blue highlight indicating where the representations ofobjects310 and312 abut. Example functions described herein with respect to the object and the second object can be applied for any number of objects.
In some examples, the UI-presentation module250 is responsive to theUIA126 or type ofUIA126 detected, as graphically represented by the dashed arrow (the dashes are only for clarity). In some examples,individual UIAs126 are associated with specific user interfaces. In some examples, individual types of UIAs126 (e.g., USB device classes or other groupings of UIAs) are associated with specific user interfaces. For example, aUIA126 having a 12-key piano keyboard can correspond to a first user interface of an application, and aUIA126 have a 24-key piano keyboard can correspond to a second user interface of that application. In some examples, the application can, e.g., remove controls or UI features not relevant to thecurrent UIA126, improving operational efficiency, e.g., of users or other entities, by reducing screen clutter and by reducing the amount of time users have to search to locate a desired feature. Relationships betweenUIAs126, types of UIAs126, and user interfaces can be stored in any combination of memory in theUIA126, memory in thecomputing device102, or remote storage such as that hosted by thecomputing device134.
In some examples, in response to detection by the accessory-identifyingmodule236 of a first user-input accessory, the UI-presentation module250 can present for display a first user interface of an application. The first user interface can include at least one content element or presentation element determined based at least in part on a first identifier corresponding to the first user-input accessory. In response to detection by the accessory-identifyingmodule236 of a second, different user-input accessory, the UI-presentation module250 can present for display a second, different user interface of the application. The second user interface can have at least one content element or presentation element determined based at least in part on a second identifier corresponding to the second user-input accessory. For example, the UI-presentation module250 can retrieve a UI layout or a set of UI resources based at least in part on the identifer of theUTA126, and arrange UI elements according to the retrieved layout or resources.
In some examples, the two user interfaces can include a basic interface and a “pro” interface. In some examples, the first user interface can include one or more user-interface controls. The second user interface can include the one or more user-interface controls from the first user interface, and one or more additional user-interface controls. For example, in a word-processing program, the first user interface can include a text-editing control and buttons for bold, italic, and underline. The second user interface can include the text-editing control and the buttons for bold, italic, and underline, and also buttons to invoke complex paragraph- and page-formatting dialogs.
In some examples, the two user interfaces can include a limited-functionality interface and a full-functionality (or not-as-limited functionality) interface. In some examples, the first user interface can include one or more user-interface controls presented in a disabled state. For example, the user-interface controls can be grayed out, nonresponsive to events or attempts to actuate them, or fixed in value. The second user interface can include the one or more user-interface controls presented in an enabled state. For example, in the second user interface, the controls can be presented in colors used for other active controls, can respond to events, or can be varying in value. For example, in a digital audio workstation (DAW) application, the first user interface can include a mixer with four volume sliders that are responsive to user inputs and four volume sliders that are locked at −∞ dB. The second user interface can include all eight volume sliders responsive to user inputs.
In some examples, the two user interfaces can present elements of the user interface in respective, different orders. In some examples, the first user interface can include one or more content elements presented in a first arrangement, e.g., a first order, sequence, or spatial layout. The second user interface can include the one or more content elements presented in a second, different arrangement. For example, the content elements can include survey questions or test questions, and the questions can be presented in different orders to users withdifferent UIAs126. This can provide randomization of the population of users answering the questions and reduce bias in the answers due to the ordering of the questions. In some examples, the order of elements can be stored in a database and the UI-presentation module250 can query the database with the identifier of theUTA126.
Illustrative ProcessesFIG. 5 is a flow diagram that illustrates anexample process500 for selecting and operating an application using a computing device, e.g.,computing device200,FIG. 2. Example functions shown inFIG. 5 and other flow diagrams and example processes herein can be implemented on or otherwise embodied in one ormore computing devices102 or134, e.g., using software running on such devices. In some examples, functions shown inFIG. 5 can be implemented in an operating system (OS), a hardware abstraction layer (HAL), or a device-management subsystem.
For the sake of illustration, theexample process500 is described below with reference to components ofenvironment100,FIG. 1, processingunit224 and other components ofcomputing device200,FIG. 2, orcomputing device300,FIG. 3, that can carry out or participate in the steps of the exemplary method. However, other processing units such asprocessing unit108 and/or other components ofcomputing devices102 can carry out steps of described example processes such asprocess500. Similarly, exemplary methods shown inFIGS. 6, 7, and 8 are also not limited to being carried out by any particularly-identified components.
The order in which the operations are described in each example flow diagram or process is not intended to be construed as a limitation, and any number of the described operations can be combined in any order and/or in parallel to implement each process. In some nonlimiting examples, either ofblocks506 and508 can be performed before the other, or block508 can be performed beforeblock504. Moreover, the operations in each ofFIGS. 5, 6, 7, and 8 can be implemented in hardware, software, and/or a combination thereof. In the context of software, the operations represent computer-executable instructions that, when executed by one or more processors, cause one or more processors to perform the recited operations. In the context of hardware, the operations represent logic functions implemented in circuitry, e.g., datapath-control and finite-state-machine sequencing functions.
Atblock502, in some examples, the accessory-identifyingmodule236 can wirelessly detect a first identifier corresponding to a user-input accessory, e.g., by reading an RFID tag on theUTA126. This can be done, e.g., as described above with reference to thewireless interrogator122,FIG. 1, or the accessory-identifyingmodule236,FIG. 4.
Atblock504, in some examples, the object-identifyingmodule242 can wirelessly detect a second identifier corresponding to an object, e.g., by reading an RFID tag on the object. This can be done, e.g., as described above with reference to thewireless interrogator122,FIG. 1, or the object-identifyingmodule242,FIG. 4.
Atblock506, the force-analysis module246 can determine spatial data of the object using theforce sensor124. The spatial data can include at least location data of the object or shape data of the object. This can be done, e.g., as discussed above with reference to theforce sensor124 or the force-analysis module246,FIG. 4.
Atblock508, thespawning module240 can execute a software application corresponding to the first identifier, e.g., by calling CreateProcess on a Windows system or fork( ) and exec on a UNIX system. This can be done, e.g., as discussed above with reference to thespawning module240,FIG. 4. As noted above, in some examples, if the software application corresponding to the first identifier is not available, block508 can include executing an installation package of the software application or executing a user interface configured to permit the user to locate, download, or purchase the software application. Block508 can then include executing the software application when the installation, download, or purchase is complete.
Atblock510, the force-analysis module246 can provide the spatial data to the software application. This can be done, e.g., using IPC or other techniques discussed above with reference to the force-analysis module246,FIG. 4.
FIG. 6 is a flow diagram that illustrates anexample process600 for operation of an application, e.g., by presenting a representation of a physical object using a computing device, e.g.,computing device200,FIG. 2.
Atblock602, the UI-presentation module250 can present for display a user interface. The user interface can have at least one content element or presentation element determined based at least in part on an identifier of aUTA126. This can be done, e.g., as described above with reference to the UI-presentation module250,FIG. 4.
In some examples, block602 can include presenting a plurality of content elements in an arrangement based at least in part on the identifier of the user-input accessory. This can be done, e.g., as described above with reference to the UI-presentation module250,FIG. 4.
Atblock604, the representation-determiningmodule252 can determine a representation of an object. The representation can be determined based at least in part on an identifier of the object. This can be done, e.g., as described above with reference to the representation-determiningmodule252,FIG. 4.
Atblock606, the object-presentation module254 can present for display the representation of the object. The representation can be presented arranged in the user interface based at least in part on spatial data of the object. The spatial data of the object can include at least location data of the object or shape data of the object. This can be done, e.g., as described above with reference to the object-presentation module254,FIG. 4.
In some examples, block606 can include presenting the representation of the object in a three-dimensional virtual environment of the user interface. An extent or a position of the representation of the object in a first dimension can be determined based at least in part on the spatial data of the object. An extent or a position of the representation of the object in a second, different dimension can be determined based at least in part on the identifier of the object. For example, the location of the representation can be determined based on the spatial data, and the height of the representation can be retrieved from a database queried by the identifier of the object. This can be done, e.g., as described above with reference to the object-presentation module254,FIG. 4.
FIG. 7 is a flow diagram that illustrates anexample process700 for operating an application based on spatial data of one or more objects, e.g., by presenting a representation of a physical object using a computing device, e.g.,computing device200,FIG. 2. Block602 can also be used withprocess700 but is omitted fromFIG. 7 for brevity. Block604 can be followed byblock606, block702, or block710. The dash styles of lines inFIG. 7 are solely for clarity.
Atblock702, the force-detectingmodule244 or the force-analysis module246 can receive the spatial data of the object from a force sensor having a plurality of sensing elements distributed across a sensing surface. This can be done, e.g., as described above with reference to the force-detectingmodule244 or the force-analysis module246,FIG. 4. Block702 can be followed byblock606, which can be followed byblock704 or block706.
Atblock704, the UI-presentation module250 or the object-presentation module254 can present for display a goal representation arranged in the user interface based at least in part on a spatial target. This can be done, e.g., as described above with reference to theoutlines308,FIG. 3, or the UI-presentation module250 or the object-presentation module254,FIG. 4. The goal representation can be used for, e.g.,educational software applications208 such as shape-sorting applications.
Atblock706, the UI-presentation module250 or the object-presentation module254 can determine whether the spatial data of the object correspond to a spatial target. If so, the next block can be block708. This determination can be made whether or not a goal representation was displayed (block704). For example, a goal representation can be displayed for a shape-sorting application and omitted for a tangram application. This can be done, e.g., as described above with reference to theoutlines308,FIG. 3, or the UI-presentation module250 or the object-presentation module254,FIG. 4.
Atblock708, in response to a positive determination atblock706, the UI-presentation module250 or the object-presentation module254 can present for display a success indicator. This can be done, e.g., as described above with reference to the object-presentation module254,FIG. 4.
Atblock710, the UI-presentation module250 or the representation-determiningmodule252 can determine whether a second object is in a selected spatial relationship with the object based at least in part on the spatial data of the object and spatial data of the second object. If so, the next block can be block712. This can be done, e.g., as described above with reference to the representation-determiningmodule252,FIG. 4.
Atblock712, the representation-determiningmodule252 can determine a representation of the second object, the representation based at least in part on an identifier of the second object. This can be done, e.g., as described above with reference to the representation-determiningmodule252,FIG. 4.
Atblock714, the object-presentation module254 can present for display the representation of the second object arranged in the user interface based at least in part on the spatial data of the object and the spatial data of the second object. The spatial data of the second object can include at least location data of the second object or shape data of the second object. This can be done, e.g., as described above with reference to the object-presentation module254,FIG. 4.
FIG. 8 is a flow diagram that illustrates anexample process800 for selecting a user interface for an application using a computing device, e.g.,computing device200,FIG. 2. In some example, the user interface can be selected based upon a user-input accessory. This can provide the user an efficient experience with theparticular UIA126 that user has selected.
Atblock802, the accessory-identifyingmodule236 can detect aUIA126. This can be done, e.g., by wirelessly detecting an identifier of theUIA126, e.g., as discussed above with reference to thewireless interrogator122,FIG. 1.
Atblock804, the accessory-identifyingmodule236, the app-determiningmodule238, or the UI-presentation module250 can determine whether the detectedUIA126 is a first UIA or a second, different UIA, e.g., as discussed above with reference toFIG. 4. Block804 can also distinguish between more than two UIAs or types of UIAs. Block804 can be followed by, e.g., block806 or block808 (or other blocks omitted for brevity) depending on the type of theUIA126.
Atblock806, in response to detection of the first UIA (blocks802,804), the UI-presentation module250 can present for display a first user interface of an application. The first user interface can have at least one content element or presentation element determined based at least in part on a first identifier corresponding to the first user-input accessory.
Atblock808, in response to detection of the second UIA (blocks802,804), the UI-presentation module250 can present for display a second user interface of the application. The second user interface can have at least one content element or presentation element determined based at least in part on a second identifier corresponding to the second user-input accessory. The second user interface can be different from the first user interface. For example, the second user interface can have at least one content element or presentation element the first user interface lacks, or vice versa.
In some examples, the first user interface can include one or more user-interface controls and the second user interface can include the one or more user-interface controls and one or more additional user-interface controls. This can be done, e.g., as described above with reference to the UI-presentation module250,FIG. 4.
In some examples, the first user interface can include one or more user-interface controls presented in a disabled state and the second user interface can include the one or more user-interface controls presented in an enabled state. This can be done, e.g., as described above with reference to the UI-presentation module250,FIG. 4.
In some examples, first user interface can include one or more content elements presented in a first arrangement and the second user interface can include the one or more content elements presented in a second, different arrangement. This can be done, e.g., as described above with reference to the UI-presentation module250,FIG. 4.
Example ClausesA: A device comprising: a wireless interrogator configured to: wirelessly detect a first identifier associated with a tagged user-input accessory in operational proximity to the wireless interrogator; and wirelessly detect a second identifier associated with a tagged object in operational proximity to the wireless interrogator; a force sensor having a sensing surface; one or more computer-readable media having stored thereon a plurality of modules; and one or more processing units operably coupled to the wireless interrogator, the force sensor, and at least one of the computer-readable media, the processing unit adapted to execute modules of the plurality of modules comprising: a launching engine configured to: determine a software application corresponding to the first identifier; and execute the determined software application; and an interaction engine configured to: detect a force exerted against the sensing surface by the object; and provide to the software application information of the second identifier and information of the detected force.
B: A device as recited in paragraph A, wherein the force sensor is configured to detect spatially-varying forces across the sensing surface.
C: A device as recited in paragraph B, wherein the interaction engine is further configured to: determine a location of the object based at least in part on the detected spatially-varying forces; and provide, to the software application, information of the determined location.
D: A device as recited in paragraph B or C, wherein the interaction engine is further configured to: determine a shape of the object based at least in part on the detected spatially-varying forces; and provide, to the software application, information of the determined shape.
E: A device as recited in any of paragraphs B-D, wherein the force sensor comprises a plurality of sensing elements distributed across the sensing surface.
F: A device as recited in any of paragraphs A-E, wherein the one or more computer-readable media have stored thereon one or more mappings of identifiers to respective software applications and the launching engine is further configured to determine the software application corresponding to the first identifier using the one or more mappings.
G: A device as recited in any of paragraphs A-F, further comprising a communications interface, wherein the launching engine is further configured to determine the software application corresponding to the first identifier by transmitting the first identifier via the communications interface and receiving an indication of the software application via the communications interface.
H: A device as recited in any of paragraphs A-G, wherein the wireless interrogator comprises a radio-frequency identification (RFID) reader configured to wirelessly detect the identifiers of RFID-tagged ones of the tagged objects.
I: A device as recited in any of paragraphs A-H, further comprising a mounting feature configured to retain the user-input accessory in operational relationship with the sensing surface.
J: A device as recited in any of paragraphs A-I, wherein the user-input accessory comprises a pad configured to overlie the force sensor.
K: A device as recited in any of paragraphs A-J, wherein the one or more computer-readable media have stored thereon one or more mappings of identifiers to respective software applications and the launching engine is further configured to determine the software application corresponding to the first identifier using the one or more mappings, the determining including selecting, from the one or more mappings, a candidate mapping corresponding to the first identifier; and verifying a cryptographic signature of the candidate mapping.
L: A device as recited in any of paragraphs A-K, wherein the launching engine is configured to, if the determined software application is not available to be executed, execute an installation package of the determined software application.
M: A device as recited in any of paragraphs A-L, wherein the launching engine is configured to, if the determined software application is not available to be executed, execute a user interface configured to permit the user to locate, download, or purchase the determined software application.
N: A device as recited in paragraph M, wherein the user interface is configured to, when executed, present an indication of the determined software application; receive payment information; and, at least partly in response to the received payment information, download the determined software application.
O: A device as recited in any of paragraphs A-N, wherein the user-input accessory includes a pad configured to overlie the sensing surface.
P: A computer-implemented method comprising: presenting for display a user interface having at least one content element or presentation element determined based at least in part on an identifier of a user-input accessory; determining a representation of an object, the representation based at least in part on an identifier of the object; and presenting for display the representation of the object arranged in the user interface based at least in part on spatial data of the object, wherein the spatial data of the object comprises at least location data of the object or shape data of the object.
Q: A computer-implemented method as recited in paragraph P, further comprising: determining that the spatial data of the object correspond to a spatial target; and in response, presenting for display a success indicator.
R: A computer-implemented method as recited in paragraph Q, further comprising presenting for display a goal representation arranged in the user interface based at least in part on the spatial target.
S: A computer-implemented method as recited in any of paragraphs P-R, wherein: the presenting the representation of the object comprises presenting the representation of the object in a three-dimensional virtual environment of the user interface; a geometry of the representation of the object in a first dimension is determined based at least in part on the spatial data of the object; and a geometry of the representation of the object in a second, different dimension is determined based at least in part on the identifier of the object.
T: A computer-implemented method as recited in any of paragraphs P-S, further comprising: determining that a second object is in a selected spatial relationship with the object based at least in part on the spatial data of the object and spatial data of the second object; determining a representation of the second object, the representation based at least in part on an identifier of the second object; and presenting for display the representation of the second object arranged in the user interface based at least in part on the spatial data of the object and the spatial data of the second object, wherein the spatial data of the second object comprises at least location data of the second object or shape data of the second object.
U: A computer-implemented method as recited in any of paragraphs P-T, wherein the presenting the user interface comprises presenting a plurality of content elements in an arrangement based at least in part on the identifier of the user-input accessory.
V: A computer-implemented method as recited in any of paragraphs P-U, further comprising receiving the spatial data of the object from a force sensor having a plurality of sensing elements distributed across a sensing surface.
W: A computer-readable medium, e.g., a computer storage medium, having thereon computer-executable instructions, the computer-executable instructions upon execution configuring a computer to perform operations as recited in any of paragraphs P-V.
X: A device comprising: a processor; and a computer-readable medium, e.g., a computer storage medium, having thereon computer-executable instructions, the computer-executable instructions upon execution by the processor configuring the device to perform operations as recited in any of paragraphs P-V.
Y: A system comprising: means for processing; and means for storing having thereon computer-executable instructions, the computer-executable instructions including means to configure the system to carry out a method as recited in any of paragraphs P-V.
Z: A computer-readable medium having thereon computer-executable instructions, the computer-executable instructions upon execution configuring a computer to perform operations comprising: detecting a first user-input accessory and, in response, presenting for display a first user interface of an application, the first user interface having at least one content element or presentation element determined based at least in part on a first identifier corresponding to the first user-input accessory; and detecting a second, different user-input accessory and, in response, presenting for display a second, different user interface of the application, the second user interface having at least one content element or presentation element determined based at least in part on a second identifier corresponding to the second user-input accessory.
AA: A computer-readable medium as recited in paragraph Z, wherein the first user interface includes one or more user-interface controls and the second user interface includes the one or more user-interface controls and one or more additional user-interface controls.
AB: A computer-readable medium as recited in paragraph Z or AA, wherein the first user interface includes one or more user-interface controls presented in a disabled state and the second user interface includes the one or more user-interface controls presented in an enabled state.
AC: A computer-readable medium as recited in any of paragraphs Z-AB, wherein the first user interface includes one or more content elements presented in a first arrangement and the second user interface includes the one or more content elements presented in a second, different arrangement.
AD: A device comprising: a processor; and a computer-readable medium as recited in any of paragraphs Z-AC.
AE: A system comprising: means for processing; and a computer-readable medium as recited in any of paragraphs Z-AC, the computer-readable medium storing instructions executable by the means for processing.
AF: A device comprising: one or more computer-readable media having stored thereon a plurality of modules; and one or more processing units operably coupled to at least one of the computer-readable media, the processing unit adapted to execute modules of the plurality of modules comprising: a launching engine configured to: detect a first identifier corresponding to a user-input accessory; determine a software application corresponding to the first identifier; and execute the determined software application; and an interaction engine configured to: detect a second identifier corresponding to an object; detect a force exerted by the object; and provide to the software application information of the second identifier and information of the detected force.
AG: A device as recited in paragraph AF, wherein the one or more computer-readable media have stored thereon one or more mappings of identifiers to respective software applications and the launching engine is further configured to determine the software application corresponding to the first identifier using the one or more mappings, the determining including selecting, from the one or more mappings, a candidate mapping corresponding to the first identifier; and verifying a cryptographic signature of the candidate mapping.
AH: A device as recited in paragraph AF or AG, wherein the launching engine is configured to, if the determined software application is not available to be executed, execute an installation package of the determined software application.
AI: A device as recited in any of paragraphs AF-AH, wherein the launching engine is configured to, if the determined software application is not available to be executed, execute a user interface configured to permit the user to locate, download, or purchase the determined software application.
AJ: A device as recited in paragraph AI, wherein the user interface is configured to, when executed, present an indication of the determined software application; receive payment information; and, at least partly in response to the received payment information, download the determined software application.
AK: A device as recited in any of paragraphs AF-AJ, further comprising a sensing surface, wherein the interaction engine is configured to detect spatially-varying forces exerted by the object across the sensing surface.
AL: A device as recited in paragraph AK, wherein the interaction engine is further configured to: determine a location of the object based at least in part on the detected spatially-varying forces; and provide, to the software application (e.g., the determined software application), information of the determined location.
AM: A device as recited in paragraph AK or AL, wherein the interaction engine is further configured to: determine a shape of the object based at least in part on the detected spatially-varying forces; and provide, to the software application, information of the determined shape.
AN: A device as recited in any of paragraphs AK-AM, further comprising a plurality of sensing elements distributed across the sensing surface.
AO: A device as recited in any of paragraphs AK-AN, wherein the user-input accessory includes a pad configured to overlie the sensing surface.
AP: A device as recited in any of paragraphs AF-AO, wherein the one or more computer-readable media have stored thereon one or more mappings of identifiers to respective software applications and the launching engine is further configured to determine the software application corresponding to the first identifier using the one or more mappings.
AQ: A device as recited in any of paragraphs AF-AP, further comprising a communications interface, wherein the launching engine is further configured to determine the software application corresponding to the first identifier by transmitting the first identifier via the communications interface and receiving an indication of the software application via the communications interface.
AR: A device as recited in any of paragraphs AF-AQ, further comprising a radio-frequency identification (RFID) reader configured to wirelessly detect the identifiers of RFID-tagged ones of the tagged objects.
AS: A device as recited in any of paragraphs AF-AR, further comprising a mounting feature configured to retain the user-input accessory in operational relationship with the sensing surface.
AT: A computer-implemented method comprising: wirelessly detecting a first identifier corresponding to a user-input accessory; wirelessly detecting a second identifier corresponding to an object; determining spatial data of the object using a force sensor, wherein the spatial data comprises at least location data of the object or shape data of the object; executing a software application corresponding to the first identifier; and providing the spatial data to the software application.
AU: A computer-implemented method as recited in paragraph AT, wherein the determining spatial data comprises detecting spatially-varying forces across the sensing surface using the force sensor.
AV: A computer-implemented method as recited in paragraph AU, wherein the determining spatial data comprises determining the location data of the object based at least in part on the detected spatially-varying forces.
AW: A computer-implemented method as recited in paragraph AU or AV, wherein the determining spatial data comprises determining the shape data of the object based at least in part on the detected spatially-varying forces.
AX: A computer-implemented method as recited in any of paragraphs AT-AW, wherein the executing the software application comprises determining the software application corresponding to the first identifier using one or more stored mappings of identifiers to respective software applications.
AY: A computer-implemented method as recited in any of paragraphs AT-AX, wherein the executing the software application comprises transmitting the first identifier via a communications interface and receiving an indication of the software application via the communications interface.
AZ: A computer-implemented method as recited in any of paragraphs AT-AY, wherein the wirelessly detecting the first identifier comprises retrieving the first identifier from a radio-frequency identification (RFID) tag of the user-input accessory.
BA: A computer-implemented method as recited in any of paragraphs AT-AZ, wherein the wirelessly detecting the second identifier comprises retrieving the second identifier from an RFID tag of the object.
BB: A computer-implemented method as recited in any of paragraphs AT-BA, wherein the executing the software application comprises selecting, from one or more stored mappings of identifiers to respective software applications, a candidate mapping corresponding to the first identifier; verifying a cryptographic signature of the candidate mapping; and, in response to the verifying, executing the software application indicated in the candidate mapping.
BC: A computer-implemented method as recited in any of paragraphs AT-BB, wherein the executing the software application comprises executing an installation package of the software application in response to the software application being unavailable to be executed.
BD: A computer-implemented method as recited in any of paragraphs AT-BC, wherein the executing the software application comprises, if the software application is not available to be executed, executing a user interface configured to permit the user to locate, download, or purchase the determined software application.
BE: A computer-implemented method as recited in paragraph BD, wherein the executing the user interface includes presenting an indication of the determined software application; receiving payment information; and, at least partly in response to the received payment information, downloading the determined software application.
BF: A computer-readable medium, e.g., a computer storage medium, having thereon computer-executable instructions, the computer-executable instructions upon execution configuring a computer to perform operations as recited in any of paragraphs AT-BE.
BG: A device comprising: a processor; and a computer-readable medium, e.g., a computer storage medium, having thereon computer-executable instructions, the computer-executable instructions upon execution by the processor configuring the device to perform operations as recited in any of paragraphs AT-BE.
BH: A system comprising: means for processing; and means for storing having thereon computer-executable instructions, the computer-executable instructions including means to configure the system to carry out a method as recited in any of paragraphs AT-BE.
CONCLUSIONApplication selection and operation techniques described herein can reduce the amount of time required to locate and execute a software application corresponding to a user-input accessory. This can provide increases in operational efficiency, e.g., in user efficiency and satisfaction. Application operation techniques described herein can provide rapid feedback or evaluation regarding the placement of objects on a force sensor. This can reduce the effort required for users or other entities to manipulate visual objects, increasing operational efficiency in such manipulations.
Although the techniques have been described in language particular to structural features or methodological acts, it is to be understood that the appended claims are not necessarily limited to the features or acts described. Rather, the features and acts are described as example implementations of such techniques.
The operations of the example processes are illustrated in individual blocks and summarized with reference to those blocks. The processes are illustrated as logical flows of blocks, each block of which can represent one or more operations that can be implemented in hardware, software, or a combination thereof. In the context of software, the operations represent computer-executable instructions stored on one or more computer-readable media that, when executed by one or more processors, enable the one or more processors to perform the recited operations. Generally, computer-executable instructions include routines, programs, objects, modules, components, data structures, and the like that perform particular functions or implement particular abstract data types. The order in which the operations are described is not intended to be construed as a limitation, and any number of the described operations can be executed in any order, combined in any order, subdivided into multiple sub-operations, and/or executed in parallel to implement the described processes. The described processes can be performed by resources associated with one ormore computing devices102,134,200, or300 such as one or more internal or external CPUs or GPUs, and/or one or more pieces of hardware logic such as FPGAs, DSPs, or other types described above.
All of the methods and processes described above can be embodied in, and fully automated via, software code modules executed by one or more general purpose computers or processors. The code modules can be stored in any type of computer-readable storage medium or other computer storage device. Some or all of the methods can be embodied in specialized computer hardware.
Conditional language such as, among others, “can,” “could,” “might” or “may,” unless specifically stated otherwise, are understood within the context to present that certain examples include, while other examples do not include, certain features, elements and/or steps. Thus, such conditional language is not generally intended to imply that certain features, elements and/or steps are in any way required for one or more examples or that one or more examples necessarily include logic for deciding, with or without user input or prompting, whether certain features, elements and/or steps are included or are to be performed in any particular example. Conjunctive language such as the phrases “X, Y or Z” and “at least one of X, Y or Z,” unless specifically stated otherwise, is to be understood to present that an item, term, etc. can be either X, Y, or Z, or a combination thereof.
The disclosure includes combinations of the examples described herein. References to a particular “example” and the like refer to features that are present in at least one example or configuration of what is within the scope of the disclosed subject matter. Separate references to “an example” or “particular examples” or the like do not necessarily refer to the same example or examples; however, such examples are not mutually exclusive, unless specifically indicated. The use of singular or plural in referring to “example,” “examples,” “method,” “methods” and the like is not limiting.
Any routine descriptions, elements or blocks in the flow diagrams described herein and/or depicted in the attached figures should be understood as potentially representing modules, segments, or portions of code that include one or more executable instructions for implementing particular logical functions or elements in the routine. Alternative implementations are included within the scope of the examples described herein in which elements or functions can be deleted, or executed out of order from that shown or discussed, including substantially synchronously or in reverse order, depending on the functionality involved as would be understood by those skilled in the art. It should be emphasized that many variations and modifications can be made to the above-described examples, the elements of which are to be understood as being among other acceptable examples. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.