RELATED APPLICATIONSThis application claims priority to U.S. Provisional Application No. 62/249,884, entitled “Compound Data Types,” filed on Nov. 2, 2015; U.S. Provisional Application No. 62/249,869, entitled “Rich Data Types,” filed Nov. 2, 2015; U.S. Provisional Application No. 62/357,292, entitled “Compound Data Objects,” filed on Jun. 30, 2016; U.S. Provisional Application No. 62/357,284, entitled “Rich Data Types,” filed on Jun. 30, 2016; and U.S. Provisional Application No. 62/357,363, entitled “Dynamic Data Associated with Cells in Spreadsheets,” filed on Jun. 30, 2016; the entire disclosures of which are hereby incorporated in their entireties herein by reference.
BACKGROUNDToday, while spreadsheet data can be charted, objects within a spreadsheet, such as images, sound files, videos, etc., are not currently handled as first class data and cannot be charted. Although modern reports often have infographics or other objects for facilitating the presentation of data, current spreadsheets are unable to generate such robust reports through charting. Accordingly, current spreadsheets are ill-suited for providing such new features in a visual, sensory-driven world.
It is with respect to these and other general considerations that embodiments have been described. Also, although relatively specific problems have been discussed, it should be understood that the embodiments should not be limited to solving the specific problems identified in the background.
SUMMARYThe disclosure generally relates to a system and methods for charting streaming data and/or attributes of streaming data in a spreadsheet. In aspects, when streaming data is associated with a spreadsheet, the streaming data may be treated as a new type of data within the spreadsheet. In some aspects, streaming data may be associated within a single cell or within a range of cells. In further aspects, both streaming data and additional data may be associated with a single cell. According to the present methods, the streaming data itself, as well as parameters of the streaming data such as a protocol, a format, a packet rate, a packet size, etc., may be retrieved (e.g., from metadata) and incorporated into a report (e.g., a chart) using a spreadsheet charting function. In further aspects, streaming data may be passed to third party services for processing, which may analyze and return additional parameters for charting, such as health or other statistics associated with the streaming data, analysis and/or predictions based on the streaming data, compilations of the streaming data, and the like.
In aspects, a method for charting one or more parameters of a video is provided. The method includes receiving a selection of one or more cells within a spreadsheet and identifying streaming data associated with the selected one or more cells. The method further includes receiving a charting function and charting the streaming data.
In further aspects, a system is provided that includes a processing unit and a memory storing computer executable instructions that, when executed by the processing unit, cause the system to receive a selection of one or more cells of a spreadsheet and identify streaming data associated with the selected one or more cells. The computer executable instructions further causing the system to retrieve a plurality of values associated with the streaming data and chart the plurality of values.
In further aspects, a system is provided that includes a processing unit and a memory storing computer executable instructions that, when executed by the processing unit, cause the system to receive a selection of one or more cells of a spreadsheet and identify streaming data associated with the selected one or more cells. The computer executable instructions further causing the system to identify at least one device associated with the streaming data, retrieve one or more parameters associated with the streaming data, and chart the one or more parameters.
In still further aspects, a system is provided that includes a processing unit and a memory storing computer executable instructions that, when executed by the processing unit, cause the system to perform a method. The method includes receiving a selection of one or more cells within a spreadsheet and identifying an object associated with the selected one or more cells, where the object is associated with one or more parameters. The method further includes receiving a selection of a charting function and creating a chart based on incorporating the object into the chart or charting the one or more parameters of the object.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
BRIEF DESCRIPTION OF THE DRAWINGSNon-limiting and non-exhaustive examples are described with reference to the following Figures.
FIG. 1 illustrates a system for creating a chart in a spreadsheet application based at least in part on an object within the spreadsheet, according to an example embodiment.
FIG. 2 illustrates a method for charting an object associated with a spreadsheet, according to an example embodiment.
FIG. 3 illustrates a method for customizing an image incorporated into a chart, according to an example embodiment.
FIG. 4 illustrates a method for manipulating an image associated with a spreadsheet that is incorporated into a chart, according to an example embodiment.
FIG. 5 illustrates a method for incorporating an image into a chart in response to satisfaction of a condition, according to an example embodiment.
FIG. 6 illustrates a method for selecting and incorporating an image into a chart, according to an example embodiment.
FIG. 7 illustrates a method for playing an audio file incorporated into a chart, according to an example embodiment.
FIG. 8 illustrates a method for manipulating an audio file associated with a spreadsheet that is incorporated into a chart, according to an example embodiment.
FIG. 9 illustrates a method for transcribing a chart into speech and incorporating an audio file of the speech into the chart, according to an example embodiment.
FIG. 10 illustrates a method for selecting and incorporating an audio file into a chart, according to an example embodiment.
FIG. 11 illustrates a method for incorporating an audio file into a chart in response to satisfaction of a condition, according to an example embodiment.
FIG. 12 illustrates a method for playing a video incorporated into a chart, according to an example embodiment.
FIG. 13 illustrates a method for charting streaming data that is associated with a spreadsheet, according to an example embodiment.
FIG. 14A illustrates an interface showing at least one image associated with one or more cells of a spreadsheet, according to an example embodiment.
FIG. 14B illustrates an interface showing a selection cells associated with images in a spreadsheet, according to an example embodiment.
FIG. 14C illustrates an interface for selecting a charting function, according to an example embodiment.
FIG. 14D illustrates a bar chart incorporating images, according to an example embodiment.
FIG. 15A illustrates an interface showing a selected image associated with a compound data type represented by a record, according to an example embodiment.
FIG. 15B illustrates an interface for selecting a charting function, according to an example embodiment.
FIG. 15C illustrates a line graph charting values associated with images in compound data types, according to an example embodiment.
FIG. 16 illustrates a bar chart incorporating images in response to satisfaction of a condition, according to a first example embodiment.
FIG. 17 illustrates a bar chart incorporating a plurality of images within a single bar, according to an example embodiment.
FIG. 18 illustrates a bar chart incorporating an image in response to satisfaction of a condition, according to a second example embodiment.
FIG. 19A illustrates an interface showing a selected image associated with a compound data type represented by a record, according to an example embodiment.
FIG. 19B illustrates a popup interface for selecting a charting function for images in a spreadsheet, according to an example embodiment.
FIG. 19C illustrates a map chart with a card view of an image, according to an example embodiment.
FIG. 20A illustrates an interface showing audio files associated with cells of a spreadsheet, according to an example embodiment.
FIG. 20B illustrates an interface showing a selected audio file associated with a compound data type represented by a record, according to an example embodiment.
FIG. 20C illustrates a bar chart incorporating a plurality of audio files within a single bar, according to an example embodiment.
FIG. 21A illustrates an interface showing an audio file associated with a spreadsheet, the spreadsheet proving a charting menu specific to the audio file, according to an example embodiment.
FIG. 21B illustrates a scatter plot incorporating visual representations for a plurality of audio files as data points, according to an example embodiment.
FIG. 21C illustrates a scatter plot with one or more popup menus for performing transcription, according to an example embodiment.
FIG. 21D illustrates a scatter plot showing an audio transcription associated with a data point, according to an example embodiment.
FIG. 22A illustrates an interface showing a UI element for viewing and interacting with a plurality of images associated with a cell in a spreadsheet, according to an example embodiment.
FIG. 22B illustrates a map chart with an incorporated image, according to an example embodiment.
FIG. 22C illustrates a map chart with one or more popup menus for performing transcription, according to an example embodiment.
FIG. 22D illustrates a map chart including an audio transcription of the map chart, according to an example embodiment.
FIG. 23A illustrates a popup interface for selecting a charting function for images in a spreadsheet, according to an example embodiment.
FIG. 23B illustrates a bar chart incorporating a plurality of images in a single bar, according to an example embodiment.
FIG. 24A illustrates an interface showing videos and additional data associated with one or more cells of a spreadsheet, according to an example embodiment
FIG. 24B illustrates a bar chart incorporating a video, according to an example embodiment.
FIG. 25A illustrates an interface for associating streaming data within a cell, according to a first example embodiment.
FIG. 25B illustrates an interface for associating streaming data within a cell, according to a second example embodiment.
FIG. 25C illustrates a line graph of heartrate values streamed from a device, according to an example embodiment.
FIG. 26 is a block diagram illustrating example physical components of a computing device with which aspects of the disclosure may be practiced.
FIGS. 27A and 27B are simplified block diagrams of a mobile computing device with which aspects of the present disclosure may be practiced.
FIG. 28 is a simplified block diagram of a distributed computing system in which aspects of the present disclosure may be practiced.
FIG. 29 illustrates a tablet computing device for executing one or more aspects of the present disclosure.
DETAILED DESCRIPTIONIn the following detailed description, references are made to the accompanying drawings that form a part hereof, and in which are shown by way of illustrations specific embodiments or examples. These aspects may be combined, other aspects may be utilized, and structural changes may be made without departing from the present disclosure. Embodiments may be practiced as methods, systems or devices. Accordingly, embodiments may take the form of a hardware implementation, an entirely software implementation, or an implementation combining software and hardware aspects. The following detailed description is therefore not to be taken in a limiting sense, and the scope of the present disclosure is defined by the appended claims and their equivalents.
As described above, the disclosure generally relates to a system and methods for charting streaming data and/or attributes of streaming data in a spreadsheet. In aspects, when streaming data is associated with a spreadsheet, the streaming data may be treated as a new type of data within the spreadsheet. In some aspects, streaming data may be associated within a single cell or within a range of cells. In further aspects, both streaming data and additional data may be associated with a single cell (e.g., within a compound data type, as described herein). According to the present methods, the streaming data itself, as well as parameters of the streaming data such as a protocol, a format, a packet rate, a packet size, etc., and/or spreadsheet data (e.g., values in cells, user comments, etc.) may be incorporated into a report (e.g., a chart) using a spreadsheet charting function. It is with respect to these and other general considerations that embodiments have been made.
FIG. 1 illustrates a system for creating a chart in a spreadsheet application based at least in part on an object within the spreadsheet, according to an example embodiment.
System100 may include one or more client computing devices104 (e.g.,client computing devices104A and104B) that may execute a client version of a spreadsheet application capable of charting objects associated with a spreadsheet. For example, charting objects may include incorporating an object into a chart and/or charting attributes associated with the object. A chart may include any type of chart, graph, table, or report, such as a bar chart, map chart, scatter plot, line graph, tree chart, pie chart, radar chart, and the like, in any suitable number of dimensions. Objects associated with a spreadsheet may include, for example, images, audio files, videos, streamed data, and the like. Streaming data may refer to any type of data provided via a communications connection (e.g., via Bluetooth®, cellular, WAN, LAN, wired or wireless media, etc.) over some period of time. For instance, streaming data may refer to streaming audio (e.g., podcast, music, audio book), streaming video (e.g., live sports broadcast, YouTube® video, third-party hosted video, multiple frames transmitted from a camera, recorded video transmitted from a mobile device or video recorder, etc.), data feeds (e.g., twitter feed, stock ticker, fitness data from a wearable device, medical data from a medical device, diagnostic data from a mechanical device, etc.), and the like. An object may be “associated” with the spreadsheet by being embedded in a cell of the spreadsheet, anchored to a cell of the spreadsheet, referenced by a formula, name, hyperlink or pointer within the spreadsheet, positioned within the same row or column as a selected cell of the spreadsheet, and the like.
In some examples, the client spreadsheet application may execute locally on a client computing device104. In other examples, a client spreadsheet application (e.g., a mobile app on a thin client computing device104) may operate in communication (e.g., via network106) with a corresponding server version ofspreadsheet application110 executing on one or more server computing devices, e.g.,server computing device108. In still other aspects, rather than executing a client version of a spreadsheet application, the one or more client computing devices104 may remotely access, e.g., via a browser overnetwork106, thespreadsheet application110 implemented on theserver computing device108 or multiple server computing devices (e.g.,server computing devices122A and122B in a distributed computing environment such as a cloud computing environment).
As illustrated byFIG. 1, a server version ofspreadsheet application110 is implemented byserver computing device108. As should be appreciated, the server version ofspreadsheet application110 may also be implemented in a distributed environment (e.g., cloud computing environment) across a plurality of server computing devices (e.g.,server computing devices122A and122B). Moreover, as should be appreciated, either a client or a server version of thespreadsheet application110 may be capable of charting objects associated with a spreadsheet. While a server version of thespreadsheet application110 and associated components112-120 are shown and described, this should not be understood as limiting. Rather, a client version ofspreadsheet application110 may similarly implement components112-120 on a client computing device104.
In a basic configuration, the one or more client computing devices104 are personal or handheld computers having both input elements and output elements operated by one or more users102 (e.g.,user102A anduser102B). For example, the one or more client computing devices104 may include one or more of: a mobile telephone; a smart phone; a tablet; a phablet; a smart watch; a wearable computer; a personal computer; a desktop computer; a laptop computer; a gaming device/computer (e.g., Xbox®); a television; a household appliance; and the like. This list is exemplary only and should not be considered as limiting. Any suitable client computing device for executing a client spreadsheet application and/or remotely accessingspreadsheet application110 may be utilized.
In some aspects,network106 is a computer network such as an enterprise intranet and/or the Internet. In this regard, thenetwork106 may include a Local Area Network (LAN), a Wide Area Network (WAN), the Internet, wireless and wired transmission mediums. In further aspects,server computing device108 may communicate with some components of the system via a local network (e.g., an enterprise intranet), whereasserver computing device108 may communicate with other components of the system via a wide area network (e.g., the Internet). In addition, the aspects and functionalities described herein may operate over distributed systems (e.g., cloud computing systems), where application functionality, memory, data storage and retrieval, and various processing functions may be operated remotely from each other over a distributed computing network, such as the Internet or an intranet.
As described above, thespreadsheet application110 may be implemented on aserver computing device108. In a basic configuration,server computing device108 may include at least a processing unit and a system memory for executing computer-readable instructions. In some aspects,server computing device108 may comprise one or moreserver computing devices108 in a distributed environment (e.g., cloud computing environment).Server computing device108 may provide data, including spreadsheet data, objects (e.g., images, audio files, videos, streaming data, and the like), object data and/or object attributes associated with the spreadsheet to and from the one or more client computing devices104 and/or one or more other server computing devices (e.g.,server computing devices122A and/or122B) vianetwork106.
As noted above, an object associated with a spreadsheet may include, for instance, an image, an audio file, a video, a link to streamed data, and the like. In other examples, an object may refer to any discrete data structure. For instance, objects may include shapes (e.g., a circle, triangle, square, etc.), diagrams (e.g., flow diagram, chart, tree, etc.)—essentially anything. With respect to images, each image may be stored as an image file in a file format identified by a file extension, such as .jpeg, .png, .gif, .tiff, etc., and may be retrieved based on a file locator, which may be a uniform resource locator (URL) identifying a file path to a local storage location or a remote storage location. In aspects, an image may be defined by image data (e.g., raw pixel data, an array of pixel values, or other data for rendering the image) and image attributes (e.g., opacity, color palette, resolution, aspect ratio, image dimensions, author, creation date and/or time, file name, tags, file size, GPS location information, etc.). In some aspects, each image attribute may be defined by an attribute-value pair. That is, an image attribute (e.g., image height) may be paired with a value for that attribute (e.g., 1.04 inches) for a particular image. In other aspects, image attributes may be organized in any suitable structured format, e.g., an array of values, a record with an array of fields, a table, an array of vectors, etc.
With respect to audio files, sound waves may be digitally encoded (e.g., by pulse-code modulation), in some cases processed (e.g., filtered, edited, etc.) and/or compressed (e.g., based on a codec to reduce file size), and stored as an audio file in a file format identified by a file extension, such as .wav, .wma, .aiff, .m4a, .snd, .mp3, .omf, etc. For example, a microphone of a mobile device may record (or capture) sound waves (e.g., of a conversation) and may convert the sound waves into an analog electric signal. An analog-to-digital converter (ADC) may then convert the analog signal into a digital signal, e.g., generally using pulse-code modulation. In some cases, the ADC may be available on the mobile device, while in other cases the analog signal may be downloaded from the mobile device and converted to a digital signal on another device (e.g., personal or server computing device). The digital signal may be processed and/or compressed and stored in a file format (e.g., audio data), as detailed above. Later, when an audio file is played, the digital signal may be converted back to an analog electrical signal using a digital-to-audio converter (DAC) for transmission to a speaker. An audio file may be defined by audio data (e.g., digital data encoding soundwaves) and audio attributes (e.g., frequency, amplitude, sampling rate, codec, bitrate, volume, pitch, speed, channel, audio effects, author/artist, creation date and/or time, file name, file size, duration, etc.).
A video may refer to moving images, with or without an audio track, and a video file may encode data for rendering the moving images and playing the audio track, if included. The term “moving images” generally refers to an array of images (e.g., individual frames) that are shot in sequence over a time period (e.g., capture rate) and are then spliced together and “played” (or displayed) consecutively at a certain rate (e.g., frame rate). However, in aspects, any array of images (whether related or not) may be spliced together and played at a frame rate to create a video (or a slideshow). An audio track refers to any type of audio, e.g., speech, music, sounds, or any combination thereof, that is associated with a video, whether synchronized with individual frames or not. In some cases, an audio track may be recorded with a video (e.g., on a mobile device, video recorder, movie camera, etc.). In other aspects, an audio track may be added to a video at a later time and may be synchronized with individual frames of the video, or not. In aspects, a video file may include video data (e.g., an array of pixel values for rendering each individual frame of a video and/or modulated data for reproducing soundwaves of an audio track). A video file may further include video attributes (e.g., frame rate, aspect ratio, duration, resolution, bits per frame, video size, synchronization data, etc.), individual frame attributes (e.g., aspect ratio, color space, bitrate, etc.) and/or audio attributes (e.g., pitch, volume, speed, etc.).
As detailed above, streamed data may include any type of data received over some type of communications connection for some period of time. Streamed data may include, for instance, streaming video, streaming audio or a data feed that is associated with the spreadsheet based on a hyperlink to a streaming device (e.g., camera, mobile device, medical device, fitness device, etc.), a URL referencing a third party service (e.g., YouTube®, Netflix®, Twitter®, Pandora®, Spotify®, etc.), or the like. For instance, references to URLs within the spreadsheet may include: =GETFEED(“https://twitter.com/hashtag/Fastcars?src=hash&lang=en”) or =GETSTREAM(“https://www.amazon.com/East/dp/B0152ZY7KQ/ref=sr_1_1?s=instant-video&ie=UTF8&qid=1466705933&sr=1-1&keywords=fast+cars”), etc. In some cases, the URL may specify a parameter to be ‘on’ or ‘play’ by default. In a first example, the function would return a handle to the data stream that may just show a blank screen or a first frame of data. In the case where the optional parameter is ‘true’ (it may be false by default to avoid performance issues), then the function would return the data stream and the data stream would start refreshing and playing its data. Syntax may include: “=GETVIDEO(“http://foo.com/bar/feed”, TRUE)”. Alternatively, in the case where the data stream is not played immediately, the data stream may only play on demand (via UI control, or via some other calling function or feature that points at that stream object and asks it to play). In a case where video is streamed directly from a camera, there may be additional parameters passed to control the device itself, e.g., OFF, STANDBY, RESET, etc. In other examples, streaming data may be associated with a cell without using a globally unique name or URL by selecting an “Insert” operation in the toolbar and using a dialog filtered to video types or streaming data types (e.g., streaming video, streaming audio, live data feeds, etc.) to find and insert the streaming data.
In further aspects, the dynamic data (e.g., video file, live data feed, streaming audio or streaming video) and attributes (e.g., image attributes, audio attributes and/or video attributes), may be associated with additional data (e.g., data describing the content of the dynamic data, a text transcription of an audio track, or any other data, etc.) in a single compound data type. Additionally, compound data types can hold multiple data streams and/or multiple videos, etc. Moreover, an application program interface (API) may be provided that can talk back to the data stream. This type of functionality allows two things: first, the data stream may be triggered and/or controlled (in the case of attached devices); and second, simple playback of the data stream, e.g., start, stop, lock, refresh, as well as user interface controls. Additionally, for data streams that allow it (e.g., delayed live TV feeds), a ‘look ahead’ buffer may be enabled such that the next steps in calculations may be modeled theoretically. In a more advanced case of an active ‘look ahead,’ one function may handle calculations based on real time data (e.g., current data) and a predictive function may run on an offset that anticipates data five seconds ahead of the current data. The results of these functions may be compared (e.g., delta) in a moving calculation. By combining this theoretical model with talking back to the data stream, a device (e.g., having a steering wheel) may be controlled in real-time via calculations in a spreadsheet. For instance, the device may be steered through a number of obstacles using a combination of cameras streaming locations of nearby obstacles as the device moves through a course. This implementation enables a steering model in the spreadsheet to calculate the results of steering the device through the obstacle course using a feedback loop, as described above.
Data attributes may describe the streaming data. For instance, streaming data may be transmitted based on a protocol in a format at a packet rate, and may include a packet size. In some cases, data attributes may be described by attribute-value pairs and/or attribute-type pairs. For instance, for attribute “protocol” a type may be “TCP/IP”; for attribute “packet size” a value may be “64K”; and the like. In other cases, data associated with streaming data may not be represented as name/value pairs but may simply represent “data,” e.g., an array of pixel data, an array of values, etc. In still other cases, these types of properties may be considered distinct ‘rich types’ of data, which means you may be permitted to convert the value to some other value. For example, in the case of “data rate” the user may be offered an interaction to down-convert to a lower rate, e.g., 1 Mb/s to 0.5 Mb/s. This conversion may require a reduction in resolution, framerate, or some other property of the video. In the case of video resolution, a user may start with 4k and the act of changing to 1024×768 may trigger a resample of the video. The logic to do this type of conversion may be bundled up via a user defined function (UDF), which is a customized function that may be created by a user directly within a spreadsheet. In some aspects, the value changes to resolution, size, etc., may be written back to the source, or may be a property of a ‘display format’ that is applied when the video is played back on the client side. As should be appreciated, the above examples of data attributes and other data are offered as examples only.
As should be further appreciated, attributes associated with objects may be organized and stored in any suitable data structure. In further examples, object attributes may be appended as metadata to the object (e.g., image file, audio file, video file, data stream, etc.). For instance, as illustrated,metadata130A may be appended to object126A,metadata130B may be appended to object126B, andmetadata130C may be appended to object126C and stored instorage location124. Alternatively, the object attributes may be stored in a separate location or database from the objects and may be referenced by or otherwise indexed to the objects (not shown). In at least some examples, objects may be stored in different storage locations within a distributed environment (e.g., cloud computing environment) accessible tospreadsheet application110 over a network, e.g.,network106. As described herein, the location of an object (e.g., image file, audio file, video, streaming data, etc.) in storage may be represented by a file locator or link, which may be a URL to local storage (e.g., C:\Pictures\elephant1.jpeg), a URL to remote storage accessible over a network (e.g., http://www.pics.com/tree.png), a live link to a streaming device, etc. Additionally, an object may be referenced by name (e.g., “elephant1.jpeg”) to locate it within the local workbook file. In other aspects, the object may be referenced within a function of the spreadsheet by a globally unique name. A globally unique name can be any string, e.g., “elephant,” or “elephant1” or “elephant.jpg,” that uniquely identifies the object within the spreadsheet workbook. For instance, to call the object from another cell in the spreadsheet, the function “=elephant” may be used in the cell. However, if the same name is used on different sheets of a spreadsheet to return different values, the name may be qualified by the sheet on which it appears in order to create a globally unique name, e.g., “Sheet1!OctoberEarnings” and “Sheet2!OctoberEarnings.” In still other aspects, an object may be referenced by a cell address. In this case, for an object added to a spreadsheet in cell A1, the formula “=A1” will simply grab the object. If the object has a bitrate property, for example, another formula such as “=A1.bitrate” (e.g., written into cell B1) would access the object in cell A1 and retrieve the bitrate field (e.g., from metadata associated with the object). If no such bitrate field exists, the formula in B1 may error out. In this way, whether or not an object has been given a globally unique name, cell address dereferencing enables formulas to be written that operate on objects within cells.
As illustrated inFIG. 1, thespreadsheet application110 may include various components for charting objects and/or object attributes associated with a spreadsheet, including aselection component112, anobject identifier114, aparameter retriever116, acharting component118, aUX component120, and the like. In aspects, each component may communicate and pass data between the other components. The various components may be implemented using hardware, software, or a combination of hardware and software. Moreover, the various components may be executed on a single server computing device (e.g., server computing device108), on multiple server computing devices (e.g.,server computing devices122A,122B and/or128), or locally on a client computing device (e.g.,client computing device104A or104B).
In aspects,selection component112 may receive a selection of one or more cells of a spreadsheet. A cell or a range of cells may be selected, either automatically (e.g., based on a function) or by user selection. That is, in some aspects, operations may call (or select) a cell without requiring user input, e.g., operations such as ‘data refresh,’ ‘import from such and such data source,’ ‘copy paste,’ etc. In other aspects, a spreadsheet application (e.g., spreadsheet application110) may provide the spreadsheet to a user, the spreadsheet including one or more sheets, each sheet having a plurality of rows and columns of cells. Cells may be selected by highlighting or otherwise identifying the cell or range of cells using a gesture, touch, mouse click, keyboard input, and the like. When a single cell is selected, the cell may be identified in a toolbar of the user interface by a cell identifier that specifies a location of the cell within the spreadsheet. For example, a cell identifier of “A1” specifies that the cell is located in column A,row 1 of the spreadsheet, while a cell identifier of “B5” specifies that the cell located in column B,row 5 of the spreadsheet. The cell identifier may further be displayed adjacent to a formula bar (or “fx bar”) identifying the contents of the cell in the toolbar of the user interface. When a range of cells is selected, the cell at the top left corner of the range may be displayed by a cell identifier next to the formula bar, with the range of cells being represented by a range identifier including cell identifiers for the cell at the top left corner and the cell at the bottom right corner (e.g., A1:C5).
In other aspects,selection component112 may receive a selection of an object (e.g., an image, audio file, video, streaming data, etc.) associated with the spreadsheet, either automatically or by user selection. That is, in some aspects, operations may call (or select) an object without requiring user input, e.g., operations such as ‘data refresh,’ ‘import from such and such data source,’ ‘copy paste,’ etc. In other aspects, a spreadsheet application (e.g., spreadsheet application110) may provide the spreadsheet to a user, the spreadsheet including one or more sheets, each sheet having a plurality of rows and columns of cells. Objects within the spreadsheet (e.g., associated with one or more cells) may be selected by highlighting, by inputting a formula referencing the object or an object name and/or by otherwise identifying the object and/or the cell(s) with which the object is associated using a gesture, touch, mouse click, keyboard input, and the like.
In examples where an object is not directly selected, anobject identifier114 may identify one or more objects associated with the selected one or more cells. As described above, an object may be associated with one or more cells by being directly embedded into the one or more cells. For instance, in the case of an image, image data and image attributes may be embedded in the one or more cells as a value and the image may be drawn in the same pixelated space within the spreadsheet as the one or more cells. In other cases, an object may be anchored within the one or more cells based on a function that references the object (e.g., =IMAGE(“http://www.mattspics.com/weather/rainyday.png”)). In still other cases, an object may be associated with the spreadsheet without a formula, for instance, a name for the object may simply be listed without the “=IMAGE” operator, e.g., “4two.png, red.png” or “<Image>4two.png, red.png”. In still other cases, an object may be associated with the selected one or more cells based on being associated with a cell that is related to the selected one or more cells. A cell may be related to the selected one or more cells based on being in the same row and/or column as at least one cell within the selected one or more cells.
In aspects,parameter retriever116 may retrieve object data and/or object attributes for each object identified by theselection component112 or theobject identifier114. Object data may include, for instance, image data (e.g., raw pixel data, an array of pixel values, or other data for rendering the image), audio data (e.g., digital data encoding soundwaves), video data (e.g., an array of pixel values for rendering each individual frame of a video and/or modulated data for reproducing soundwaves of an audio track), data values (e.g., individual stock prices for a particular stock over a period of time, heartrates over a period of time, insulin levels over a period of time, etc.), and the like. Object attributes may be descriptors, such as image attributes (e.g., opacity, color palette, resolution, aspect ratio, image dimensions, author, creation date and/or time, file name, tags, file size, GPS location information, etc.), audio attributes (e.g., frequency, amplitude, sampling rate, codec, bitrate, volume, pitch, speed, channel, audio effects, author/artist, creation date and/or time, file name, file size, duration, etc.), video attributes (e.g., frame rate, aspect ratio, duration, resolution, bits per frame, video size, synchronization data, aspect ratio, color space, bitrate, pitch, volume, speed, etc.), data attributes (e.g., packet rate, packet size, protocol, bandwidth, etc.), and the like. In some aspects, parameter retriever may retrieve object data and/or object attributes from an object file or a stream of data. In further aspects, object attributes may be retrieved from metadata associated with an object file or a stream of data. In still further aspects, object data and/or object attributes may be retrieved from a compound data type associated with the object. In still further aspects, objects may be passed to a third party service for processing, which may return additional object parameters for charting, such as the number of people in a photo, the number of smiling people in a photo, the names of people in a photo, the type of animal in a photo, actors identified in a video, emotions detected from face recognition while a test group (or other group) is watching a video or listening to an audio recording based on a streaming video recording of the test group, emotions identified among persons within a video, and the like. As should be appreciated, object parameters may be retrieved for each object via any suitable means.
A compound data type may include data types such as: image data, image attributes, alphanumeric data, audio data, audio attributes, video data, video attributes, streamed data, data attributes, and the like. In aspects, a compound data type may reference a file (e.g., an image file, audio file, video file, etc.) or a data stream (e.g., stock ticker, Twitter® feed, camera feed, wearable device feed, etc.) that includes object data (e.g., image data, audio data, video data, data feed, etc.) and/or object attributes (e.g., image attributes, audio attributes, video attributes, data attributes, etc.) in a structured format. In some cases, where an object is associated with a compound data type, the structure of a compound data type may be leveraged in complex calculations, thereby providing a model for referencing and using different aspects of the data. For example, each component of a compound data type may be represented by a formula or a function. Such individual representation of components facilitates the creation of structures in a single cell where calculations can reference back to other components of the compound data type. For instance, any of the fields of the compound data type can be dereferenced and acted on. That is, a formula may be constructed to get the value of a special field (“=GetAttribute(<field name>)”), an operator may be used to get the value (e.g., the dot “.” operator, “=A1.aspectratio”), or a unique name may be used to get the value (e.g., if cell A1 has a unique name, “OctoberEarningsRecording.aspectratio”). In this way, each field is available to thecharting component118 described below.
Charting component118 may create a chart. As described above, a chart may include any type of chart, graph, table, or report, such as a bar chart, map chart, scatter plot, line graph, tree chart, pie chart, radar chart, and the like, in any suitable number of dimensions. In some aspects, chartingcomponent118 may receive a selection of a charting function that specifies the data and/or attributes to be charted and the type of chart to be created. For instance, a chart may be created based on data associated with the selected one or more cells and/or based on object data and/or object attributes. In some cases, the object or a representation of the object may be incorporated into the chart. That is, an image, an audio file, a video, streaming data, etc., may be incorporated into the chart. In some cases, a separate window or overlay may be incorporated into the chart, e.g., for displaying streaming data including streaming audio, streaming video, a data feed (such as a stock ticker, Twitter® feed, etc.).
By way of example, for three images associated with the selected one or more cells, the GPS location where each image was created may be retrieved from metadata associated with each image and may be charted on a map chart. In this case, the GPS locations may not be generally viewable within the selected one or more cells. Moreover, each of the three images may be incorporated into the map chart at the GPS location where the image was created. In another example, data associated with the selected one or more cells (e.g., sticker price data for three makes and models of used cars) may be charted in a bar chart and an image for each make and model of used car may be incorporated into the bar representing the corresponding data for that make and model. In yet another example, a duration for each of a plurality of audio files associated with the selected one or more cells may be retrieved. The duration information may be charted on a pie chart and a visual representation for each audio file may be incorporated into an appropriate segment of the pie chart corresponding to the duration of each audio file. In yet another example, a cell within a spreadsheet may reference a link to a glucometer for receiving glucose levels for a user over a period of time. For each day of glucose monitoring, a different cell may reference the link to the glucometer. In this case, an average glucose level for each day may be calculated and presented in a bar chart. Alternatively, each glucose reading may be charted as a data point in a line graph over some period of time.
Although specific examples for objects such as images, audio files, videos and streaming data have been described above, objects may include any discrete data structure. In this case, objects may include shapes (e.g., circle, triangle, square, etc.), diagrams (e.g., flow diagram, chart, tree, etc.)—essentially anything. For instance, a compound data type may be defined with a shape as a field. When fields of the compound data type are charted, the object (e.g., shape) could be provided within the chart as a data point. Other examples are possible and any arbitrary object may be part of the compound data type framework. As should be appreciated, the above examples are provided for purposes of explanation and should not be considered limiting.
UX component120 may communicate with chartingcomponent118 to provide one or more user interfaces for selecting charting functions and for presenting charts of objects associated with a spreadsheet. Selections and/or inputs of charting functions may be received by gesture, touch, mouse input, keyboard input, etc. For example,UX component120 may provide a tool bar, popup menu, dropdown menu, ribbon, etc., that includes UI controls for selecting a charting function. For example, UI controls may be provided for specifying the data or parameters to be charted and the type of chart to be created. As should be appreciated,UX component120 may further present the chart within the spreadsheet. For example, the chart may be displayed in a separate window, as an overlay, etc., within an interface of the spreadsheet application. In some aspects, the chart may be interactive and may be configured to launch additional interfaces upon selection (e.g., launch a separate window to play a video, view an image, access a plurality of audio tracks, etc.).UX component120 may provide any suitable interface for viewing charted data and/0r objects, as described herein.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 1 are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 2 illustrates a method for charting an object associated with a spreadsheet, according to an example embodiment.
Method200 begins with provideinterface operation202, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display. Selections and/or inputs may be received by the user interface based on gestures, touch, mouse movements, and the like. The user interface may operate in conjunction or communication with one or more other components of the spreadsheet application (e.g.,selection component112,object identifier114,parameter retriever116, and charting component118) to chart objects, object data and/or object attributes associated with a spreadsheet.
At receiveselection operation204, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication. That is, in some aspects, operations may call (or select) the one or more cells without requiring user input, e.g., operations such as ‘data refresh,’ ‘import from such and such data source,’ ‘copy paste,’ etc. In other aspects, a spreadsheet application (e.g., spreadsheet application110) may provide the spreadsheet to a user, the spreadsheet including one or more sheets, each sheet having a plurality of rows and columns of cells. The one or more cells may be selected by highlighting or otherwise identifying the cell or range of cells using a gesture, touch, mouse click, keyboard input, and the like. When a single cell is selected, the cell may be identified in a toolbar of the user interface by a cell identifier that specifies a location of the cell within the spreadsheet. For example, a cell identifier of “A1” specifies that the cell is located in column A,row 1 of the spreadsheet, while a cell identifier of “B5” specifies that the cell located in column B,row 5 of the spreadsheet. The cell identifier may further be displayed adjacent to a formula bar (or “fx bar”) identifying the contents of the cell in the toolbar of the user interface. When a range of cells is selected, the cell at the top left corner of the range may be displayed by a cell identifier next to the formula bar, with the range of cells being represented by a range identifier including cell identifiers for the cell at the top left corner and the cell at the bottom right corner (e.g., A1:C5).
Atidentify object operation206, at least one object that is associated with the selected one or more cells may be identified. For example, an object identifier (e.g., object identifier114) may identify at least one object associated with the selected one or more cells. An object may refer to an image, an audio file, a video, streaming data, and the like. In aspects, an object may be “associated” with one or more cells by being embedded in the one or more cells, anchored to the one or more cells, referenced by a formula, name or pointer within the one or more cells, positioned within the same row or column as a selected cell of the one or more cells, and the like. As should be appreciated, at least one object associated with the selected one or more cells may be identified by any suitable means.
At retrieveparameters operation208, one or more parameters associated with the identified object(s) may be retrieved. Retrieveparameters operation208 may be performed by a parameter retriever (e.g., parameter retriever116) of a spreadsheet application (e.g., spreadsheet application110). As described above, parameters of objects may include, for instance, object data and/or object attributes. Object data may include, for instance, image data (e.g., raw pixel data, an array of pixel values, or other data for rendering the image), audio data (e.g., digital data encoding soundwaves), video data (e.g., an array of pixel values for rendering each individual frame of a video and/or modulated data for reproducing soundwaves of an audio track), data values (e.g., individual stock prices for a particular stock over a period of time, heartrates over a period of time, insulin levels over a period of time, etc.), and the like. Object attributes may be descriptors, such as image attributes (e.g., opacity, color palette, resolution, aspect ratio, image dimensions, author, creation date and/or time, file name, tags, file size, GPS location information, etc.), audio attributes (e.g., frequency, amplitude, sampling rate, codec, bitrate, volume, pitch, speed, channel, audio effects, author/artist, creation date and/or time, file name, file size, duration, etc.), video attributes (e.g., frame rate, aspect ratio, duration, resolution, bits per frame, video size, synchronization data, aspect ratio, color space, bitrate, pitch, volume, speed, etc.), data attributes (e.g., packet rate, packet size, protocol, bandwidth, etc.), and the like. In some aspects, parameters including object data and/or object attributes may be retrieved from an object file or a stream of data. In further aspects, object attributes may be retrieved from metadata associated with an object file or a stream of data. In still further aspects, object data and/or object attributes may be retrieved from a compound data type associated with the object. As should be appreciated, one or more parameters may be retrieved for each identified object via any suitable means.
At receive chartingfunction operation210, a selection of a charting function may be received, either automatically or by user indication. For example, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application. In some aspects, an interface such as a toolbar, ribbon, dropdown menu, etc., may be provided for receiving a selection of the charting function. For instance, the interface may display various types of reports or charts for selection, e.g., bar charts, map charts, scatter plots, line graphs, tree charts, pie charts, radar charts, and the like. Moreover, the interface may provide for selection of multiple-dimensioned charts, e.g., two-dimensional (e.g., line graph, bar chart, pie chart, map chart) and three-dimensional (surface chart, topical relief map, xyz line graph, etc.) charts may be available for selection. In other aspects, a chart may be automatically selected. For instance, in response to identifying an object that is a live data feed of heartrate data for an athlete, a line graph may automatically be selected and rendered to present the heartrate data to the athlete's trainer during a race. In further aspects, the interface for selecting the charting function may further provide for selection of one or more parameters to be charted. In some cases, the selected one or more cells may be associated with rows and/or columns of data for charting. In other cases, e.g., where the selected one or more cells are associated with a plurality of images, a selection may be received to chart parameters such as the creation date and author for each image in a scatter plot. In some aspects, where parameters are not displayed within the spreadsheet, an additional interface (e.g., dropdown menu) may be provided for selecting these parameters for charting. That is, some parameters may be associated with an object as metadata or as a compound data type and may not be visible as values within cells of the spreadsheet. As should be appreciated, a selection of a charting function including a chart type and/or parameters for charting may be received via any suitable means.
At createchart operation212, a chart may be created including the at least one object and/or object parameters. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. A chart may be created based on data associated with the selected one or more cells and/or based on object data and/or object attributes. For instance, as described above, object parameters that are not visible within the spreadsheet may be charted. That is, object data (such as raw pixel data, an array of pixel values, digitally-modulated sound waves, etc.) and/or object parameters (such as a bitrate, resolution, creation date and/or time, GPS location data, etc.) may be retrieved from metadata or a compound data type associated with the object and charted. In additional or alternative cases, the object or a representation of the object may be incorporated into the chart. That is, an image, an audio file, a video, streaming data, etc., may be incorporated directly into the chart, as described with respect to chartingcomponent118 above. As should be appreciated, the above examples are provided for purposes of explanation and should not be considered limiting.
As should be further appreciated, operations202-212 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 3 illustrates a method for customizing an image incorporated into a chart, according to an example embodiment.
Method300 begins with provideinterface operation302, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display. Selections and/or inputs may be received by the user interface based on gestures, touch, mouse movements, and the like.
At receiveselection operation304, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
At identifyimage operation306, at least one image associated with the selected one or more cells may be identified. For example, an object identifier (e.g., object identifier114) may identify at least one image associated with the selected one or more cells. In aspects, an image may be “associated” with one or more cells by being embedded in the one or more cells, anchored to the one or more cells, referenced by a formula, name or pointer within the one or more cells, positioned within the same row or column as a selected cell of the one or more cells, and the like. As should be appreciated, at least one image associated with the selected one or more cells may be identified by any suitable means.
At retrieveparameters operation308, one or more parameters associated with the identified image(s) may be retrieved. Retrieveparameters operation308 may be performed by a parameter retriever (e.g., parameter retriever116) of a spreadsheet application (e.g., spreadsheet application110). As described above, parameters of images may include, for instance, image data and/or image attributes. Image data may include, e.g., raw pixel data, an array of pixel values, or any other data for rendering the image. Image attributes may include, e.g., opacity, color palette, resolution, aspect ratio, image dimensions, author, creation date and/or time, file name, tags, file size, GPS location information, etc. In some cases, image attributes may be retrieved from metadata associated with an image file. In other cases, image data and/or image attributes may be retrieved from a compound data type associated with the image. As should be appreciated, one or more parameters may be retrieved for each identified image via any suitable means.
At receive chartingfunction operation310, a selection of a charting function may be received, either automatically or by user indication. For example, as described above with respect to receive chartingfunction operation210, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application. In some aspects, an interface such as a toolbar, ribbon, dropdown menu, etc., may be provided for receiving a selection of the charting function. The interface may display various types of charts for selection, e.g., bar charts, map charts, scatter plots, line graphs, tree charts, pie charts, radar charts, and the like. In other aspects, a chart may be automatically selected. Further, the interface may provide for selecting one or more parameters to be charted. In some cases, e.g., where the selected one or more cells are associated with a plurality of images, a selection may be received to chart parameters such as the creation date and author for each image in a scatter plot. In some aspects, where parameters are not displayed within the spreadsheet, an additional interface (e.g., dropdown menu) may be provided for selecting such parameters for charting. That is, some parameters may be associated with an image as metadata or as a compound data type and may not be visible as values within cells of the spreadsheet. As should be appreciated, a selection of a charting function including a chart type and/or parameters for charting may be received via any suitable means.
At createchart operation312, a chart may be created incorporating the at least one image. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. In particular, the image or a representation of the image (e.g., icon) may be incorporated into the chart. For example, three images may be associated with the selected one or more cells. A GPS location corresponding to where each image was created may be retrieved from metadata associated with each image and each GPS location may be presented on a map chart. In some examples, although provided as metadata, the GPS locations may not be visible as values within the selected one or more cells. In further examples, each of the three images (or representations of the images) may be incorporated into the map chart at the GPS location where the image was created. In still further aspects, the map chart may automatically be resized so as to present each image in an appropriate size for viewing by a user. In some cases, resizing the map chart may be based at least in part on an aspect ratio for each image, which aspect ratio may fixed or adjustable. When the map chart is displayed on a mobile device or other reduced-size display, although each image may be incorporated into the map chart, each image may be represented by a placeholder such as an icon. In this case, in response to selection of the icon, the image may be launched in a separate window, overlay, or any other suitable interface, for presentation to a user. As should be appreciated, the above examples are provided for purposes of explanation and should not be considered limiting.
At receivecustomization operation314, a customization to the chart may be received. For instance, a user may apply various effects to the chart, such as shading effects, fill effects (e.g., based on different color hues), texture fill effects, gradient fill effects, transparency effects, glow and/or soft edge effects, three-dimensional effects, or some combination thereof. As should be appreciated, any customization that is made available by the spreadsheet application may be received and applied to the chart.
At applycustomization operation316, the customization may be applied to the at least one image incorporated into the chart. For instance, when a gradient, red fill effect was received and applied to a bar chart, the at least one image may also be customized with a gradient, red fill effect. Similarly, when shading effects are applied to a map chart, such shading effects may also be applied to the at least one image incorporated into the chart. However, in at least some aspects, such customizations may not be rewritten to the image file associated with the spreadsheet. As should be appreciated, in addition to an image associated with a spreadsheet, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an audio file, a video, streaming data, and the like.
As should be further appreciated, operations302-316 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 4 illustrates a method for manipulating an image associated with a spreadsheet that is incorporated into a chart, according to an example embodiment.
Method400 begins with provideinterface operation402, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display. Selections and/or inputs may be received by the user interface based on gestures, touch, mouse movements, and the like.
At receiveselection operation404, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
At identifyimage operation406, at least one image associated with the selected one or more cells may be identified. For example, an object identifier (e.g., object identifier114) may identify at least one image associated with the selected one or more cells, as described above with respect to identifyimage operation306.
At retrieveparameters operation408, one or more parameters associated with the identified image(s) may be retrieved. Retrieveparameters operation408 may be performed by a parameter retriever (e.g., parameter retriever116) of a spreadsheet application (e.g., spreadsheet application110), as described above with respect to retrieveparameters operation308.
At receive chartingfunction operation410, a selection of a charting function may be received, either automatically or by user indication. For example, as described above with respect to receive chartingfunction operation310, a selection of a charting function including a chart type and/or one or more parameters for charting may be received by a charting component (e.g., charting component118) of a spreadsheet application. As described above, the one or more parameters associated with an image may include image data (e.g., raw pixel data, an array of pixel values, or any other data for rendering the image) and/or image attributes (e.g., opacity, color palette, resolution, aspect ratio, image dimensions, author, creation date and/or time, file name, tags, file size, GPS location information, etc.). As further described above, where the one or more parameters are not displayed within the spreadsheet, an additional interface (e.g., dropdown menu) may be provided for selecting such parameters for charting. For example, an aspect ratio of each of three images associated with the selected one or more cells may be selected for charting in a scatter plot. As should be appreciated, any of the one or more parameters associated with the three images may be selected for charting and any appropriate type of chart for presenting the one or more parameters may be selected.
At createchart operation412, a chart may be created based at least in part on the one or more parameters. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. Continuing with the example above, an aspect ratio for each of the three images associated with the selected one or more cells may be charted in a scatter plot. In further examples, each image or a representation of each image may be incorporated as a data point representing the aspect ratio for the image in the scatter plot. As should be appreciated, a chart may be created based on any parameter(s) associated with an image.
At receiveadjustment operation414, an adjustment to at least one parameter of the one or more parameters may be received in the spreadsheet application. For instance, as illustrated byFIG. 14A, aformatting menu1456 may be provided by the spreadsheet application with various selections andinput fields1458 for adjusting parameters of an image in order to manipulate the image. For instance, an interface may be provided for adjusting image attributes such as opacity, color palette, resolution, aspect ratio, image dimensions, and the like. In aspects, such adjustments may be written to the image file. Continuing with the example above, an aspect ratio for a first image of the three images associated with the selected one or more cells may be adjusted.
At applyadjustment operation416, the adjustment may be applied to the at least one image incorporated into the chart. For instance, when an adjustment is made to an opacity of an image within the spreadsheet application, the adjustment to the opacity may be rewritten to the image file for the image. Furthermore, if the image is incorporated into a chart, the opacity of the image as rendered within the chart may be adjusted accordingly. Continuing with the example above in which the aspect ratio for the first image has been adjusted, a position of the data point associated with the aspect ratio for the first image in the scatter plot may be adjusted accordingly. Moreover, as the adjustment to the aspect ratio received in the spreadsheet may be rewritten to the image file for the first image, the aspect ratio for the first image as rendered and incorporated into the chart may also be adjusted. In some examples, both the adjustment to the charted parameter and to the image incorporated in the chart may be automatically applied within the chart. As should be appreciated, in addition to an image associated with a spreadsheet, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an audio file, a video, streaming data, and the like.
As should be further appreciated, operations402-416 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 5 illustrates a method for incorporating an image into a chart in response to satisfaction of a condition, according to an example embodiment.
Method500 begins with provideinterface operation502, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display. Selections and/or inputs may be received by the user interface based on gestures, touch, mouse movements, and the like.
At receiveselection operation504, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
Atidentify data operation506, data associated with the selected one or more cells may be identified. For example, data may be represented as values within the selected one or more cells. In an example, the identified data may include a company's revenue values for successive months of a particular year. As should be appreciated, the identified data may include any values associated with the selected one or more cells.
At receive chartingfunction operation508, a selection of a charting function may be received, either automatically or by user indication. For example, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application. In some aspects, an interface such as a toolbar, ribbon, dropdown menu, etc., may be provided for receiving a selection of the charting function. For instance, the interface may display various types of reports or charts for selection, e.g., bar charts, map charts, scatter plots, line graphs, tree charts, pie charts, radar charts, and the like, in any suitable number of dimensions. As should be appreciated, any appropriate type of chart may be selected for presenting data associated with the selected one or more cells.
At createchart operation510, a chart may be created based at least in part on the identified data. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. Continuing with the example above, a bar chart may be created that presents a company's revenue values for each month of a particular year. As should be appreciated, any suitable chart may be created based on the selected charting function and the identified data.
Atdetermination operation512, it may be determined whether the charted data satisfies a condition. In aspects, the condition may be specified by a user or may be automatically generated by the spreadsheet application. For instance, continuing with the example above, a user may specify a condition such that when revenue values exceed a particular threshold, e.g., $30,000, an image should be incorporated into the chart. In this case, the revenue values for each month may be evaluated against the threshold to determine whether the condition is satisfied. In another example, a standardized condition may include a condition specifying that an image alert should be incorporated into a chart each time medical data exceeds a particular standardized threshold.
At incorporateoperation514, at least one image may be incorporated into the chart. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. In particular, in response to determining that the charted data satisfies the condition, at least one image may be incorporated into the chart. As with the condition, the image may also be specified by a user or automatically generated by the spreadsheet application. For instance, a condition function may include a reference to the image or other identifier. Continuing with the example above, an image of fireworks may be incorporated into a bar representing revenue values that exceed a threshold of $30,000 for a particular month. In another example, an image of a syringe may be generated on a chart when glucose levels exceed a threshold of 300 mg/dl for more than two hours or an image of a pill may be generated on a chart when blood pressure levels exceed a threshold of 190/120 for more than an hour, and the like. As should be appreciated, any suitable image may be incorporated into a chart upon a determination that a condition is satisfied by the charted data. Moreover, in addition to incorporating an image into a chart upon satisfaction of a condition, the above-described operations may be applied to incorporate other objects into a chart, e.g., an audio file, a video, streaming data, and the like.
As should be appreciated, operations502-514 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 6 illustrates a method for selecting and incorporating an image into a chart, according to an example embodiment.
Method600 begins with provideinterface operation602, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display. Selections and/or inputs may be received by the user interface based on gestures, touch, mouse movements, and the like.
At receiveselection operation604, at least one image associated with a spreadsheet may be selected, either automatically (e.g., based on a function) or by user indication. In some aspects, operations may call (or select) an image without requiring user input, e.g., operations such as ‘data refresh,’ ‘import from such and such data source,’ ‘copy paste,’ etc. In other aspects, a spreadsheet application (e.g., spreadsheet application110) may provide the spreadsheet to a user within an interface and the user may select at least one image associated with the spreadsheet. Images associated with the spreadsheet may be selected by highlighting, by inputting a formula referencing the image or an image name and/or by otherwise identifying the image and/or the cell(s) with which the image is associated using a gesture, touch, mouse click, keyboard input, and the like.
At retrieveparameters operation606, one or more parameters associated with the selected at least one image may be retrieved. Retrieveparameters operation606 may be performed by a parameter retriever (e.g., parameter retriever116) of a spreadsheet application (e.g., spreadsheet application110). As described above, parameters of images may include, for instance, image data and/or image attributes. Image data may include raw pixel data, an array of pixel values, or any other data for rendering the image. Image attributes may include opacity, color palette, resolution, aspect ratio, image dimensions, author, creation date and/or time, file name, tags, file size, GPS location information, etc. In some cases, image attributes may be retrieved from metadata associated with an image file. In other cases, image data and/or image attributes may be retrieved from a compound data type associated with the image. As should be appreciated, one or more parameters may be retrieved for each identified image via any suitable means.
At receive chartingfunction operation608, a selection of a charting function may be received, either automatically or by user indication. For example, as described above with respect to receive chartingfunction operation210, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application. As described above, a selection of a charting function including a chart type and/or parameters for charting may be received via any suitable means.
At createchart operation610, a chart may be created incorporating the at least one image and/or the one or more parameters. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may create a chart incorporating at least one image, as described above with respect to createchart operation312. In particular, the image or a representation of the image (e.g., icon) may be incorporated into the chart. Alternatively, as described with respect to respect to createchart operation412, a charting component may create a chart based on charting the one or more parameters. As should be appreciated, a charting component may create a chart that both incorporates the at least one image and charts the one or more parameters. Moreover, in addition to an image, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an audio file, a video, streaming data, and the like.
As should be further appreciated, operations602-610 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 7 illustrates a method for playing an audio file incorporated into a chart, according to an example embodiment.
Method700 begins with provideinterface operation702, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display. Selections and/or inputs may be received by the user interface based on gestures, touch, mouse movements, and the like.
At receiveselection operation704, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
At identifyaudio file operation706, at least one audio file associated with the selected one or more cells may be identified. For example, an object identifier (e.g., object identifier114) may identify at least one audio file associated with the selected one or more cells. In aspects, an audio file may be “associated” with one or more cells by being embedded in the one or more cells, anchored to the one or more cells, referenced by a formula, name or pointer within the one or more cells, positioned within the same row or column as a selected cell of the one or more cells, and the like. As should be appreciated, at least one audio file associated with the selected one or more cells may be identified by any suitable means.
At retrieveparameters operation708, one or more parameters associated with the identified at least one audio file may be retrieved. Retrieveparameters operation708 may be performed by a parameter retriever (e.g., parameter retriever116) of a spreadsheet application (e.g., spreadsheet application110). As described above, parameters of audio files may include, for instance, audio data and/or audio attributes. Audio data may include, e.g., modulated data for reproducing soundwaves. Audio attributes may include, e.g., frequency, amplitude, sampling rate, codec, bitrate, volume, pitch, speed, channel, audio effects, author/artist, creation date and/or time, file name, file size, duration, etc. In some cases, audio attributes may be retrieved from metadata associated with an audio file. In other cases, audio data and/or audio attributes may be retrieved from a compound data type associated with the audio file. As should be appreciated, one or more parameters may be retrieved for each identified audio file via any suitable means.
At receive chartingfunction operation710, a selection of a charting function may be received, either automatically or by user indication. For example, as described above with respect to receive chartingfunction operation210, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application. In some aspects, an interface such as a toolbar, ribbon, dropdown menu, etc., may be provided for receiving a selection of the charting function. The interface may display various types of charts for selection, e.g., bar charts, map charts, scatter plots, line graphs, tree charts, pie charts, radar charts, and the like. In other aspects, a chart may be automatically selected. Further, the interface may provide for selecting one or more parameters to be charted. In some cases, e.g., where the selected one or more cells are associated with a plurality of audio files, a selection may be received to chart parameters such as the duration or volume for each audio file. In some aspects, where parameters are not displayed within the spreadsheet, an additional interface (e.g., dropdown menu) may be provided for selecting such parameters for charting. That is, some parameters may be associated with an audio file as metadata or as a compound data type and may not be visible as values within cells of the spreadsheet. As should be appreciated, a selection of a charting function including a chart type and/or parameters for charting may be received via any suitable means.
At createchart operation712, a chart may be created incorporating the at least one audio file. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. In particular, the audio file or a visual representation of the audio file may be incorporated into the chart. For example, three audio files may be associated with the selected one or more cells. The audio files may correspond to three deposition recordings, two for a first witness and one for a second witness. A duration for each audio file may be retrieved from metadata associated with the audio file. The total duration for the deposition for each witness may be provided in a bar chart. In this case, the individual durations for the two audio files corresponding to the first witness may be combined into a single bar of the bar chart. In some examples, although provided as metadata, the durations for each audio file may not be visible as values within the selected one or more cells. As should be appreciated, the above examples are provided for purposes of explanation and should not be considered limiting.
Atdisplay operation714, a visual representation of the at least one audio file may be displayed in the chart. Continuing with the example above, a visual representation of each of the three audio files may be incorporated into a bar representing a duration of deposition for a particular witness. For the bar that represents a combination of the durations for two deposition recordings, visual representations of both audio files may be incorporated into the bar. A visual representation may include, for instance, a speaker icon, play icon, waveform rendering, sliced waveform rendering, track name with metadata, special icon with metadata, etc. For example, the visual representation may depict the actual waveform and may be interactive. That is, a user may scroll along the visual representation and begin listening to the audio file from any position along the waveform. Alternatively, the visual representation may not depict the actual waveform but may be a standard representation of a waveform and may be overlaid with a play icon. As should be appreciated, any suitable icon or other symbol may be provided as a visual representation for the at least one audio file.
At receiveselection operation716, a selection of the visual representation may be received. For example, the visual representation may be selected by gesture, touch, mouse click, keyboard input, cursor hover, and the like. Selection of the visual representation may further include activating a play icon associated with the visual representation or launching a user interface by right-clicking or hovering over the visual representation, and the like. As should be appreciated, selection of the visual representation may be received by any suitable means.
At provideoperation718, one or more play controls may be provided for accessing the at least one audio file. For instance, the play controls may include controls for playing, skipping forward or back, pausing, rewinding, fast forwarding and/or stopping the at least one audio file. In response to receiving an activation of any of the play controls, the audio file may be accessed by a user. As should be appreciated, in addition to an audio file, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., a video, streaming data, and the like.
As should be further appreciated, operations702-718 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 8 illustrates a method for manipulating an audio file associated with a spreadsheet that is incorporated into a chart, according to an example embodiment.
Method800 begins with provideinterface operation802, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display.
At receiveselection operation804, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
At identifyaudio file operation806, at least one audio file associated with the selected one or more cells may be identified. For example, as described with respect to identifyaudio file operation706, an object identifier (e.g., object identifier114) may identify at least one audio file associated with the selected one or more cells.
At retrieveparameters operation808, one or more parameters associated with the identified at least one audio file may be retrieved. Retrieveparameters operation708 may be performed by a parameter retriever (e.g., parameter retriever116) of a spreadsheet application (e.g., spreadsheet application110), as described above with reference to retrieveparameters operation708.
At receive chartingfunction operation810, a selection of a charting function may be received, either automatically or by user indication. For example, as described above with respect to receive chartingfunction operations210 and710, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application. As detailed above, an interface may provide for selecting one or more parameters to be charted. In some cases, e.g., where the selected one or more cells are associated with a plurality of audio files, a selection may be received to chart parameters such as the duration or volume for each audio file. In some aspects, where parameters are not displayed within the spreadsheet, an additional interface (e.g., dropdown menu) may be provided for selecting such parameters for charting. That is, some parameters may be associated with an audio file as metadata or as a compound data type and may not be visible as values within cells of the spreadsheet. As should be appreciated, a selection of a charting function including a chart type and/or parameters for charting may be received via any suitable means.
At createchart operation812, a chart may be created based at least in part on the one or more parameters. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. Continuing with the example above, a volume and a pitch for the plurality of audio files associated with the selected one or more cells may be charted in a bubble chart. In further examples, each audio file or a representation of each audio file may be incorporated a bubble representing the volume and pitch for the audio file in the bubble chart. As should be appreciated, a chart may be created based on any parameter(s) associated with an audio file.
At receiveadjustment operation814, an adjustment to at least one parameter of the one or more parameters may be received in the spreadsheet application. For instance, as illustrated byFIG. 21A, aformatting menu2120 may be provided by the spreadsheet application with various selections andinput fields2124 for adjusting parameters to manipulate the audio file. For instance, an interface may be provided for adjusting audio attributes such as volume, pitch, speed, bitrate type, bitrate, channel type, channel, and the like. In aspects, such adjustments may be written to the audio file. Continuing with the example above, a volume for a first audio file of the plurality of audio files associated with the selected one or more cells may be adjusted.
At applyadjustment operation816, the adjustment may be applied to the at least one audio file incorporated into the chart. For instance, when an adjustment is made to a pitch of an audio file within the spreadsheet application, the adjustment to the pitch may be rewritten to the audio file. Furthermore, if the audio file is incorporated into a chart, the pitch of the sound in the audio file as rendered within the chart may be adjusted accordingly. Continuing with the example above in which the volume for the first audio file has been adjusted, a position and/or size of the bubble associated with the volume and pitch for the first audio file in the bubble chart may be adjusted accordingly. Moreover, as the adjustment to the pitch received in the spreadsheet may be rewritten to the first audio file, the pitch for the first audio file as rendered and incorporated into the chart may also be adjusted. In some examples, both the adjustment to the charted parameter and to the audio file incorporated in the chart may be automatically applied within the chart. As should be appreciated, in addition to an audio file, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an image, a video, streaming data, and the like.
As should be further appreciated, operations802-816 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 9 illustrates a method for transcribing a chart into speech and incorporating an audio file of the speech into the chart, according to an example embodiment.
Method900 begins with provideinterface operation902, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display.
At receiveselection operation904, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
Atidentify data operation906, data associated with the selected one or more cells may be identified. For example, data may be represented as values within the selected one or more cells. In an example, the identified data may include a company's revenue values for successive months of a particular year. As should be appreciated, the identified data may include any values associated with the selected one or more cells.
At receive chartingfunction operation908, a selection of a charting function may be received, either automatically or by user indication. For example, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application, as described above with respect to receive chartingfunction operation508.
At createchart operation910, a chart may be created based at least in part on the identified data. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. Continuing with the example above, a bar chart may be created that presents a company's revenue values for each month of a particular year. As should be appreciated, any suitable chart may be created based on the selected charting function and the identified data.
At transcribeoperation912, the chart may be transcribed into speech to create at least one audio file. Continuing with the example above, the speech transcription may describe the company's revenue values for each month of a particular year. For instance, e.g., for low vision users, the data within the chart may be transcribed (e.g., “Text to Speech”) and the data may be “played” for the user at any time. To further improve user experience, particularly for low vision users, the chart may be customized to associate sounds with colors, numbers, trends, or any other aspect of the chart. Sound waves representing the speech transcription may be digitally encoded (e.g., by pulse-code modulation), in some cases processed (e.g., filtered, edited, etc.) and/or compressed (e.g., based on a codec to reduce file size), and stored as the at least one audio file.
At incorporateoperation914, the at least one audio file may be incorporated into the chart. For instance, a charting component (charting component118) may incorporate the audio file into the chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. In this way, by transcribing the data into an audio file and incorporating the audio file into the chart, the chart becomes able to read its own data.
At provideoperation916, one or more play controls may be provided for accessing the at least one audio file. For instance, the play controls may include controls for playing, skipping forward or back, pausing, rewinding, fast forwarding and/or stopping the at least one audio file. In response to receiving an activation of any of the play controls, the audio file may be accessed by a user. In at least some cases, e.g., for a low-vision user, the audio file may be “played” automatically when the chart is created.
As should be appreciated, operations902-916 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 10 illustrates a method for selecting and incorporating an audio file into a chart, according to an example embodiment.
Method1000 begins with provideinterface operation1002, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display.
At receiveselection operation1004, at least one audio file associated with a spreadsheet may be selected, either automatically (e.g., based on a function) or by user indication. In some aspects, operations may call (or select) an audio file without requiring user input, e.g., operations such as ‘data refresh,’ ‘import from such and such data source,’ ‘copy paste,’ etc. In other aspects, a spreadsheet application (e.g., spreadsheet application110) may provide the spreadsheet to a user within an interface and the user may select at least one audio file associated with the spreadsheet. Audio files associated with the spreadsheet may be selected by highlighting, by inputting a formula referencing the audio file or an audio file name and/or by otherwise identifying the audio file and/or the cell(s) with which the audio file is associated using a gesture, touch, mouse click, keyboard input, and the like.
At retrieveparameters operation1006, one or more parameters associated with the selected at least one audio file may be retrieved. Retrieveparameters operation1006 may be performed by a parameter retriever (e.g., parameter retriever116) of a spreadsheet application (e.g., spreadsheet application110). As described above, parameters of audio files may include, for instance, audio data and/or audio attributes. Audio data may include, e.g., modulated data for reproducing soundwaves. Audio attributes may include, e.g., frequency, amplitude, sampling rate, codec, bitrate, volume, pitch, speed, channel, audio effects, author/artist, creation date and/or time, file name, file size, duration, etc. In some cases, audio attributes may be retrieved from metadata associated with an audio file. In other cases, audio data and/or audio attributes may be retrieved from a compound data type associated with the audio file. As should be appreciated, one or more parameters may be retrieved for each identified audio file via any suitable means.
At receive chartingfunction operation1008, a selection of a charting function may be received, either automatically or by user indication. For example, as described above with respect to receive chartingfunction operation210, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application. As described above, a selection of a charting function including a chart type and/or parameters for charting may be received via any suitable means.
At createchart operation1010, a chart may be created incorporating the at least one audio file and/or the one or more parameters. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may create a chart incorporating the at least one audio file, as described above with respect to createchart operation712. In particular, the audio file or a visual representation of the audio file may be incorporated into the chart. Alternatively, as described with respect to respect to createchart operation812, a charting component may create a chart based on charting the one or more parameters. As should be appreciated, a charting component may create a chart that both incorporates the at least one audio file and charts the one or more parameters. Moreover, in addition to an audio file, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an image, a video, streaming data, and the like.
As should be further appreciated, operations1002-1010 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 11 illustrates a method for incorporating an audio file into a chart in response to satisfaction of a condition, according to an example embodiment.
Method1100 begins with provideinterface operation1102, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display.
At receiveselection operation1104, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
Atidentify data operation1106, data associated with the selected one or more cells may be identified. For example, data may be represented as values within the selected one or more cells. In an example, the identified data may include a company's revenue values for successive months of a particular year. As should be appreciated, the identified data may include any values associated with the selected one or more cells.
At receive chartingfunction operation1108, a selection of a charting function may be received, either automatically or by user indication. For example, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application, as described with respect to receive chartingfunction operation508.
At createchart operation1110, a chart may be created based at least in part on the identified data. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. Continuing with the example above, a bar chart may be created that presents a company's revenue values for each month of a particular year. As should be appreciated, any suitable chart may be created based on the selected charting function and the identified data.
Atdetermination operation1112, it may be determined whether the charted data satisfies a condition. In aspects, the condition may be specified by a user or may be automatically generated by the spreadsheet application. For instance, continuing with the example above, a user may specify a condition such that when revenue values exceed a particular threshold, e.g., $30,000, an audio file should be incorporated into the chart. In this case, the revenue values for each month may be evaluated against the threshold to determine whether the condition is satisfied. In another example, a standardized condition may include a condition specifying that an audio alert should be incorporated into a chart and each time medical data exceeds a particular standardized threshold the audio alert should “play.”
At incorporateoperation1114, at least one audio file may be incorporated into the chart. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. In particular, in response to determining that the charted data satisfies the condition, at least one audio file may be incorporated into the chart. As with the condition, the audio file may also be specified by a user or automatically generated by the spreadsheet application. For instance, a condition function may include a reference to the audio file or other identifier. Continuing with the example above, an audio file encoding an audio encouragement may be incorporated into a bar representing revenue values that exceed a threshold of $30,000 for a particular month. In another example, an audio file encoding an audio alert may be played in a chart when glucose levels exceed a threshold of 300 mg/dl for more than two hours or when blood pressure levels exceed a threshold of 190/120 for more than an hour, and the like. As should be appreciated, any suitable audio file may be incorporated into a chart upon a determination that a condition is satisfied by the charted data. Moreover, in addition to an audio file, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an image, a video, streaming data, and the like.
As should be further appreciated, operations1102-1114 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 12 illustrates a method for playing a video incorporated into a chart, according to an example embodiment.
Method1200 begins with provideinterface operation1202, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display.
At receiveselection operation1204, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
Atidentify video operation1206, at least one video associated with the selected one or more cells may be identified. For example, an object identifier (e.g., object identifier114) may identify the at least one video associated with the selected one or more cells. In aspects, a video may be “associated” with one or more cells by being embedded in the one or more cells, anchored to the one or more cells, referenced by a formula, name or pointer within the one or more cells, positioned within the same row or column as a selected cell of the one or more cells, and the like. As should be appreciated, at least one video associated with the selected one or more cells may be identified by any suitable means.
At retrieveparameters operation1208, one or more parameters associated with the identified at least one video may be retrieved. Retrieveparameters operation1208 may be performed by a parameter retriever (e.g., parameter retriever116) of a spreadsheet application (e.g., spreadsheet application110). As described above, parameters of videos may include, for instance, video data and/or video attributes. Video data may include, for instance, an array of pixel values for rendering each individual frame of a video and/or modulated data for reproducing soundwaves of an audio track. Video attributes may include, for instance, a frame rate, aspect ratio, duration, resolution, bits per frame, video size, synchronization data, etc. Video attributes may further include individual frame attributes (e.g., aspect ratio, color space, bitrate, etc.) and/or audio attributes (e.g., pitch, volume, speed, etc.). In some cases, video attributes may be retrieved from metadata associated with a video. In other cases, video data and/or video attributes may be retrieved from a compound data type associated with the video. As should be appreciated, one or more parameters may be retrieved for each identified video via any suitable means.
At receive chartingfunction operation1210, a selection of a charting function may be received, either automatically or by user indication. For example, as described above with respect to receive chartingfunction operation210, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application. In some aspects, an interface such as a toolbar, ribbon, dropdown menu, etc., may be provided for receiving a selection of the charting function. The interface may display various types of charts for selection, e.g., bar charts, map charts, scatter plots, line graphs, tree charts, pie charts, radar charts, and the like. Further, the interface may provide for selecting one or more parameters to be charted. In some cases, e.g., where the selected one or more cells are associated with a plurality of videos, a selection may be received to chart parameters such as a duration or a frame rate or a volume for each video. In some aspects, where parameters are not displayed within the spreadsheet, an additional interface (e.g., dropdown menu) may be provided for selecting such parameters for charting. That is, some parameters may be associated with a video as metadata or as a compound data type and may not be visible as values within cells of the spreadsheet. As should be appreciated, a selection of a charting function including a chart type and/or parameters for charting may be received via any suitable means.
At createchart operation1212, a chart may be created incorporating the at least one video. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. In particular, the video or a visual representation of the video may be incorporated into the chart. For example, three videos may be associated with the selected one or more cells. The videos may correspond to three documentaries. A duration for each video may be retrieved from the selected one or more cells or from metadata associated with the video. The duration for each documentary may be provided in a bar chart. In some examples, although provided as metadata, the durations for each video may not be visible as values within the selected one or more cells. As should be appreciated, the above examples are provided for purposes of explanation and should not be considered limiting.
Atdisplay operation1214, a visual representation of the at least one video may be displayed in the chart. Continuing with the example above, a visual representation of each of the three videos may be incorporated into a bar representing a duration for each documentary. A visual representation may include, for instance, a speaker icon, play icon, a single image frame overlaid with a play icon, a video name with metadata, a special icon with metadata, etc. For example, the visual representation may depict an actual image frame of the video and may be interactive. That is, a user may scroll through the video and begin watching the video from any image frame. Alternatively, the visual representation may not depict an actual image frame but may be a film poster for the video and may be overlaid with a play icon. As should be appreciated, any suitable icon or other symbol may be provided as a visual representation for the at least one video.
At receiveselection operation1216, a selection of the visual representation may be received. For example, the visual representation may be selected by gesture, touch, mouse click, keyboard input, cursor hover, and the like. Selection of the visual representation may further include activating a play icon associated with the visual representation or launching a user interface by right-clicking or hovering over the visual representation, and the like. As should be appreciated, selection of the visual representation may be received by any suitable means.
At provideoperation1218, one or more play controls may be provided for accessing the at least one video. For instance, the play controls may include controls for playing, skipping forward or back, pausing, rewinding, fast forwarding and/or stopping the at least one video. In response to receiving an activation of any of the play controls, the video may be accessed by a user. As should be appreciated, in addition to video, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an audio file, streaming data, and the like.
As should be further appreciated, operations1202-1218 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 13 illustrates a method for charting streaming data that is associated with a spreadsheet, according to an example embodiment.
Method1300 begins with provideinterface operation1302, where a spreadsheet application may provide (or cause to be provided) a user interface to a user. For example, as described above with reference to provideinterface operation202, a UX component (e.g., UX component120) may facilitate a user experience (UX) by providing the user interface of a spreadsheet application (e.g., spreadsheet application110) via a display.
At receiveselection operation1304, one or more cells of a spreadsheet may be selected, either automatically (e.g., based on a function) or by user selection. For example, as described above with respect to receiveselection operation204, a selection component (e.g., selection component112) may receive a selection of one or more cells, either automatically or by user indication.
At identifystreaming data operation1306, streaming data associated with the selected one or more cells may be identified. As detailed above, streaming data may refer to any type of data provided via a communications connection (e.g., via Bluetooth®, cellular, WAN, LAN, wired or wireless media, etc.) over some period of time. For instance, streaming data may refer to streaming audio (e.g., podcast, music, audio book), streaming video (e.g., live sports broadcast, YouTube® video, third-party hosted video, multiple frames transmitted from a camera, recorded video transmitted from a mobile device or video recorder, etc.), data feeds (e.g., twitter feed, stock ticker, fitness data from a wearable device, medical data from a medical device, diagnostic data from a mechanical device, etc.), and the like. For example, streaming data may be identified based on a hyperlink within the selected one or more cells.
Atidentify device operation1308, at least one device associated with the streaming data may be identified. For instance, devices capable of streaming data may include, for instance, wearable devices (including watches, fitness bands, health monitoring devices, etc.), cameras, appliances, mobile devices, automobiles, etc. For example, streaming data may be identified based on a hyperlink within the selected one or more cells. In an example, the identified streaming data may include a heartrate values monitored on a periodic basis by a wearable device. In some cases, the heartrate values may be substantially continuously monitored and streamed from the wearable device in near real time. In a further example, the identified streaming data may include heartrate values monitored on a periodic basis by a wearable device. In some cases, the heartrate values may be substantially continuously monitored and streamed from the wearable device in near real time.
At retrieveoperation1310, a plurality of values and/or parameters may be retrieved from the streaming data. That is, discrete values for various types of streamed data may be retrieved from the stream. For instance, discrete pitch values for streaming audio, discrete stock price values from a stock ticker, discrete health monitoring values from a medical device, and the like, may be retrieved from a stream. Continuing with the example above, discrete heartrate values may be retrieved from streaming data transmitted by a wearable device. In some cases, the heartrate values may be retrieved in near real time from the stream. Alternatively, one or more parameters may be retrieved from the streaming data. Such parameters may include transmission rate, packet size, data unit (e.g., beats per second, mg/dl, etc.), statistics associated with the data (e.g., average glucose level, average stock price, etc.), and the like.
At receive chartingfunction operation1312, a selection of a charting function may be received, either automatically or by user indication. For example, a selection of a charting function may be received by a charting component (e.g., charting component118) of a spreadsheet application, as described above with respect to receive chartingfunction operation508.
At createchart operation1314, a chart may be created based at least in part on the identified streaming data. For instance, a charting component (charting component118) may create a chart and a UX component (UX component120) may provide the chart in an interface of the spreadsheet application. Continuing with the example above, a line graph may be created that presents an individual's heartrate values as monitored by a wearable device in near real time. That is, the line graph may be dynamically updated to present new heartrate values as they are received from the wearable device. By way of further example, with reference to a stock ticker, a data segment associated data values for a single trading day for a single stock may be extracted and displayed to a user (e.g., as a line graph or otherwise). Alternatively, a data segment associated with a group of stocks (e.g., a standard index or a custom group of stocks) may be extracted and displayed to a user (e.g., as a bar chart per stock, line graph of average prices-per-share, etc.). Further, each data segment representing a positive slope (e.g., increasing stock price) may be displayed over a day, a month, a year, etc. In still further examples, a live feed of stock prices can be appended during an earnings report and overlaid as a chart onto the video to show the immediate effect of the report on stock price. Additionally or alternatively, one or more parameters associated with the streaming data may be charted along with the plurality of values or within a separate chart. As should be appreciated, any suitable chart may be created based on the selected charting function and the identified streaming data.
At sendoperation1316, control instructions may be sent to the at least one device associated with the streaming data. For instance, the data stream may be controlled to play on demand (via UI control, or via some other calling function or feature that points at that device and asks it to play). In a case where data is streamed directly from a camera, there may be additional parameters passed to control the device, e.g., OFF, STANDBY, RESET, etc. Other suitable control instructions are possible and may be sent to the device via any suitable means.
As should be further appreciated, operations1302-1316 are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps, e.g., steps may be performed in differing order, additional steps may be performed, and disclosed steps may be excluded without departing from the present disclosure.
FIG. 14A illustrates an interface showing at least one image associated with one or more cells of a spreadsheet, according to an example embodiment.
As illustrated, aninterface1400 of a spreadsheet application is provided.Interface1400 includes aspreadsheet1402, a navigation ribbon1404 (including acell identifier1406 and a formula bar1408), and ahome toolbar1410A.Interface1400 further includes a plurality oftabs1412 for accessing various aspects and operations of the spreadsheet application. As illustrated, a home tab, i.e.,tab1412A entitled “Home,” is selected, which is indicated as an “unshaded” tab. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. However, a selection of a cell may be indicated by any suitable means, such as highlighting, shading, perceived three-dimensional enlargement, and the like. As shown, a cell identifier1406 (e.g., “D3”) for the selected cell is displayed innavigation ribbon1404. Additionally,formula bar1408 displays a function calling a file locator, e.g., fx=IMAGE(“http://www.BMWpics.com/Z4.png”), for the image (i.e., image1414) displayed within cell D3.
As further illustrated byFIG. 14A, images1414-1418 are pictures of different automobiles. In this case, additional data describing images1414-1418 within cells D3-D5 ofcolumn1432 is stored in cells within adjacent rows and/or columns. For example, in column “A” (i.e., column1428), cells A3, A4 and A5 contain data regarding the “makes” of the automobiles shown inimages1414,1416, and1418, respectively. In column “B” (i.e., column1430), cells B3, B4 and B5 contain data regarding the “models” of the automobiles shown inimages1414,1416 and1418, respectively. Cells C3, C4 and C5 of column “C” (i.e., column1420) contain data regarding the “prices” (e.g., sticker prices) of the automobiles shown inimages1414,1416 and1418, respectively. Furthermore, row “3” (i.e., row1422) provides the make, model, and price related toimage1414, row “4” (i.e., row1424) provides the make, model, and price related toimage1416, and row “5” (i.e., row1426) provides the make, model, and price related toimage1418. In aspects, the cells within a row may include values or objects (in this case, an image) that are related.
In some aspects, in response to selecting a cell that includes an associated image, a formatting menu1456 (or other interface) may be provided for manipulating the associated image. In other aspects,formatting menu1456 may be provided in response to additional input (e.g., a right click) within a selected cell.Formatting menu1456 may include a number of tabs for viewing and manipulating various image attributes. For instance,formatting menu1456 may include a color tab, a cell tab, asizing tab1462, an image tab, and acharting tab1460, and the like. Sizingtab1462 may display image attributes directed to image size and orientation such as “height,” “width,” “rotation,” “scale height,” “scale width,” “aspect ratio,” as well as displaying an original size for the image. Additionally, an image tab may provide image data and/or image attributes for viewing and manipulation of the image, such as resolution, sharpness, luminance, opacity, transparency, and the like (not shown). Color tab may provide image data and/or image attributes for viewing or manipulating a color palette for the image. In some cases, chartingtab1460 may be provided for selecting one or more image parameters for charting, including image data and/or image attributes of the associated image.
Formatting menu1456 may also expose one or more operations for manipulating image data and/or image attributes of an associated image. For example, a input fields1458, which may include UI controls (e.g., +/−controls), may be provided for one or more of the image attributes displayed by sizingtab1462. In this regard, a current value for the image data or image attribute may be provided within the input field and a user may directly overwrite the current value by inputting a new value and/or adjusting the current value up or down using the UI controls. For instance, with reference to image attributes displayed by the sizing tab described above, values for “height,” “width,” “rotation,” “scale height,” and “scale width” may be adjusted by direct input and/or adjusted up or down using +/−controls. In some aspects, a preview (not shown) of an associated image may be provided by theformatting menu1456 so that adjustments to an image may be viewed prior to acceptance. Additionally or alternatively, a “reset” button may be provided to return adjusted parameters back to an original version of the image. As should be appreciated, the above examples of image attributes are not exhaustive and any image attribute may be similarly surfaced and adjusted. Moreover, upon adjusting an image attribute viaformatting menu1456, corresponding changes may be made to the image in a chart.
Formatting menu1456 may further provide UI controls for turning certain settings on or off. For instance, a selection may be provided for “locking (or fixing) aspect ratio” and a further selection may be provided for locking (or fixing) the aspect ratio “relative to an original size” of the image. With reference to sizing an image, additional selections may be provided to “move and size” the image with a cell, “move without sizing” the image with a cell, and “don't move or size” the image with a cell. Additional operations, such as printing an associated image or printing the image with additional data, may be provided. In further aspects, image data (e.g., an array of pixel values for rendering the image) may be surfaced and operations for manipulating the image data may be exposed to a user (not shown). In this way, image processing may be enabled for images within a spreadsheet. In some cases, some options may be disabled when certain settings are selected. For example, sizing options may be disabled when a “fit to cell” setting is selected. The above examples are provided for purposes of explanation only and should not be understood as limiting.
In other aspects of the present disclosure, rather than providing formatting menu1456 (as illustrated byFIG. 14A), a formatting toolbar may be provided (not shown). For example, the formatting toolbar may be provided upon selection of a “Pic Format” tab. Options available in a formatting toolbar (not shown) associated with a pic format tab may include, for instance, moving an image from foreground to background, or background to foreground. Options may also including editing the image (e.g., touchup tools, etc.), adjusting colors, and/or adding artistic effects. Options for sizing the image, cropping the image, changing image orientation (e.g., vertically aligned to horizontally aligned), popping an image out of a cell, changing picture styles, changing picture borders, and/or changing picture layout may also be provided. Any number of options for manipulating images associated with cells may be provided in any suitable ribbon, tab, toolbar, menu, and the like. Moreover, upon manipulating an image within the spreadsheet, corresponding changes to the image may be made within a chart. As should be appreciated, in addition to an image, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an audio file, a video, streaming data, and the like.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 14A are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 14B illustrates an interface showing a selection cells associated with images in a spreadsheet, according to an example embodiment.
Similar toFIG. 14A,FIG. 14B showsinterface1400 of a spreadsheetapplication including spreadsheet1402 andnavigation ribbon1404, which includescell identifier1406 andformula bar1408. In this case, a range of cells1436 (e.g., B3:C5) is identified as selected (e.g., by shading) and the cell at the top left corner of the range (i.e., cell B3) is identified by cell identifier1406 (e.g., “B3”) in thenavigation pane1404. The range ofcells1436 may be selected by highlighting the range of cells, touching (or swiping) the range of cells, entering “=B3:C5” into the formula bar, etc. In some cases, in response to a selection of a range of cells,formula bar1408 may be blank (shown).
As further illustrated, an insert tab, i.e., tab1412C entitled “Insert,” has been selected, as indicated by an unshaded tab. In response to selection of insert tab1412C, inserttoolbar1410B is provided.Insert toolbar1410B provides a number of options for selecting various items to insert into thespreadsheet1402. For instance, tables including pivot tables and other tables; illustrations including clip art, pictures, shapes, SmartArt, etc.; and symbols including equations and other symbols may be inserted intospreadsheet1402. Additionally, inserttoolbar1410B provides for selectingvarious charts1434, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet1402.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 14B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 14C illustrates an interface for selecting a charting function, according to an example embodiment.
Similar toFIGS. 14A and 14B,FIG. 14C showsinterface1400 of a spreadsheetapplication including spreadsheet1402 andnavigation ribbon1404, which includescell identifier1406 andformula bar1408. As detailed above, a range of cells1436 (e.g., B3:C5) is identified as selected (e.g., by shading) and the cell at the top left corner of the range (i.e., cell B3) is identified by cell identifier1406 (e.g., “B3”) in thenavigation pane1404.
As further illustrated byFIG. 14C, abar chart icon1438 has been selected (in particular, a column bar chart). In aspects, in response to selectingbar chart icon1438, a dropdown menu1440 (or other interface) may be provided for selecting different types of bar charts, such as two-dimensional (2D) bar charts, three-dimensional (3D) bar charts, cylinder bar charts, etc. In other aspects,dropdown menu1440 may be provided in response to additional input (e.g., right click, cursor hover, etc.). As illustrated,2D bar chart1442 is identified as selected (e.g., by shading).
As should be appreciated, the various devices, components, etc., described with respect toFIG. 14C are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 14D illustrates a bar chart incorporating images, according to an example embodiment.
Similar toFIGS. 14A-14C,FIG. 14D showsinterface1400 of a spreadsheetapplication including spreadsheet1402 andnavigation ribbon1404, which includescell identifier1406 andformula bar1408. As further illustrated byFIG. 14D, abar chart icon1438 has been selected (in particular, a column bar chart). In this case,chart1444 has been created and inserted inspreadsheet1402. In examples,chart1444 may be inserted as an overlay (shown) onspreadsheet1402 or may be launched in a separate window or interface (not shown).
Chart1444 is a bar chart graphingautomobile models1446 versusprice1448. In this case, afirst bar1450 represents a first sticker price for model “Z4,” asecond bar1452 represents a second sticker price for model “Hardtop,” and athird bar1454 represents a third sticker price for model “Fortwo.” As further illustrated, althoughimages1414,1416 and1418 were not within the selected range ofcells1436, these images have been incorporated into the chart. In aspects,images1414,1416 and1418 may be identified as associated with the selected range ofcells1436 based on being within the same rows, respectively, as at least one cell within the selected range ofcells1436.
In particular, a first image (e.g., image1414) of a Z4 model is incorporated into thefirst bar1450, a second image (e.g., image1416) of a Hardtop model is incorporated into thesecond bar1452, and a third image (e.g., image1418) of a Fortwo model is incorporated into thethird bar1454. As shown, the first, second and third images are incorporated at a top of thefirst bar1450, thesecond bar1452 and thethird bar1454, respectively. In other examples, the first, second and third images may be shown as fill for the first, second and third bars1450-1454, or represented by a visual representation (e.g., icon) which displays the first, second or third images upon selection, or otherwise incorporated into thechart1444. In still other examples, other objects (e.g., audio files, videos, streaming data, etc.) may be associated with the selected range ofcells1436 and may be similarly incorporated withinchart1444.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 14D are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 15A illustrates an interface showing a selected image associated with a compound data type represented by a record, according to an example embodiment.
As illustrated, aninterface1500 of a spreadsheet application is provided.Interface1500 includes aspreadsheet1502, a navigation ribbon1504 (including acell identifier1506 and a formula bar1508), and ahome toolbar1510A.Interface1500 further includes a plurality oftabs1512 for accessing various aspects and operations of the spreadsheet application. As illustrated, a home tab, i.e.,tab1512A entitled “Home,” is selected, which is indicated as an “unshaded” tab. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. A cell identifier1506 (e.g., “D3”) for the selected cell is displayed innavigation ribbon1504.
As illustrated, cell D3 containsimage1514, which depicts an automobile. In some aspects,image1514 may be associated with a compound data type. In this case, aformula bar1508 may display a function describing the compound data type associated withimage1514 contained in cell D3. In other aspects (not shown),formula bar1508 for cell D3 may display a function referencing a globally unique name for the compound data type associated withimage1514 contained in cell D3.
A function representing the compound data type may be identified using a variety of syntax. For instance, the function may surface whatever image data, image attributes or additional data are stored in the compound data type and may be represented as: =IMAGE(“http://www.BMWpics.com/Z4.png”, “MakeName”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, $49700, “MPG”, 28.1 mpg, “Miles”, “8,763 miles”). In this case, a first portion of the function may reference a file locator forimage1514, e.g., =IMAGE(“http://www.BMWpics.com/Z4.png”), and a second portion of the function may reference additional data, e.g., “MakeName”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, $49,700, “MPG”, 28.1 mpg, “Miles”, “8,763 miles”.
As further illustrated, arecord1520 may display fields and values of the compound data type contained in cell D3. In this case, where a user combines an image with an arbitrary set of values (e.g., a record), the function may be represented as: =IMAGE(“http://www.BMWpics.com/Z4.png”, RECORD(“MakeName”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, $49,700, “MPG”, 28.1 mpg, “Miles”, “8,763 miles”)). In still other aspects, where an image (identified by a “.png” file extension) is added to a compound data type constructed by a user, the image would amount to a value within the compound data type (e.g., a record) and the function may be represented as: =RECORD(“Image”, Z4.png, “MakeName”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, $49,700, “MPG”, 28.1 mpg, “Miles”, “8,763 miles”). In still other aspects, a user may create a compound data type and give the compound data type a name (e.g., “AutoResearch”). The next time the compound data type is used, each attribute name is already known as a field in the “AutoResearch” compound data type and only the values need to be called out in the function, which may be represented as: =AUTORESEARCH(“http://www.BMWpics.com/Z4.png”, “BMW”, “Z4”, $49,700, 28.1 mpg, “8,763 miles”). Further, the function may simply reference attributes of the image and read the values from metadata, e.g., =RECORD(“Image”, “http://www.BMWpics.com/Z4.png”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, Z4.price, “MPG”, Z4.mpg, “Miles”, Z4.miles). In this case, a user may provide custom fields within the compound data type (e.g., record) and, by dereferencing the ‘image’ field, values may be read from metadata and populated in the user's defined fields in the record. As should be appreciated, in addition to an image, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an audio file, a video, streaming data, and the like.
As should be appreciated, users may add objects to a spreadsheet as native object types. In this case, code may be written that represents an object, which code can be added to a cell, to a file, at some URL, which the spreadsheet application may traverse to retrieve the object. In this case, the object may be defined by its internal representation, e.g., its code. For instance, the object may have JSON properties. The formula bar for a cell in which the object resides may then display an icon for that object type, or text that indicates the object is of a particular type, etc. Dereferencing the properties of the object, e.g., through A1.<propertyName>, can be used, so long as the developer implemented name/properties, or more accurately implemented an interface that allows the spreadsheet to retrieve such object parameters. In aspects, a developer may be able to define literally any property for an object, e.g., a “shoe size” property for a video. Additionally or alternatively, a user interface may be provided that surfaces properties for the object to the user in a dialog. That is, spreadsheet functionality may be provided such that developers are not required to make objects with function representations, i.e., objects without such representations may still be compatible. In some implementations, an object may be built using a general card UI for specifying the object, a name for the object, arbitrary name/value pairs, etc. In some cases, while the data structures and behavior of the object may be the same, this implementation may not generate a function representation for the object, providing flexibility to the developer.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 15A are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 15B illustrates an interface for selecting a charting function, according to an example embodiment.
Similar toFIG. 15A,FIG. 15B showsinterface1500 of a spreadsheetapplication including spreadsheet1502 andnavigation ribbon1504, which includescell identifier1506 andformula bar1508. In this case, a range of cells1522 (e.g., D3:D5) is identified as selected (e.g., by outlining) and the cell at the top of the range (i.e., cell D3) is identified by cell identifier1506 (e.g., “D3”) in thenavigation pane1504. The range ofcells1522 may be selected by highlighting the range of cells, touching (or swiping) the range of cells, entering “=D3:D5” into the formula bar, etc. In some cases, in response to a selection of a range of cells,formula bar1508 may be blank (shown).
As further illustrated, an insert tab, i.e.,tab1512B entitled “Insert,” has been selected, as indicated by an unshaded tab. In response to selection ofinsert tab1512B, inserttoolbar1510B is provided.Insert toolbar1510B provides a number of options for selecting various items to insert into thespreadsheet1502, as described above with respect toFIG. 14C. For instance, inserttoolbar1510B provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet1502.
As further illustrated byFIG. 15B, aline graph icon1524 is identified as selected (e.g., by shading). In aspects, in response to selectingline graph icon1524, a dropdown menu1526 (or other interface) may be provided for selecting different types of line graphs, such as two-dimensional (2D) line graphs, three-dimensional (3D) line graphs, etc. In other aspects,dropdown menu1526 may be provided in response to additional input (e.g., right click, cursor hover, etc.). As illustrated,2D line graph1528 is identified as selected (e.g., by shading).
As should be appreciated, the various devices, components, etc., described with respect toFIG. 15B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 15C illustrates a line graph charting values associated with images in compound data types, according to an example embodiment.
Similar toFIGS. 15A-15B,FIG. 15C showsinterface1500 of a spreadsheetapplication including spreadsheet1502 andnavigation ribbon1504, which includescell identifier1506 andformula bar1508. As further illustrated byFIG. 15C, aline graph icon1524 has been selected (as indicated by shading). In this case,chart1530 has been created and inserted inspreadsheet1502. In examples,chart1530 may be inserted as an overlay (shown) onspreadsheet1502 or may be launched in a separate window or interface (not shown).
Chart1530 is a line graph charting miles per gallon (mpg)1532 versusprice1534. That is, although values for miles per gallon for each automobile were not represented within the selected range ofcells1522, this data was nonetheless charted. In this regard, a value for mpg may be represented in a compound data type associated with each image, as illustrated forimage1514 inFIG. 15A, and may be identified and charted. In this case, a first image (e.g., image1514) represents a first data point for model “Z4,” a second image (e.g., image1516) represents a second data point for model “Hardtop,” and a third image (e.g., image1518) represents a third data point for model “Fortwo” on thechart1530. In other examples, rather than providing images1514-1518 as data points, a visual representation (e.g., icon) for each image1514-1518 may be provided as a data point. In response to selection of the visual representations, one or more of images1514-1518 may be displayed as an overlay to chart1530, within a separate window, or otherwise. In further aspects, data and/or parameters associated with compound data types for other objects (e.g., audio files, videos, streaming data) within a spreadsheet may be similarly charted and such objects may be similarly incorporated into a chart such aschart1530.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 15C are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 16 illustrates a bar chart incorporating images in response to satisfaction of a condition, according to a first example embodiment.
FIG. 16 shows interface1600 of a spreadsheetapplication including spreadsheet1602 andnavigation ribbon1604, which includescell identifier1606 andformula bar1608. As further illustrated, an insert tab, i.e.,tab1612 entitled “Insert,” has been selected, as indicated by an unshaded tab. In response to selection ofinsert tab1612,insert toolbar1610 is provided.Insert toolbar1610 provides a number of options for selecting various items to insert into thespreadsheet1602. For instance,insert toolbar1610 provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet1602.
As further illustrated byFIG. 16, abar chart icon1614 has been selected (in particular, a column bar chart). In this case,chart1616 has been created and inserted inspreadsheet1602. In examples,chart1616 may be inserted as an overlay (shown) onspreadsheet1602 or may be launched in a separate window or interface (not shown).Chart1616 is a bar chart graphing revenue values1620 formonths1618. In this case, afirst bar1622 represents a first revenue value for January, asecond bar1624 represents a second revenue value for February, and athird bar1626 represents a third revenue value for March. In this example, underlying spreadsheet data is not shown.
As illustrated, afirst image1628A of fireworks is incorporated into thesecond bar1624 and asecond image1628B of fireworks is incorporated into thethird bar1626. In aspects, a condition may be specified by a user or may be automatically generated by the spreadsheet application. For instance, the condition may specify that when revenue values exceed a particular threshold, e.g., $30,000, an image1628 of fireworks should be incorporated into a corresponding bar or data point of the chart. In this case, the revenue values for each month may be evaluated against the threshold to determine whether the condition is satisfied. In aspects, a file locator, link, reference or pointer to image1628 may be included in a condition function. As illustrated, the first revenue value for January does not exceed the threshold, whereas the second and third revenue values (represented by the second and third bars1624-1626, respectively) for February and March exceed the threshold of $30,000. Accordingly, image1628 of fireworks is not incorporated into thefirst bar1622 and is incorporated into each of the second and third bars1624-1626 asfirst image1628A andsecond image1628B.
As shown, thefirst image1628A and thesecond image1628B are incorporated at a top of thesecond bar1624 and thethird bar1626, respectively. In other examples, the first andsecond images1628A-B may be shown as fill for the second and third bars1624-1626, or represented by a visual representation (e.g., icon) which displays the first andsecond images1628A-B upon selection. In other examples, other objects (e.g., audio files, videos, streaming data) may be similarly incorporated withinchart1616 upon satisfaction of a condition.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 16 are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 17 illustrates a bar chart incorporating a plurality of images within a single bar, according to an example embodiment.
FIG. 17 shows interface1700 of a spreadsheetapplication including spreadsheet1702 andnavigation ribbon1704, which includescell identifier1706 andformula bar1708. As further illustrated, an insert tab, i.e.,tab1712 entitled “Insert,” has been selected, as indicated by an unshaded tab. In response to selection ofinsert tab1712,insert toolbar1710 is provided.Insert toolbar1710 provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet1702.
As further illustrated byFIG. 17, abar chart icon1714 has been selected (in particular, a column bar chart). In this regard,chart1716 has been created and inserted inspreadsheet1702. In examples,chart1716 may be inserted as an overlay (shown) onspreadsheet1702 or may be launched in a separate window or interface (not shown).Chart1716 is a bar chart graphing a number ofstudents1720 havinggrades1718 forperiod1 of a class. In this case, afirst bar1722 represents a first number of students (i.e., two students) having “A's”, asecond bar1724 represents a second number of students (i.e., three students) having “B's”, athird bar1726 represents a third number of students (i.e., three students) having “C's”, and afourth bar1728 represents a fourth number of students (i.e., one student) having a “D”. In this example, underlying spreadsheet data is not shown.
As illustrated, two images (one image corresponding to each of the two students having A's) are incorporated into thefirst bar1722, three images (one image corresponding to each of the three students having B's) are incorporated into thesecond bar1724, three images (one image corresponding to each of the three students having C's) are incorporated into thethird bar1726, and one image corresponding to the student having a D is incorporated into thefourth bar1728. In aspects, images corresponding to each student may be associated with one or more cells selected for charting (not shown) inchart1716, or images corresponding to each student may be associated with at least one cell that is within the same row as at least one cell of the selected one or more cells. In further aspects, selection of a charting function may include an indication to incorporate the images corresponding to the students into the chart. Alternatively, the images corresponding to the students may automatically be incorporated into the chart. As shown, images corresponding to the students are sized to fit within first, second, third and fourth bars1722-1728. In other examples, the images may be represented by visual representations (e.g., icons) that display the images corresponding to the students upon selection. As should be appreciated, the above-described operations may be applied to other objects associated with a spreadsheet, e.g., an audio file, a video, streaming data, and the like.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 17 are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 18 illustrates a bar chart incorporating an image in response to satisfaction of a condition, according to a second example embodiment.
FIG. 18 shows interface1800 of a spreadsheetapplication including spreadsheet1802 andnavigation ribbon1804, which includescell identifier1806 andformula bar1808. As further illustrated, an insert tab, i.e.,tab1812 entitled “Insert,” has been selected, as indicated by an unshaded tab. In response to selection ofinsert tab1812,insert toolbar1810 is provided.Insert toolbar1810 provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet1802.
As further illustrated byFIG. 18, abar chart icon1814 has been selected (in particular, a column bar chart). In this case,chart1816 has been created and inserted inspreadsheet1802. In examples,chart1816 may be inserted as an overlay (shown) onspreadsheet1802 or may be launched in a separate window or interface (not shown).Chart1816 is a bar chart graphing average stock prices1820 (e.g., for a particular stock or a group of stocks) formonths1818. In this case, afirst bar1822 represents a first average stock price for January, asecond bar1824 represents a second average stock price for February, athird bar1826 represents a third average stock price for March, and afourth bar1828 represents a fourth average stock price for April. In this example, underlying spreadsheet data is not shown.
As illustrated, a single image of a bull (i.e., image1830) is incorporated as fill across each of the first, second, third and fourth bars1822-1828. In aspects, a condition may be specified by a user or may be automatically generated by the spreadsheet application. For instance, the condition may specify that when average stock prices are increasing month over month (e.g., for a particular stock or a group of stocks),image1830 of a bull should be incorporated as fill into the bars of the chart. Conversely, the same or another condition may specify that when average stock prices are decreasing month over month (e.g., for a particular stock or a group of stocks), an image of a bear (not shown) should be incorporated as fill into the bars of the chart.
As shown, the average stock prices for each month may be evaluated to determine whether they are increasing or decreasing in order to determine whether the condition is satisfied. In aspects, a file locator, link, reference or pointer to image1830 may be included in a condition function. As illustrated, the first average stock price for January is less than the second average stock price for February, which is less than the third average stock price for March, which is less than the fourth average stock price for April. In this case, the average stock prices are increasing month-over-month between January and April. Accordingly,image1830 of a bull is incorporated as fill in each of the first, second, third and fourth bars1822-1828. In contrast, had the average stock price decreased between March and April, based on the condition, an image of a bear would have been incorporated as fill in thefourth bar1828 for April (not shown). As should be appreciated, other objects may be incorporated into a chart based on a condition, e.g., an audio file, a video, streaming data, and the like.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 18 are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 19A illustrates an interface showing a selected image associated with a compound data type represented by a record, according to an example embodiment.
As illustrated, aninterface1900 of a spreadsheet application is provided.Interface1900 includes aspreadsheet1902, a navigation ribbon1904 (including acell identifier1906 and a formula bar1908), and ahome toolbar1910A.Interface1900 further includes a plurality oftabs1912 for accessing various aspects and operations of the spreadsheet application. As illustrated, a home tab, i.e.,tab1912A entitled “Home,” is selected, which is indicated as an “unshaded” tab. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. A cell identifier1906 (e.g., “D3”) for the selected cell is displayed innavigation ribbon1904.
As illustrated, cell D3 containsimage1914, which depicts an automobile. In some aspects,image1914 may be associated with a compound data type. In this case, aformula bar1908 may display a function describing the compound data type associated withimage1914 contained in cell D3. In other aspects (not shown),formula bar1908 for cell D3 may display a function referencing a globally unique name for the compound data type associated withimage1914 contained in cell D3.
A function representing the compound data type may be identified using a variety of syntax. For instance, the function may surface whatever image data, image attributes or additional data are stored in the compound data type and may be represented as: =IMAGE(“http://www.BMWpics.com/Z4.png”, “MakeName”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, $49700, “MPG”, 28.1 mpg, “Miles”, “8,763 miles”, “GPS”, 32.7767° N, 96.7970° W). In this case, a first portion of the function may reference a file locator forimage1914, e.g., =IMAGE(“http://www.BMWpics.com/Z4.png”), and a second portion of the function may reference additional data, e.g., (“MakeName”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, $49,700, “MPG”, 28.1 mpg, “Miles”, “8,763 miles”, “GPS”, 32.7767° N, 96.7970° W).
As further illustrated, arecord1920 may display fields and values of the compound data type contained in cell D3. In this case, where a user combines an image with an arbitrary set of values (e.g., a record), the function may be represented as: =IMAGE(“http://www.BMWpics.com/Z4.png”, RECORD(“MakeName”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, $49,700, “MPG”, 28.1 mpg, “Miles”, “8,763 miles”, “GPS”, 32.7767° N, 96.7970° W)). In still other aspects, where an image (identified by a “.png” file extension) is added to a compound data type constructed by a user, the image would amount to a value within the compound data type (e.g., a record) and the function may be represented as: =RECORD(“Image”, Z4.png, “MakeName”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, $49,700, “MPG”, 28.1 mpg, “Miles”, “8,763 miles”, “GPS”, 32.7767° N, 96.7970° W). In still other aspects, a user may create a compound data type and give the compound data type a name (e.g., “AutoResearch”). The next time the compound data type is used, each attribute name is already known as a field in the “AutoResearch” compound data type and only the values need to be called out in the function, which may be represented as: =AUTORESEARCH(“http://www.BMWpics.com/Z4.png”, “BMW”, “Z4”, $49,700, 28.1 mpg, “8,763 miles”, 32.7767° N, 96.7970° W). Further, the function may simply reference attributes of the image and read the values from metadata, e.g., =RECORD(“Image”, “http://www.BMWpics.com/Z4.png”, “BMW”, “ModelName”, “Z4”, “ModelPrice”, Z4.price, “MPG”, Z4.mpg, “Miles”, Z4.miles, “GPS”, Z4.gps). In this case, a user may provide custom fields within the compound data type (e.g., record) and, by dereferencing the ‘image’ field, values may be read from metadata and populated in the user's defined fields in the record.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 19A are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 19B illustrates a popup interface for selecting a charting function for images in a spreadsheet, according to an example embodiment.
Similar toFIG. 19A,FIG. 19B showsinterface1900 of a spreadsheetapplication including spreadsheet1902 andnavigation ribbon1904, which includescell identifier1906 andformula bar1908. In this case, a range of cells1922 (e.g., D3:D5) is identified as selected (e.g., by outlining) and the cell at the top of the range (i.e., cell D3) is identified by cell identifier1906 (e.g., “D3”) in thenavigation pane1904. The range ofcells1922 may be selected by highlighting the range of cells, touching (or swiping) the range of cells, entering “=D3:D5” into the formula bar, etc. In some cases, in response to a selection of a range of cells,formula bar1908 may be blank (shown).
As further illustrated, an insert tab, i.e.,tab1912B entitled “Insert,” has been selected, as indicated by an unshaded tab. In response to selection ofinsert tab1912B, inserttoolbar1910B is provided.Insert toolbar1910B provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet1902. In aspects, in response to selecting the range ofcells1922, a first popup menu1924 (or other interface) may be provided with any number of options for manipulating data or objects associated with the selected range ofcells1922.
As illustrated,first popup menu1924 provides options including “Cut,” “Copy” and “Paste.” In this case, data or objects may be cut from a cell, copied in a cell and/or pasted to a cell. Additionally,first popup menu1924 provides an option to “Float an image on grid,” an “Insert” option, and a “Delete” option. The “Insert” option may enable a user to associate data, images or other objects with the selected range ofcells1922. In contrast to the “Insert” option, the “Delete” option may enable a user to delete data, images or other objects from the selected range ofcells1922. In addition,first popup menu1924 may provide “Filter” and “Sort” options, an “Insert Comment” option and a “Format Cells” option. A “Define Name” option may enable a globally unique name to be assigned to an image or other object.First popup menu1924 may further provide a “Hyperlink” option for inserting a hyperlink to a file, a webpage, third-party streaming service, data streaming device, or otherwise.
Additionally, a “Chart”option1926 may be provided for selecting a charting function for application to the selected range ofcells1922. In response to selecting “Chart”option1926, asecond popup menu1928 may be provided. Thesecond popup menu1928 may provide options for selecting different types of charts, such as bar charts, line graphs, map charts, pie charts, etc. As illustrated,map chart option1930 is identified as selected (e.g., by outlining).
As should be appreciated, the various devices, components, etc., described with respect toFIG. 19B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 19C illustrates a map chart with a card view of an image, according to an example embodiment.
Similar toFIGS. 19A-19B,FIG. 19C showsinterface1900 of a spreadsheetapplication including spreadsheet1902 andnavigation ribbon1904, which includescell identifier1906 andformula bar1908. As further illustrated byFIG. 19C,chart1932 has been created and inserted inspreadsheet1902. In examples,chart1932 may be inserted as an overlay (shown) onspreadsheet1902 or may be launched in a separate window or interface (not shown).
Chart1932 is a map chart graphing GPS locations for images. That is, although GPS locations for each of the images1914-1918 were not represented within the selected range ofcells1922, this data was nonetheless charted. In this regard, a GPS location for each image (e.g., corresponding to where the image was taken) may be represented in a compound data type associated with the image, as illustrated forimage1914 inFIG. 19A. The spreadsheet application may extract and identify such data within the compound data type associated with each image (e.g., via a parameter retriever116) and may chart the data (e.g., via a charting component118). For example, a first image (e.g., image1914) may be represented by afirst GPS location1934, a second image (e.g., image1916) may be represented by asecond GPS location1936, and a third image (e.g., image1918) may be represented by athird GPS location1938 on thechart1932.
Furthermore, a visual representation (e.g., active data point) at each GPS location may be selectable to view the corresponding image. For instance, in response to selection of a visual representation, a corresponding image (e.g., one of images1914-1918) may be displayed as an overlay to chart1932, within a separate window, or otherwise. In further examples, a “card view” of a corresponding image may be displayed in response to selection of a visual representation. As illustrated, a visual representation (e.g., active data point) forGPS location1934 has been selected by hovering (as indicated bycursor icon1940 shown near or over GPS location1934). In response to selecting the visual representation,card1942 is displayed as an overlay to chart1932.Card1942 may be provided in an organized and stylized layout, including a formatted header (e.g., “D3”),image1914 displayed within an interactive insert (e.g., including aview control1944 allowing for 360° views of the automobile), and additional data1946 (e.g., including formatted data descriptors for each piece of information). In the illustrated aspect,card1942 is entitled “D3,” which corresponds to cell D3 within whichimage1914 is located in thespreadsheet1902. Alternatively,card1942 may be entitled “Z4” (the model of the automobile depicted by image1914) or “BMW” (the make of the automobile depicted by image1914) or otherwise.
Additional data1946 corresponds to at least a portion of the data contained within the compound data type associated withimage1914, as illustrated byrecord1920. In particular,additional data1946 provides information regarding the automobile depicted byimage1914, including the make, model, price, mpg, miles, and GPS location. As illustrated byFIG. 19A, some of this information is provided as values within the selected range ofcells1922, whereas other information is available within therecord1920 of the compound data type associated withimage1914. As detailed above, the spreadsheet may identify and chart data from either source. In some cases,card1942 may provide information obtained within a link associated with an image. For instance, 360° views of the automobile depicted byimage1914 may be obtained by following the link to the image, e.g., http://www.BMWpics.com/Z4.png, or otherwise, and may be provided incard1942. In this regard,card1932 provides a user-friendly interface (e.g., organized and stylized) for viewing additional data associated withimage1914 in response to user selection. In further aspects, data and/or parameters associated with compound data types for other objects (e.g., audio files, videos, streaming data) within a spreadsheet may be similarly charted and such objects may be similarly incorporated into a chart such aschart1932.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 19C are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 20A illustrates an interface showing audio files associated with cells of a spreadsheet, according to an example embodiment.
As illustrated, aninterface2000 of a spreadsheet application is provided.Interface2000 includes aspreadsheet2002, a navigation ribbon2004 (including acell identifier2006 and a formula bar2008), and ahome toolbar2010A.Interface2000 further includes a plurality oftabs2012 for accessing various aspects and operations of the spreadsheet application. As illustrated, a home tab, i.e.,tab2012A entitled “Home,” is selected, which is indicated as an “unshaded” tab. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. As shown, a cell identifier2006 (e.g., “D3”) for the selected cell is displayed innavigation ribbon2004. Additionally,formula bar2008 displays a function calling a file locator, e.g., fx=GETAUDIO(“C:\Depos\Key\20131203.wav”) for the audio file (i.e., audio file2014) displayed within cell D3.
As further illustrated, avisual representation2036 ofaudio file2014 is displayed in cell D3. A visual representation of an audio file may be provided as any suitable identifier of an audio file, e.g., a waveform rendering (shown), a speaker icon, a play icon, a special icon with metadata, a file name, and the like.FIG. 20A also illustrates aplay control2020 displayed overvisual representation2040 ofaudio file2018. In response to activatingplay control2020,audio file2018 may be played. Similarly, play controls are displayed overvisual representations2036 and2038 for playingaudio files2014 and2016, respectively. As further illustrated byFIG. 20A, additional data describingaudio files2014,2016 and2018 is stored in cells within adjacent rows and/or columns. For example, in column A (i.e., column2028), cells A3, A4 and A5 contain data regarding the deposition “dates” of the depositions recorded inaudio files2014,2016, and2018, respectively. In column B (i.e., column2030), cells B3, B4 and B5 contain data regarding the “location” of the depositions recorded inaudio files2014,2016 and2018, respectively. In column C (i.e., column2032), cells C3, C4 and C5 contain data regarding the “deponent” in the depositions recorded inaudio files2014,2016 and2018, respectively.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 20A are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 20B illustrates an interface showing a selected audio file associated with a compound data type represented by a record, according to an example embodiment.
Similar toFIG. 20A,FIG. 20B showsinterface2000 of a spreadsheetapplication including spreadsheet2002, navigation ribbon2004 (which includescell identifier2006 and formula bar2008), and aninsert toolbar2010B.Interface2000 further includes a plurality oftabs2012 for accessing various aspects and operations of the spreadsheet application. As illustrated, an insert tab, i.e.,tab2012B entitled “Insert,” is selected, which is indicated as an unshaded tab. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. A cell identifier2006 (e.g., “D3”) for the selected cell is displayed innavigation ribbon2004.
As illustrated, avisual representation2036 of audio file2014 (which is a deposition recording) is displayed in cell D3. In some aspects,audio file2014 may be associated with a compound data type. In this case, aformula bar2008 may display a function describing the compound data type associated withaudio file2014 contained in cell D3. In other aspects (not shown),formula bar2008 for cell D3 may display a function referencing a globally unique name for the compound data type associated withaudio file2014 contained in cell D3.
A function representing the compound data type may be identified using a variety of syntax. For instance, the function may surface whatever audio data, audio attributes or additional data are stored in the compound data type and may be represented as: =GETAUDIO(“C:\Depos\Key\20131203.wav”, “Title”, “Depo1”, “DeponentName”, “Mr. Key”, “Date”, Dec. 3, 2013, “LocationName”, “Chicago”, “Duration”, “05:42”). In this case, a first portion of the function may reference a file locator foraudio file2014, e.g., =GETAUDIO(“C:\Depos\Key\20131203.wav”), and a second portion of the function may reference additional data, e.g., (“Title”, “Depo1”, “DeponentName”, “Mr. Key”, “Date”, Dec. 3, 2013, “LocationName”, “Chicago”, “Duration”, “05:42”).
As further illustrated, arecord2042 may display fields and values of the compound data type contained in cell D3. In this case, where a user combines an audio file with an arbitrary set of values (e.g., a record), the function may be represented as: =GETAUDIO(“C:\depos\key\2013203.wav”, RECORD(“Title”, “Depo1”, “DeponentName”, “Mr. Key”, “Date”, Dec. 3, 2013, “LocationName”, “Chicago”, “Duration”, “05:42”)). In still other aspects, where an audio file (identified by a “.wav” file extension) is added to a compound data type constructed by a user, the audio file would amount to a value within the compound data type (e.g., a record) and the function may be represented as: =RECORD(“Audio”, “C:\depos\key\2013203.wav”, “Title”, “Depo1”, “DeponentName”, “Mr. Key”, “Date”, Dec. 3, 2013, “LocationName”, “Chicago”, “Duration”, “05:42”). In still other aspects, a user may create a compound data type and give the compound data type a name (e.g., “Deposition”). The next time the compound data type is used, each attribute name is already known as a field in the “Deposition” compound data type and only the values need to be called out in the function, which may be represented as: =DEPOSITION(“C:\depos\key\2013203.wav”, “Depo1”, “Mr. Key”, Dec. 3, 2013, “LocationName”, “Chicago”, “Duration”, “05:42”). Further, the function may simply reference attributes of the audio file and read the values from metadata, e.g., =RECORD(“Audio”, “C:\depos\key\2013203.wav”, “Title”, Audio. Title “DeponentName”, “Mr. Key”, “Date”, Audio.CreatedDate, “LocationName”, Audio.CreatedPlace, “Duration”, Audio.Time). In this case, a user may provide custom fields within the compound data type (e.g., record) and, by dereferencing the ‘audio’ field, values may be read from metadata and populated in the user's defined fields in the record.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 20B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 20C illustrates a bar chart incorporating a plurality of audio files within a single bar, according to an example embodiment.
Similar toFIGS. 20A-20B,FIG. 20C showsinterface2000 of a spreadsheetapplication including spreadsheet2002, navigation ribbon2004 (which includescell identifier2006 and formula bar2008), and inserttoolbar2010B. As illustrated,insert tab2012B is selected, which is identified as an unshaded tab. In response to selection ofinsert tab2012B, inserttoolbar2010B is displayed and provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet2002.
As further illustrated byFIG. 20C,bar chart icon2044 has been selected (in particular, a column bar chart). In this regard,chart2046 has been created and inserted inspreadsheet2002. In examples,chart2046 may be inserted as an overlay (shown) onspreadsheet2002 or may be launched in a separate window or interface (not shown).Chart2046 is a bar chart graphing total deposition time in hours for two deponents. In this case, afirst bar2058 represents total deposition time for a first deponent, Mr. Key, which includes two segments corresponding to two depositions as recorded onaudio files2014 and2016. Asecond bar2060 represents total deposition time for a second deponent, Ms. Block, corresponding to a deposition as recorded onaudio file2018.
As illustrated, avisual representation2048 corresponding toaudio file2014 and avisual representation2050 corresponding toaudio file2016 are incorporated into thefirst bar2058. Additionally, avisual representation2052 corresponding toaudio file2018 is incorporated into thesecond bar2060. In this case, visual representations are in the form of speaker icons. As detailed above, a visual representation of an audio file may be provided as any suitable identifier of an audio file, e.g., a waveform rendering, a speaker icon, a play icon, a special icon with metadata, a file name, and the like.
Although values for the durations of each audio file are not represented withinspreadsheet2002, these values were charted nonetheless. That is, as illustrated byFIG. 20B, each audio file may be associated with a compound data type storing audio data, audio attributes and/or addition data. As further illustrated byFIG. 20B,audio file2014 is associated with a compound data type including a duration of 5 hours and 42 minutes (i.e., 05:42) for the recording of the deposition. While not illustrated byFIG. 20B,audio files2016 and2018 may also be associated with compound data types, which may each include a value for a duration of the corresponding deposition recording.
As further illustrated, visual representation2050 (e.g., speaker icon) corresponding toaudio file2016 has been selected by hovering (as indicated bycursor icon2056 shown near or over visual representation2050). In response to selecting the visual representation, play controls2054 are displayed for accessing audio file2015. In particular, play controls2054 include controls for playing, skipping forward or back, pausing and rewindingaudio file2016. As should be appreciated, in addition to an audio file, the above description may be applied to other objects associated with a spreadsheet, e.g., an image, a video, streaming data, and the like.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 20C are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 21A illustrates an interface showing an audio file associated with a spreadsheet, the spreadsheet proving a charting menu specific to the audio file, according to an example embodiment.
As illustrated, aninterface2100 of a spreadsheet application is provided.Interface2100 includes aspreadsheet2102, a navigation ribbon2104 (including acell identifier2106 and a formula bar2108), and ahome toolbar2110A.Interface2100 further includes a plurality oftabs2112 for accessing various aspects and operations of the spreadsheet application. As illustrated, ahome tab2112A is selected, which is indicated as an unshaded tab. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. A cell identifier2106 (e.g., “D3”) for the selected cell is displayed innavigation ribbon2104. Additionally,formula bar2108 displays a function calling a file locator, e.g., fx=GETAUDIO(“C:\Depos\Key\20131203.wav”) foraudio file2114 displayed within cell D3.
As further illustrated, avisual representation2116 ofaudio file2114 is displayed in cell D3. A visual representation of an audio file may be provided as any suitable identifier of an audio file, e.g., a waveform rendering (shown), a speaker icon, a play icon, a special icon with metadata, a file name, and the like.FIG. 21A also illustrates aplay control2118 displayed overvisual representation2116 ofaudio file2114. In response to activatingplay control2118,audio file2114 may be played or otherwise accessed. Additionally,audio file2124 is associated with cell D4 andaudio file2126 is associated with cell D5.
In aspects,audio file2114 may be associated with a compound data type storing audio parameters (e.g., audio data, audio attributes and/or additional data), as described above. In additional or alternative aspects,audio file2114 may be associated with metadata storing audio parameters. As illustrated, in response to selection of cell D3 containingaudio file2114, aformatting menu2120 may be provided byspreadsheet2102. Theformatting menu2120 may be customized foraudio file2114 based on audio parameters (e.g., audio data, audio attributes and/or additional data) retrieved by the spreadsheet application foraudio file2114, e.g., from an associated compound data type and/or metadata.Formatting menu2120 may include a number of tabs for viewing and manipulating various audio parameters. For instance,formatting menu2120 may include a color tab, a cell tab, a sizing tab, an audio tab, and acharting tab2122, and the like. For instance, the audio tab may provide audio data and/or audio attributes for viewing and manipulation of the audio file, such as volume, pitch, speed, bitrate type, bitrate, channel type, channel, and the like (not shown). As illustrated, chartingtab2122 is provided for selecting one or more audio parameters for charting, including audio data and/or audio attributes retrieved from a compound data type and/or metadata associated with theaudio file2114.
Charting tab2122 provides, for example, options for selection and charting audio parameters such as “volume,” “pitch,” “speed,” “bitrate type,” “bitrate,” “channel type,” and/or “channel.” Current values foraudio file2114 for each of the above parameters may also be provided.Charting tab2122 may also provide for selecting a cell, range of cells and/or all audio files within a workbook (which may include one or more spreadsheets) for charting against the selected audio parameters. As illustrated, a selection for charting speed versus bitrate over a range of cells (i.e., D3:D5) has been made. In this way, charting one or more audio parameters of one or more audio files may be performed by a spreadsheet application. The above example is provided for purposes of explanation only and should not be understood to be limiting.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 21A are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 21B illustrates a scatter plot incorporating visual representations for a plurality of audio files as data points, according to an example embodiment.
Similar toFIG. 21A,FIG. 21B showsinterface2100 of a spreadsheetapplication including spreadsheet2102 and navigation ribbon2104 (which includescell identifier2106 and formula bar2108). As illustrated,insert tab2112B is selected, which is identified as an unshaded tab. In response to selection ofinsert tab2112B, inserttoolbar2110B is displayed and provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet2102.
As further illustrated byFIG. 21B,scatter plot icon2130 has been selected. In this regard,chart2132 has been created and inserted inspreadsheet2102. In examples,chart2132 may be inserted as an overlay (shown) onspreadsheet2102 or may be launched in a separate window or interface (not shown).Chart2132 is a scatter plot graphing speed (%) versus bitrate (kbit/s) for a range of cells D3:D5, which cells each include an audio file (seeFIG. 21A). In this case, a first data point comprising a first visual representation2138 (i.e., a speaker icon with a “D3” identifier) represents a speed of 115% and an encoding bitrate of 32 kbit/s foraudio file2114, as illustrated inFIG. 21A by chartingtab2120. Similarly, a second data point comprising a second visual representation2140 (i.e., a speaker icon with a “D4” identifier) represents a speed of 100% and an encoding bitrate of 96 kbit/s for audio file2126 (audio parameters not shown inFIG. 21A) and a third data point comprising a third visual representation2142 (i.e., a speaker icon with a “D5” identifier) represents a speed of 110% and an encoding bitrate of 128 kbit/s for audio file2128 (audio parameters not shown inFIG. 21A).
Although values for the speeds and encoding bitrates of each audio file are not represented withinspreadsheet2102, these values were charted nonetheless. That is, as illustrated byFIG. 21B, each audio file may be associated with metadata and/or a compound data type storing audio data, audio attributes and/or addition data. As further illustrated byFIG. 21A, a spreadsheet application may extract values for audio parameters from metadata and/or a compound data type for an audio file and provide such values in a formatting menu (e.g., formatting menu2120) for viewing and/or manipulation. Furthermore, such values of audio parameters may be selected for charting via acharting tab2122. The above example is provided for purposes of explanation only and should not be understood to be limiting.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 21B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 21C illustrates a scatter plot with one or more popup menus for performing transcription, according to an example embodiment.
Similar toFIGS. 21A-21B,FIG. 21C showsinterface2100 of a spreadsheetapplication including spreadsheet2102 and navigation ribbon2104 (which includescell identifier2106 and formula bar2108). As illustrated,insert tab2112B is selected, which is identified as an unshaded tab. In response to selection ofinsert tab2112B, inserttoolbar2110B is displayed and provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet2102.
As further illustrated byFIG. 21C,chart2132 has been created and inserted inspreadsheet2102.Chart2132 is a scatter plot graphing speed (%) versus bitrate (kbit/s) for a range of cells D3:D5, which cells each include an audio file (seeFIG. 21A). As further illustrated, visual representation2138 (e.g., a speaker icon with a “D3” identifier) corresponding toaudio file2114 has been selected by hovering (as indicated bycursor icon2144 shown near or over visual representation2138). In response to selecting the visual representation, a first popup menu2146 (or other interface) may be provided with any number of options for manipulatingaudio file2114. In other aspects,first popup menu2146 may be provided in response to additional input (e.g., right click, etc.).
As illustrated,first popup menu2146 provides options including “Cut,” “Copy” and “Paste.” Additionally,first popup menu2146 providestranscribe option2148, which is selected as evidenced by shading. In aspects, an audio file may be transcribed into alphanumeric or symbolic data and/or may be created from alphanumeric or symbolic data. In response to selection oftranscribe option2148, asecond popup menu2150 may be displayed.Second popup menu2150 may provide a number of options for transcription, for example, “Speech to Text,” “Text to Speech,” “Music to Score,” or “Score to Music,” etc. As illustrated, the “Speech to Text”option2152 has been selected (e.g., evidenced by shading).
For example, an audio file including speech may be converted into a text transcription (e.g., “Speech to Text”), which is a textual representation of each word or sound in the audio file. Conversely, a textual document may be converted into an audio file (e.g., “Text to Speech”), e.g., spoken words may be generated that correspond to the text of the document. In further aspects, an audio file of music may be transcribed into a musical score (e.g., “Music to Score”), including musical notes, bars, frames, and/or musical notations, representing the music of the audio file. Alternatively, a musical score may be converted by optical character recognition (OCR) into an audio file (e.g., “Score to Music”) encoding data for producing sound waves representative of the musical score. Alternatively, a function may calculate a translation from musical score for piano to a musical score for saxophone (e.g., a scale and note translation may result in new values for each note in the music, where the ‘result’ is the new score).
In further aspects, e.g., for low vision users, data within a chart may be transcribed (e.g., “Text to Speech”) and the data may be “played” for the user at any time. To further improve user experience, particularly for low vision users, the chart may be customized to associate sounds with colors, numbers, trends, or any other aspect of the chart. Similarly, by transcribing alphanumeric or other data into an audio file (e.g., “Text to Speech”) and associating the audio file with the chart, a spreadsheet application becomes able to read its own data. As should be appreciated, transcription may include converting an audio file into alphanumeric or symbolic data and/or creating an audio file from alphanumeric or symbolic data according to any suitable means. Additionally, among other options,first popup menu2146 may include an “Insert Audio Note”option2156 that may enable a user to create (e.g., record) and associate an audio note with a chart.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 21C are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 21D illustrates a scatter plot showing an audio transcription associated with a data point, according to an example embodiment.
Similar toFIGS. 21A-21C,FIG. 21D showsinterface2100 of a spreadsheetapplication including spreadsheet2102 and navigation ribbon2104 (which includescell identifier2106 and formula bar2108). As illustrated,insert tab2112B is selected, which is identified as an unshaded tab and inserttoolbar2110B is displayed. As further illustrated byFIG. 21D,chart2132 has been created and inserted inspreadsheet2102.Chart2132 is a scatter plot graphing speed (%) versus bitrate (kbit/s) for a range of cells D3:D5, which cells each include an audio file (seeFIG. 21A).
As further illustrated, visual representation2138 (e.g., a speaker icon with a “D3” identifier) corresponding toaudio file2114 is associated with atext transcription2154 that specifies the encoding bitrate (e.g., 32 kbit/s) and speed (e.g., 115%) foraudio file2114, which records Mr. Key's deposition. In some cases,text transcription2154 may be created and persistently displayed inchart2132 in response to selection of the “Speech to Text”option2152. In other cases,text transcription2154 may be created and associated withvisual representation2138 in response to selection of the “Speech to Text”option2152 but may be displayed inchart2132 upon hovering overvisual representation2138. In this regard, different inputs may causevisual representation2138 to perform different functionality. For example,visual representation2138 may displaytext transcription2154 in response to a cursor hover and may playaudio file2114 in response to a click input.
Similarly, in response to selection of “Insert Audio Note”option2156, an audio note may be created (e.g., recorded) and associated with visual representation2138 (not shown). As withtext transcription2154, the audio note may be played, or transcribed and displayed, upon selection (not shown). As should be appreciated, the above examples may be similarly applied to other objects associated with a spreadsheet (e.g., images, videos, streaming data, etc.) and should not be considered limiting.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 21D are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 22A illustrates an interface showing a UI element for viewing and interacting with a plurality of images associated with a cell in a spreadsheet, according to an example embodiment.
As illustrated, aninterface2200 of a spreadsheet application is provided.Interface2200 includes aspreadsheet2202, a navigation ribbon2204 (including acell identifier2206 and a formula bar2208), and ahome toolbar2210A.Interface2200 further includes a plurality oftabs2212 for accessing various aspects and operations of the spreadsheet application. As illustrated, ahome tab2212A is selected, which is indicated as an unshaded tab. As further illustrated, cell C5 is selected, as indicated by thickened and/or colored outlining of the cell border of cell C5. A cell identifier2206 (e.g., “C5”) for the selected cell is displayed innavigation ribbon2204. Additionally,formula bar2208 displays a function calling file locators for a plurality of images, e.g., =IMAGE(“http://www.autopics.com/Smartcars/4two.png”,“http://www.autopics.com/Smartcars/4two/red.png”, etc.) associated with cell C5.
FIG. 22A illustrates afirst image2214 withscroll control2216, which indicates that a plurality of images is associated with cell C5. In some aspects, thescroll control2216 may enable a user to scroll backward (e.g., by activating the back arrow) or forward (e.g., by activating the forward arrow) one at a time through the plurality of images within the cell. Alternatively, thescroll control2216 or another UI control may be activated to launchuser interface2218 for displaying and/or interacting with each of the plurality of images.
As illustrated,user interface2218 displays each of the plurality of images associated with cell C5 in a grid configuration. In aspects,user interface2218 may display the plurality of images in any suitable configuration, e.g., linear, carousel, etc.User interface2218 may further provide options for performing operations on the plurality of images. For instance, a “Delete” option may be provided for removing one or more images from the array and an “Insert” option may be provided for adding one or more images to the array.Translation control2220 enables a user to translate through the images to a position (identified by place marker2222) for inserting a new image.User interface2218 may further provide an “Edit” option for manipulating one or more images of the array and a “Set timer” option for cycling display of each image one at a time within cell C5. In still further aspects, a “Spill” option may be provided for spilling the array of images into separate cells.
In further aspects, a “Chart”option2224 may be provided. In response to selecting “Chart”option2224, apopup menu2226 may be provided. Thepopup menu2226 may provide options for selecting different types of charts, such as bar charts, line graphs, map charts, pie charts, etc. As illustrated,map chart option2228 is identified as selected (e.g., by outlining). The above examples of options for viewing and interacting with a plurality of images are not intended to be exhaustive and should not be understood to be limiting.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 22A are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 22B illustrates a map chart with an incorporated image, according to an example embodiment.
Similar toFIG. 22A,FIG. 22B showsinterface2200 of a spreadsheetapplication including spreadsheet2202 andnavigation ribbon2204, which includescell identifier2206 andformula bar2208. As further illustrated byFIG. 22B,chart2234 has been created and inserted inspreadsheet2202. In examples,chart2234 may be inserted as an overlay (shown) onspreadsheet2202 or may be launched in a separate window or interface (not shown).
Chart2234 is a map chart graphing GPS locations for the plurality of images associated with cell C5 (seeFIG. 22A). That is, although GPS locations for each of the plurality of images were not represented within the selected cell C5, this data was nonetheless charted. In this regard, a GPS location for each image (e.g., corresponding to where the image was taken) may be represented in a compound data type associated with cell C5 (not shown). In some aspects, the compound data type may include each image within the plurality of images displayed byuser interface2218. The compound data type may further include image parameters (e.g., image data, image attributes and/or additional data) for each of the plurality of images, including a GPS location corresponding to where each image was taken. In further aspects, the spreadsheet application may extract and identify such parameters within the compound data type associated with cell C5 (e.g., via a parameter retriever116) and may chart the parameters (e.g., via a charting component118). For example,first image2214 may be represented by afirst GPS location2236,second image2230 may be represented by asecond GPS location2238, and athird image2232 may be represented by athird GPS location2240 on thechart2234.
Furthermore, a visual representation (e.g., active data point) at each GPS location may be selectable to view the corresponding image. For instance, in response to selection of a visual representation associated withGPS location2240, a corresponding image (e.g., third image2232) may be displayed as an overlay to chart2234, within a separate window, or otherwise. As illustrated, the visual representation associated withGPS location2240 has been selected by hovering (as indicated bycursor icon2242 shown near or over GPS location2240). In further aspects, one or more parameters associated withthird image2232 may be provided with the third image2232 (e.g., GPS coordinates for GPS location2240) (not shown).
As should be appreciated, the various devices, components, etc., described with respect toFIG. 22B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 22C illustrates a map chart with one or more popup menus for performing transcription, according to an example embodiment.
Similar toFIGS. 22A-22B,FIG. 22C showsinterface2200 of a spreadsheetapplication including spreadsheet2202 and navigation ribbon2204 (which includescell identifier2206 and formula bar2208). As further illustrated byFIG. 22C,chart2234 has been created and inserted inspreadsheet2202.Chart2234 is a map chart graphing GPS locations for the plurality of images associated with cell C5 (seeFIG. 22A).
As further illustrated, a visual representation (e.g., an active data point) associated withsecond GPS location2238 corresponding tosecond image2230 has been selected by hovering (as indicated bycursor icon2244 shown near or over second GPS location2238). In response to selecting the visual representation, a first popup menu2246 (or other interface) may be provided with any number of options for manipulatingsecond image2230. In other aspects,first popup menu2246 may be provided in response to additional input (e.g., right click, etc.).
For example,first popup menu2246 provides options including “Cut,” “Copy” and “Paste.” Additionally,first popup menu2246 providestranscribe option2248, which is selected as evidenced by shading. In aspects, chart may be transcribed into speech, e.g., for low-vision users. In response to selection oftranscribe option2248, a second popup menu2250 may be displayed. Second popup menu2250 may provide a number of options for transcription, for example, “Speech to Text,” “Text to Speech,” “Music to Score,” or “Score to Music,” etc. As illustrated, the “Text to Speech”option2252 has been selected (e.g., evidenced by shading).
In this regard, a textual document such as a chart may be converted into an audio file (e.g., “Text to Speech”), e.g., spoken words may be generated that correspond to the alphanumeric text and data of the chart and may be “played” for the user at any time. With respect to the illustrated example, information regarding thesecond GPS location2238 and/or thesecond image2230 may be transcribed into speech and stored as an audio file. In this example, a visual indicator of the audio file may be associated with the chart at or near thesecond GPS location2238. Alternatively, in response to selection of the “Text to Speech”option2252, all or a substantial portion of the alphanumeric text and data withinchart2234 may be transcribed into speech. That is, information regarding each GPS location and/or each corresponding image, cities or states associated with the GPS locations, etc., may be transcribed into speech and stored as an audio file. To further improve user experience, the chart may be customized to associate sounds with colors, numbers, trends, or any other aspect of the chart. In this regard, a visual indicator of the audio file may be associated with the chart in any suitable location and/or the audio file may automatically play in response toopening chart2234.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 22C are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 22D illustrates a map chart including an audio transcription of the map chart, according to an example embodiment.
Similar toFIGS. 22A-22C,FIG. 22D showsinterface2200 of a spreadsheetapplication including spreadsheet2202 and navigation ribbon2204 (which includescell identifier2206 and formula bar2208). As further illustrated byFIG. 22D,chart2234 has been created and inserted inspreadsheet2202.Chart2234 is a map chart graphing GPS locations for the plurality of images associated with cell C5 (seeFIG. 22A).
As further illustrated, a visual representation (e.g., a speaker icon) corresponding to anaudio file2254 has been created and inserted inchart2234. In aspects,audio file2254 is a transcription of thechart2234 that was created by generating spoken words corresponding to alphanumeric text and data ofchart2234. As described above, particularly for low-vision users, a chart may be transcribed into speech so that the spreadsheet application can, in effect, read its own data to a reader, either automatically or upon selection of the visual representation. For instance,audio file2254 may be set to play automatically for a low-vision user, either when the chart is created or opened. Alternatively, as illustrated, the visual representation ofaudio file2254 may be selected by hovering (e.g., identified bycursor icon2256 shown near or over the visual representation). In other aspects, the visual representation may be selected by right click, keyboard input, and the like. As should be appreciated, the above examples may be similarly applied to other objects associated with a spreadsheet (e.g., audio files, videos, streaming data, etc.) and should not be considered limiting.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 22D are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 23A illustrates a popup interface for selecting a charting function for images in a spreadsheet, according to an example embodiment.
As illustrated, aninterface2300 of a spreadsheet application is provided.Interface2300 includes aspreadsheet2302, a navigation ribbon2304 (including acell identifier2306 and a formula bar2308), and aninsert toolbar2310.Interface2300 further includes a plurality oftabs2312 for accessing various aspects and operations of the spreadsheet application. As illustrated, aninsert tab2312A is selected, which is indicated as an unshaded tab. In this case, a range of cells2322 (e.g., D3:D5) is identified as selected (e.g., by outlining) and the cell at the top of the range (i.e., cell D3) is identified by cell identifier2306 (e.g., “D3”) in thenavigation pane2304. The range ofcells2322 may be selected by highlighting the range of cells, touching (or swiping) the range of cells, entering “=D3:D5” into the formula bar, etc. In some cases, in response to a selection of a range of cells,formula bar2308 may be blank (shown).
FIG. 23A illustrates animage2318 withscroll control2320, which indicates that a plurality of images is associated with cell C5. In some aspects, thescroll control2320 may enable a user to scroll backward (e.g., by activating the back arrow) or forward (e.g., by activating the forward arrow) one at a time through the plurality of images within the cell. Alternatively, thescroll control2320 or another UI control may be activated to launch a user interface for displaying and/or interacting with each of the plurality of images.
In aspects, in response to selecting the range ofcells2322, a first popup menu2324 (or other interface) may be provided with any number of options for manipulating data or objects associated with the selected range ofcells2322. As illustrated,first popup menu2324 provides options including “Cut,” “Copy” and “Paste,” “Float an image on grid,” “Insert” and “Delete,” “Filter” and “Sort,” “Insert Comment,” “Format Cells,” “Define Name” and “Hyperlink,” as described above with respect toFIG. 19B.
Additionally, a “Chart”option2326 may be provided for selecting a charting function for application to the selected range ofcells2322. In response to selecting “Chart”option2326, asecond popup menu2328 may be provided. Thesecond popup menu2328 may provide options for selecting different types of charts, such as bar charts, line graphs, map charts, pie charts, etc. As illustrated,bar chart option2330 is identified as selected (e.g., by outlining).
As should be appreciated, the various devices, components, etc., described with respect toFIG. 23A are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 23B illustrates a bar chart incorporating a plurality of images in a single bar, according to an example embodiment.
Similar toFIG. 23A,FIG. 23B showsinterface2300 of a spreadsheetapplication including spreadsheet2302 andnavigation ribbon2304, which includescell identifier2306 andformula bar2308. As further illustrated byFIG. 23B,chart2332 has been created and inserted inspreadsheet2302. In examples,chart2332 may be inserted as an overlay (shown) onspreadsheet2302 or may be launched in a separate window or interface (not shown).
Chart2332 is a bar chart graphing a number of available used automobiles for different makes of automobiles associated with the selected range of cells2322 (seeFIG. 23A). Furthermore, an image of each available used automobile is incorporated into a bar corresponding to the make of automobile depicted by the image. As described above with respect toFIG. 23A,scroll control2320 is displayed overimage2318, which indicates that a plurality of images is associated with cell D5. As illustrated, cell D5 is included inrow 5 of the spreadsheet, which row includes data relating to “Smart Cars.” In aspects, the plurality of images associated with cell D5 are images depicting different Smart Cars. As further illustrated, cell D3 includes a single image (e.g., image2314) withinrow 3, which row includes data relating to “BMWs,” and cell D4 includes a single image (e.g., image2316) withinrow 4, which row includes data relating to “Mini Coopers.”
Based on the above illustrated example, in response to charting the selected range of cells (i.e., D3:D5),image2314 is incorporated intobar2342 corresponding to BMWs,image2316 is incorporated intobar2346 corresponding to Mini Coopers, and the plurality of images associated with cell D5 are provided within a single bar (e.g., bar2344) corresponding to Smart Cars. In particular, a visual representation for each image is provided within the corresponding bar. That is, in response to selection ofvisual representation2338,image2314 may be displayed; and in response to selection ofvisual representation2340,image2316 may be displayed. As further illustrated,bar2344 includes a plurality of visual representations corresponding to the plurality of images associated with cell D5. For instance, in response to selection of visual representation2334 (e.g., as indicated bycursor icon2336 provided over or near visual representation2334),image2318 is displayed withinchart2332 as an overlay. Similarly, in response to selection of any of the plurality of visual representations incorporated intobar2344, a corresponding image of the plurality of images may be displayed. As should be appreciated, the above examples may be similarly applied to other objects associated with a spreadsheet that may be provided in arrays of objects (e.g., audio files, videos, etc.) and should not be considered limiting.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 23B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 24A illustrates an interface showing videos and additional data associated with one or more cells of a spreadsheet, according to an example embodiment
As illustrated, aninterface2400 of a spreadsheet application is provided.Interface2400 includes aspreadsheet2402, a navigation ribbon2404 (including acell identifier2406 and a formula bar2408), and ahome toolbar2410A.Interface2400 further includes a plurality oftabs2412 for accessing various aspects and operations of the spreadsheet application. As illustrated, ahome tab2412A is selected, which is indicated as an unshaded tab. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. As shown, a cell identifier2406 (e.g., “D3”) for the selected cell is displayed innavigation ribbon2404. Additionally,formula bar2408 displays a function calling a file locator, e.g., =GETVIDEO(“C:\Pictures\Flight&Fight\20120612.mp4”) forvideo2414 displayed within cell D3.
In some aspects, in response to selecting cell D3 and/or selecting video2414 (e.g., by right click, cursor hover, keyboard input, etc.), a card view ofimage2414 may be displayed. For instance, in response to a selection,card2416 is displayed as an overlay onspreadsheet2402. As illustrated,card2416 displays avisual representation2418 ofvideo2414, along withadditional data2422, in an organized and formatted layout. For instance,card2416 includes a full title (e.g., “Flight & Fight”) in a header portion.Visual representation2418 ofvideo2414 includes aplay control2424 and aninteractive play bar2420, which provides controls for “fast rewind” (or “skip back”), “rewind,” “pause,” “play,” and “fast forward” (or “skip forward”). Further,additional data2422 includes formatted data descriptors (e.g., bolded) for each piece of information. In this regard,card2416 provides a user-friendly interface (e.g., organized and stylized) for viewing additional data associated withvideo2414, e.g., via a compound data type.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 24A are not intended to limit the systems and methods to the particular components described. Accordingly, additional configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 24B illustrates a bar chart incorporating a video, according to an example embodiment.
Similar toFIG. 24A,FIG. 24B showsinterface2400 of a spreadsheetapplication including spreadsheet2402 and navigation ribbon2404 (which includescell identifier2406 and formula bar2408). As illustrated,insert tab2412B is selected, which is identified as an unshaded tab. In response to selection ofinsert tab2412B, inserttoolbar2410B is displayed and provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet2402.
As further illustrated byFIG. 24B,bar chart icon2426 has been selected (in particular, a column bar chart). In this regard,chart2428 has been created and inserted inspreadsheet2402. In examples,chart2428 may be inserted as an overlay (shown) onspreadsheet2402 or may be launched in a separate window or interface (not shown).Chart2428 is a bar chart graphing duration in hours for each of three documentary videos. In this case, afirst bar2430 represents the duration for a first documentary video (e.g., first video2414) entitled “Flight & Fight,” asecond bar2432 represents the duration for a second documentary video (e.g., second video2436) entitled “Fast Cars,” and athird bar2434 represents the duration for a third documentary video (e.g., third video2438) entitled “Run Free.”
As illustrated, a firstvisual representation2440 corresponding tofirst video2414 is incorporated into thefirst bar2430, a secondvisual representation2442 corresponding tosecond video2436 is incorporated into thesecond bar2432 and a thirdvisual representation2444 corresponding tothird video2438 is incorporated into thethird bar2434. In aspects, in response to selection of a visual representation,card2416 may be displayed (not shown) or aminiaturized video card2448 withplay bar2446 may be displayed (shown). As should be appreciated, any suitable interface for accessing a video may be provided in response to selection of a visual representation of the video within a chart. However, in some aspects, aminiaturized video card2448 may not obscure chart data while still allowing access to the video. Moreover, the above examples may be similarly applied to other objects associated with a spreadsheet (e.g., images, audio files, streaming data, etc.) and should not be considered limiting.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 24B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 25A illustrates an interface for associating streaming data within a cell, according to a first example embodiment.
As illustrated, aninterface2500 of a spreadsheet application is provided.Interface2500 includes aspreadsheet2502 and a navigation ribbon2504 (including acell identifier2506 and a formula bar2508).Interface2500 further includes a plurality oftabs2512 for accessing various aspects and operations of the spreadsheet application. As illustrated, ahome tab2512A is selected, which is indicated as an unshaded tab, and ahome toolbar2510A is displayed. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. As shown, a cell identifier2506 (e.g., “D3”) for the selected cell is displayed innavigation ribbon2504. Additionally,formula bar2508 displays a function calling streaming data to cell D3, e.g., =GETFEED(“E:/body/log/heartrate/date/20131105.json”).
In aspects, aninterface2514 may be provided in a toolbar (e.g.,home toolbar2510A) of the spreadsheet application and may enable association of streaming data with a selected cell2524 (i.e., cell D3) or a range of cells.Interface2514 may include a number of access points for associating different types of streaming data. For example, aniFrame interface2526, aBluetooth® interface2516, etc., may be available frominterface2514. In some aspects, these interfaces may be provided in a mobile spreadsheet application and may enable association of streaming data with a selected cell or cells using a mobile device (not shown).
Bluetooth® interface2516 may further display discovered devices in awindow2518. In the illustrated example, devices “Tom's Fitbit®” and “Karen's iPhone®” have been discovered and a selection of “Tom's Fitbit®” has been received, as indicated by shading. In aspects, in response to selecting a discovered device and activating “Pair”control2522, the discovered device may be paired to the spreadsheet application. In further aspects, depending on the types of data that are available for streaming from the device, a get request for a specific type of data may be made. For instance, in the example of a wearable fitness device, the device may monitor and store heartrate data, blood pressure data, pedometer data, and the like. In this case, the spreadsheet may retrieve data directly from the device or may call an application program interface (API) associated with the device. In the illustrated example, “Tom's Fitbit®” is paired to the spreadsheet (e.g., via Bluetooth®) and streaming data for monitored heartrate values may be retrieved directly from the paired device using a get request such as: =GETFEED(“E:/body/log/heartrate/date/20131105.json”). In particular, the get request specifies a date (e.g., Nov. 5, 2013) and a type of data (e.g., heartrate) for retrieval from the paired device. In at least some aspects, monitored data may be streamed from the device to the spreadsheet in near real time, e.g., as the monitored data is measured, stored and streamed by the device.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 25A are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 25B illustrates an interface for associating streaming data within a cell, according to a second example embodiment.
Similar toFIG. 25A,FIG. 25B showsinterface2500 of a spreadsheet application.Interface2500 includes aspreadsheet2502 and a navigation ribbon2504 (including acell identifier2506 and a formula bar2508). As illustrated, ahome tab2512A is selected, which is indicated as an unshaded tab, and ahome toolbar2510A is displayed. As further illustrated, cell D3 is selected, as indicated by thickened and/or colored outlining of the cell border of cell D3. In this case,formula bar2508 displays a different function calling streaming data to cell D3, e.g., =GETFEED(“https://api.fitbit.com/1/user/34288/body/log/heartrate/date/20131105.json”).
As described above,interface2514 may include aniFrame interface2526, aBluetooth® interface2516, etc., for associating streaming data with a spreadsheet. In some aspects, these interfaces may be provided in a mobile spreadsheet application and may enable association of streaming data with a selected cell or cells using a mobile device (not shown). In the illustrated example,iFrame interface2526 is selected, as indicated by shading, and provides aninput field2530 for referencing a URL for the streaming data. As illustrated,URL2528 calls an API associated with a fitness device and has been entered intoinput field2530, e.g., =GETFEED(“https://api.fitbit.com/1/user/34288/body/log/heartrate/date/20131105.json”). In particular,URL2528 specifies a user identifier (e.g., 34288), a date (e.g., Nov. 5, 2013) and a type of data (e.g., heartrate) for retrieval. As further illustrated,iFrame interface2526 includes an “Insert”control2532, for associating the streaming data retrieved by theURL2528 into the selected cell2524 (e.g., cell D3). In at least some aspects, monitored data may be streamed from the device to the spreadsheet in near real time, e.g., as the monitored data is measured, stored and streamed by the device. As should be appreciated, additional examples for associating streaming data with a cell are possible and the above examples are offered for purposes of explanation and should not be understood as limiting.
As should be further appreciated, the various devices, components, etc., described with respect toFIG. 25B are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIG. 25C illustrates a line graph of heartrate values streamed from a device, according to an example embodiment.
Similar toFIGS. 25A-25B,FIG. 25C showsinterface2500 of a spreadsheetapplication including spreadsheet2502 and navigation ribbon2504 (which includescell identifier2506 and formula bar2508). As illustrated,insert tab2512B is selected, which is identified as an unshaded tab. In response to selection ofinsert tab2512B, inserttoolbar2510B is displayed and provides for selecting various charts, including bar charts, line graphs, pie charts, scatter plots, area graphs, etc., for insertion intospreadsheet2502.
As further illustrated byFIG. 25C,line graph icon2534 has been selected, as indicated by shading. In this regard,chart2536 has been created and inserted inspreadsheet2502.Chart2536 is a line graph charting heartrate in beats per minute taken at various times on Nov. 5, 2013. In this case, each data point (e.g., data point2538) represents a heartrate measurement taken at a particular time on Nov. 5, 2013. As should be appreciated, heartrate measurements may be streamed and charted in near real time by associating streaming data with a cell of a spreadsheet. In further aspects, the streamed data may be charted automatically in response to associating the streaming data with the spreadsheet. The above example is provided for purposes of explanation only and should not be understood to be limiting.
As should be appreciated, the various devices, components, etc., described with respect toFIG. 25C are not intended to limit the systems and methods to the particular components described. Accordingly, additional topology configurations may be used to practice the methods and systems herein and/or some components described may be excluded without departing from the methods and systems disclosed herein.
FIGS. 26-29 and the associated descriptions provide a discussion of a variety of operating environments in which aspects of the disclosure may be practiced. However, the devices and systems illustrated and discussed with respect toFIGS. 26-29 are for purposes of example and illustration and are not limiting of a vast number of computing device configurations that may be utilized for practicing aspects of the disclosure, as described herein.
FIG. 26 is a block diagram illustrating physical components (e.g., hardware) of acomputing device2600 with which aspects of the disclosure may be practiced.
The computing device components described below may have computer executable instructions for implementing aspreadsheet application2620 on a computing device (e.g.,server computing device108 and/or client computing device104), including computer executable instructions forspreadsheet application2620 that can be executed to implement the methods disclosed herein. In a basic configuration, thecomputing device2600 may include at least oneprocessing unit2602 and asystem memory2604. Depending on the configuration and type of computing device, thesystem memory2604 may comprise, but is not limited to, volatile storage (e.g., random access memory), non-volatile storage (e.g., read-only memory), flash memory, or any combination of such memories. Thesystem memory2604 may include anoperating system2605 and one ormore program modules2606 suitable for runningspreadsheet application2620, such as one or more components with regard toFIG. 1 and, in particular, selection component2611 (e.g., corresponding to selection component112), object identifier2613 (e.g., including object identifier114), parameter retriever2615 (e.g., corresponding to parameter retriever116), and/or UX component2617 (e.g., including chartingcomponent118 and UX component120).
Theoperating system2605, for example, may be suitable for controlling the operation of thecomputing device2600. Furthermore, embodiments of the disclosure may be practiced in conjunction with a graphics library, other operating systems, or any other application program and is not limited to any particular application or system. This basic configuration is illustrated inFIG. 26 by those components within a dashedline2608. Thecomputing device2600 may have additional features or functionality. For example, thecomputing device2600 may also include additional data storage devices (removable and/or non-removable) such as, for example, magnetic disks, optical disks, or tape. Such additional storage is illustrated inFIG. 26 by aremovable storage device2609 and anon-removable storage device2610.
As stated above, a number of program modules and data files may be stored in thesystem memory2604. While executing on theprocessing unit2602, the program modules2606 (e.g., spreadsheet application2620) may perform processes including, but not limited to, the aspects, as described herein. Other program modules that may be used in accordance with aspects of the present disclosure, and in particular for charting objects (e.g., images, audio files, videos, streaming data, etc.) associated with a spreadsheet, may includeselection component2611,object identifier2613,parameter retriever2615, and/orUX component2617, etc.
Furthermore, embodiments of the disclosure may be practiced in an electrical circuit comprising discrete electronic elements, packaged or integrated electronic chips containing logic gates, a circuit utilizing a microprocessor, or on a single chip containing electronic elements or microprocessors. For example, embodiments of the disclosure may be practiced via a system-on-a-chip (SOC) where each or many of the components illustrated inFIG. 26 may be integrated onto a single integrated circuit. Such an SOC device may include one or more processing units, graphics units, communications units, system virtualization units and various application functionality all of which are integrated (or “burned”) onto the chip substrate as a single integrated circuit. When operating via an SOC, the functionality, described herein, with respect to the capability of client to switch protocols may be operated via application-specific logic integrated with other components of thecomputing device1000 on the single integrated circuit (chip). Embodiments of the disclosure may also be practiced using other technologies capable of performing logical operations such as, for example, AND, OR, and NOT, including but not limited to mechanical, optical, fluidic, and quantum technologies. In addition, embodiments of the disclosure may be practiced within a general purpose computer or in any other circuits or systems.
Thecomputing device2600 may also have one or more input device(s)2612 such as a keyboard, a mouse, a pen, a sound or voice input device, a touch or swipe input device, etc. The output device(s)2614 such as a display, speakers, a printer, etc. may also be included. The aforementioned devices are examples and others may be used. Thecomputing device2600 may include one ormore communication connections2616 allowing communications withother computing devices2650. Examples ofsuitable communication connections2616 include, but are not limited to, radio frequency (RF) transmitter, receiver, and/or transceiver circuitry; universal serial bus (USB), parallel, and/or serial ports.
The term computer readable media as used herein may include computer storage media. Computer storage media may include non-transitory, volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information, such as computer readable instructions, data structures, or program modules. Thesystem memory2604, theremovable storage device2609, and thenon-removable storage device2610 are all computer storage media examples (e.g., memory storage). Computer storage media may include RAM, ROM, electrically erasable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other article of manufacture which can be used to store information and which can be accessed by thecomputing device2600. Any such computer storage media may be part of thecomputing device2600. Computer storage media does not include a carrier wave or other propagated or modulated data signal.
Communication media may be embodied by computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and includes any information delivery media. The term “modulated data signal” may describe a signal that has one or more characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media may include wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, radio frequency (RF), infrared, and other wireless media.
FIGS. 27A and 27B are simplified block diagrams of a mobile computing device with which aspects of the present disclosure may be practiced.
FIGS. 27A and 27B illustrate amobile computing device2700, for example, a mobile telephone, a smart phone, wearable computer (such as a smart watch), a tablet computer, a laptop computer, and the like, with which embodiments of the disclosure may be practiced. In some aspects, the client may be a mobile computing device. With reference toFIG. 27A, one aspect of amobile computing device2700 for implementing the aspects is illustrated. In a basic configuration, themobile computing device2700 is a handheld computer having both input elements and output elements. Themobile computing device2700 typically includes adisplay2705 and one ormore input buttons2710 that allow the user to enter information into themobile computing device2700. Thedisplay2705 of themobile computing device2700 may also function as an input device (e.g., a touch screen display). If included, an optionalside input element2715 allows further user input. Theside input element2715 may be a rotary switch, a button, or any other type of manual input element. In alternative aspects,mobile computing device2700 may incorporate more or less input elements. For example, thedisplay2705 may not be a touch screen in some embodiments. In yet another alternative embodiment, themobile computing device2700 is a portable phone system, such as a cellular phone. Themobile computing device2700 may also include anoptional keypad2735.Optional keypad2735 may be a physical keypad or a “soft” keypad generated on the touch screen display. In various embodiments, the output elements include thedisplay2705 for showing a graphical user interface (GUI), a visual indicator2720 (e.g., a light emitting diode), and/or an audio transducer2725 (e.g., a speaker). In some aspects, themobile computing device2700 incorporates a vibration transducer for providing the user with tactile feedback. In yet another aspect, themobile computing device2700 incorporates input and/or output ports, such as an audio input (e.g., a microphone jack), an audio output (e.g., a headphone jack), and a video output (e.g., a HDMI port) for sending signals to or receiving signals from an external device.
FIG. 27B is a block diagram illustrating the architecture of one aspect of a mobile computing device. That is, themobile computing device2700 can incorporate a system (e.g., an architecture)2702 to implement some aspects. In one embodiment, thesystem2702 is implemented as a “smart phone” capable of running one or more applications (e.g., browser, e-mail, calendaring, contact managers, messaging clients, games, and media clients/players). In some aspects, thesystem2702 is integrated as a computing device, such as an integrated personal digital assistant (PDA) and wireless phone.
One ormore application programs2766 may be loaded into thememory2762 and run on or in association with theoperating system2764. Examples of the application programs include phone dialer programs, e-mail programs, personal information management (PIM) programs, word processing programs, spreadsheet programs, Internet browser programs, messaging programs, and so forth. Thesystem2702 also includes anon-volatile storage area2768 within thememory2762. Thenon-volatile storage area2768 may be used to store persistent information that should not be lost if thesystem2702 is powered down. Theapplication programs2766 may use and store information in thenon-volatile storage area2768, such as email or other messages used by an email application, and the like. A synchronization application (not shown) also resides on thesystem2702 and is programmed to interact with a corresponding synchronization application resident on a host computer to keep the information stored in thenon-volatile storage area2768 synchronized with corresponding information stored at the host computer. As should be appreciated, other applications may be loaded into thememory2762 and run on themobile computing device2700, including the instructions for charting objects associated with a spreadsheet as described herein (e.g., selection component, object identifier, parameter retriever, charting component, and/or UX component, etc.).
Thesystem2702 has apower supply2770, which may be implemented as one or more batteries. Thepower supply2770 may further include an external power source, such as an AC adapter or a powered docking cradle that supplements or recharges the batteries. Thesystem2702 may also include aradio interface layer2772 that performs the function of transmitting and receiving radio frequency communications. Theradio interface layer2772 facilitates wireless connectivity between thesystem2702 and the “outside world,” via a communications carrier or service provider. Transmissions to and from theradio interface layer2772 are conducted under control of theoperating system2764. In other words, communications received by theradio interface layer2772 may be disseminated to theapplication programs2766 via theoperating system2764, and vice versa.
Thevisual indicator2720 may be used to provide visual notifications, and/or anaudio interface2774 may be used for producing audible notifications via an audio transducer2725 (e.g.,audio transducer2725 illustrated inFIG. 27A). In the illustrated embodiment, thevisual indicator2720 is a light emitting diode (LED) and theaudio transducer2725 may be a speaker. These devices may be directly coupled to thepower supply2770 so that when activated, they remain on for a duration dictated by the notification mechanism even though theprocessor2760 and other components might shut down for conserving battery power. The LED may be programmed to remain on indefinitely until the user takes action to indicate the powered-on status of the device. Theaudio interface2774 is used to provide audible signals to and receive audible signals from the user. For example, in addition to being coupled to theaudio transducer2725, theaudio interface2774 may also be coupled to a microphone to receive audible input, such as to facilitate a telephone conversation. In accordance with embodiments of the present disclosure, the microphone may also serve as an audio sensor to facilitate control of notifications, as will be described below. Thesystem2702 may further include avideo interface2776 that enables an operation of peripheral device2730 (e.g., on-board camera) to record still images, video stream, and the like.
Amobile computing device2700 implementing thesystem2702 may have additional features or functionality. For example, themobile computing device2700 may also include additional data storage devices (removable and/or non-removable) such as, magnetic disks, optical disks, or tape. Such additional storage is illustrated inFIG. 27B by thenon-volatile storage area2768.
Data/information generated or captured by themobile computing device2700 and stored via thesystem2702 may be stored locally on themobile computing device2700, as described above, or the data may be stored on any number of storage media that may be accessed by the device via theradio interface layer2772 or via a wired connection between themobile computing device2700 and a separate computing device associated with themobile computing device2700, for example, a server computer in a distributed computing network, such as the Internet. As should be appreciated such data/information may be accessed via themobile computing device2700 via theradio interface layer2772 or via a distributed computing network. Similarly, such data/information may be readily transferred between computing devices for storage and use according to well-known data/information transfer and storage means, including electronic mail and collaborative data/information sharing systems.
As should be appreciated,FIGS. 27A and 27B are described for purposes of illustrating the present methods and systems and are not intended to limit the disclosure to a particular sequence of steps or a particular combination of hardware or software components.
FIG. 28 is a simplified block diagram of a distributed computing system in which aspects of the present disclosure may be practiced.
FIG. 28 illustrates one aspect of the architecture of a system for processing data received at a computing system from a remote source, such as a general computing device2804 (e.g., personal computer),tablet computing device2806, ormobile computing device2808, as described above. Content displayed atserver device2802 may be stored in different communication channels or other storage types. For example, various documents may be stored using adirectory service2822, aweb portal2824, amailbox service2826, aninstant messaging store2828, or asocial networking service2830. Thespreadsheet application2821 may be employed by a client that communicates withserver device2802, and/or thespreadsheet application2820 may be employed byserver device2802. Theserver device2802 may provide data to and from a client computing device such as ageneral computing device2804, atablet computing device2806 and/or a mobile computing device2808 (e.g., a smart phone) through anetwork2815. By way of example, the computer system described above with respect toFIGS. 1-25 may be embodied in a general computing device2804 (e.g., personal computer), atablet computing device2806 and/or a mobile computing device2808 (e.g., a smart phone). Any of these embodiments of the computing devices may obtain content from thestore2816, in addition to receiving graphical data useable to either be pre-processed at a graphic-originating system or post-processed at a receiving computing system.
As should be appreciated,FIG. 28 is described for purposes of illustrating the present methods and systems and is not intended to limit the disclosure to a particular sequence of steps or a particular combination of hardware or software components.
FIG. 29 illustrates an exemplarytablet computing device2900 that may execute one or more aspects disclosed herein. In addition, the aspects and functionalities described herein may operate over distributed systems (e.g., cloud-based computing systems), where application functionality, memory, data storage and retrieval and various processing functions may be operated remotely from each other over a distributed computing network, such as the Internet or an intranet. User interfaces and information of various types may be displayed via on-board computing device displays or via remote display units associated with one or more computing devices. For example user interfaces and information of various types may be displayed and interacted with on a wall surface onto which user interfaces and information of various types are projected. Interaction with the multitude of computing systems with which embodiments of the invention may be practiced include, keystroke entry, touch screen entry, voice or other audio entry, gesture entry where an associated computing device is equipped with detection (e.g., camera) functionality for capturing and interpreting user gestures for controlling the functionality of the computing device, and the like.
As should be appreciated,FIG. 29 is described for purposes of illustrating the present methods and systems and is not intended to limit the disclosure to a particular sequence of steps or a particular combination of hardware or software components.
Aspects of the present disclosure, for example, are described above with reference to block diagrams and/or operational illustrations of methods, systems, and computer program products according to aspects of the disclosure. The functions/acts noted in the blocks may occur out of the order as shown in any flowchart. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved.
The description and illustration of one or more aspects provided in this application are not intended to limit or restrict the scope of the disclosure as claimed in any way. The aspects, examples, and details provided in this application are considered sufficient to convey possession and enable others to make and use the best mode of claimed disclosure. The claimed disclosure should not be construed as being limited to any aspect, example, or detail provided in this application. Regardless of whether shown and described in combination or separately, the various features (both structural and methodological) are intended to be selectively included or omitted to produce an embodiment with a particular set of features. Having been provided with the description and illustration of the present application, one skilled in the art may envision variations, modifications, and alternate aspects falling within the spirit of the broader aspects of the general inventive concept embodied in this application that do not depart from the broader scope of the claimed disclosure.