CROSS REFERENCE TO RELATED APPLICATIONS- This patent application claims priority of Provisional Patent Application 61/648,593 filed May 18, 2012, Provisional Patent Application 61/670,754 filed Jul. 12, 2012, Provisional Patent Application 61/705,051 filed Sep. 24, 2012, Provisional Patent Application 61/771,629 filed Mar. 1, 2013, Provisional Patent Application 61/771,646 filed Mar. 1, 2013, Provisional Patent Application 61/771,690 filed Mar. 1, 2013, and Provisional Patent Application 61/771,704 filed Mar. 1, 2013, the disclosures of which are each incorporated by reference herein in its entirety. 
BACKGROUND- 1. Field of the Invention 
- The present subject matter relates to providing a shared experience to audience members communicated from a central server by creating rule-based individual and shared interactions with portable interactive devices of audience members. 
- 2. Related Art 
- In order to enhance an audience's involvement in a live event such as a concert, video displays may be combined with a concert performance. 
- For example, United States Patent Application Publication No. 201200239526 discloses an interactive method and apparatus which provide limited interaction between a performer and concert attendees. The performer enters concert information into a server, which is then accessed wirelessly by an electronic device of a concert attendee. Animations from the server are dynamically displayed on the electronic device. In this arrangement, attendees may select a song to download or to view the lyrics. The user may select an encore screening to vote on a song to be played during an encore performance. In this arrangement, the attendee interacts only with previously stored information. There is no new information generated to enhance the performance. In order to combine further information sources, whether local or accessed through the Internet, the system must provide sufficient bandwidth or delays and gaps in the data will occur. In the past, it has generally been impossible to provide sufficient bandwidth through a venue connection. Possible interactions between a performer and an audience are greatly limited. 
- United States Published Patent Application No. 20070292832 discloses a system for creating sound using visual images. Various controls and features are provided for the selection, editing, and arrangement of the visual images and tones used to create a sound presentation. Visual image characteristics such as shape, speed of movement, direction of movement, quantity, and location can be set by a user. These systems do not provide for interaction with the audience. 
- To the extent that audience interaction and provision of displays constructed for particular users have been provided, they have had very limited capabilities. 
- U.S. Pat. No. 8,090,321 discloses a method and system for wirelessly providing venue-based data to one or more hand held devices. The venue-based data can be authenticated and wirelessly transmitted to one or more hand held devices through one or more wireless telecommunications networks in response to authenticating the venue-based data. This method and system provide data to hand held devices. However, an interaction between a device and the venue data source is not disclosed. 
- United States Published Patent Application No. 20130080348 describes capturing event feedback and providing a representation of feedback results generated using the feedback indicia. The capturing of event feedback involves storing a plurality of different event records for each of a plurality of different events. The information is used by a program presenter for determining audience behavior in response to transmitted content. Production of an enhanced concert experience is not disclosed. 
- U.S. Pat. No. 7,796,162 discloses a system in which one set of cameras generates multiple synchronized camera views for broadcast of a live activity from a venue to remote viewers. A user chooses which view to follow. However, there is no plan for varying the sets of images sent to users. There is no uploading capability for users. 
- U.S. Pat. No. 6,731,940 discloses methods of using wireless geolocation to customize content and delivery of information to wireless communication devices. The communication devices send signals to a central control system. The method uses an RF receiving site including antenna array and a mobile device operated by a user. At least one p-dimensional array vector is derived from RF signals sampled from p antennas of an array, where p is an integer. At least one p-dimensional array vector is used to derive a location of the mobile device. The device addresses a data source in order to customize information in correspondence with the location. The customized information is transmitted to a user. A significant application of this system is to send and/or receive location-specific information of interest, such as targeted advertisements and special services, to travelers and shoppers. A control function is not provided for generating a display comprising an entertainment performance. 
- United States Published Patent Application No. 20110075612 discloses a system in which content is venue-cast. The content is sent to a plurality of receiving access terminals comprising portable interactive devices within a venue boundary. Content generated at an access terminal is transmitted to a venue-cast server. A venue-specific network could comprise a wide area network (WAN) or a Wi-Fi hotspot deployment. The system provides “unscheduled ad hoc deliveries” of content via the venue transmission system to provide venue visitors with venue related information. Content is specific to the venue and is not related to groups of users within the venue. The only function provided is a venue cast. 
SUMMARY- Briefly stated, in accordance with the present subject matter, there are provided a system, method, and machine-readable medium comprising instructions to be executed on a digital processor for permitting a system to create cooperatively determined video compositions and interaction between audience members to produce a composite result. A central server provides commands through which information is sent or gathered or both between audience members and a central server. Information received from audience members is processed to determine relationships between data from individual users. More specifically, in one form, rules may be implemented in a processor to enable a game played by audience members which is coordinated by a central control system. 
- In one preferred form, a Wi-Fi link is provided in a venue so that the ability to communicate is not limited by Internet or cellular system bandwidth constraints. 
- A program may issue commands to all interactive devices in an audience to produce a composite result. The present subject matter can direct portable interactive devices to perform functions in response to received signals from a central source including a central server. The central server may interact with a portable interactive device through an app created in accordance with the present subject matter. The app is installed in the portable interactive device. 
- Another composite result is implementation of a game in which the central server sends commands to gather selected information from each portable interactive device. The information is processed according to a preselected rule, and results are provided to users. A plurality of iterations of information gathering and processing may be performed. Various forms of information may be gathered and processed in accordance with different preselected rules in order to implement different games or information transmission. 
BRIEF DESCRIPTION OF THE DRAWINGS- The present subject matter may be further understood by reference to the following description taken in connection with the following drawings: 
- FIG. 1, consisting ofFIGS. 1A and 1B, is an illustration of the method and apparatus of the present subject matter operating in a venue; 
- FIG. 2 is a block diagram of the system illustrated inFIG. 1; 
- FIG. 3 is a block diagram illustrating one preferred form of effectively connecting client devices in a network, 
- FIG. 4 is a block diagram of a smartphone; 
- FIG. 5 is a flow chart illustrating one form of app for enabling portable user devices to interact in the system; 
- FIG. 6 is a block diagram illustrating the central server acting as a source communicating via a server with portable interactive devices; 
- FIGS. 7,8, and9 are each a flow chart illustrating interactive applications within the present system; 
- FIG. 10 is a flow chart illustrating a further use of data gathered from interactive devices; and 
- FIG. 11 is an illustration of a display of forms of interaction performed by the system such as a game or shared activity. 
DETAILED DESCRIPTION- FIG. 1, consisting ofFIGS. 1A and 1B, is an illustration of avenue10 housing asystem2 in accordance with the present subject matter.FIG. 2 is a high-level block diagram of communication paths in the system illustrated inFIG. 1.FIGS. 1 and 2 are discussed at the same time. Thesystem2 may be used in conjunction with a live event, for example a concert. Two-way interactivity is provided between acentral server8 andindividual audience members4 who may each have a portableinteractive device6. The portableinteractive device6 may be a smartphone, tablet, or other device. 
- Acentral clock9 synchronizes operations. Thevenue10 may include astage12,audience area14, acontrol room16, and amedia system18 which may be located in thecontrol room16. Themedia system18 receives audio, video, and intelligence from sources and may be operated to perform control room functions such as mixing, selecting, and processing. Avideo program20 is shown on adisplay22. 
- Themedia system18 is used to couple outputs from avideo source26, asound source28, andother intelligence source30. Thevideo source26 may comprise one ormore television cameras24. In the present illustration, amedia source34 includes thevideo source26,sound source28, andother intelligence source30. Thesound source28 comprises audio output from a live performance provided by a performer orperformers40 coupled bytransducers42, such as microphones. Alternatively, one or more of thevideo source26, thesound source28, andother intelligence source30 may comprise sources of streaming content, prerecorded content, stored data, or currently processed content from any source. These sources may be local, remote, or both. 
- In one preferred form thedisplay22 is ascreen50 that comprises a backdrop for thestage12. Thedisplay22 could comprise anarray52 of screens over which thevideo program20 is distributed. In another form, often used in arenas, thedisplay22 could comprise adisplay unit56 which includes a plurality ofmonitors58 on onesupport60, with each monitor58 facing in a different direction. Examples of thedisplay unit56 are available under the trademark Jumbotron®. 
- Themedia system18 is operated by aVJ70. TheVJ70 may comprise one or more personnel or a programmed computer. It is not essential that thecontrol room18 be located at thevenue10. Themedia system18 provides content to aconcert network controller100. Theconcert network controller100 may both receive and transmit information. Theconcert network controller100 provides an input to adisplay link102, which is coupled by apatch panel104 to thedisplay unit56. 
- Theconcert network controller100 may also comprise a Wi-Fi hotspot120 providing and receiving signals to and from anaudience area14. As further described below, content may be provided both to and fromaudience members4. Audience members may also be informed of information relating to the composite responses from all audience members. Theconcert network controller100 may also interact withremote participants140. In another form, a Wi-Fi system124, discussed below with respect toFIG. 2, couplesaudience members4 to interact with thesystem2. 
- Theconcert network controller100 is preferably wirelessly connected to anevent server130, which can provide communications betweenremote participants140 and theconcert network controller100. The event server is coupled to acontent editor134, which interacts with astaging server136. The stagingserver136 may be coupled to theremote participants140 by a network, for example, theInternet144. 
- Communications will be provided between a target system and a source system. In the present description, “source system” is a device that wishes to send a message to a “target system.” The target system is a device that is configured to receive sent messages via its operating system provided network connection subsystem. The business logic running on the device can select as needed to operate as the target or the source system at any moment. Operating as a source system or target system for a particular messaging transaction does not preclude operating as the other system for a different messaging transaction simultaneously. 
- In a nominal application, thousands ofportable user devices6 may communicate with theconcert network controller100. The communication will provide interaction for intended uses of thesystem2. This alone could strain resources and require expensive T1 access lines far beyond the capacity normally utilized within a concert venue. Providing such capacity would be both expensive and impractical. Additionally,users4 have the option to operate theirportable user devices6 in order to access the Internet and to access cell phone services. It is important to limit bandwidth requirements to accommodate a large number ofportable user devices6. This can be accomplished by disabling access to applications that are not part of the entertainment functions of thesystem2. For purposes of the present description, the applications, contributing to functioning of thesystem2 are referred to as business logic. 
- FIG. 3 is a block diagram illustrating one preferred form of effectively connecting client devices in a network. In this embodiment, individual portable interactive devices300-1 through300-nsuch as smartphones or tablet computers each store an application, or app. Each portable interactive device contains itsown library302 and aprogram memory304 storing anapp306. The portableinteractive device300 includes aprocessor310. Additionally, each portable interactive device may include adisplay316, and agraphical user interface320 of the type normally found on a smartphone. The portableinteractive devices300 each interact via a communications link330 with avideo processor340. In one preferred form the communications link330 comprises the Wi-Fi link120. Interaction is commanded from thecentral server8. 
- The enhanced shared experience by theusers4 may include receiving common displays from theconcert controller100. An example might be each portableinteractive device300 being commanded to display the same solid color. Another alternative is providing different displays to each of a group662 (FIG. 6) of portableinteractive devices300. As further described below, inputs from the portableinteractive devices300 may be read by thecentral server8. Commands may be provided from thecentral server8 to respond to the mood of an event and to create an evolving experience to all participants. 
- In accordance with a further aspect of the present subject matter, theapp306 is provided to coordinate the functions described herein. The operation description acts as a description of the software architecture of theapp306. Theapp306 may be supplied for each portable interactive device through a number of means. It may be supplied via theInternet350 or via acell phone connection352 from asoftware company360. The software company may comprise a software developer.Apps306 that are written by developers may be downloaded from a source such as the iTunes store for iOS phones or Google Play for Android phones. 
- FIG. 4 is a block diagram of the internal circuitry of anominal smartphone400 utilizing an app in accordance with the present subject matter. Aprocessor410 controls operation of thesmartphone400 and communications with the system2 (FIG. 1). Wi-Fi communication is made through anRF module414 coupled to theprocessor410. Thesmartphone400 is preferably of a type equipped with transducers. In one form, theprocessor410 receives condition-responsive inputs from many different sources. These sources may include acamera lens420. Ambient physical parameter sensors may include ahumidity sensor422,gyroscope424,digital compass426,atmospheric pressure sensor428, andtemperature sensor430. Anaccelerometer432 and acapacitive touch sensor434 sense movements made by auser4. 
- Theprocessor410 also interacts with anaudio module440 coupled to amicrophone442 and to aspeaker444. Functions connected with the use of thecamera lens420 and associated circuitry within theprocessor410 include an ambientlight sensor450,flash lamp452, andoptical proximity sensor454. Memories associated with theprocessor410 include adata memory460 and aprogram memory470. Thedata memory460 stores data and provides data as commanded by a program in theprogram memory470. 
- Theapp306 is loaded into theprogram memory470 when downloaded by auser4. When theapp306 is activated by theuser4, thesmartphone400 will respond to commands from thecentral server8. Information will be uploaded from or downloaded to thesmartphone400. When downloading an app, a user grants permissions for the app to access and upload data, control selected functions, and download data. This grant of permissions may be explicit or it may be made by default. 
- Permissions utilized by theapp306 may include permission to: modify or delete USB storage contents; discover a list of accounts known by the phone; view information about the state of Wi-Fi in the phone; create network sockets for Internet communication; receive data from the Internet; locate thesmartphone400, either by reading a coarse network location or a fine GPS location; read the state of the phone including whether a call is active, the number called and serial number of thesmartphone400; modify global system setting; retrieve information about currently and recently running tasks; kill background processes; or discover private information about other applications. Consequently, anapp306 can be provided that will readily access the forms of information discussed below from asmartphone400. The serial number of thesmart phone400 may be used to compose a Unique Identification Number (UID). Other ways of assigning unique identification numbers include assigning numbers asusers6 log on to a session. 
- FIG. 5 is a flowchart illustrating one form of software used for theapp306. Theapp306 may also enable functions further described below. Theapp306 may take many different forms in order to provide the functions of enabling interaction of the portable interactive devices300 (FIG. 3) and thecentral server8. Atblock500, a command input is created and sent to thecentral server8. The command may be created by theVJ70 or invoked by an automated program. Atblock502, the command input accesses a stored command which is transmitted from thecentral server8 via the Wi-Fi transmitter120 (FIG. 1A). Atblock504, the command signal is received by the RF module414 (FIG. 4) in thesmartphone400. The command signal is translated to theprogram memory470 atblock506 in order to access a command. Atblock508, entitled “access data,” the command is provided from theprogram memory470 in order to access appropriate locations in thedata memory460 for uploading or downloading information. Atblock510, the data that is the subject of the command is exchanged between thecentral server8 and thesmartphone400. 
- FIG. 6 is a detailed partial view of the block diagram inFIG. 1 illustrating thecentral server8 and interconnections. Thecentral server8 is coupled via adata bus600 to themedia system18 in order to receive preprogrammed selections or selections made by theVJ70. Thecentral server8 is also coupled via thedata bus600 to theconcert network controller100. Thecentral server8 sends commands and information via theconcert network controller100 to the portableinteractive devices6 and receives information from the portableinteractive devices6. Thecentral server8 comprises adata memory610 and aprogram memory620. Aprocessor630 coordinates operations. The specific requests may be made by theVJ70 through aGUI650 at themedia system18. 
- Particular parameters to be requested in order to achieve varying compounds or results are explained further with respect toFIGS. 7-9. 
- Thecentral server8 may permit users to log-in to their Facebook accounts when joining a session. Using the permissions described above, thecentral server8 scrapes demographic information about them such as age, gender, location, and stored information from their device such as favorite musical artist, number of prior shared, enhanced experience events attended, or other information. Selected information may be shown graphically on thedisplay22 as shown inFIG. 11 to inform theusers4 of audience information. 
- The system illustrated inFIG. 6 supports messaging betweenindividuals4 in the audience.Users4 are allowed to initiate contact with other audience members. Auser4 may operate the GUI372 (FIG. 3) to select a particular audience member or to select an entire category or request theserver300 to produce a new category362 in accordance with the wishes of theuser4. Information is gathered and used as described with respect toFIG. 6 above andFIGS. 10 and 11 below to allow auser6 to search for characteristics ofother users6. For example, auser6 can request a display of a graph of gender and age and choose to see only the list of men between the ages of 30 and 50 who have joined the session. Theuser6 may post a text message to only that subset of users. 
- In order to group devices, a device family segregation method is employed. The controller-only framework includes provision to segregate client devices into arbitrary groupings based on any provided metric.Groups662 may be generated from any available information. For example, agroup662 may include selected identification numbers of portableinteractive devices6. Agroup662 may include location data of eachuser device6 as received from the device. Anothergroup662 may comprise demographic information gathered from theuser devices6. In order to address a selected segment of an audience, data from a selectedgroup662 is mapped into an address list. This selection may be made by theVJ70 via thecentral server8. Displays may be provided to theuser devices6 so that anindividual group662 receives an input. Additionally, different groups may be provided with different information or displays in order to produce a composite display for all or part of avenue10.Groups662 may also consist ofindividual users4. 
- Criteria for establishing agroup662 include the component or other users' role for interaction with the group. Components or other users to be factored into criteria include the concert network controller, Jumbotron®, OCS, remote client, gender, age, location within venue, device type, e.g., iOS, Android, HTML5, Windows, OSX, or Linux, device families including iPhone 3G, 3GS, 4/4S, 5, iPad, iPad2, and iPad3 and random selection. In addition, the framework allows the creation, collection, and persistence of arbitrary metrics for use as a segregation criterion. Individual devices can be added or removed from any grouping as desired.Groups662 may be mutually exclusive or can overlap in any manner desired. Once defined, the grouping can be used as targets for specific commands. 
- Simplified access to selectedusers4 is provided. Additionally,users4 can be enabled to request specific data. The sorting function supports the creation of arbitrary data elements such as groupings or commands for later reference by name. This can be used to send a specific complex or often used command or to define a grouping of devices. 
- FIGS. 7,8, and9 are each a flow diagram illustrating interactive applications within the present system. 
- A first example of an interaction producing a composite result is illustrated inFIG. 7. In this case, theVJ70 addresses thecentral server8 via the GUI650 (FIG. 6) in order to select a “timed group photo command.” This will initiate a countdown period at the end of which the portableinteractive devices6 will be commanded to take a flash picture. Atblock700, theVJ70 initiates the command. Atblock702, the command is translated via thedata bus600 to address the processor630 (FIG. 6). Atblock704 the timed group photo command is accessed from theprogram memory620. Also atblock704, the command is coupled to theconcert controller100 for transmission by the Wi-Fi transceiver124 (FIG. 1A). Atblock706, the command is received by the RF module414 (FIG. 4) in thesmartphone400. Atblock708 the command is coupled via theprocessor410 to access appropriate commands from theprogram memory470, and more specifically from theapp306. Theprogram memory470 operates thesmartphone400 by coupling signals to appropriate modules atblock720, which containsblocks722 and724. A first portion of the command withinblock720 is atblock722 at which a counter in theprocessor410 initiates a countdown and produces an image on thedisplay416. The portableinteractive devices6 play a synchronous countdown message such as a countdown, “3 . . . 2 . . . 1 . . . ” At the end of the countdown, operation proceeds to a second portion of theblock720, namely block724. Atblock724 theflash452 in eachsmartphone400 is activated. Thesmartphones400 all flash at substantially the same time. A system and method for executing a command at the same time is disclosed in commonly owned patent application serial number 2053U11, the disclosure of which is incorporated herein by reference. Theprocessor410 enables optical information from thecamera lens420 to be loaded intodata memory460 to store a picture. 
- Further in accordance with the timed group photo command, atblock730 each picture is transmitted via theRF module414 for transmission back to theevent server130, which contains memory for storing the received pictures. Atblock732, various tags may be added to each picture. Most commonly, the tag will comprise a timestamp. Processing is performed atblock734 in order to obtain a result which comprises a composite of the interaction of the portableinteractive devices6 and thesystem2. This result may take many forms. For example, theVJ70, atblock734, can create collages and other photographic displays of the resulting images. As indicated in the loop connection fromblock736 back to block734, this operation can be repeated over the course of an event. This allows accumulation of a large photo archive of the event itself on theevent server130 or thecentral server8. The stored images can also be mined for souvenir material after the event. This data can be used to create a searchable, time-stamped record of the event which can be published later. 
- Another form of interaction is illustrated inFIG. 8. Atblock760, the VJ initiates a command file via the graphical user interface650 (FIG. 6) in order to choose a set of portableinteractive devices6 to be commanded. Atblock762, a command is issued for enabling the imaging function on the selected portableinteractive devices6 and for commanding activation of thecamera flash452, indicating the location of the active cameras within the audience. Atblock764, the signal is transmitted to theRF module414 in each selected portableinteractive device6. At block766 the command signal is translated to theprocessor410, and the imaging function is executed. 
- Atblock770, the images obtained are sent to theuser server130 and may be sent to thedata memory610 in thecentral server8. At aprocessing block772 theVJ70 processes the images. Atblock774 images are displayed on thebig screen50 or sent to all of the audience or to selected members of the audience. 
- The system will provide the ability for users who employ “client devices” to upload content to the system. Examples of uploaded content include photos, or short video clips. Users may access content for uploading in a variety of ways. Users may take pictures from their client devices. They may access shared material from social media. The user may access a storage area of a client device and may select pictures or other items from storage files. When a first and a second user are connected in selected social applications, each user may also choose content from the other user's library. 
- Uploaded content may be reviewed at the controller100 (FIG. 1A). Content review may take many forms. Content review may be manual, automated, or both. TheVJ70 and automated criteria measurement subsystems can browse through uploaded submissions. The uploaded submissions may be handled individually or in groups. 
- FIG. 9 illustrates a further form of composite result. A game is implemented based on physical actions of audience members. In one form the game compares howmuch energy users4 can apply to theirrespective smartphones400. 
- Atblock800, a command is selected by theVJ70. Many forms of “game-like” interactions can be commanded. Upon a command from thecontroller100 atblock802, an interaction called “Shaking,” is initiated when a command is issued to enable reading ofsensor accelerometers432 insmartphones400. Eachdevice400 provides a message to its user on itsdisplay416 that says “Shake your phone!” Eachuser4 then begins shaking therespective smartphone400. The issued command derives output fromaccelerometer432 of eachsmartphone400 and transmits data back to thecontroller100. Atblock804, theaccelerometers432 are read and information is sent back to thecentral server8. Theaccelerometers432 of thesmartphones400 provide a substantially real-time dataset of the rates of motion and amounts of kinetic energy being expended by the audience members. Tags may be attached atblock806. Data is stored at thecentral server8 atblock808. In a loop, data from thecentral server8 is integrated atblock810 and stored again atblock808 to provide updated processed data. Rule based processing is used atblock812 to determine preselected information derived from processing the stored data. For example, data indicative of auser4's movements may be used to provide a characterization of the kinds of kinetic energy being created by users, who are either dancing, or swaying, or waving their hands, or standing around idly. 
- A rule is applied over successive operating cycles, usually clock periods, to update status and keep games current. 
- This information can be processed in a number of ways. For example, a ranking can be assigned to the physical attributes such as applying the most energy to each smartphone or maintaining the most constant rhythm, or performing according to some other criterion. This process can assign a ranking of all the participating devices. TheVJ70 can then command the display of the “winners” or “leaders” of the ranking. 
- Thecentral clock9 also allows solution of problems in traditional distributed multiplayer gaming. In a shooting game in which players are instructed to “draw-and-shoot” their weapons as soon as they see a special signal appear either on thebig screen50 at thevenue10 or on their respectiveportable user devices6. A timestamp signal from thecentral clock9 may be associated with each “BANG” message at the time a response is commanded from auser device6. A winner is determined by comparison of timestamps rather than by the arrival time of their “BANG” messages at thecentral server8. 
- Another game is a scavenger hunt. In one form of scavenger hunt games, agroup662 contains a code which anothergroup662 requires to complete a game or puzzle. A “scavenger hunt” may be implemented by providing codes associated with a subset ofusers4 which another subset ofusers4 requires to complete a game or puzzle. The second set ofusers4 has to “ping” other users to access the required code. 
- These interactions promote interplay and communication in the physical space among people attending the event. By addressing messages toindividual devices6, lotteries can be conducted and winners and losers can be informed of individual outcomes. 
- Utilizing device addressing also facilitates messaging between individuals in the audience either one-on-one, one-to-many, or many-to-one. In this example, “many” comprises agroup662. 
- FIG. 10 is a flow chart illustrating gathering of data frominteractive devices6.FIG. 11 is an illustration of adisplay950, which may be produced by the method ofFIG. 10.FIGS. 10 and 11 are discussed together. This technique may be used to produce “dynamic statistics.” Atblock900, theVJ70 issues a command to gather data. The selected systems within thesmartphones400 are queried atblock902. Atblock904, data is received and sent to thedata memory610 in themain server8. 
- Atblock906 selected data can be arranged for display. Atblock908 data is displayed. For example,plot954 is an illustration of distributions of kinetic energy readings received from thesmartphones400. Displays may be provided on thelarge screen50 as well as on thedisplays416 on thesmartphones400. Since theapp306 in one preferred form has a wide range of permissions, virtually any data within asmartphone400 can be accessed. This data can include information scraped from the Facebook social graph, such as a map of home locations represented in a crowd, as seen in the map graphic956. Statistics may be repeatedly collected from the contents oflibraries302 across all the devices, statistics about local light/sound levels over time, or statistics about accelerometer information over time. The repeated collection updates computed values, thus providing dynamic statistics. 
- For example, the act of dancing may be measured byaccelerometers432 instrumentation in thesmartphones400. The processor may register measurements to determine the top five most active “dancers” in the audience. By virtue of the downloading of social network information corresponding to particular users, the system may access the Facebook picture of each of the five most active “dancers,” as seen inpanel960. Another form of statistical information can be gathered by the geolocation transducers in thesmartphone400. The system can measure an amount of physical movement of eachsmartphone400 and then display alist962 of the most “restless” users. 
- The above description is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these aspects will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other aspects without departing from the spirit or scope of the invention. For example, one or more elements can be rearranged and/or combined, or additional elements may be added. A wide range of systems may be provided consistent with the principles and novel features disclosed herein.