The application is according to 35U.S.C. § 119(e) require following U.S. Provisional Application No.: the 61/389th, No. 000, on October 01st, 2010 submit to be entitled as " dual screen window system "; The 61/389th, No. 117, on October 01st, 2010 submit to be entitled as " the portable the service of connection devices of multiple operating system "; The 61/389th, No. 087, on October 01st, 2010 submit to be entitled as " panel computer user interface "; The 61/458th, No. 150, on November 17th, 2010 submit to be entitled as " double screen email client "; XX/XXXXXX number, in be entitled as " mobile device of submitting to 09 month XX day in 2011 ".Each whole instructions and whole purposes in the aforementioned document is incorporated in this in full with it by reference.
Embodiment
In this paper is the embodiment of equipment.This equipment can be communication facilities, as cell phone, or other smart machine.This equipment can comprise two screens be used to the demonstration configuration that is directed to provide several uniquenesses.In addition, this equipment can receive user's input with unique mode.The global design of equipment and function provide the user of enhancing to experience, and make equipment more useful and more efficient.
Mechanical equivalent of light feature:
Figure 1A-1J showsequipment 100 according to embodiment of the present disclosure.As described in greater detail below,equipment 100 can be positioned in some different modes, its each provide different functions for theuser.Equipment 100 is the multi-screen equipment that comprisesmain screen 104 andauxiliary screen 108, and the two all is touch-sensitive.In an embodiment,screen 104 and 108 whole front surface can be touch-sensitives, and can receive the input on the front surface oftouch screen 104 and 108 by theuser.Main screen 104 comprises touch-sensitive display 110, and it also shows information to the user except being the touch-sensitive.Auxiliary screen 108 comprises touch-sensitive display 114, and it also shows information to the user.In other embodiments,screen 104 and 108 viewing areas that can comprise more than.
Main screen 104 also comprises configurable regional 112, this configurable regional 112 specific input that has been arranged to when the user touches the part of configuring area 112.Auxiliary screen 108 comprises that also configuration comes for configurable regional 116 of specific input.Zone 112a and 116a have been configured to receive " returning " input that the indication user wants to check the information that showed in the past.Zone 112b and 116b have been configured to receive " menu " input that the indication user wants to check menu option.Zone 112c has been configured to receive " homepage " input that the indication user wants to check the information relevant with " homepage " view with 116c.In other embodiments, except above-mentioned configuration, zone 112a-c and 116a-c can be arranged to the specific input of other type of the characteristic that comprises opertaing device 100, and some non-limiting examples comprise the power supply of adjusting total system, adjust volume, adjust brightness, adjust vibration, select items displayed (on screen 104 or 108), operation camera, operation microphone and initiation/termination telephone calling.And in certain embodiments, regional 112a-C and 116a-C can be arranged to according to the application program of operation on the equipment 100 and/or be presented on touch-sensitive display 110 and/or 114 the specific input of the information that shows.
Except touch-sensing,main screen 104 andauxiliary screen 108 can also comprise reception from the zone of user's input, and do not need the viewing area on user's touch screen.For example,main screen 104 comprisesgesture capture region 120, andauxiliary screen 108 comprises gesture capture region 124.These zones can receive input by the gesture that the identification user makes, and need not the surface of user's actual touch viewing area.Compare with 114 with touch-sensitive display 110,gesture capture region 120 and 124 can not present the image of demonstration usually.
As (rear view ofequipment 100 is shown) that Fig. 1 C is clearly shown that, twoscreens 104 and 108 are linked together by hinge 128.Hinge 128 in the embodiment shown in Figure 1A-1J is the center hinge that connectscreen 104 and 108, makes when hinge is closed, and (front view ofequipment 100 is shown) as shown in Figure 1B,screen 104 and 108 is set up in parallel (that is, side by side).Can hinge-opening 128 so that twoscreens 104 are positioned at different relative positions with 108.As in greater detail following,equipment 100 can have different functions with 108 relative position according toscreen 104.
Fig. 1 D shows the right side of equipment 100.As shown in Fig. 1 D,auxiliary screen 108 also is included incard slot 132 and the port one 36 of itsside.Card slot 132 among the embodiment holds dissimilar cards, comprises subscriber identity module (SIM).Port one 36 in an embodiment is input/output end port (I/O ports), and itspermission equipment 100 is connected to other peripherals, such as display, keyboard or printing device.Be understandable that these only are some examples, in other embodiments,equipment 100 can comprise other slots and the port that holds extra memory device and/or connect other peripherals such as being used for.Fig. 1 D also showsaudio jack 140, and describedaudio jack 140 can hold for example end, ring, sleeve (TRS) connector, utilizes headphone or headsets to allow the user.
Equipment 100 also comprises a plurality of buttons 158.For example, Fig. 1 E shows the left side of equipment 100.As shown in Fig. 1 E, main screen 104 sides comprise three buttons 144,148 and 152, and they can be arranged to specific input.For example, button 144,148 and 152 can be configured to make up or some aspects of opertaing device 100 separately.Some nonrestrictive examples comprise the startup/termination of selection (on screen 104 or 108), camera, microphone and call of power supply, volume, brightness, vibration, the items displayed of total system.At some embodiment, replace independent button, two buttons can be combined into the rocking bar button.It is this that to be arranged under the situation of feature that button is configured to control example such as volume or brightness and so on be useful.Except button 144,148 and 152, as show shown in Fig. 1 F at top of equipment 100, equipment 100 also comprises button 156.In one embodiment, button 156 is configured to the on/off button for the power supply of the total system of opertaing device 100.Except or replace the control system power supply, in other embodiments, button 156 is configured to other aspects of opertaing device 100.In certain embodiments, one or more buttons 144,148,152 can be supported different user commands with 156.For example, normally press and have usually less than about 1 second duration, and be similar to fast and rap.Medium pressing had common 1 second or above but be less than about 12 seconds duration.Long presses the duration that has usually about 12 seconds or longer time.The function of button normally application program is specific, and described application program is current focus on each display 110 and 114.For example, in phone application, according to specific button, normal, medium or long pressing can mean and finish conversation, the increase of call volume, and minimizing and the switch microphone of call volume are quiet.For example, in the application of camera or video camera, according to specific button, normal, medium or long pressing can mean the increase zoom, reduces zoom, and takes pictures or recording of video.
Also have some nextport hardware component NextPorts in the equipment 100.Shown in Fig. 1 C,equipment 100 comprisesloudspeaker 160 andmicrophone 164.Equipment 100 also comprises camera 168(Figure 1B).In addition,equipment 100 comprises: two position transducer 172A and 172B, they are used to determine the relative position ofscreen 104 and 108.In one embodiment, position transducer 172A and 172B are hall effect sensors.Yet in other embodiments, other sensor can be additional to or replace hall effect sensor touse.Accelerometer 176 be can also comprise as the part ofequipment 100, the direction ofequipment 100 and/or the direction ofscreen 104 and 108 are used for determining.Around Fig. 2 the additional internal nextport hardware component NextPort that can be included in theequipment 100 is described below.
The extra function that it can provide other communication facilities not provide is provided in the global design of equipment 100.Some functions are based on all places and the direction thatequipment 100 can have.Shown in Figure 1B-1G,equipment 100 can be operated in the position of " opening ", and whereinscreen 104 and 108 is side by side.This position can make big viewing area be used for demonstration information and give the user.When position transducer 172A and 172B determined thatequipment 100 is shown in an open position, they can produce to trigger the signal of different events, as demonstration information on twoscreens 104 and 108.Ifaccelerometer 176 determines thatequipment 100 in the lengthwise position (Figure 1B) relative with the lateral attitude (not shown), then may trigger extra event.
Except open position,equipment 100 also may have " closing " position, shown in Fig. 1 H.Equally, position transducer 172A and 172B can produce the signal that indicatingequipment 100 is in " cutting out " position.This can trigger the event of the demonstration change in information that causes onscreen 104 and/or 108.For example, because the user once can only check a screen whenequipment 100 is in " closing " position, soequipment 100 can be programmed therein and stops demonstration information on the screen (for example,screen 108).In other embodiments, the indicatingequipment 100 that is produced by position transducer 172A and 172B is in the signal of " cutting out " position, and theequipment 100 that can trigger is answered incoming call." closing " position can also be used to the optimum position that utilizesequipment 100 as mobile phone.
Shown in Fig. 1 I,equipment 100 also can use the position at " support ".In the position of " support ",screen 104 and 108 is relative to each other at an angle to each other and towards the outside,screen 104 and 108 edge approximate horizontal.On this position,equipment 100 can be configured to demonstration information onscreen 104 and 108, and is simultaneously mutual withequipment 100 to allow two users.Whenequipment 100 is positions at " support ", sensor 172A and 172B generateinstruction screen 104 and 108 and are positioned at an angle to each other signal, andaccelerometer 176 can produce indicatingequipment 100 and has been placed and makes that the edge ofscreen 104 and 108 is the signal of level basically.The signal use that can be combined then is to produce the event of the change in information that is used for triggering the demonstration onscreen 104 and 108.
Fig. 1 J is illustrated in theequipment 100 of the position of " support of modification ".In the position of " support of modification ",screen 104 or one of 108 is as base and face down on the object surfaces such as desk.This position provides a kind of mode easily that shows information in a lateral direction to the user.Similar to backing positions, whenequipment 100 was in " support of modification " position, position transducer 172A and 172B generatedinstruction screen 104 and 108 and are positioned in signal at an angle to eachother.Accelerometer 176 will generate indicatingequipment 100 and be positioned, made one ofscreen 104 and 108 to face down and be the signal of level basically.Signal can be used to generate the event that the demonstration of the information that triggersscreen 104 and 108 changes then.For example, because the user can not see screen, so information can not be displayed on the ventricumbent screen.
Transition state also is possible.When position transducer 172A and B and/or accelerometer indicated number screen (from opening) are being closed or are folding, confirm the transition state of closing.On the contrary, as position transducer 172A with the B instruction screen just is being opened or when folding (from closing), confirm the transition state of opening.The transition state of closing and opening is normally time-based, or begins that from the starting point that senses the maximum duration is arranged.Usually, when to close with one of open mode be effective, what do not have that the user imports was possible.In this mode, the user of the chance between the functional period of closing or opening of screen contact is not misinterpreted as user's input.In an embodiment, whenequipment 100 was closed, another kind of transition state was possible.Whenequipment 100 was closed, based on some user's inputs (for example double-click on the screen 110,114), this extra transition state can switch toauxiliary screen 108 from ascreen 104 with display.
Be understandable that the description ofequipment 100 only is used for illustrative purposes, embodiment is not limited to the specific mechanical property shown in aforesaid and Figure 1A-1J.In other embodiments,equipment 100 can comprise extra feature, comprising one or more extra buttons, slot, viewing area, hinge and/or locking mechanism.In addition, in an embodiment, above-mentioned feature also can be positioned at the different piece ofequipment 100, and similar function still is provided.Therefore, Figure 1A-1J is nonrestrictive with the description that provides above.
Ardware feature:
Fig. 2 illustrates the assembly according to the equipment 100 of embodiment of the present disclosure.In the ordinary course of things, equipment 100 comprises main screen 104 and auxiliary screen 108.Though normally enable main screen 104 and its assembly under two positions or the state opening and closing, normally under the state of opening, enable auxiliary screen 108 and its assembly and forbidding down in off position.Yet, even when following time in off position, the interruption of user or application triggers (for example, in response to the operation of phone application or camera applications) by suitable order can the upset activity screen, or forbidding main screen 104 and enable auxiliary screen 108.Each screen 104,108 can be touch-sensitive, and can comprise different operating areas.For example, first operating area in each touch-sensitive screen 104 and 108 can comprise touch-sensitive display 110,114.In the ordinary course of things, touch-sensitive display 110,114 can comprise the touch-sensitive display of full color.Second operating area in each touch screen 104 and 108 can comprise gesture capture region 120,124.Gesture capture region 120,124 can comprise outside touch-sensitive display 110,114 zones and zone or scope that input (for example, the form of the gesture that provides with the user) can be provided.Yet gesture capture region 120,124 does not comprise can carry out Presentation Function or ability pixel.
Thescreen 104 of touch-sensitive and 108 the 3rd zone can comprise configurable regional 112,116.Configurable regional 112,116 can receive input, and have demonstration or limited display ability.In an embodiment, configurable regional 112, the 116 different input options that can present to the user.For example, configurable regional 112,116 can the Show Button or other relevant entry.In addition, the sign of the button of demonstration, or any button whether be displayed on touchsensitive screen 104 or 108 configurable regional 112,116 in, can from the context thatequipment 100 is used and/or operates, determine.In the exemplary embodiment, touchsensitive screen 104 and 108 comprise thescreen 104 of crossing over touch-sensitive at least and 108, can those regional liquid crystal displays of vision output be provided and on touchsensitive screen 104 and 108 those zones, can receive electric capacity input matrix from user's input to the user.
One ormore display controller 216a can be provided, and 216b controls the operation of touchsensitive screen 104 and 108, comprises the function of input (touch-sensing) and output (demonstration).In exemplary embodiment as shown in Figure 2, for eachtouch screen 104 and 108 provides independenttouch screen controller 216a or 216b.According to alternate embodiment, touch screen controller 216 common or that share can be used for touchsensitive screen 104 that control is included and each of 108.According to other embodiment, the function of touch screen controller 216 can be merged in other assemblies, such asprocessor 204.
Processor 204 can comprise general purpose programmable processors or the controller for executive utility programming or instruction.According at least some embodiment,processor 204 can comprise a plurality of processor cores, and/or realizes a plurality of virtual processors.According to other embodiment,processor 204 can comprise a plurality of concurrent physical processors.As specific example,processor 204 can comprise computing machine of the ASIC(Application Specific Integrated Circuit) (ASIC) of special configuration or other integrated circuit, digital signal processor, controller, hard-wired electronics or logical circuit, programmable logic device (PLD) or gate array, specificuse etc.Processor 204 works to move programming code or the instruction of the various functions ofrealization equipment 100 usually.
Communication facilities 100 can also comprisestorer 208, and it is used for the execution be associated withprocessor 204 application programmings or instruction, and is used for the temporary transient or longer-term storage of programmed instruction and/or data.As example,storer 208 can comprise RAM, DRAM, SDRAM or other solid-state memories.Alternatively or additionally, can provide data storage device 212.Be similar tostorer 208,data storage device 212 can comprise solid-state memory device.Alternatively or in addition,data storage device 212 can comprise hard disk drive or other random access memory.
Aspect support communication function or ability,equipment 100 can comprise cell phone module 228.As example,cell phone module 228 can comprise can be by GSM, CDMA, FDMA and/or the analog cellular telephone transceiver of cellular network support voice, multimedia and/or data transmission.Replacedly or additionally,equipment 100 can comprise additional or other wireless communication module 232.As example, otherwireless communication module 232 can comprise Wi-Fi, bluetooth TM, WiMax, infrared ray or other wireless communication link.Incell phone module 228 and other thewireless communication module 232 each can be associated with shared or special-purpose antenna 224.
Can comprise port interface 252.Port interface 252 can comprise thatsupport equipment 100 is interconnected to the proprietary or general port of other equipment or assembly (as docking adapter (dock)), and other equipment or assembly can or can not comprise additional function or the function different with the function of the equipment of being integrated into 100.Except the exchange ofsupport equipment 100 and other equipment or communication between components signal, craft port (docking port) 136 and/orport interface 252 can be supportedequipment 100 or supply with from the power supply of equipment 100.Port interface 252 also comprises the element of intelligence, this element comprise foropertaing device 100 and the equipment that is connected or communication between components or other mutual to connection module.
Can comprise input/output module 248 and associated ports with for example support with other communication facilities, server apparatus and/or peripherals pass through cable network or communicating by letter of linking.The example of input/output module 248 comprises ethernet port, USB (universal serial bus) (USB) port, Institute for Electrical and Electronics Engineers 1394 or other interfaces.
Can comprise that audio frequency input/output interface/equipment 244 thinks that be mutually related loudspeaker or other equipment provide analogue audio frequency, and receive the analogue audio frequency input from the microphone that connects or other equipment.As example, audio frequency input/output interface/equipment 244 can comprise amplifier and the analogue-to-digital converters that are associated.Replacedly or additionally,equipment 100 can comprise integrated audio input/output device 256 and/or be used for audio jack with external loudspeaker or microphone interconnection.For example, can provide integrated loudspeaker and integrated microphone, with near conversation or the speakerphone operation supporting.
Can comprisehardware button 158, with for example related use with specific control operation.As in conjunction with the description of Figure 1A to 1J, example comprises main power switch, volume control etc.Can comprise that one or more picture catching interface/equipment 240(is such as camera), be used for catching static and/or video image.Alternatively or additionally, picture catching interface/equipment 240 can comprise scanner or code reader.Picture catching interface/equipment 240 can comprise or can be associated with extra element (such as flashlamp or other light source).
Equipment 100 can also comprise GPS (GPS) receiver 236.According to embodiments of the invention, the GPS module of other assemblies that can provide absolute location information to arriveequipment 100 is provided gps receiver 236.Can also comprise accelerometer 176.For example, and show information and/or other functions to the user explicitly, can be used for determining to show to the user direction and/or the form of this information from the signal ofaccelerometer 176.
Embodiments of the invention can also comprise one or more position transducers 172.Position transducer 172 can provide the signal of indication touchsensitive screen 104 and 108 position relative to each other.This information can be used as input and offers for example user-interface application program, to determine touch-sensitive display 110,114 operator scheme, characteristic and/or 100 operations of other equipment.As example,screen position sensor 172 can comprise a series of hall effect sensor, multiposition switches, photoswitch, and Wheatstone bridge, potentiometer maybe can provide other layouts of the signal of a plurality of relative positions of indicating the touch screen place.
The various communication between components ofequipment 100 can be carried out by one or more buses 222.In addition, can provide the assembly of power toequipment 100 from power source and/or power control module 260.Power control module 260 can (for example) comprise battery, AC-DC converter, power control logic and/or be used forinterconnect equipment 100 to the port of outside power supply.
Equipment state:
The exemplary status of Fig. 3 A and 3B indication equipment 100.Though some exemplary states are shown, and the transition from first state to second state, be appreciated that all possible state and/or all possible transition from first state to second state may not be contained in the constitutional diagram of example.As shown in Figure 3, the physical change that differentarrow indication equipment 100 between the state (being illustrated by the state of representing in circle) takes place, described variation is detected by one or more hardware and softwares, described detection triggers the one or more interruptions in hardware and/or the software, and described interruption is used for one or more functions of control and/ormanagement equipment 100.
As shown in Figure 3A, 12 exemplary " physics " states are arranged: close 304, transition 308(or open transition state), support 312, the support of revising 316, open 320, incoming call/outgoing call or communicate byletter 324, image/video catches 328, transition 332(or close transition state), horizontal 340, butt joint 336, butt joint 344 and horizontal 348.Exceptstate 324 and 328, the state next door shown in each is the diagram of the physical state ofequipment 100, and the state in thestate 324 and 328 is represented respectively by the international icon of phone and the icon of camera usually.
Instate 304, this equipment is in off position, andequipment 100 is directed in a longitudinal direction usually,main screen 104 andauxiliary screen 108 back-to-back on different planes (referring to Fig. 1H).Equipment 100 can enter for example matedcondition 336 from closed condition, whereinequipment 100 is coupled to Docking station, butt joint cable, or usually connect with one or more other equipment or peripherals or related, or enter transverse state 340, whereinequipment 100 is oriented tomain screen 104 user orienteds usually, andmain screen 104 andauxiliary screen 108 are back-to-back.
Down, this equipment also can be transferred to transition state in off position, and wherein this equipment keeps closing, but shows that importing (for example double-click on screen 110,114) based on the user transfers to anotherscreen 108 from a screen 104.Another embodiment of the present invention comprises bilateral (biliteral) state.Under bilateral state, this equipment is still closed, but single application program shows a window atfirst display 110 andsecond display 114 at least.Window displayed on first and second displays 110,114 can be identical or different based on the state of application program and this application program.For example, when obtaining image with camera, this equipment can show view finder atfirst display 110, and shows the preview (full frame and left-to-right mirror image) of photo theme atsecond display 114.
Atstate 308, that is, fromclosed condition 304 to semi-open state or the transition state of support state 312,equipment 100 is shown opens, itsmain screen 104 andauxiliary screen 108 rotate around the point of the axis that overlaps with hinge.In case enter support state 312,main screen 104 andauxiliary screen 108 are separated from one another, make that forexample equipment 100 can be positioned on the surface with the structure of similar support.
Atstate 316, the backing positions that is called modification,equipment 100 has and similarmain screen 104 andauxiliary screen 108 relativeness to each other in support state 312, and its difference is that inmain screen 104 or theauxiliary screen 108 is placed from the teeth outwards, as shown in the figure.
State 320 is open modes, and whereinmain screen 104 andauxiliary screen 108 are normally at grade.Thetransverse state 348 thatequipment 100 can carry out the transition to matedcondition 344 or open from open mode.Inopen mode 320, usually all on similar direction longitudinally, and intransverse state 348,main screen 104 andauxiliary screen 108 are normally on similar horizontal direction atmain screen 104 andauxiliary screen 108.
State 324 is diagrams of communications status, for example whenequipment 100 receives just respectively or be in incoming call or exhalation.Though for clarity sake not shown, be to be understood that thecall state 324 thatequipment 100 can be from any status transition shown in Figure 3 to incoming call/exhalation shown in the figure.In a similar fashion, can enter image/video trap state 328 by any other state from Fig. 3, image/video trap state 328 makesequipment 100 take one or more images and/or utilizevideo capturing device 240 capture video by camera.
Transition state 322 schematically illustratedmain screens 104 andauxiliary screen 108 are closed to enter for exampleclosed condition 304.
Reference key message in the reference diagram, Fig. 3 illustrate the input that is received to detect the transition from first state to second state.In Fig. 3 B, the various combinations of state are shown, on the whole, the directedvertical state 352 of the part of row,transverse state 356, the directedvertical state 360 of a part and thetransverse state 364 of row.
In Fig. 3 B, key message indication " H " expression is from the input of one or more hall effect sensors, " A " expression is from the input of one or more accelerometers, " T " expression is from the input of timer, " P " expression communications triggered input, " I " presentation video and/or video capture request input.Therefore, at themiddle body 376 of chart,indication equipment 100 is shown how detects input or the input combination that carries out the transition to second physical state from first physical state.
As discussing, at the middle body ofchart 376, the input that receives enable from for example vertically open mode to the detection of the transition of horizontal support state (" HAT " that show with runic).This exemplary transition for from the state that vertically is opened to horizontal support may need hall effect sensor (" H "), accelerometer (" A ") and timer (" T ") input.The timer input can obtain from the clock that for example is associated with processor.
Except the vertical and horizontal state, also show matedcondition 368, it is triggered based on the reception of docking signal 372.As discussed above, and contact Fig. 3, can be byequipment 100 and one or more other related butt joint signals that trigger ofequipment 100, accessory, peripheral hardware, intelligent docking adapter etc.
User interactions:
Fig. 4 A to 4H describes the various diagrammatic representations of the gesture input that screen 104,108 can identify.These gestures can be not only carried out by user's body part (such as finger), also can be carried out by other equipment such as pointer, described pointer can by screen 104,108 contact sensing portion sensed to.Generally speaking, where carry out (directly on the display 110,114 or in gesture capture region 120,124) according to gesture, gesture is differently explained.For example, can be directed to desktop or application program in display 110,114 gesture, gesture capture region 120,124 gesture can be interpreted as for system.
With reference to figure 4A-4H, the gesture of the first kind, touch gestures 420 is at screen 104,108 static basically on a selected time span.Touch or other contact types that the specific location of the contact detecting on the circle 428 expression screens receives.Circle 428 can compriseborder 432, and the thickness indication onborder 432 keeps static time span basically in this contact of contact position.For example, rap 420(or short by) have than long by 424(or press normally) thethinner border 432a of border 432b.Long can relating to by 424, keep the rest time section than rapping 420 longer contacts on screen basically.As being understood that, can be according to the gesture that contact stops or the static time span of touch maintenance is registered different definition before the movement on the screen.
With reference to Fig. 4 C, on screen 104,108 to draggesture 400 are initial contacts (by circle 428 expressions) and move 436 in the contact of selected direction.Initial contact 428 can be static in maintenance on the screen 104,108, in a certain amount of time ofborder 432 expressions.Dragging gesture needs the user to contact icon, window or other demonstration image in primary importance usually, subsequently, moves on the desired new second place of selected demonstration image in the drawing direction contact.As long as be continuous basically from first contact to the second place, described contact movement needn't be point-blank, but any mobile route is arranged.
With reference to Fig. 4 D, be that initial contact (by circle 428 expressions) and the mobile 436(of the contact of blocking on selected direction are with respect to dragging gesture in 104,108 thegesture 404 of flicking on the screen).In an embodiment, compare with dragging gesture, flicking in the last movement of gesture has higher rate of withdraw.For example, flicking gesture can be that initial contact back finger knocks fast.Compare with dragging gesture, the gesture of flicking do not need usually from the described primary importance of shown image to the predetermined second place, with screen 104,108 continue contact.The demonstration image of contact is moved to the predetermined second place by the gesture of flicking in the direction of the gesture of flicking.Though two gestures can move shown image usually from the primary importance to the second place, aspect duration and the travel distance that contacts on screen, flick gesture usually than dragging gesture still less.
With reference to Fig. 4 E, described the gesture 408 of handling knob on screen 104,108.The gesture of handling knob 408 can (for example by first finger) to screen 104,108 first contact 428 and (for example by second finger) to screen 104,108 second contact the 428b initiation.First andsecond contact 428a, the b can be by common screen 104,108 common contact sensing portion, by the different contact detecting ofcommon screen 104 or 108, or is detected by the different contact detecting of different screen.Shown in the 432a of border, thefirst contact 428a is held very first time amount, and shown in the 432b of border, thesecond contact 428b is held second time quantum.First and second time quantums are normally substantially the same, and first andsecond contact 428a, the b occur usually basically simultaneously.First andsecond contact 428a, the b also comprise corresponding first and second contact mobile 436a, the b usually respectively.First and second contact mobile 436a, the b are usually in opposite direction.Change kind of a mode, the mobile 436a of first contact is towards thesecond contact 436b, and the mobile 436b of described second contact is towards the first contact 436a.More briefly, the gesture 408 of handling knob can be made touch screen 104,108 with pinching by user's finger and finishes.
Described expansion gesture 410 on screen 104,108 with reference to Fig. 4 F.Expansion gesture 410 can by (for example by first finger) to screen 104,108 thefirst contact 428a and (for example by second finger) to screen 104,108 second contact the 428b initiation.First andsecond contact 428a, the b can be by common screen 104,108 common contact detecting, detect by common screen 104,108 different contact detecting or by the different contact detecting of different screens.Shown in the 432a of border, thefirst contact 428a is held very first time amount, and shown in the 432b of border, thesecond contact 428b is held second time quantum.First and second time quantums are normally substantially the same, and first andsecond touch 428a, b occurs usually basicallysimultaneously.Touch 428a, b first and second and also comprise corresponding first and second contact mobile 436a, the b usually respectively.First and second contact mobile 436a, the b are usually on common direction.Change kind of a mode, first and second contact mobile 436a, the b are away from described first andsecond contact 428a, the b.More briefly, expansion gesture 410 can be finished with expansion action touch screen 104,108 by user's finger.
Such as by shown in Fig. 4 G and the 4H those, above-mentioned gesture can be combined to produce definite function result by any way.For example, in Fig. 4 G, fromrap gesture 420 away from direction on,rap gesture 420 and drag or flick gesture 412 combinations.In Fig. 4 H, on the direction of rappinggesture 420,rap gesture 420 and drag or flick gesture 412 combinations.
The function result who receives gesture can depend on a number of factors and change, comprisingequipment 100, display 110,114 or the sensed position of arriving of screen 104,108 state, the context that is associated with this gesture or gesture.The state of equipment typically refers to one or more configurations, the display direction ofequipment 100 and user and other inputs that is received by equipment 100.Context is often referred to the part in the selected one or more specific application programs of gesture and the current application program of carrying out, whether this application program is the application program of list or multi-screen, and whether this application program is the multi-screen application program that shows one or more window in one or more storehouses or one or more screen.The set that the position that senses of gesture typically refers to the position coordinates of the gesture that senses is at touch-sensitive display 110,114 or on gesture capture region 120,124, the set of the position coordinates of the gesture that senses still is associated with different display or screen 104,108 with common, and/or what of gesture capture region partly comprises the set of the position coordinates of the gesture that senses.
When touch-sensitive display 110,114 receives when rapping, use this to rap, for example, select icon to start or to stop the execution of corresponding application program, with maximization or minimized window, the window of rearrangement in the storehouse, and such as showing by keyboard or the image of other demonstration provides the user to import.Receive when dragging when touching sensitive display 110,114, can use this to drag, for example, to reorientate icon or the window desired position in the display, at the display storehouse of resequencing, or cross over two displays (making the window of selecting occupy the part of each display simultaneously).When touch-sensitive display 110,114 or gesture capture region 120,124 receive when flicking, can use this to flick that window is reoriented to second display or crosses over two displays (making the window of selecting occupy the part of each display simultaneously) from first display.Yet, be different from and drag gesture, do not use usually and flick gesture shown image is moved to specific user selected position, but to the not configurable default location of user.
When touch-sensitive display 110,114 or gesture capture region 120,124 receive when handling knob gesture, the described gesture of handling knob can be used for minimizing or increasing the size (usually when being received fully by common display) of viewing area or window, the window at top that is presented at the storehouse of each display is switched to the top (usually when by different displays or screen reception) of the storehouse of other display, perhaps display application Program Manager (in storehouse " pop-up window " of display window).When thesensitive display 110 that is touched, 114 or gesture capture region 120,124 when receiving, the expansion gesture can be used for maximizing or reducing the size of viewing area or window, to switch to the top (normally when being received by different displays or screen) of the storehouse of other display at the window at the top of the storehouse that shows each display, perhaps the display application Program Manager (usually when by on the identical or different screens from shielding the gesture capture region when receiving).
When receiving the combination gesture of Fig. 4 G by the common demonstration capture region in common display or screen 104,108, the combination gesture of Fig. 4 G can be used to the display that receives this gesture to keep the first window stack invariant position in first storehouse, and the second window stack position of reordering in second window stack simultaneously is to comprise window in the display of the gesture that receives.When by common display or screen 104,108 or the different demonstration capture region of different display or screen when receiving the combination gesture of Fig. 4 H, the combination gesture of Fig. 4 H can be used for keeping the first window stack invariant position in first window stack at the display that raps part that receives gesture, the second window stack position of reordering in second window stack simultaneously is to comprise window in the display that flicks or drag gesture in reception.Though specific gesture in the aforementioned embodiment and gesture capture region are associated with function result's corresponding set, should be understood that, these associations can redefine by any way, to produce different associations between gesture and/or gesture capture region and/or functional outcome.
Firmware and software:
Storer 508 can be stored, andprocessor 504 can be carried out one or more component softwares.These assemblies can comprise at least one operating system (OS) 516,application manager 562,desktop 566 and/or from one ormore application program 564a and/or the 564b of application storage device 560.OS516 can compriseframework 520, one ormore frame buffer 548, as previous one or more drivers 512 described in conjunction with Figure 2 and/or kernel 518.Any software that OS516 can be made up of program and data, its supervisory computer hardware resource, and provide public service for the execution of various application programs 564.OS516 can be any operating system, and at least in certain embodiments, is exclusively used in equipment, includes but not limited to Linux, ANDROID TM, iPhone OS (IOS TM), WINDOWS PHONE7TM etc.As described herein, OS516 comes operationally to provide function for mobile phone by carrying out one or more operations.
Application program 564 can be to carry out any more senior software of specific function for the user.564 programs that can comprise such as email client, Web browser, short message application program, recreation, media player, office software etc. of application.Application program 564 can be stored in theapplication storage device 560, andapplication storage device 560 can be represented any storer and the data storage device for application storing 564, and management software associated with it.In case carry out, application program 564 may operate in the different zone ofinternal memory 508.
Framework 520 can be to allow a plurality of tasks to move to carry out mutual any software or data at equipment.In an embodiment, the discrete assembly at least part of and that hereinafter describe offramework 520 can be considered to the part ofoperating system 516 or application program 564.Yet these parts will be described to the part offramework 520, but these assemblies are not limited tothis.Framework 520 can include, but are not limited to multi-screen display management (MDM)module 524, surperficial cache module 528,window management module 532,input manager module 536,task management module 540,Application models manager 542, display controller, one ormore frame buffer 548, and it is window in the viewing area and/or the logic arrangement of desktop fortask stack 552, one or more window stack heap 550() and/orevents buffer 556.
MDM module 524 comprises one or more modules, is used for operationally managing application program on the screen of equipment or the demonstration of other data.The embodiment ofMDM module 524 is described in conjunction with Fig. 5 B.In an embodiment,MDM module 524 receives the state thatequipment 100 is determined in input constantly from other OS516 assemblies and the application program 564 of for example driver 512 and so on.This input assistsMDM module 524 how to determine to dispose and distribute demonstration according to the preference of application program and requirement and user's action.In case determined to show configuration,MDM module 524 can arrive display by binding application program 564.Configuration can be provided for one or more other assemblies and comes to generate window by display then.
Surface cache module 528 comprises any storer or memory storage and software associated with it, stores or the image of the one or more windows of high-speed cache.Series of activities and/or inactive window (or other demonstration objects such as desktop shows) can be associated with each display.Current show events window (or other show object).Inactive window (or other show objects) is opened, and some the time shown, but current shown.Experience in order to improve the user, before window carries out the transition to inactive state from active state, " screenshot capture " of the image of last generation that can memory window (or other show objects).Surface cache module 528 can operate to store the current bitmap that does not have the last live image of shown window (or other show object).Therefore, surperficial cache module 528 is stored the inactive window image of (or other show object) in the data storage.
In an embodiment,window management module 532 can operate to manage the movable or inactive window (or other show object) on each display.Based on the information fromMDM module 524, OS516 or other assemblies,window management module 532 determines when window (or other show object) is visible or inactive.Window management module 532 can place " inactive state " to sightless window (or other show object), and in conjunction with the task management module,task management 540 suspends the operation of application program.In addition,window management module 532 can by withMDM module 524 collaborative interactives with the display identifier allocation to window (or other show objects), or one or more sundry items of the data that are associated with this window (or other demonstration objects) of management.Window management module 532 can also provide institute's canned data to application program 564,task management module 540 or other assemblies interactive with this window (or other show object) or that be associated.Window management module 532 can also be associated incoming task based on the displaing coordinate in window focus and the motion space with window.
Input manager module 536 can operate the event of management equipment institute.Event is any input in Windows, for example, and with user's user interface interaction.Input manager module 536 reception events and storage event in events buffer 556 logically.Event can comprise following user interface interaction: when screen 104,108 receives " event downwards " that the touch signal from the user takes place, when screen 104,108 is determined " moving event " that users' finger takes place when screen moves, determine users when screen 104,108 and stopped touch screen 104,108 " event makes progress " etc.These events are transfused toadministration module 536 and receive, store and be forwarded to other modules.Input manager module 536 can also be mapped to screen input the motion space, and this is the ultimate of all physics available on the equipment and virtual demonstration.
The motion space is the Virtual space, and it comprises all touch-sensitive displays 110,114 " tiling " together, with the physical size of imitation equipment 100.For example, when equipment 100 is unfolded, the size of motion space can be 960x800, and this can be the quantity in the pixel of the viewing area of two touch-sensitive displays 110,114 merging.(40,40) touch first touch-sensitive display 110 if the user is in the position, and the full screen window can receive the touch event of position (40,40).If the position (40 of second touch-sensitive display 114 that the user touches, 40), the full screen window can receive position (520,40) touch event, this be because second touch-sensitive display 114 on the right side of first touch display 110, this touch so the width of first touch display 110 that equipment 100 can be by 480 pixels is setovered.When hardware event took place and have positional information from driver 512, framework 520 can be changed physical location (up-scale) to the motion space, and this is can be based on equipment towards with state and different because of the position of this event.The motion space can be the U.S. Patent application the 13/187th that is entitled as " system and method that is used for receiving the gesture input of crossing over a plurality of input equipments ", submits on July 20th, 2011, No. 026 described motion space, in order to instruct and whole purpose, in this mode by reference its full content is incorporated in this.
Task can be application program, and the subtask can provide the application component of window, and the user can be by doing some things with it, such as calling, take pictures, send Email or consulting a map alternately.Can give the window that each task is drawn user interface therein.Window is filled display (for example, touch-sensitive display 110,114) usually, but also can be less than display 110,114 and float over the top of other windows.Application program normally is made up of a plurality of subtasks of loosely binding each other.Generally, a task in the application program is designated as " master " task, and it presents to the user when starting application program first.Each task can start another task or subtask to carry out different operations then.
Task management module 540 can operate to manage the operation of the one or more application programs 564 that can be carried out by equipment.Therefore,task management module 540 can receive that signal starts, time-out, termination etc. are stored in application program or application program subtask in the application storage device 560.Task management module 540 then can instantiation application program 564 one or more tasks or the subtask with the operation of beginning application program 564.In addition,task management module 540 can start, time-out or terminated task or subtask, with as the result of user's input or as the result from the signal of framework forcooperation 520 assemblies.Task management module 540 be in charge of application program (task and subtask), the life cycle when stopping to application program during from application program launching.
The processing oftask stack 552 nonproductivetask administration modules 540, its logical organization for being associated with task management module 540.All tasks ontask stack 552service equipments 100 and the state of subtask.When some assemblies ofoperating system 516 need the transition in its life cycle of task or subtask, the OS516 assembly can be notified task management module 540.Task management module 540 can use identification information intask stack 552 location tasks or subtask then, and the signal which type of life cycle transition is the indication task need carry out is sent to task or subtask.Notice task or subtask transition allow task or subtask to prepare for the life cycle state transition.Task management module 540 can be executed the task or the status transition of subtask then.In an embodiment, status transition may need to triggerOS kernel 518, with terminated task when needs stop.
In addition,task management module 540 can be based on suspend this application program 564 from the information of window management module 532.Suspend application program 564 and can in internal memory, keep application data, present window or user interface but can limit or stop application program 564.In case application program becomes movable again,task management module 540 can trigger application program again and present its user interface.In an embodiment, if task is suspended, if task finishes, then task can be preserved the state of task.Under halted state, the application program task may not receive input, because this application window is sightless to the user.
Frame buffer 548 is be used to the logical organization that presents userinterface.Frame buffer 548 can be created and destroy to OS kernel 518.Yetdisplay controller 544 can write view data inframe buffer 548 for visiblewindow.Frame buffer 548 can be associated with one or morescreens.Frame buffer 548 can be by dynamically controlling with the mutual ofoperating system nucleus 518 with the related of screen.Can create compound demonstration by a plurality of screens are associated with single frame buffer 548.The graph data that is used for presenting the window user interface of application program then can be written to thesingle frame buffer 548 for compound demonstration, and it is output to a plurality of screens 104,108.Display controller 544 can be directed to User Interface a part that is mapped to specific display 110,114frame buffer 548, therefore, and explicit user interface on ascreen 104 or 108only.Display controller 544 can extend to the control to user interface a plurality of application programs, is a plurality of display control user interfaces that are associated withframe buffer 548 or its part.A plurality of physical screens 104,108 that component software on this methodcompensation display controller 544 uses.
Application manager 562 is application programs that presentation layer is provided for Windows.Therefore,application manager 562 provides the graphical model that is presented by task management module 540.Equally,desktop 566 provides presentation layer for application storage device 560.Therefore, desktop for the application program 564 in theapplication storage device 560 provide have selectable application program image target surface, can offer the graphical model thatwindow manager 556 presents.
In addition, this framework can comprise Application models manager (AMM) 542.Application manager 562 can with the AMM542 interface.In an embodiment, AMM542 receives the state change information of state about application program (move or suspend) from equipment 100.AMM542 can be associated with bitmap images the task of activity (operation or time-out) from surperficial cache module 528.In addition, AMM542 can be converted to the window logic storehouse that remains ontask manager module 540 linearity (" film " or " a secondary card ") tissue (organization), when using when carrying out the window ordering from screengesture capture region 120 user to feel described linearity group.In addition, AMM542 can provide the tabulation of executive utility forapplication manager 562.
The embodiment ofMDM module 524 is shown in Fig. 5B.MDM module 524 can operate to determine the state of the environment of equipment, whether direction, theequipment 100 that includes but not limited to this equipment is opened or closed, what application program 564 is being carried out, application program how 564 shown, user carry out which type of action, shown task dispatching.In order to dispose display, as described in conjunction with Fig. 6 A-6J,MDM module 524 is explained these environmental factors and is determined to show configuration.Then,MDM module 524 can be tied to display with application program 564 or other apparatus assemblies.Then, this configuration other assembly that can be sent indisplay controller 544 and/or the OS516 generatesdemonstration.MDM module 524 can comprise one or more, but is not limited to, and showsconfiguration module 568,preference module 572,device state module 574,gesture module 576, requiresmodule 580,event module 584 and/or bindingmodule 588.
Show configuration module 568 definite layouts that show.In an embodiment,demonstration configuration module 568 can be determined environmental factor.Can be from one or moreother MDM modules 524 or other source reception environment factor.Show that thenconfiguration module 568 can determine the best configuration that shows from list of factors.Some embodiment of possible configuration and factor associated with it are described in conjunction with Fig. 6 A-6F
Preference module 572 can operate to determine the demonstration preference of application program 564 or other assembly.For example, application program can have the preference of list or dual screen.Ifequipment 100 is under the state that can adapt to this preference pattern, the demonstration preference thatpreference module 572 can be determined application program (for example, by checking the preference setting of application program) and can allow application program 564 to change to a pattern (for example, single screen, double screen, maximum etc.).Yet even a pattern is available, some user interface policy may not allow this pattern.Because the configuration change of equipment can check that preference is to determine whether to realize the better demonstration configuration of application program 564.
Device state module 574 can operate the state of definite or receiving equipment.The state of equipment can be described in conjunction with Fig. 3 A and Fig. 3 B.Show thatconfiguration module 568 can use the state of equipment to determine the configuration that shows.Therefore,device state module 574 can receive input, and explains the state of this equipment.Provide status information to arrive then and showconfiguration module 568.
Gesture module 576 illustrates as the part of MDM module 524, and still, in an embodiment, gesture module 576 can be independently framework 520 assemblies that separate from MDM module 524.In an embodiment, gesture module 576 can operate to determine whether that the user carries out any operation in any part of user interface.In alternative embodiment, 576 of gesture modules receive operating user interfaces from configurable regional 112,116.Gesture module 576 can receive by input manager module 536 and occur in configurable regional 112,116(or other possible interface regions) touch event, and can (by service orientation, speed, distance, time and other various parameters) explain touch event, to determine what gesture the user carries out.When explaining gesture, gesture module 576 can be initiated the processing to gesture, and can administrative institute need the window animation by cooperating with other framework 520 assemblies.Gesture module 576 cooperates with Application models manager 542 to collect about which application program when user's gesture is carried out is moved the order status information that (movable or suspend) and application program must occur.Gesture module 576 can also (from surperficial cache module 528) receives reference and the movable window of bitmap, make when gesture takes place, it can the indicated number controller 544 moving windows on display 110,114 how.Therefore, when these windows on display screen 110,114 when moving, the application program of time-out may manifest to be moved.
In addition,gesture module 576 can receive mission bit stream fromtask management module 540 or input manager module 536.These gestures can define in conjunction with Fig. 4 A to 4H.For example, moving window makes display present the display frame of the movement of a series of diagram windows.The gesture that is associated with such user interface interaction can be received and explained by gesture module 576.Relevant with user's gesture then information is sent to the demonstration binding thattask management module 540 is revised task.
Requiremodule 580, be similar topreference module 572, can operate to determine the display requirement of application program 564 or other assemblies.Application program can have the one group of display requirement that must observe.Some application programs need specific display direction.For example, application program " bird of indignation " can only be with horizontal demonstration.Such display requirement can be by requiringmodule 580 to determine or reception.Because the direction of equipment changes, and requiresmodule 580 can determine the display requirement of application program 564 again.Show configuration module 568 can generate as requiremodule 580 that provide, according to the demonstration configuration of application program display requirement.
Event module 584 is similar togesture module 576, can operate to determine to influence one or more events user interface, that take place with application program or other assemblies.Therefore,event module 584 can receive event information from events buffer 556 or task management module 540.These events can change task and how be tied todisplay.Event module 584 can be fromother framework 520 assembly collection status change informations, and take action according to state change information.In example, when mobile phone is opened or closed or when when changing, new message can present at auxiliaryscreen.Event module 584 can receive and explain the state variation based on event.Information about event can be sent to the configuration thatdemonstration configuration module 568 is revised demonstration then.
Bindingmodule 588 can be operated application program 564 or other component bindings are arrived the configuration that shows thatconfiguration module 568 is determined.The demonstration configuration that is bundled in the internal memory each application program is associated with demonstration and the pattern of application program, therefore,binding module 588 can with the demonstration configuration of application program and application program (as laterally, vertically, multi-screen etc.) be associated.Then, bindingmodule 588 can distribute the display identifier to display.The display identifier is associated the particular display of application program with equipment 100.Other assembly that this binding was stored and offered other assembly ofdisplay controller 544, OS516 then or correctly presents demonstration.Binding is dynamic, and can be based on the configuration change or the renewal that are associated with event, gesture, state variation, application program preference or requirement etc.
The user interface configuration:
With reference now to Fig. 6 A-J,, the various types of output configurations that realized byequipment 100 will be described below.
Fig. 6 A and 6B have described two of theequipment 100 of first state different outputs configurations.Particularly, Fig. 6 A has describedequipment 100 and has closedvertical state 304, and wherein data are displayed on the main screen 104.In this example,equipment 100 shows data with the first verticallyconfiguration 604 by touch-sensitive display 110.Be understandable that first vertically disposes 604 shows desktop or operating system home screen only.Replacedly, whenequipment 100 vertically disposes 604 demonstration data with first, can present one or more windows at longitudinal direction.
Fig. 6 B has describedequipment 100 and has remained and closingvertical state 304 times, but shows data at auxiliary screen 108.In this example,equipment 100 shows data with the second verticallyconfiguration 608 by touch-sensitive display 114.
Can vertically dispose 604,608 with first or second and show similar or different data.Also can come transition between firstvertical configuration 604 and second vertically disposes 608 by offeringequipment 100 user's gestures (for example, double-clicking gesture), menu selection or other modes.Also can adopt other suitable gesture to come transition between configuration.In addition, be moved to which state according toequipment 100, also can makeequipment 100 vertically dispose 604,608 from first or second and carry out the transition to any other configuration described herein.
Equipment 100 at second state can adapt to another kind of output configuration.Particularly, Fig. 6 C has described the 3rd vertically configuration, and wherein data are simultaneously displayed on main screen 104 and the auxiliary screen 108.It is that two vertically (PD) output is disposed that the 3rd vertical configuration can be called as.In PD output configuration, when the touch-sensitive display 114 of auxiliary screen 108 was described data with second vertical configuration 608, the touch-sensitive display 110 of main screen 104 was described data with first vertical configuration 604.When equipment 100 is when opening vertical state 320, first vertically configuration 604 and second vertically present and can take place configuration 608 time.In this configuration, equipment 100 can show an application window, two application windows (in each display 110 and 114 respectively one), application window and a desktop or a desktop in a display 110 or 114.Other configuration is possible.Should be understood that be moved to which state according to equipment 100, equipment 100 is shown when disposing 604,608 carry out the transition to any other configuration described herein.In addition, under this state, the demonstration preference of application program can be placed into equipment bilateral pattern, and two displays all are movable under this pattern, to show different windows in identical application program.For example, camera application program can show view finder and control in a side, and opposite side shows the mirror image preview that can be seen by the photo theme.Relate to the advantage that recreation that two players play simultaneously also can utilize bilateral pattern.
Fig. 6 D and 6E have described two other output configuration at theequipment 100 of the third state.Particularly, Fig. 6 D has described to be displayed on the equipment of closing transverse state 340 100 on themain screen 104 in data.In this example,equipment 100 shows data withfirst landscape configuration 612 by touch-sensitive display 110.Just as other configurations of explanation herein,first landscape configuration 612 can show desktop, home screen, one or more windows of display application routine data etc.
Fig. 6 E has described to remain theequipment 100 closing transverse state 340, but data are displayed on the auxiliary screen 108.In this example,equipment 100 shows data withsecond landscape configuration 616 by touch-sensitive display 114.Can vertically dispose 612,616 with first or second and show similar or different data.Also can byequipment 100 distortion is provided and rap gesture or flick and the gesture of sliding in one or two come transition betweenfirst landscape configuration 612 and second landscape configuration 616.Also can adopt the transition between configuration of other suitable gesture.In addition, also can be moved to which state according toequipment 100equipment 100 is carried out the transition to any other configuration described herein from described first or second landscape configuration 612,616.
Fig. 6 F has described the 3rd landscape configuration, and wherein data are simultaneously displayed onmain screen 104 and the auxiliary screen 108.The 3rd landscape configuration can be called as two laterally (LD) output configuration.In LD output configuration, the touch-sensitive display 110 ofmain screen 104 is described data withfirst landscape configuration 612, and the touch-sensitive display 114 ofauxiliary screen 108 is described data with second landscape configuration 616.Whenequipment 100 when opening transverse state 340, present and may take place in the time offirst landscape configuration 612 and second landscape configuration 616.Should be understood that also can be moved to which state according toequipment 100,display device 100 is shown when disposing 612,616 carry out the transition to any other configuration described herein.
Fig. 6 G and Fig. 6 H have described two views at the
equipment 100 of another kind of state.Particularly,
equipment 100 is described as be in support state 312.Fig. 6 G shows first support output configuration 618 and can be displayed on the touch-sensitive display 110.Fig. 6 H shows second support output configuration 620 and can be displayed on the touch-sensitive display 114.
Equipment 100 can be configured to describe respectively first support output configuration, 618 or second support output configuration 620.Perhaps, can present support output configuration 618,620 simultaneously.In certain embodiments, support output configuration 618,620 can be similar or identical with horizontal output configuration 612,616.The support state of revising 316 times,
equipment 100 can also be configured to show one or two support output configuration 618,620.It should be understood that utilization (for example, can promote two-player game when support output disposed 618,620
Chess, Chinese checkers etc.), two or more user shares multi-user's meeting and other application programs of same equipment 100.Be understandable that also can be moved to which state according to
equipment 100, make
display device 100 carry out the transition to any other configuration as described herein from showing one or two configuration 618,620.
Fig. 6 I has described opening the another kind output configuration that vertical state can adapt to for 320 times when equipment 100.Particularly, here be called as in vertical configuration of vertical maximum (PMAX)configuration 624,equipment 100 can be configured to provide crosses over two touch-sensitive displays 110,114 single consecutive image.In this configuration, can cut apart and show on one of touch-sensitive display top data (for example, single image, application program, window, icon, video etc.), and the other parts of data are displayed on another touch-sensitivedisplay.Pmax configuration 624 can help to show atequipment 100 bigger demonstration and/or the better resolution of specific image.Similar with other output configuration, be moved to which state according toequipment 100,equipment 100 can be carried out the transition to any other output configuration described herein fromPmax configuration 624.
Fig. 6 J has described opening another output configuration that transverse state can adapt to for 348 times when equipment 100.Particularly, here be called as in the landscape configuration of horizontal maximum (LMAX)configuration 628,equipment 100 can be configured to provide crosses over two touch-sensitive displays 110,114 single consecutive image.In this configuration, can cut apart and show on one of touch-sensitive display top data (for example, single image, application program, window, icon, video etc.), and the other parts of data are displayed on another touch-sensitive display.Theconfiguration 628 of Lmax can help to show atequipment 100 bigger demonstration and/or the better resolution of specific image.Similar with other output configuration, be moved to which state according toequipment 100,equipment 100 can be carried out the transition to any other output configuration described herein from theconfiguration 628 of Lmax.
Shown in Fig. 7 A and 7B,equipment 100 management have desktop and/or the window of at least one window stack 700.Window stack 700 is logic arrangement activity or inactive window or demonstration object of multi-screen equipment.For example, shown in Fig. 7 A and 7B,window stack 700 can logically be similar to piling up of pack or fragment of brick, and wherein, one or more windows or demonstration object (for example, desktop) are by arranged in sequence.Movable window is the current window that is being shown at least one touch-sensitive display 110,114.For example, window 1708 is movable windows and is displayed on in the touch-sensitive display 114 at least one.In the embodiment shown in Fig. 7 A,equipment 100 is inclosed condition 304, to dispose 608 display windows 1708.Inactive window be opened and shown, but at the window of the window " back " of activity, and not shown.In an embodiment, the application program that inactive window can be used for suspend, so the window content of show events not.For example, window 2712 and window 3716 are inactive windows.
Window stack 700 can have various layouts or institutional framework.In the embodiment shown in Fig. 7 A, whenequipment 100 was inclosed condition 304 with second vertical configuration,equipment 100 comprisedfirst storehouse 704 that is associated with first touch-sensitive display 114.Therefore, each touch-sensitive display 114 can have thewindow stack 704 that is associated.In other embodiments, there is the single window stack that comprises all composite displays.This composite display is to define the logical organization that comprises two touch-sensitive displays 110,114 wholedisplay space.Equipment 100 can have the single storehouse of composite display, and wherein window or demonstration object are resized to occupy the some or all of of composite display.Therefore,storehouse 704 can be represented the part of bigger composite window storehouse, becauseequipment 100 inoff position 304, therefore part does not wherein show.
In Fig. 7 B, two window stacks (perhaps two of window stack parts) 704,724 can have the window that is arranged in each storehouse 704,724 varying number or show object.And two window stacks 704,724 also can differently be identified and be managed respectively.Shown in Fig. 7 A, first window stack 704 can be from first window 708 to next window 712 to last window 716 and at last to desktop 720 arranged in sequence, and in an embodiment, desktop 720 is in " bottom " of window stack 704.In an embodiment, desktop 720 is not always in " bottom ", because application window can be disposed in the below of the desktop 720 in the window stack, and during desktop or the change of other direction, desktop 720 can be placed to " top " of the storehouse above other window.For example, shown in Fig. 7 B, equipment 100 is transformed into open mode 320.Equipment 100 is transformed into the demonstration configuration shown in Fig. 6 C.Therefore, touch-sensitive display 110 is the window that is not associated with touch-sensitive display 110.Therefore, show desktop 720 for touch-sensitive display 110.Therefore, second storehouse 724 can comprise desktop 720, and in an embodiment, described desktop 720 is single desktop areas, below desktop 720 all windows in window stack 704 and window stack 724.The logic data structure that is used for two window stacks of management or has two parts 704, a single window storehouse of 724 can be described in conjunction with Fig. 8.
Equipment is opened the layout ofstorehouse 704 of rear hatch as shown in Fig. 7 C to7E.Window stack 704 is illustrated in three " facade " views.In Fig. 7 C, show the top ofwindow stack 704/724.Two adjacent limits ofwindow stack 704/724 have been shown among Fig. 7 D and the 7E.In this embodiment,window stack 704/724 is similar to one pile of brick.Window is piled up mutually.The top of thewindow stack 704/724 from Fig. 7 C has only the window at the top in thewindow stack 704/724 to be seen in the different piece of composite display 728.Desktop 786 or window can occupy the some or all of of composite display 728.Figure C showscomposite display 728.composite displays 728 and contains or comprise all touch-sensitive displays 110,114 viewing area.The size ofcomposite display 728 can change based on the direction of equipment 100.For example, shown in Fig. 7 A, when in off position, the size of thecomposite display 728 ofequipment 100 can include only the zone of one of touch-sensitive display 110 or 114.Whenequipment 100 was opened, shown in Fig. 7 B and 7C,composite display 728 expanded to comprise two touch-sensitive displays 110,114.Varying sized whencomposite display 728, some windows or demonstration object are associated withcomposite display 728 and are also varying sized.In an embodiment, such demonstration object can bedesktop 720, and opening to fillcomposite display 728 describeddesktops 720 when equipment can be varying sized.
In the illustrated embodiment,desktop 720 is demonstration object minimum in thewindow stack 704/724, window or " brick ".Thereon, window 1708, window 2712 and window 3716 are stacked.Window 1708, window 2712 and 3716 parts that occupycomposite display 728 of window.Therefore, another part ofstorehouse 724 includes only desktop 718.Only present and show the top window in any part ofcomposite display 728 practically or show object.Therefore, shown in the top view among Fig. 7 C, window 1708 anddesktop 718 are displayed on the top of the different piece ofwindow stack 704/724.Window can be resized only to occupy the part ofcomposite display 728 with the lower window in " appearing " window stack 704.For example,desktop 718 is lower than window 1708, window 2712 and window 3716, but still shown.When the equipment of the top window in the positivedisplay window storehouse 704 was opened, this layout of window and desktop took place.
Desktop 718 is positioned in storehouse and where and how places can be the context of the direction ofequipment 100, the program that is being performed onequipment 100, function, software etc., how to place the function of storehouse etc. whenequipment 100 is opened.Whenequipment 100 was opened, the logic data structure that is associated withdesktop 718 or other window can not change, but how logic data structure can determine display window and desktop.When user interface or other event or task changed the layout of storehouse, the logical organization of window or desktop can be changed to reflect the change in the layout.
Fig. 7 F to 7J shows window stack and arranges another embodiment of 724, and described window stack arranges that 724 change owing to equipment is opened from the different layout of Fig. 7 A to 7C.In the embodiment shown in Fig. 7 F,equipment 100 inoff position 304 is to dispose 608 display windows 1708.Window 2712 and window 3716 are inactive windows.In the embodiment shown in Fig. 7 F, whenequipment 100 first vertically during theclosed condition 304 ofconfiguration 604,equipment 100 comprisessecond storehouse 724 that is associated with second touch-sensitive display 110.Therefore, touch-sensitive display 110 can have thewindow stack 724 that is associated.In other embodiments, there is whole single window of containingcomposite display.Equipment 100 can have the single storehouse of composite display, and wherein window or demonstration object are resized to occupy the some or all of of composite display.Therefore,storehouse 724 can be represented the part of bigger composite window storehouse, becauseequipment 100 inoff position 304, therefore part does not wherein show.
In Fig. 7 G, two window stacks (perhaps two of window stack parts) 704,724 can have the window that is arranged in each storehouse 704,724 varying number or show object.And two window stacks 704,724 also can differently be identified and be managed respectively.Shown in Fig. 7 F, second window stack 724 can according to from first window 708 to next window 712 to last window 716 and at last to being disposed in order of desktop 720, in an embodiment, desktop 720 is in " bottom " of window stack 704.In an embodiment, desktop 720 is not always in " bottom ", because application window can be disposed in the below of the desktop 720 in the window stack, and during desktop or the change of other direction, desktop 720 can be placed to " top " of the storehouse above other window.For example, shown in Fig. 7 G, equipment 100 is transformed into open mode 320.Equipment 100 is transformed into the demonstration configuration shown in Fig. 6 C.Therefore, touch-sensitive display 114 is the window that is not associated with touch-sensitive display 114.Therefore, show desktop 720 for touch-sensitive display 114.Therefore, second storehouse 704 can comprise desktop 720, and in an embodiment, described desktop 720 is single desktop areas, below desktop 720 all windows in window stack 704 and window stack 724.The logic data structure that is used for two window stacks of management or has two parts 704, a single window storehouse of 724 can be described in conjunction with Fig. 8.
The layout of thewindow stack 724 after equipment is opened is as shown in Fig. 7 H to7J.Window stack 704 is illustrated in three " facade " views.In Fig. 7 H, show the top ofwindow stack 704/724.Two adjacent " limits " ofwindow stack 704/724 have been shown among Fig. 7 I and the 7J.In this embodiment,window stack 704/724 is similar to one pile of brick.Window is piled up mutually.The top of thewindow stack 704/724 from Fig. 7 H has only the window/demonstration object at the top in thewindow stack 704/724 to be seen in the different piece of composite display 728.Desktop 720 or window can occupy the some or all of ofcomposite display 728.
In the illustrated embodiment,desktop 720 is demonstration object minimum in thewindow stack 704/724, window or " brick ".Thereon, window 1708, window 2712 and window 3716 are stacked.Window 1708, window 2712 and 3716 parts that occupycomposite display 728 of window.Therefore, another part ofstorehouse 724 includes only desktop 718.Only present and show the top window in any part ofcomposite display 728 practically or show object.Therefore, shown in the top view among Fig. 7 H, window 1708 anddesktop 720 are displayed on the top of the different piece ofwindow stack 704/724.Window can be resized only to occupy the part ofcomposite display 728 with the lower window in " appearing " window stack 704.For example,desktop 718 is lower than window 1708, window 2712 and window 3716, but still shown.When the equipment of the top window in the positivedisplay window storehouse 704 was opened, this layout of window and desktop took place.
Desktop 718 is positioned in storehouse and where and how places can be the context of the direction ofequipment 100, the program that is being performed onequipment 100, function, software etc., how to place the function of storehouse etc. whenequipment 100 is opened.Whenequipment 100 was opened, the logic data structure that is associated withdesktop 718 or other window can not change, but how logic data structure can determine display window and desktop.When user interface or other event or task changed the layout of storehouse, the logical organization of window or desktop can be changed to reflect the change in the layout.
Be used for management and be shown in Fig. 8 at thelogic data structure 800 of the layout of the window of window stack or desktop.Logic data structure 800 can be any data structure for storage (no matter being object, record, file etc.) data.Logic data structure 800 can be stored in database or the data-storage system of any kind, and no matter agreement or standard.In an embodiment,logic data structure 800 is included in one or more parts, field, attribute of storage data in the logic arrangement that allows to be convenient to store with retrieving information etc.Hereinafter, these one or more parts, field, attribute etc. will simply be described as field.Field canmemory window identifier 804,size 808, windowstack location identifier 812,display identifier 816 and/or active designator 820.Each window in window stack can have thelogic data structure 800 that is associated.As oval 828 represented, though have only singlelogic data structure 800 to be shown in Fig. 8, more or less logic data structure 800(of using withwindow user interface 700 can be arranged based on the quantity of the window in storehouse or desktop).In addition, represented asellipse 828, also can have than the more or less field shown in Fig. 8.
Window identifier 804 can comprise any identifier (ID), described indications identify uniquely with window stack in other windows or show the window that is associated that object is relevant or showobject.Window identifier 804 can be the identifier of Globally Unique Identifier (GUID), digital ID, alphanumeric ID or other types.In an embodiment, thiswindow identifier 804 can be based on, two or any amount of numeral of the quantity of openable window or demonstration object.In alternative embodiment, the size ofwindow identifier 804 can or show that the quantity of object changes based on the window of opening.When window or show to as if open, thiswindow identifier 804 can be static and remain unchanged.
Size 808 can be included in the window in thecomposite display 704 or show the size of object.For example,size 808 can comprise window or show object two or more angles coordinate or can comprise window or show a wide and high coordinate and the size of object.Thesesizes 808 can define window or show object can occupy what part ofcomposite display 704, and it can be wholecomposite display 704 or be the part of composite display 704.For example, shown in Fig. 7 C and 7H, window 1708 can have the only size 880 of part that indication window 1708 will occupy the viewing area of composite display 728.Because window or demonstration object are moved in window stack or insert, sosize 808 can change.
Stack position identifier 812 can be any identifier, this any identifier can identification window or is shown the position of object in storehouse, and perhaps the control record (such as tabulation or storehouse) of the window that thisstack position identifier 812 can be from data structure is inferred.Stack position identifier 812 can be the identifier of GUID, digital ID, alphanumeric ID or other types.Each window or demonstration object can comprise stack position identifier 812.For example, shown in Fig. 7 A, the window 1708 in storehouse 1704 can haveidentification window 708 to be first windows in thestorehouse 704 and to be thestack position identifier 812 of movable window.Similarly, window 2712 can have the expression window 2712 arestack position identifiers 812 of second window in the storehouse 704.Therefore, according to the type of storehouse, stackposition identifier 812 can be represented the window in the storehouse or show the position of object.
Display identifier 816 can identification window or is shown that object is associated with particular display (such as first display, 110 second displays 114) or by thecomposite display 728 that two displays are formed.Shown in Fig. 7 A, though many stack system may not needdisplay identifier 816,display identifier 816 can be indicated the window in the serial storehouse of Fig. 7 or be shown whether object shows in particular display.Therefore,desktop 720 can have two parts shown in Fig. 7 C.First can have the display identifier offirst display 110, yet and second portion can have thedisplay identifier 816. ofsecond display 114, in alternative embodiment,desktop 720 can have theindividual monitor identifier 816 of identificationcomposite display 728.
Similar to displayidentifier 816 since the window in the stack position 1 or show to as if movable and shown, the dual stack system of Fig. 7 A can not need activity indicators 820.In alternative embodiment,activity indicators 820 can indicate that (a bit) window in the storehouse shown.Therefore, window 1708 can be illustrated, and has activity indicators 820.Activity indicators 820 can be the expression window or show liking movable or shown simple mark or position.
The embodiment that is used for themethod 900 of change window stack is shown in Fig. 9.The general sequence of the step ofmethod 900 has been shown among Fig. 9, and in general,method 900 begins and finishes withend operation 924 to begin operating 904.Method 900 can comprise more or step still less, perhaps can come the order of alignment step to be different from step order shown in Figure 9.Method 900 can be used as by computer system and carries out and be performed in the set of the computer executable instructions of computer-readable medium coding or storage.Hereinafter, will comeillustration method 900 with reference to system, assembly, module, software, data structure, the user interface described in conjunction with Fig. 1-8.
Multi-screen equipment 100 can receive the direction of describing as Fig. 3 A-3B and change, and described direction change changes to openmode 320 withequipment 100 fromclosed condition 304 in step 908.Direction changes the signal that can be detected and receive from the hardware input of hall effect sensor, timer etc.Direction changes and can be received and send to multi-displayadministration module 524 by task management module 540.Multi-display administration module 524 can explain that this change changes to vertical demonstration (shown in Fig. 6 C) of opening with the configuration with display from vertical demonstration 604,608 of closing, perhaps show 612,616 to the landscape configuration of opening (shown in Fig. 6 F, as being combined described with Fig. 6 A to 6F) from laterally closing.In an embodiment,task management module 540 is placed in thetask stack 552 user interface interaction to be acted on by multi-display administration module 524.In addition,task management module 540 is waited for and is created window to send instructions towindow management module 532 from the information ofmulti-display administration module 524 inwindow stack 704.
When receiving instruction fromtask management module 540, instep 912,multi-display administration module 524 determines whether to appear desktop 786.In an embodiment, beforeequipment 100 was opened, desktop 786 can be in the bottom ofwindow stack 704/724.Yetequipment 100 can show last window in thestorehouse 704/724.In other words, the top window in thestorehouse 704/724 is shown, the position of the display that appears recently in filling open mode do not have a window.For example, in Fig. 7 A, " left side " of window 1 do not have window.Therefore, shown in Fig. 7 B,multi-display administration module 524 needpresent desktop 720 in display 110.Becausedesktop 720 is crossed overcomposite display 728 usually and is launched, unlessdesktop 720 always shows that at the equipment of opening 100 another window covers desktop 720.Shown in Fig. 7 B and 7F, owing to there is not window to coverdesktop 720,multi-display administration module 524 determines to showdesktop 720 in the display of opening.
In an embodiment, thedevice state module 574 ofmulti-display administration module 524 can determine equipment be how to be directed or equipment be in what state, that for example open, that close, wait longitudinally.In addition,preference module 572 and/or requiremodule 580 to determine how desktop 786 is shown based on the preference of desktop 786.Showing then thatconfiguration module 568 can use from the input ofdevice state module 574,preference module 572 and/orother frame assembly 520 assessescurrent window stack 704/724.Because other window is mobile, just be illustrated indesktop 720 common other windows that do not influence in thewindow stack 704/724 in the new display.Yet thesize 808 ofdesktop 720 can change whendesktop 720 is modified to fillcomposite display 728, anddesktop 720 isexpansions opening equipment 100.
In an embodiment, the observability algorithm is that all parts ofcomposite display 728 determine that those windows/demonstration object is at the top ofstorehouse 704/724.For example, shown in Fig. 7 B or 7F, the observability algorithm determines that afterequipment 100 was opened, desktop 786 was appeared in the part of storehouse 704.In addition, shown in Fig. 7 B or 7F, shown in the other parts of window 1708 in storehouse 704.When having determined where to appeardesktop 720, show thatconfiguration module 568 can changedisplay size 808,display identifier 816 and/or thestack position identifier 812 of desktop 720.Multi-display administration module 524 can send it backtask management module 540 withsize 808,display identifier 816 and/or stackposition identifier 812 then.
In an embodiment,task management module 540 sendssize 808,display identifier 816 and/or stackposition identifier 812 and/or out of Memory and instruction desktop 786 is presented to window managementmodule 532.In step 920,window management module 532 andtask management module 540 can change logic data structure 800.Task management module 540 andwindow management module 532 canmanagement window storehouse 704/724 copy.These copies ofwindow stack 704/724 can be by the communication betweenwindow management module 532 and thetask management module 540 by synchronously or be held similar.Therefore, based on the information that manydisplay managements module 524 is determined,window management module 532 andtask management module 540 can changesize 808,display identifier 816 and/or stackposition identifier 812 and one or more windows of desktop 720.Logic data structure 800 can be bywindow management module 532 and 540 storages of task management module then.Andwindow management module 532 andtask management module 540 can be at after thismanagement window storehouse 704/724 andlogic data structures 800.
Example system of the present disclosure and method have been relevant to the window stack that is associated with multi-screen equipment and have been described.Yet for fear of unnecessarily bluring the disclosure, some known structures and equipment have been omitted in the description of front.This omission should not be understood that the restriction to the scope of claim.Illustrate detail to provide understanding of the present disclosure.Yet, should be understood that the disclosure can be put into practice with the variety of way that exceeds detail described in this paper.
In addition, though the illustrative aspects shown in this paper, embodiment and/or configuration show each ingredient of the system of configuration side by side, the remote part long range positioning that some assembly of this system can be in distributed network (such as LAN and/or the Internet) or in dedicated system.Therefore, it should be understood that, the assembly of this system can be incorporated into one or more equipment, and (such as computing machine, notebook computer, net book, panel computer, smart mobile phone, mobile device etc., or the specific node in distributed network disposes (as simulation and/or digital telecommunication network, packet switching network or circuit-switched network) side by side.Be appreciated that owing to counting yield that from the description of front the assembly of system can be disposed in the interior any position of distributed network of assembly, and can not influence the operation of system.For example, under one or more users' prerequisite, each assembly can be arranged in the converter (switch) such as PBX and media server, gateway, in one or more communication facilitiess or in their some combinations.Similarly, one or more funtion parts of system can be distributed in telecommunication apparatus and the computing equipment that is associated between.
In addition, should be understood that the various links of Connection Element can be wired or wireless links, or their any combination, maybe can get at and provide from the element that is connected and/or any other element known or later exploitation of communication data.These wired or wireless links can also be safety chain and information that can communication encryption.Transmission medium as link can be any appropriate carriers that for example is used for electric signal, comprise concentric cable, copper cash and optical fiber, and can take the form of sound wave or light wave, such as in radiowave and infrared ray red data communication process, produce those.
And, though discuss and illustrate process flow diagram relatively with the particular sequence of event, it should be understood that variation, interpolation and the omission of this sequence can take place and the operation of not appreciable impact the disclosed embodiments, configuration and aspect.
Can use variations more of the present disclosure and modification.Can provide other for some feature of the present disclosure.For example, in an alternate embodiment, window stack is similar to travelling belt or network of personal connections but not secondary playing cards.Therefore, window can be recycled to another touch-sensitive display 114 from a touch-sensitive display 110.For example, window can be pushed to the right side and finish at the end of the storehouse of another window back.If storehouse continues to move to the right side, even window is opened in first touch-sensitive display 110, final window will occur at second touch-sensitive display.In the storehouse these move and change and can be managed by using method discussed above and logic data structure.In another alternate embodiments, can there be other layout of window stack.
In another embodiment, system and method of the present disclosure can be in conjunction with special purpose computer, programming microprocessor or microcontroller and peripheral integrated circuit component, ASIC or other integrated circuit, digital signal processor, hardwired electronics such as discrete element circuits or logical circuit, (such as PLD, PLA, FPGA, PAL's and so on) programmable logic device (PLD) or gate array, special purpose computer, any similar devices wait to realize.Under normal conditions, any equipment that can realize method shown in this article can be used for be realized various aspects of the present disclosure.The example hardware that can be used for the disclosed embodiments, configuration and aspect comprise computing machine, handheld device, phone (for example, cell phone, the internet of enabling, numeral, simulation, hybrid and other) and other hardware as known in the art.Some of these equipment comprise processor (for example, single or multiple microprocessors), storer, nonvolatile memory, input equipment and output device.In addition, include but not limited to that the alternative software implementations that distributed treatment or component/object distributed treatment, parallel processing or virtual machine are handled also can be fabricated to realize method described herein.
In another embodiment, disclosed method can be used in combination the software of object or OO software development environment and easily realize, described software development environment provides the portable source code that can use on various computing machines or workstation platform.Replacedly, disclosed system can partly or entirely realize with the hardware that uses standard logic circuit or VLSI design.Use the software person of still being hardware to realize that system according to the present invention depends on requirement, specific function and the specific software of being used or hardware system or microprocessor or the microcomputer system of speed and/or the efficient of system.
In another embodiment, disclosed method can be partly realizes at software, and described software can be stored on the storage medium and carry out at the multi-purpose computer of programming with cooperations such as controller and storer, special purpose computer, microprocessors.In these examples, system and method for the present disclosure may be implemented as the program that is embedded on the personal computer, as applet,
Or CGI scripting, as the resource that resides on server or the computer workstation, as the conventional program that is embedded in the special measurement system, system component etc.This system can also be by physically with system and/or method is attached to software and/or hardware system is realized.
Though the disclosure with reference to specific criteria and protocol description aspect, the assembly and the function that realize in embodiment and/or the configuration, described aspect, embodiment and/or configuration are not limited to these standards and agreement.There are NM other the similar standard of this paper and agreement and are believed to comprise in the disclosure.In addition, the standard of mentioning here and agreement and other similar standards of not mentioning here and agreement are periodically replaced by faster or more effective equivalent with substantially the same function.Such alternate standard and agreement with identical function are considered as included in the interior equivalent of the disclosure.
Basically as describing herein and illustrate, the disclosure in all fields, among the embodiment and/or comprise assembly, method, processing, system and/or equipment in the configuration, comprise various aspects, embodiment, configuration embodiment, son in conjunction with and/or its subclass.Those skilled in the art will understand after understanding present disclosure, how to make and use disclose aspect, embodiment and/or configuration.The disclosure in all fields, embodiment and/or configuration comprise providing does not have project that this paper describes and/or not description entry destination device or process, He or in all fields, embodiment and/or its configuration comprise do not have can be formerly equipment or this project used of process, for example, be used for improving performance, realization and easily reduce implementation cost.
Present the discussion of front with the purpose of illustration and description.Aforementioned content also is not intended to the form restriction disclosure disclosed herein.For example in the detailed description of aforementioned exemplary, for simplifying purpose of the present disclosure, various features of the present disclosure are grouped in together aspect one or more, in embodiment and/or the configuration.The feature of aspect of the present disclosure, embodiment and/or configuration can be combined in alternative aspect, embodiment and/or those configurations in addition discussed above.Disclosed this method is not interpreted as reflecting that claim requires the intention than the more feature of putting down in writing of feature in each claim.On the contrary, reflect that as following claim creative aspect is to lack than all features of single above-mentioned disclosed aspect, embodiment and/or configuration.Therefore, following claim is incorporated in this detailed description, and each claim itself is independent of independent preferred embodiment of the present disclosure.
In addition, though describe the description that has comprised one or more aspects, embodiment and/or configuration, specific change and revise, other variation, make up and be modified in the scope of the present disclosure, for example, can the skills and knowledge scope after those skilled in the art understand the present invention in.Intention is obtained right, described right comprises alternative aspect, embodiment and/or configuration in allowed limits, comprise alternative, interchangeable and/or equivalent configurations, function, scope or step to claim, no matter and these substitute, whether exchange and/or equivalent configurations, function, scope or step open by this paper, and is not intended to contribute publicly any patentability theme.