Movatterモバイル変換


[0]ホーム

URL:


EP3285238A2 - Automation system user interface - Google Patents

Automation system user interface
Download PDF

Info

Publication number
EP3285238A2
EP3285238A2EP17186497.8AEP17186497AEP3285238A2EP 3285238 A2EP3285238 A2EP 3285238A2EP 17186497 AEP17186497 AEP 17186497AEP 3285238 A2EP3285238 A2EP 3285238A2
Authority
EP
European Patent Office
Prior art keywords
security
gateway
floor
devices
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
EP17186497.8A
Other languages
German (de)
French (fr)
Other versions
EP3285238A3 (en
Inventor
Ken Sundermeyer
Jim Fulker
Matt Davidson
Paul Dawes
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
IControl Networks Inc
Original Assignee
IControl Networks Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/237,873external-prioritypatent/US20170185277A1/en
Application filed by IControl Networks IncfiledCriticalIControl Networks Inc
Publication of EP3285238A2publicationCriticalpatent/EP3285238A2/en
Publication of EP3285238A3publicationCriticalpatent/EP3285238A3/en
Ceasedlegal-statusCriticalCurrent

Links

Images

Classifications

Definitions

Landscapes

Abstract

Systems and methods include an automation network comprising a gateway located at/in a premises. The gateway is coupled to a remote network and is configured to control components at the premises including premises devices and a security system comprising security system components. The components include at least one camera. A sensor user interface (SUI) is coupled to the gateway and presented to a user via remote client devices. The SUI includes a display elements for managing and receiving data of the premises components agnostically across the remote client devices. The display elements include a timeline user interface comprising event data of the components positioned at a time corresponding to events.

Description

 for (i=0; i<tilesArr.length; i++) {   if (tilesArr[i].length > 4) {    x = (tilesArr[i][1]);    y = (tilesArr[i][2]);    w = (tilesArr[i][3]);    h = (tilesArr[i][4]);    //save individual tile data for editing    for (row=y; row < (y+h) && row<this.numTiles; row++) {      for (col=x; col < (x+w) && col<this.numTiles; col++) this.t[row][col].shown=true;      //turn on tile for each value in vector     }    //remember full tile blocks, ONLY for superfast rendering (not edit mode, where segs are being changed constantly)    point0 = this.pSkewXY(x *this.tileWidth + this.startPosX, y *this.tileWidth +    this.startPosY);    point1 = this.pSkewXY((x + w)*this.tileWidth + this.startPosX, y *this.tileWidth    + this.startPosY);    point2 = this.pSkewXY((x + w)*this.tileWidth + this.startPosX, (y +    h)*this.tileWidth + this.startPosY);    point3 = this.pSkewXY(x *this.tileWidth + this.startPosX, (y + h)*this.tileWidth    + this.startPosY);    this.tFastRender.push([point0, point1, point2, point3]);   }  }
  • For example, if the data included taadc, that becomes an array [0,0,3,2], meaning draw a rectangle from the origin, three tiles wide and two tiles high. The above code, computes the true pixel position for those locations, converting the parameters to 4 (x,y) corners of the rectangle to render:...point0.point1.x0,y0.x1,y1...point3.point2.x3,y3.x2,y2
    Figure imgb0002
  • The actual pixel location of each x,y coordinate is taking the abstract grid location and turning it into pixels. Each location is multiplied by the tileWidth, then offset by the rendering start positions startPosX and startPosY that account for gutters. To compute an abstract position like (3,2), the params are multiplied by the pixel width of a tile, and offset by the pixel position startPosX etc.pixelPosition forxy=x*this.tileWidth+this.starPosX,y*this.tileWidth+this.startPosY
    Figure imgb0003
  • For 2D rendering, the pSkewXY function does not alter these pixel positions, but returns them. For 3D rendering, each x,y position gets altered in several ways as follows, but the embodiment is not so limited:
    1. 1. If there are multiple floors, each y position is scaled vertically (for example, if there are 2 floors, every y value is divided by 2). The first floor would be drawn from the origin, but the 2nd floor would also be offset vertically so it draws halfway down. In addition, vertical offset is altered to provide a gap is between floors.
    2. 2. If there is a single floor, each position is scaled vertically to 60% of its height and offset to be vertically centered. This is controlled by a ppref.
    3. 3. All x positions are altered by shifting them toward the vertical midline. For example, in a 100px canvas, An x value of 50 it is unchanged. However, if x is 0, it needs to be skewed 20% toward the center. Since the back row is to be scaled to 80% width, we bring X to 80% of it's distance from the vertical midline. In this example, x would change to (50 - abs(x-50)*0.8). So an x at 0 shifted 20% to midline becomes x=10. This effect is reduced as we render lower rows (toward the front edge of the floor). Back row is squeezed to 80%, and front row is not horizontally squeezed at all, so 100% of original position.
    4. 4. A front-to-back scaling factor must be computed for later shrinking of device icons and label text. Devices in back (top) row are scaled to 80%, halfway back 90%, and front edge (bottom) devices are 100%.
  • An example follows of the core skewing algorithm of an embodiment, in code, but the embodiment is not so limited:
  •  //------------------------------------------------------------------ // pSkewXY // arguments: absolute canvas x and y positions // return: object with x and y properties with new, skewed values // // In general, y skew is scaled by # floors (2 floors means y = y/2). X skew is more subtle. // If x is about halfway across, its unaffected. And if Y is the max, x is the front row and // undaffected. But the farther "back" you go, the more skewed x is. For example, in the first // row, x==0 will be bent in by the 80% factor, or 10% increased towards the middle. //------------------------------------------------------------------------------------------ ic_hvwFloorData.prototype.pSkewXY = function (px, py) {  var devScale = 1; //computed amount to scale devices for each location. 1 for front  edge (bottom row), .80 for back edge (top row)  try {   if (this.render3D) { //skewing ONLY affects render mode, not editor    if (!this.cache) { //to ensure this is fast, precompute everything possible, only once    per floor      var scaleBackRowByPct= this.threeDScaleBackRowByPct, //horizontally scale back wall (and icons and text) this %        vertScaleSingleFloorPct = this.threeDVertScaleSingleFloorPct, //if rendingsingle floor 3D, scale vert by this %        floorGapInTiles = this.threeDVertFloorGapInTiles, //vert gap btwn        floors, height is this many tiles (scaled by # floors)        gapBetweenFloors = (this.numFloors>1)? (floorGapInTiles*this.tileHeight/(this.numFloors)) : 0; //gap in pixels if 3D & >1 flr      this.cache = { }; //create or clear cache object      this.cache.xSkewFactor = (1-scaleBackRowByPct); //constant controls amount of skew, such as .8 = 80% horiz scale      this.cache.ySkewFactor = (1 - (this.numFloors- 1)*(floorGapInTiles/this.numTiles)) / this.numFloors;      this.cache.drawWidth = this.tileWidth * this.numTiles;      this.cache.drawHeight = this.tileHeight * this.numTiles;      this.cache.yOffset = ((this.numFloors - 1) - this.floorNum) *      //amount to shift each floor down                        ((this.cache.drawHeight/this.numFloors) + gapBetweenFloors);                        //offset by # floors + gap      if (this.numFloors == 1) { //if single floor       this.cache.ySkewFactor *= vertScaleSingleFloorPct; //scale       vertically       this.cache.yOffset = ((1 - vertScaleSingleFloorPct) / 2) *       this.cache.drawHeight; //and offset vertically so centered      }      this.cache.halfDrawWidth = this.cache.drawWidth / 2; //precompute for speed      this.cache.xSkewMultiplier = this.cache.drawHeight * (this.cache.xSkewFactor) / 2;      this.cache.yScaleFactor = (1-(this.cache.xSkewFactor)*(this.cache.drawHeight -      this.startPosY)/(this.cache.drawHeight));     }    //compute skewed x, y positions, and scale for this row    devScale = py*this.cache.xSkewFactor/this.cache.drawHeight +    this.cache.yScaleFactor; //device scale: compute before altering py    px += (1 - (px-this.startPosX)/this.cache.halfDrawWidth) * //add normal X    factor skewing             (1 - (py-this.startPosY)/this.cache.drawHeight) * //but diminished by Y             factor             this.cache.xSkewMultiplier; //then scale overall    py = (py-this.startPosY)*this.cache.ySkewFactor + this.startPosY +    this.cache.yOffset; //remove start pos, skew, then add back   }  }  catch (ev) {   //console.log("Home View: pSkew failed "+ev);  }  return {x:px, y:py, scaleFor3D:devScale}; };
  • Tapering of the floors inHome View 3D, as described in detail herein, means that the top floor is rendered slightly wider than the bottom floor. Since the render naturally has vertical gutters on the left and right edge, and these gutters are wider than needed since the floors are skewed and smaller, the algorithm of an embodiment renders the bottom floor with gutter unchanged, and reduces the top floor gutter to approximately 35% of its normal width, as an example.
  • Before computing all the locations for rendering a floor, an embodiment shrinks this gutter for the higher floors. For example, with 3 floors, the gutters are approximately 35%, 57%, and 100% of their typical width, but are not so limited. Since the gutters are smaller, the floors are wider, so an embodiment grows the tile widths by that same approximate percent. An example algorithm is as follows, but is not so limited:
  •  if (render3D) { //This block makes the higher floors a bit wider then tapers inward to enhance 3D illusion   this.cache = null; //need to clear pre-computed cache from lower floors   var gutterPct = 0.35 + 0.65*((numFloors-1)-floorNum)/((numFloors>1)?(numFloors- 1):1); //top floor: 35% gutter, bottom floor: 100% gutter   this.startPosX *= gutterPct; //shrink startPosX that % to shift closer to edge   this.tileWidth *= 1 + (2*(startPosX-this.startPosX)/(this.numTiles*this.tileWidth))   //grow tileWidth by same percent gutter shrank  }
  • Embodiments of the integrated system described herein include a user interface (UI) that is a cross-platform UI providing control over home automation and security systems and devices from client devices including but not limited to tablets, smart phones, iOS devices, and Android devices. While conventional UIs for accessing live video and captured camera content are not integrated and do not allow for future support of Continuous Video Recording (CVR) content, an embodiment includes a Video Timeline UI that is a consistent and seamless cross-platform UI for accessing live video, saved clips and saved pictures and CVR content in the future.
  • Figure 94A is a flow diagram showing an example flow for accessing camera data via a smart phone (e.g., iPhone), under an embodiment. Access to camera data via Home View involves a user selecting a camera icon displayed on the "Home View" UI to show the Home View camera popup ("HV Popup"). Tapping a specified icon (e.g., ">" icon, etc.) on the "HV Popup" causes full-screen live video to be displayed. Access to camera data via a displayed list of cameras ("Camera List") involves a user selecting a "LIVE" icon displayed on the camera list, which results in the display of full screen live video corresponding to the selected camera. Alternatively, selecting the camera name takes a user to a clips/pictures viewer ("Clips/Pic Viewer").
  • Figure 94B is a flow diagram showing an example flow for accessing camera data via a tablet device (e.g., iPad), under an embodiment. Access to camera data via Home View involves a user selecting a camera icon displayed on the "Home View" UI, which results in presentation of a live video preview popup ("HV Popup"). Selecting a particular live video preview on the "HV Popup" initiates display of full-screen live video. Alternatively, tapping the history icon takes a user to a clips/pictures viewer ("Clip/Pic Viewer"). Access to camera data is also available via a camera list comprising a carousel ("Camera Carousel") of live video preview "scones". Starting from the displayed camera list, the user selects a particular live video preview to show full-screen video ("Full-Screen Live"). Alternatively, selecting the history icon takes a user to a clips/pictures viewer ("Clip/Pic Viewer").
  • The video window of an embodiment renders or presents video in landscape mode but is not so limited. The UI elements (e.g., top bar, bottom bar, paging dots, etc.) are shown by default. The UI is configured so tapping of the video window once causes the UI to be hidden, while tapping again returns the UI to the display. Selecting a "Done" icon returns the UI to the camera list.
  • The UI is configured so a swipe switches between cameras. Swiping pauses playback of a current camera maintaining zoom level. A new camera resumes live playback when fully snapped to full-screen video. Swipe shows the UI if it is hidden, and the UI hides again after a predetermined period of time (e.g., 5 seconds, etc.).
  • Live video is shown when first viewing a camera full-screen.Figure 95 is an example of a live view including the UI, under an embodiment. The UI is configured so a tap of a "Capture" icon displays live video capture options (e.g., take clip, take picture, etc.). The UI is configured so swiping left-right results in a switch between live camera feeds. Page dots indicate the position of the current camera in the list. The portion of the timeline right of the playback head ("LIVE") represents the future and is indicated by a grey patterned area, so that swiping to this area is not possible but a drag can rubber-band into it temporarily.
  • The UI is configured so a tap detected on the video section hides the UI (e.g., top bar, bottom bar, pagination dots, etc.).Figure 96 is an example of a live view with the UI hidden, under an embodiment.
  • If a camera event occurs while viewing live video, the event notification is displayed beneath the top bar.Figure 97 is an example of a live view with an event notification ("Motion detected") displayed during live viewing, and with the UI displayed, under an embodiment. The notification or message bar lasts for a pre-specified period of time (e.g., 5 seconds, etc.) and then disappears.
  • If a camera event occurs while viewing live video, and the UI is hidden, the event notification is displayed at the top of the screen or display.Figure 98 is an example of a live view with an event notification ("Motion detected") displayed during live viewing, and with the UI hidden, under an embodiment. The notification or message bar lasts for a predetermined period of time (e.g., 5 seconds, etc.) and then disappears.
  • The UI of an embodiment includes a Timeline.Figure 99 is an example of a UI including a live camera view and the Timeline, under an embodiment. The Timeline of an embodiment is configured to provide seamless navigation between live video, stored pictures and clips, and CVR data. The Timeline is presented or displayed over pictures and video clips, but is not so limited. The Timeline of an embodiment represents clips, pictures, and CVR data for a pre-specified period of time (e.g., 10 days, 30 days, etc.). The Timeline includes data for a pre-specified period of time (e.g., one 24-hour day, etc.) on screen at a time and includes one or more of the following time indicators or markings: Playback head (e.g., fixed at the center of the timeline and reflects the point in time currently being viewed; when first shown, the timeline is positioned at the "LIVE" position); Live video (e.g., indicated by a vertical bar (e.g., labeled "LIVE"); when centered at the playback position, the label and bar are red); Future (e.g., indicated by a patterned grey area to the right of the Live video indicator); Day marks (e.g., indicated by vertical grey bars labeled in a format (e.g., day name, month, date); Time stamp (e.g., shown above the timeline while being actively dragged; time displayed in the format hour:minute:AM/PM).
  • The Timeline of an embodiment includes camera-related events comprising one or more of captured clips, captured pictures or images, and motion events, but the embodiment is not so limited. The captured clips are indicated by a blue square with arrow (seeFigure 108, element 9902), centered at the time of capture, but are not so limited. Captured pictures of an embodiment are indicated by a blue open square (seeFigure 108, element 9904), centered on the time of capture. Motion events are indicated in the Timeline of an embodiment by a motion icon (seeFigure 108, element 9906), centered on the time of capture, positioned on a higher "track" than the camera content, but are not limited to this embodiment.
  • If pictures, clips and CVR data are not available, a message is displayed (e.g., "No saved videos or photos", etc.).Figure 100 is an example of a UI including the live camera view and Timeline, and a message regarding data, under an embodiment. The Timeline is configured so swiping left-right changes playback position for browsing stored video content. When the Timeline position indicator is no longer at the LIVE viewing position, the marker and LIVE label become grey.
  • Figure 101 is an example of a UI including the Timeline offset ("5:19 PM") from the live viewing position, under an embodiment. As the Timeline of an embodiment is actively dragged, the timestamp of the media playback is shown above the playback head, and is updated while dragging. The Timeline is configured so that tapping or dragging anywhere on the Timeline moves that point to the playback head, and the timeline will snap to the nearest clip or picture. If CVR data is available, the tapping or dragging action goes to the selected point on the Timeline with no snapping. The Timeline is configured so that tapping the LIVE area at the far right edge of the timeline or swiping right-left until the live indicator reaches the playback head returns the UI to the live camera view.
  • The Timeline of an embodiment is configured so that in response to a tap or release from a dragging operation, the timeline snaps to the nearest saved picture or clip. The screen dims and the loading spinner is displayed as the clip or picture is loaded, if necessary.Figure 102 is an example of a UI as a clip or picture is loaded, under an embodiment. Once the clip or picture is loaded, the spinner stops and the screen changes out of the dimmed state, and the selected clip or picture is displayed. The Timeline is configured so swiping left or right navigates to the beginning of the next or previous picture or clip.
  • Figure 103 is an example of a UI displaying a loaded picture (Timeline position indicator located at captured picture indicator), under an embodiment.Figure 104 is an example of a UI displaying a loaded video clip (Timeline position indicator located at captured clip indicator), under an embodiment. Once the clip is loaded, the spinner stops and the screen changes out of the dimmed state, and the selected clip is automatically played. The progress of the video clip download is indicated by a faint fill within the playback track, and the current position of playback is indicated by a playback thumb, which is draggable in an embodiment. The time remaining in the clip is also displayed (e.g., in a left portion of the UI) in an embodiment. The time of capture is shown (e.g., above the playback track). Tapping anywhere on the playback track will resume clip playback from that point.
  • The UI includes a pause icon, and playback can be paused by tapping the pause.Figure 105 is an example of a UI displaying a paused video clip, under an embodiment. When pausing, the pause icon changes to a play icon and a pause symbol is shown on screen for a pre-specified period of time (e.g., 1 second, etc.), fading away automatically. When playing, the play icon changes to a pause icon and a play symbol is shown on screen for a pre-specified period of time, fading away automatically.
  • Figure 106 is an example of a UI display having completed play of a video clip, under an embodiment. At the end of playback of a video clip, the pause/play icon becomes a replay icon.
  • The UI of an embodiment includes Timeline zooming, in which pinch gestures detected or received change the scale of the Timeline and reveal or hide timeline event icons. The UI is configured to pinch-out to zoom in to the Timeline. Zooming in spreads apart overlapping events, and allows finer control over moving the playhead. As the timeline zooms in, tick marks fade in and out to show the most appropriate time measurements.
  • Zoom Out is realized in a direction opposite that of zoom in (e.g., pinch-in). Zooming out can cause nearby event icons to overlap, but is not so limited. The most recent event is positioned on top (thus tappable).
  • The UI of an embodiment includes Timeline scaling in which zooming in and out scales the timeline to seamlessly cross fade to the new scale. New labels, icons and tick marks fade in, and original labels, icons and tick marks will fade out. The Timeline of an example embodiment presents 24 hours on-screen, with tick marks every twelve hours, and any zoom-in causes the Timeline to add hour tick marks (e.g., zoom-in more than 2x (12 hours on screen) adds half-hour tick marks, etc.). In this example, the largest scale includes a 5-day view showing five days of information, and having tick marks presented in 24-hour increments. Another Timeline scale is a 24-hour view presents one day of information, and having tick marks presented in 12-hours apart. A one-hour increment view presents approximately two days of information, with tick marks presented in one-hour increments. A five-minute increment view presents approximately two hours of information, with tick marks presented in five-minute increments. The scaling of an embodiment also includes a one-minute increment view that includes tick marks presented in one-minute increments.
  • The UI of an embodiment includes a zoom map but is not so limited. The zoom map, which is positioned in a portion or region of the UI display, repositions itself depending on the size of the top bar.Figure 107 is an example of a UI having no top bar and on which the zoom map is positioned in a top region of the display, under an embodiment.Figure 108 is an example of a UI having a relatively minimal top bar, with a zoom map is positioned on the display just below the top bar, under an embodiment.Figure 109 is an example of a UI having a relatively large top bar, with a zoom map is positioned on the display just below the top bar, under an embodiment.
  • Figure 110 is an example of a UI including the Timeline with CVR data, under an embodiment. The UI of an embodiment indicates CVR content or data on the Timeline using filled portions. The filled portions may not be contiguous, but the embodiment is not so limited. The UI is configured so that a swipe or tap detected at any location on the timeline (where CVR data is indicated) causes play of the CVR data. On release, the screen dims and the loading spinner is shown. When loaded, the spinner stops, the screen returns to normal contrast, and the video plays. When CVR is available, concurrent pictures and/or video clips from server are not shown in an embodiment but the CVR material is shown, but the embodiment is not so limited. When a camera event occurs, the event notification is overlaid on the video clip, and an event marker is presented or displayed on the timeline.
  • A typical swipe on the Timeline provides 1:1 movement of the Timeline. A relatively fast swipe provides accelerated scrolling with inertia, allowing access to the entire span of the timeline within a pre-specified number of swipes (e.g., 10 swipes). Conversely, a slow drag provides fine-grained control with magnification, enabling selection of individual, closely spaced captured clips or pictures. Slow drag in an embodiment is activated or triggered by a long press; the Timeline remains magnified while dragging until touch end, but is not so limited.
  • The Timeline of an embodiment is configured for magnification.Figure 111 is an example of a UI including the Timeline with magnification, under an embodiment. When magnified, a center portion (e.g.,center 50%, etc.) of the Timeline zooms in to a pre-specified magnification (e.g., 24x magnification, etc.), decreasing its scope to show a pre-specified period of time (e.g., one hour, etc.). The non-magnified portion of the Timeline continues to move during dragging, but moves at a much slower rate than the zoomed portion of the Timeline. The relative rates of movement of the magnified and unmagnified portions of the Timeline are proportional to the relative sizes of the magnified and unmagnified portions. While magnified, tick marks or indicators presented on the Timeline become visible, time-interval indicators or time stamps become visible, and the unmagnified portion of the timeline is slightly tinted.
  • Figure 112 is an example of a UI configured to include thumbnail images in the Timeline, under an embodiment. When the Timeline is configured to include thumbnail images, the picture and/or video clip icons are replaced with actual thumbnail images representing the image or video data.
    Embodiments include a system comprising an automation network comprising a gateway at a premises, wherein the gateway is coupled to a remote network, wherein the gateway is configured to control a plurality of components at the premises including premises devices and a security system comprising security system components, wherein the plurality of components include at least one camera and a sensor user interface (SUI) coupled to the gateway and presented to a user via a plurality of remote client devices, wherein the SUI includes a plurality of display elements for managing and receiving data of the plurality of components agnostically across the plurality of remote client devices, wherein the plurality of display elements includes a timeline user interface comprising event data of the plurality of components positioned at a time corresponding to events.
  • The timeline user interface comprises a variable-length timeline.
  • A time scale of the timeline user interface can be dynamically changed.
  • The event data may comprise component state of the plurality of components presented in a timeline.
  • The event data of the plurality of components may include at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • The timeline user interface may include at least one icon corresponding to the at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • The timeline user interface may include at least one thumbnail image corresponding to the at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • A captured clip may include continuous video recording for a period of time.
  • An event captured in at least one of the live video, captured clips and captured pictures may be depicted on the timeline user interface using icons.
  • The event data of the plurality of components may include live video of the at least one camera.
  • The event data of the plurality of components may include captured clips of the at least one camera.
  • The event data of the plurality of components may include captured pictures of the at least one camera.
  • The event data of the plurality of components may include motion events of the at least one camera.
  • The timeline user interface may be configured to control navigation between live video, captured clips and captured pictures of the at least one camera.
  • A tap detected at a position on the timeline user interface may cause the timeline user interface to snap to and display one of a captured clip and captured picture nearest the position.
  • The timeline user interface may be configured to display concurrent ones of the captured clip and captured picture nearest the position.
  • When continuous video recording is available at the position, the continuous video recording may be presented instead of concurrent ones of the captured clip and captured picture nearest the position.
  • The system may include a dedicated coupling between a processor of the gateway and a controller of the security system, wherein the controller is coupled to the security system components.
  • The controlling of the plurality of components at the premises may include controlling interoperability among the plurality of components.
  • The gateway may be configured using data of the plurality of components.
  • At least one of the gateway and the plurality of remote devices may be configured to perform a synchronization to associate the plurality of remote devices with the plurality of components.
  • The plurality of remote devices may include applications that receive the data from and transmit control instructions to the plurality of components via the gateway.
  • The gateway may be coupled to the security system via a first network.
  • The first network may be a dedicated network.
  • The gateway may be coupled to the premises devices via a second network.
  • The plurality of remote client devices may include one or more of a smart phone, a mobile phone, a cellular phone, a tablet computer, a personal computer, and a touchscreen device.
  • The plurality of display elements may include an icon that visually indicates a state of the plurality of components.
  • The icon may be configured to control the plurality of components.
  • The plurality of display elements may include at least one warning that is an informational warning of the plurality of components.
  • The at least one warning may correspond to at least one of a camera device, a lighting device, a lock device, and a thermostat device.
  • The plurality of display elements may include display elements comprising a representation of a floor plan layout of the premises, wherein the floor plan layout includes representations of the plurality of components.
  • The floor plan layout may visually and separately indicate a location and a state of the plurality of components, wherein the state includes current state and historical state.
  • The floor plan layout may include a three-dimensional representation of the floor plan.
  • The floor plan layout may include a two-dimensional representation of the floor plan.
  • The floor plan layout may include configuration data for each of the plurality of components.
  • Embodiments include a system comprising an automation network including a gateway at a premises. The gateway is coupled to a remote network. The gateway is configured to control a plurality of components at the premises including premises devices and a security system comprising security system components. The plurality of components includes at least one camera. The system includes a sensor user interface (SUI) coupled to the gateway and presented to a user via a plurality of remote client devices. The SUI includes a plurality of display elements for managing and receiving data of the plurality of components agnostically across the plurality of remote client devices. The plurality of display elements includes a timeline user interface comprising event data of the plurality of components positioned at a time corresponding to events.
  • Embodiments include a system comprising: an automation network comprising a gateway at a premises, wherein the gateway is coupled to a remote network, wherein the gateway is configured to control a plurality of components at the premises including premises devices and a security system comprising security system components, wherein the plurality of components include at least one camera; and a sensor user interface (SUI) coupled to the gateway and presented to a user via a plurality of remote client devices, wherein the SUI includes a plurality of display elements for managing and receiving data of the plurality of components agnostically across the plurality of remote client devices, wherein the plurality of display elements includes a timeline user interface comprising event data of the plurality of components positioned at a time corresponding to events.
  • The timeline user interface comprises a variable-length timeline.
  • A time scale of the timeline user interface can be dynamically changed.
  • The event data may comprise component state of the plurality of components presented in a timeline.
  • The event data of the plurality of components may include at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • The timeline user interface may include at least one icon corresponding to the at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • The timeline user interface may include at least one thumbnail image corresponding to the at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • A captured clip may include continuous video recording for a period of time.
  • An event captured in at least one of the live video, captured clips and captured pictures may be depicted on the timeline user interface using icons.
  • The event data of the plurality of components may include live video of the at least one camera.
  • The event data of the plurality of components may include captured clips of the at least one camera.
  • The event data of the plurality of components may include captured pictures of the at least one camera.
  • The event data of the plurality of components may include motion events of the at least one camera.
  • The timeline user interface may be configured to control navigation between live video, captured clips and captured pictures of the at least one camera.
  • A tap detected at a position on the timeline user interface may cause the timeline user interface to snap to and display one of a captured clip and captured picture nearest the position.
  • The timeline user interface may be configured to display concurrent ones of the captured clip and captured picture nearest the position.
  • When continuous video recording is available at the position, the continuous video recording may be presented instead of concurrent ones of the captured clip and captured picture nearest the position.
  • The system may include a dedicated coupling between a processor of the gateway and a controller of the security system, wherein the controller is coupled to the security system components.
  • The controlling of the plurality of components at the premises may include controlling interoperability among the plurality of components.
  • The gateway may be configured using data of the plurality of components.
  • At least one of the gateway and the plurality of remote devices may be configured to perform a synchronization to associate the plurality of remote devices with the plurality of components.
  • The plurality of remote devices may include applications that receive the data from and transmit control instructions to the plurality of components via the gateway.
  • The gateway may be coupled to the security system via a first network.
  • The first network may be a dedicated network.
  • The gateway may be coupled to the premises devices via a second network.
  • The plurality of remote client devices may include one or more of a smart phone, a mobile phone, a cellular phone, a tablet computer, a personal computer, and a touchscreen device.
  • The plurality of display elements may include an icon that visually indicates a state of the plurality of components.
  • The icon may be configured to control the plurality of components.
  • The plurality of display elements may include at least one warning that is an informational warning of the plurality of components.
  • The at least one warning may correspond to at least one of a camera device, a lighting device, a lock device, and a thermostat device.
  • The plurality of display elements may include display elements comprising a representation of a floor plan layout of the premises, wherein the floor plan layout includes representations of the plurality of components.
  • The floor plan layout may visually and separately indicate a location and a state of the plurality of components, wherein the state includes current state and historical state.
  • The floor plan layout may include a three-dimensional representation of the floor plan.
  • The floor plan layout may include a two-dimensional representation of the floor plan.
  • The floor plan layout may include configuration data for each of the plurality of components.
  • Embodiments include a method comprising configuring an automation network to include a gateway at a premises, wherein the gateway is coupled to a remote network, configuring the gateway to control a plurality of components at the premises including premises devices and a security system comprising security system components, wherein the plurality of components include at least one camera, configuring a sensor user interface (SUI) to include a plurality of display elements for managing and receiving data of the plurality of components agnostically across the plurality of remote client devices, wherein the SUI is coupled to the gateway and presented to a user via a plurality of remote client devices, wherein the plurality of display elements includes a timeline user interface comprising event data of the plurality of components positioned at a time corresponding to events.
  • The event data may comprise component state of the plurality of components presented in a timeline.
  • The event data of the plurality of components may include at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • The method may include configuring the timeline user interface to include at least one icon corresponding to the at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • The method may include configuring the timeline user interface to include at least one thumbnail image corresponding to the at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • A captured clip may include continuous video recording for a period of time.
  • The method may include configuring the timeline user interface to depict an event captured in at least one of the live video, captured clips and captured pictures using icons.
  • The event data of the plurality of components may include live video of the at least one camera.
  • The event data of the plurality of components may include captured clips of the at least one camera.
  • The event data of the plurality of components may include captured pictures of the at least one camera.
  • The event data of the plurality of components may include motion events of the at least one camera.
  • The method may include configuring the timeline user interface to control navigation between live video, captured clips and captured pictures of the at least one camera.
  • A tap detected at a position on the timeline user interface may cause the timeline user interface to snap to and display one of a captured clip and captured picture nearest the position.
  • The method may include configuring the timeline user interface to display concurrent ones of the captured clip and captured picture nearest the position.
  • When continuous video recording is available at the position, the continuous video recording may be presented instead of concurrent ones of the captured clip and captured picture nearest the position.
  • The method may include a dedicated coupling between a processor of the gateway and a controller of the security system, wherein the controller is coupled to the security system components.
  • The controlling of the plurality of components at the premises may include controlling interoperability among the plurality of components.
  • The method may include configuring the gateway using data of the plurality of components.
  • The method may include configuring at least one of the gateway and the plurality of remote devices to perform a synchronization to associate the plurality of remote devices with the plurality of components.
  • The plurality of remote devices may include applications that receive the data from and transmit control instructions to the plurality of components via the gateway.
  • The gateway may be coupled to the security system via a first network, and is coupled to the premises devices via a second network.
  • The method may include configuring the plurality of display elements to include a representation of a floor plan layout of the premises, wherein the floor plan layout includes representations of the plurality of components.
  • The floor plan layout may visually and separately indicate a location and a state of the plurality of components, wherein the state includes current state and historical state.
  • The floor plan layout may include a three-dimensional representation of the floor plan.
  • Embodiments include a method comprising configuring an automation network to include a gateway at a premises. The gateway is coupled to a remote network. The method includes configuring the gateway to control a plurality of components at the premises including premises devices and a security system comprising security system components. The plurality of components includes at least one camera. The method includes configuring a sensor user interface (SUI) to include a plurality of display elements for managing and receiving data of the plurality of components agnostically across the plurality of remote client devices. The SUI is coupled to the gateway and presented to a user via a plurality of remote client devices. The plurality of display elements includes a timeline user interface comprising event data of the plurality of components positioned at a time corresponding to events.
  • Embodiments include a method comprising: configuring an automation network to include a gateway at a premises, wherein the gateway is coupled to a remote network; configuring the gateway to control a plurality of components at the premises including premises devices and a security system comprising security system components, wherein the plurality of components include at least one camera; configuring a sensor user interface (SUI) to include a plurality of display elements for managing and receiving data of the plurality of components agnostically across the plurality of remote client devices, wherein the SUI is coupled to the gateway and presented to a user via a plurality of remote client devices, wherein the plurality of display elements includes a timeline user interface comprising event data of the plurality of components positioned at a time corresponding to events.
  • The event data may comprise component state of the plurality of components presented in a timeline.
  • The event data of the plurality of components may include at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • The method may include configuring the timeline user interface to include at least one icon corresponding to the at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • The method may include configuring the timeline user interface to include at least one thumbnail image corresponding to the at least one of live video, captured clips, captured pictures, and motion events of the at least one camera.
  • A captured clip may include continuous video recording for a period of time.
  • The method may include configuring the timeline user interface to depict an event captured in at least one of the live video, captured clips and captured pictures using icons.
  • The event data of the plurality of components may include live video of the at least one camera.
  • The event data of the plurality of components may include captured clips of the at least one camera.
  • The event data of the plurality of components may include captured pictures of the at least one camera.
  • The event data of the plurality of components may include motion events of the at least one camera.
  • The method may include configuring the timeline user interface to control navigation between live video, captured clips and captured pictures of the at least one camera.
  • A tap detected at a position on the timeline user interface may cause the timeline user interface to snap to and display one of a captured clip and captured picture nearest the position.
  • The method may include configuring the timeline user interface to display concurrent ones of the captured clip and captured picture nearest the position.
  • When continuous video recording is available at the position, the continuous video recording may be presented instead of concurrent ones of the captured clip and captured picture nearest the position.
  • The method may include a dedicated coupling between a processor of the gateway and a controller of the security system, wherein the controller is coupled to the security system components.
  • The controlling of the plurality of components at the premises may include controlling interoperability among the plurality of components.
  • The method may include configuring the gateway using data of the plurality of components.
  • The method may include configuring at least one of the gateway and the plurality of remote devices to perform a synchronization to associate the plurality of remote devices with the plurality of components.
  • The plurality of remote devices may include applications that receive the data from and transmit control instructions to the plurality of components via the gateway.
  • The gateway may be coupled to the security system via a first network, and is coupled to the premises devices via a second network.
  • The method may include configuring the plurality of display elements to include a representation of a floor plan layout of the premises, wherein the floor plan layout includes representations of the plurality of components.
  • The floor plan layout may visually and separately indicate a location and a state of the plurality of components, wherein the state includes current state and historical state.
  • The floor plan layout may include a three-dimensional representation of the floor plan.
  • As described above, computer networks suitable for use with the embodiments described herein include local area networks (LAN), wide area networks (WAN), Internet, or other connection services and network variations such as the world wide web, the public internet, a private internet, a private computer network, a public network, a mobile network, a cellular network, a value-added network, and the like. Computing devices coupled or connected to the network may be any microprocessor controlled device that permits access to the network, including terminal devices, such as personal computers, workstations, servers, mini computers, main-frame computers, laptop computers, mobile computers, palm top computers, hand held computers, mobile phones, TV set-top boxes, or combinations thereof. The computer network may include one of more LANs, WANs, Internets, and computers. The computers may serve as servers, clients, or a combination thereof.
  • The integrated security system can be a component of a single system, multiple systems, and/or geographically separate systems. The integrated security system can also be a subcomponent or subsystem of a single system, multiple systems, and/or geographically separate systems. The integrated security system can be coupled to one or more other components (not shown) of a host system or a system coupled to the host system.
  • One or more components of the integrated security system and/or a corresponding system or application to which the integrated security system is coupled or connected includes and/or runs under and/or in association with a processing system. The processing system includes any collection of processor-based devices or computing devices operating together, or components of processing systems or devices, as is known in the art. For example, the processing system can include one or more of a portable computer, portable communication device operating in a communication network, and/or a network server. The portable computer can be any of a number and/or combination of devices selected from among personal computers, personal digital assistants, portable computing devices, and portable communication devices, but is not so limited. The processing system can include components within a larger computer system.
  • The processing system of an embodiment includes at least one processor and at least one memory device or subsystem. The processing system can also include or be coupled to at least one database. The term "processor" as generally used herein refers to any logic processing unit, such as one or more central processing units (CPUs), digital signal processors (DSPs), application-specific integrated circuits (ASIC), etc. The processor and memory can be monolithically integrated onto a single chip, distributed among a number of chips or components, and/or provided by some combination of algorithms. The methods described herein can be implemented in one or more of software algorithm(s), programs, firmware, hardware, components, circuitry, in any combination.
  • The components of any system that includes the integrated security system can be located together or in separate locations. Communication paths couple the components and include any medium for communicating or transferring files among the components. The communication paths include wireless connections, wired connections, and hybrid wireless/wired connections. The communication paths also include couplings or connections to networks including local area networks (LANs), metropolitan area networks (MANs), wide area networks (WANs), proprietary networks, interoffice or backend networks, and the Internet. Furthermore, the communication paths include removable fixed mediums like floppy disks, hard disk drives, and CD-ROM disks, as well as flash RAM, Universal Serial Bus (USB) connections, RS-232 connections, telephone lines, buses, and electronic mail messages.
  • Aspects of the integrated security system and corresponding systems and methods described herein may be implemented as functionality programmed into any of a variety of circuitry, including programmable logic devices (PLDs), such as field programmable gate arrays (FPGAs), programmable array logic (PAL) devices, electrically programmable logic and memory devices and standard cell-based devices, as well as application specific integrated circuits (ASICs). Some other possibilities for implementing aspects of the integrated security system and corresponding systems and methods include: microcontrollers with memory (such as electronically erasable programmable read only memory (EEPROM)), embedded microprocessors, firmware, software, etc. Furthermore, aspects of the integrated security system and corresponding systems and methods may be embodied in microprocessors having software-based circuit emulation, discrete logic (sequential and combinatorial), custom devices, fuzzy (neural) logic, quantum devices, and hybrids of any of the above device types. Of course the underlying device technologies may be provided in a variety of component types, e.g., metal-oxide semiconductor field-effect transistor (MOSFET) technologies like complementary metal-oxide semiconductor (CMOS), bipolar technologies like emitter-coupled logic (ECL), polymer technologies (e.g., silicon-conjugated polymer and metal-conjugated polymer-metal structures), mixed analog and digital, etc.
  • It should be noted that any system, method, and/or other components disclosed herein may be described using computer aided design tools and expressed (or represented), as data and/or instructions embodied in various computer-readable media, in terms of their behavioral, register transfer, logic component, transistor, layout geometries, and/or other characteristics. Computer-readable media in which such formatted data and/or instructions may be embodied include, but are not limited to, nonvolatile storage media in various forms (e.g., optical, magnetic or semiconductor storage media) and carrier waves that may be used to transfer such formatted data and/or instructions through wireless, optical, or wired signaling media or any combination thereof. Examples of transfers of such formatted data and/or instructions by carrier waves include, but are not limited to, transfers (uploads, downloads, e-mail, etc.) over the Internet and/or other computer networks via one or more data transfer protocols (e.g., HTTP, FTP, SMTP, etc.). When received within a computer system via one or more computer-readable media, such data and/or instruction-based expressions of the above described components may be processed by a processing entity (e.g., one or more processors) within the computer system in conjunction with execution of one or more other computer programs.
  • Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise," "comprising," and the like are to be construed in an inclusive sense as opposed to an exclusive or exhaustive sense; that is to say, in a sense of "including, but not limited to." Words using the singular or plural number also include the plural or singular number respectively. Additionally, the words "herein," "hereunder," "above," "below," and words of similar import, when used in this application, refer to this application as a whole and not to any particular portions of this application. When the word "or" is used in reference to a list of two or more items, that word covers all of the following interpretations of the word: any of the items in the list, all of the items in the list and any combination of the items in the list.
  • The above description of embodiments of the integrated security system and corresponding systems and methods is not intended to be exhaustive or to limit the systems and methods to the precise forms disclosed. While specific embodiments of, and examples for, the integrated security system and corresponding systems and methods are described herein for illustrative purposes, various equivalent modifications are possible within the scope of the systems and methods, as those skilled in the relevant art will recognize. The teachings of the integrated security system and corresponding systems and methods provided herein can be applied to other systems and methods, not only for the systems and methods described above.
  • The elements and acts of the various embodiments described above can be combined to provide further embodiments. These and other changes can be made to the integrated security system and corresponding systems and methods in light of the above detailed description.
  • Claims (15)

    1. A method comprising:
      receiving, from a camera of a system located at a premises, event data;
      displaying a user interface associated with the system, wherein the user interface comprises:
      a timeline comprising a time period indicator corresponding, at least in part, to a time period represented by the event data; and
      an icon at a location on the timeline, wherein the location is indicative of a time point within the time period, wherein the icon is associated with a portion of the event data, wherein the portion of the event data is selected based at least on the time point; and
      causing, responsive to an activation of the icon, the user interface to display the portion of the event data associated with the icon.
    2. The method of claim 1, wherein the event data comprises at least one of live video, a captured clip, a still image, and an indication of a motion event from the camera.
    3. The method of claim 1 or 2, wherein the event data comprises at least one of concurrent data or continuous data for the time period.
    4. The method of claim 1, 2, or 3, wherein the activation of the icon is based at least on a tapping or a dragging via the user interface.
    5. The method of one of claims 1-4, wherein the event data comprises an image and a video captured, concurrently, by the camera.
    6. The method of one of claims 1-5, wherein the method further comprises transmitting the event data to at least one remote device.
    7. The method of one of claims 1-6, wherein the timeline comprises a variable-length timeline.
    8. The method of one of claims 1-7, wherein the timeline comprises a scale, and wherein the scale can be dynamically changed.
    9. The method of one of claims 1-8, wherein the method further comprises displaying, via the user interface, an indication of movement detected by the camera.
    10. The method of one of claims 1-9, wherein the portion of the event data associated with the icon comprises an indication of an event associated with the time point.
    11. The method of claim 10, wherein the portion of the event data is selected based at least on the event.
    12. The method of one of claims 1-11, wherein the event data comprises portions of data, wherein each of portions of data is associated with a respective time interval.
    13. The method of claim 12, wherein the portion of the event data associated with the icon comprises at least one of the portions of data, and wherein the at least one of the portions of data is selected based at least on the respective time interval of the at least one of the portions of data being near the time point associated with the icon on the timeline.
    14. An apparatus configured to implement the method of any one of claims 1-13.
    15. A device comprising:
      one or more processors; and
      memory storing instructions that, when executed by the one or more processors, cause the device to perform the method of any one of claims 1-13.
    EP17186497.8A2016-08-162017-08-16Automation system user interfaceCeasedEP3285238A3 (en)

    Applications Claiming Priority (1)

    Application NumberPriority DateFiling DateTitle
    US15/237,873US20170185277A1 (en)2008-08-112016-08-16Automation system user interface

    Publications (2)

    Publication NumberPublication Date
    EP3285238A2true EP3285238A2 (en)2018-02-21
    EP3285238A3 EP3285238A3 (en)2018-02-28

    Family

    ID=59799196

    Family Applications (1)

    Application NumberTitlePriority DateFiling Date
    EP17186497.8ACeasedEP3285238A3 (en)2016-08-162017-08-16Automation system user interface

    Country Status (2)

    CountryLink
    EP (1)EP3285238A3 (en)
    CA (1)CA2976682A1 (en)

    Cited By (82)

    * Cited by examiner, † Cited by third party
    Publication numberPriority datePublication dateAssigneeTitle
    US20160274759A1 (en)2008-08-252016-09-22Paul J. DawesSecurity system with networked touchscreen and gateway
    US10127801B2 (en)2005-03-162018-11-13Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US10127802B2 (en)2010-09-282018-11-13Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US10142166B2 (en)2004-03-162018-11-27Icontrol Networks, Inc.Takeover of security network
    US10140840B2 (en)2007-04-232018-11-27Icontrol Networks, Inc.Method and system for providing alternate network access
    US10142392B2 (en)2007-01-242018-11-27Icontrol Networks, Inc.Methods and systems for improved system performance
    US10156831B2 (en)2004-03-162018-12-18Icontrol Networks, Inc.Automation system with mobile interface
    US10156959B2 (en)2005-03-162018-12-18Icontrol Networks, Inc.Cross-client sensor user interface in an integrated security network
    US10237237B2 (en)2007-06-122019-03-19Icontrol Networks, Inc.Communication protocols in integrated systems
    US10237806B2 (en)2009-04-302019-03-19Icontrol Networks, Inc.Activation of a home automation controller
    US10313303B2 (en)2007-06-122019-06-04Icontrol Networks, Inc.Forming a security network including integrated security system components and network devices
    US10348575B2 (en)2013-06-272019-07-09Icontrol Networks, Inc.Control system user interface
    US10365810B2 (en)2007-06-122019-07-30Icontrol Networks, Inc.Control system user interface
    US10389736B2 (en)2007-06-122019-08-20Icontrol Networks, Inc.Communication protocols in integrated systems
    US10423309B2 (en)2007-06-122019-09-24Icontrol Networks, Inc.Device integration framework
    US10447491B2 (en)2004-03-162019-10-15Icontrol Networks, Inc.Premises system management using status signal
    US10498830B2 (en)2007-06-122019-12-03Icontrol Networks, Inc.Wi-Fi-to-serial encapsulation in systems
    US10523689B2 (en)2007-06-122019-12-31Icontrol Networks, Inc.Communication protocols over internet protocol (IP) networks
    US10522026B2 (en)2008-08-112019-12-31Icontrol Networks, Inc.Automation system user interface with three-dimensional display
    US10530839B2 (en)2008-08-112020-01-07Icontrol Networks, Inc.Integrated cloud system with lightweight gateway for premises automation
    US10559193B2 (en)2002-02-012020-02-11Comcast Cable Communications, LlcPremises management systems
    US10616244B2 (en)2006-06-122020-04-07Icontrol Networks, Inc.Activation of gateway device
    US10616075B2 (en)2007-06-122020-04-07Icontrol Networks, Inc.Communication protocols in integrated systems
    CN111080170A (en)*2019-12-302020-04-28北京云享智胜科技有限公司Workflow modeling method and device, electronic equipment and storage medium
    US10666523B2 (en)2007-06-122020-05-26Icontrol Networks, Inc.Communication protocols in integrated systems
    US10692356B2 (en)2004-03-162020-06-23Icontrol Networks, Inc.Control system user interface
    US10721087B2 (en)2005-03-162020-07-21Icontrol Networks, Inc.Method for networked touchscreen with integrated interfaces
    US10741057B2 (en)2010-12-172020-08-11Icontrol Networks, Inc.Method and system for processing security event data
    US10747216B2 (en)2007-02-282020-08-18Icontrol Networks, Inc.Method and system for communicating with and controlling an alarm system from a remote server
    US10785319B2 (en)2006-06-122020-09-22Icontrol Networks, Inc.IP device discovery systems and methods
    US10841381B2 (en)2005-03-162020-11-17Icontrol Networks, Inc.Security system with networked touchscreen
    US10930136B2 (en)2005-03-162021-02-23Icontrol Networks, Inc.Premise management systems and methods
    US10979389B2 (en)2004-03-162021-04-13Icontrol Networks, Inc.Premises management configuration and control
    US10992784B2 (en)2004-03-162021-04-27Control Networks, Inc.Communication protocols over internet protocol (IP) networks
    US10999254B2 (en)2005-03-162021-05-04Icontrol Networks, Inc.System for data routing in networks
    US11146637B2 (en)2014-03-032021-10-12Icontrol Networks, Inc.Media content management
    US11153266B2 (en)2004-03-162021-10-19Icontrol Networks, Inc.Gateway registry methods and systems
    US11182060B2 (en)2004-03-162021-11-23Icontrol Networks, Inc.Networked touchscreen with integrated interfaces
    US11201755B2 (en)2004-03-162021-12-14Icontrol Networks, Inc.Premises system management using status signal
    US11212192B2 (en)2007-06-122021-12-28Icontrol Networks, Inc.Communication protocols in integrated systems
    US11218878B2 (en)2007-06-122022-01-04Icontrol Networks, Inc.Communication protocols in integrated systems
    US11240059B2 (en)2010-12-202022-02-01Icontrol Networks, Inc.Defining and implementing sensor triggered response rules
    US11237714B2 (en)2007-06-122022-02-01Control Networks, Inc.Control system user interface
    US11244545B2 (en)2004-03-162022-02-08Icontrol Networks, Inc.Cross-client sensor user interface in an integrated security network
    US11258625B2 (en)2008-08-112022-02-22Icontrol Networks, Inc.Mobile premises automation platform
    US11277465B2 (en)2004-03-162022-03-15Icontrol Networks, Inc.Generating risk profile using data of home monitoring and security system
    US11310199B2 (en)2004-03-162022-04-19Icontrol Networks, Inc.Premises management configuration and control
    US11316958B2 (en)2008-08-112022-04-26Icontrol Networks, Inc.Virtual device systems and methods
    US11316753B2 (en)2007-06-122022-04-26Icontrol Networks, Inc.Communication protocols in integrated systems
    US11343380B2 (en)2004-03-162022-05-24Icontrol Networks, Inc.Premises system automation
    US11368327B2 (en)2008-08-112022-06-21Icontrol Networks, Inc.Integrated cloud system for premises automation
    US11398147B2 (en)2010-09-282022-07-26Icontrol Networks, Inc.Method, system and apparatus for automated reporting of account and sensor zone information to a central station
    US11405463B2 (en)2014-03-032022-08-02Icontrol Networks, Inc.Media content management
    US11423756B2 (en)2007-06-122022-08-23Icontrol Networks, Inc.Communication protocols in integrated systems
    US11424980B2 (en)2005-03-162022-08-23Icontrol Networks, Inc.Forming a security network including integrated security system components
    US11451409B2 (en)2005-03-162022-09-20Icontrol Networks, Inc.Security network integrating security system and network devices
    US11489812B2 (en)2004-03-162022-11-01Icontrol Networks, Inc.Forming a security network including integrated security system components and network devices
    US11496568B2 (en)2005-03-162022-11-08Icontrol Networks, Inc.Security system with networked touchscreen
    US11582065B2 (en)2007-06-122023-02-14Icontrol Networks, Inc.Systems and methods for device communication
    US11601810B2 (en)2007-06-122023-03-07Icontrol Networks, Inc.Communication protocols in integrated systems
    US11615697B2 (en)2005-03-162023-03-28Icontrol Networks, Inc.Premise management systems and methods
    US11646907B2 (en)2007-06-122023-05-09Icontrol Networks, Inc.Communication protocols in integrated systems
    US11677577B2 (en)2004-03-162023-06-13Icontrol Networks, Inc.Premises system management using status signal
    US11700142B2 (en)2005-03-162023-07-11Icontrol Networks, Inc.Security network integrating security system and network devices
    US11706279B2 (en)2007-01-242023-07-18Icontrol Networks, Inc.Methods and systems for data communication
    US11706045B2 (en)2005-03-162023-07-18Icontrol Networks, Inc.Modular electronic display platform
    US11729255B2 (en)2008-08-112023-08-15Icontrol Networks, Inc.Integrated cloud system with lightweight gateway for premises automation
    US11758026B2 (en)2008-08-112023-09-12Icontrol Networks, Inc.Virtual device systems and methods
    WO2023158854A3 (en)*2022-02-212023-09-28Rf Code, Inc.System, apparatus, and method for monitoring edge compute sites
    US11792036B2 (en)2008-08-112023-10-17Icontrol Networks, Inc.Mobile premises automation platform
    US11811845B2 (en)2004-03-162023-11-07Icontrol Networks, Inc.Communication protocols over internet protocol (IP) networks
    US11816323B2 (en)2008-06-252023-11-14Icontrol Networks, Inc.Automation system user interface
    US11831462B2 (en)2007-08-242023-11-28Icontrol Networks, Inc.Controlling data routing in premises management systems
    US11916870B2 (en)2004-03-162024-02-27Icontrol Networks, Inc.Gateway registry methods and systems
    US11916928B2 (en)2008-01-242024-02-27Icontrol Networks, Inc.Communication protocols over internet protocol (IP) networks
    CN118018716A (en)*2024-04-102024-05-10睿云联(厦门)网络通讯技术有限公司Cross-platform building intercom audio and video test method, device and medium
    US12003387B2 (en)2012-06-272024-06-04Comcast Cable Communications, LlcControl system user interface
    US12063220B2 (en)2004-03-162024-08-13Icontrol Networks, Inc.Communication protocols in integrated systems
    US12063221B2 (en)2006-06-122024-08-13Icontrol Networks, Inc.Activation of gateway device
    US12088425B2 (en)2010-12-162024-09-10Icontrol Networks, Inc.Bidirectional security sensor communication for a premises security system
    US12184443B2 (en)2007-06-122024-12-31Icontrol Networks, Inc.Controlling data routing among networks
    US12283172B2 (en)2007-06-122025-04-22Icontrol Networks, Inc.Communication protocols in integrated systems

    Families Citing this family (1)

    * Cited by examiner, † Cited by third party
    Publication numberPriority datePublication dateAssigneeTitle
    US11089122B2 (en)2007-06-122021-08-10Icontrol Networks, Inc.Controlling data routing among networks

    Family Cites Families (5)

    * Cited by examiner, † Cited by third party
    Publication numberPriority datePublication dateAssigneeTitle
    US7996771B2 (en)*2005-06-172011-08-09Fuji Xerox Co., Ltd.Methods and interfaces for event timeline and logs of video streams
    US20070257986A1 (en)*2006-05-052007-11-08Ivanov Yuri AMethod for processing queries for surveillance tasks
    WO2010019624A1 (en)*2008-08-112010-02-18Icontrol Networks, Inc.Cross-client sensor user interface in an integrated security network
    KR102092316B1 (en)*2013-10-142020-03-23한화테크윈 주식회사Method for monitoring
    US9544636B2 (en)*2014-07-072017-01-10Google Inc.Method and system for editing event categories

    Non-Patent Citations (1)

    * Cited by examiner, † Cited by third party
    Title
    None

    Cited By (169)

    * Cited by examiner, † Cited by third party
    Publication numberPriority datePublication dateAssigneeTitle
    US10559193B2 (en)2002-02-012020-02-11Comcast Cable Communications, LlcPremises management systems
    US11182060B2 (en)2004-03-162021-11-23Icontrol Networks, Inc.Networked touchscreen with integrated interfaces
    US12253833B2 (en)2004-03-162025-03-18Icontrol Networks, Inc.Automation system with mobile interface
    US10142166B2 (en)2004-03-162018-11-27Icontrol Networks, Inc.Takeover of security network
    US11368429B2 (en)2004-03-162022-06-21Icontrol Networks, Inc.Premises management configuration and control
    US11916870B2 (en)2004-03-162024-02-27Icontrol Networks, Inc.Gateway registry methods and systems
    US10156831B2 (en)2004-03-162018-12-18Icontrol Networks, Inc.Automation system with mobile interface
    US11343380B2 (en)2004-03-162022-05-24Icontrol Networks, Inc.Premises system automation
    US11378922B2 (en)2004-03-162022-07-05Icontrol Networks, Inc.Automation system with mobile interface
    US11782394B2 (en)2004-03-162023-10-10Icontrol Networks, Inc.Automation system with mobile interface
    US11893874B2 (en)2004-03-162024-02-06Icontrol Networks, Inc.Networked touchscreen with integrated interfaces
    US11757834B2 (en)2004-03-162023-09-12Icontrol Networks, Inc.Communication protocols in integrated systems
    US11310199B2 (en)2004-03-162022-04-19Icontrol Networks, Inc.Premises management configuration and control
    US11410531B2 (en)2004-03-162022-08-09Icontrol Networks, Inc.Automation system user interface with three-dimensional display
    US11277465B2 (en)2004-03-162022-03-15Icontrol Networks, Inc.Generating risk profile using data of home monitoring and security system
    US11244545B2 (en)2004-03-162022-02-08Icontrol Networks, Inc.Cross-client sensor user interface in an integrated security network
    US11677577B2 (en)2004-03-162023-06-13Icontrol Networks, Inc.Premises system management using status signal
    US11991306B2 (en)2004-03-162024-05-21Icontrol Networks, Inc.Premises system automation
    US11656667B2 (en)2004-03-162023-05-23Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US11626006B2 (en)2004-03-162023-04-11Icontrol Networks, Inc.Management of a security system at a premises
    US10447491B2 (en)2004-03-162019-10-15Icontrol Networks, Inc.Premises system management using status signal
    US11625008B2 (en)2004-03-162023-04-11Icontrol Networks, Inc.Premises management networking
    US12063220B2 (en)2004-03-162024-08-13Icontrol Networks, Inc.Communication protocols in integrated systems
    US11601397B2 (en)2004-03-162023-03-07Icontrol Networks, Inc.Premises management configuration and control
    US11201755B2 (en)2004-03-162021-12-14Icontrol Networks, Inc.Premises system management using status signal
    US11184322B2 (en)2004-03-162021-11-23Icontrol Networks, Inc.Communication protocols in integrated systems
    US11175793B2 (en)2004-03-162021-11-16Icontrol Networks, Inc.User interface in a premises network
    US11159484B2 (en)2004-03-162021-10-26Icontrol Networks, Inc.Forming a security network including integrated security system components and network devices
    US11588787B2 (en)2004-03-162023-02-21Icontrol Networks, Inc.Premises management configuration and control
    US10754304B2 (en)2004-03-162020-08-25Icontrol Networks, Inc.Automation system with mobile interface
    US11153266B2 (en)2004-03-162021-10-19Icontrol Networks, Inc.Gateway registry methods and systems
    US11537186B2 (en)2004-03-162022-12-27Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US11082395B2 (en)2004-03-162021-08-03Icontrol Networks, Inc.Premises management configuration and control
    US11043112B2 (en)2004-03-162021-06-22Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US10691295B2 (en)2004-03-162020-06-23Icontrol Networks, Inc.User interface in a premises network
    US10692356B2 (en)2004-03-162020-06-23Icontrol Networks, Inc.Control system user interface
    US11037433B2 (en)2004-03-162021-06-15Icontrol Networks, Inc.Management of a security system at a premises
    US10735249B2 (en)2004-03-162020-08-04Icontrol Networks, Inc.Management of a security system at a premises
    US11489812B2 (en)2004-03-162022-11-01Icontrol Networks, Inc.Forming a security network including integrated security system components and network devices
    US11810445B2 (en)2004-03-162023-11-07Icontrol Networks, Inc.Cross-client sensor user interface in an integrated security network
    US10992784B2 (en)2004-03-162021-04-27Control Networks, Inc.Communication protocols over internet protocol (IP) networks
    US11811845B2 (en)2004-03-162023-11-07Icontrol Networks, Inc.Communication protocols over internet protocol (IP) networks
    US10796557B2 (en)2004-03-162020-10-06Icontrol Networks, Inc.Automation system user interface with three-dimensional display
    US11449012B2 (en)2004-03-162022-09-20Icontrol Networks, Inc.Premises management networking
    US10979389B2 (en)2004-03-162021-04-13Icontrol Networks, Inc.Premises management configuration and control
    US10890881B2 (en)2004-03-162021-01-12Icontrol Networks, Inc.Premises management networking
    US10127801B2 (en)2005-03-162018-11-13Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US11706045B2 (en)2005-03-162023-07-18Icontrol Networks, Inc.Modular electronic display platform
    US12277853B2 (en)2005-03-162025-04-15Icontrol Networks, Inc.Gateway integrated with premises security system
    US10999254B2 (en)2005-03-162021-05-04Icontrol Networks, Inc.System for data routing in networks
    US10721087B2 (en)2005-03-162020-07-21Icontrol Networks, Inc.Method for networked touchscreen with integrated interfaces
    US10156959B2 (en)2005-03-162018-12-18Icontrol Networks, Inc.Cross-client sensor user interface in an integrated security network
    US11496568B2 (en)2005-03-162022-11-08Icontrol Networks, Inc.Security system with networked touchscreen
    US11424980B2 (en)2005-03-162022-08-23Icontrol Networks, Inc.Forming a security network including integrated security system components
    US11700142B2 (en)2005-03-162023-07-11Icontrol Networks, Inc.Security network integrating security system and network devices
    US11367340B2 (en)2005-03-162022-06-21Icontrol Networks, Inc.Premise management systems and methods
    US11595364B2 (en)2005-03-162023-02-28Icontrol Networks, Inc.System for data routing in networks
    US11824675B2 (en)2005-03-162023-11-21Icontrol Networks, Inc.Networked touchscreen with integrated interfaces
    US10841381B2 (en)2005-03-162020-11-17Icontrol Networks, Inc.Security system with networked touchscreen
    US11615697B2 (en)2005-03-162023-03-28Icontrol Networks, Inc.Premise management systems and methods
    US11451409B2 (en)2005-03-162022-09-20Icontrol Networks, Inc.Security network integrating security system and network devices
    US10930136B2 (en)2005-03-162021-02-23Icontrol Networks, Inc.Premise management systems and methods
    US10785319B2 (en)2006-06-122020-09-22Icontrol Networks, Inc.IP device discovery systems and methods
    US10616244B2 (en)2006-06-122020-04-07Icontrol Networks, Inc.Activation of gateway device
    US12063221B2 (en)2006-06-122024-08-13Icontrol Networks, Inc.Activation of gateway device
    US11418518B2 (en)2006-06-122022-08-16Icontrol Networks, Inc.Activation of gateway device
    US12120171B2 (en)2007-01-242024-10-15Icontrol Networks, Inc.Methods and systems for data communication
    US11418572B2 (en)2007-01-242022-08-16Icontrol Networks, Inc.Methods and systems for improved system performance
    US11412027B2 (en)2007-01-242022-08-09Icontrol Networks, Inc.Methods and systems for data communication
    US10225314B2 (en)2007-01-242019-03-05Icontrol Networks, Inc.Methods and systems for improved system performance
    US11706279B2 (en)2007-01-242023-07-18Icontrol Networks, Inc.Methods and systems for data communication
    US10142392B2 (en)2007-01-242018-11-27Icontrol Networks, Inc.Methods and systems for improved system performance
    US11194320B2 (en)2007-02-282021-12-07Icontrol Networks, Inc.Method and system for managing communication connectivity
    US10657794B1 (en)2007-02-282020-05-19Icontrol Networks, Inc.Security, monitoring and automation controller access and use of legacy security control panel information
    US11809174B2 (en)2007-02-282023-11-07Icontrol Networks, Inc.Method and system for managing communication connectivity
    US10747216B2 (en)2007-02-282020-08-18Icontrol Networks, Inc.Method and system for communicating with and controlling an alarm system from a remote server
    US11663902B2 (en)2007-04-232023-05-30Icontrol Networks, Inc.Method and system for providing alternate network access
    US11132888B2 (en)2007-04-232021-09-28Icontrol Networks, Inc.Method and system for providing alternate network access
    US10672254B2 (en)2007-04-232020-06-02Icontrol Networks, Inc.Method and system for providing alternate network access
    US10140840B2 (en)2007-04-232018-11-27Icontrol Networks, Inc.Method and system for providing alternate network access
    US11212192B2 (en)2007-06-122021-12-28Icontrol Networks, Inc.Communication protocols in integrated systems
    US10666523B2 (en)2007-06-122020-05-26Icontrol Networks, Inc.Communication protocols in integrated systems
    US11646907B2 (en)2007-06-122023-05-09Icontrol Networks, Inc.Communication protocols in integrated systems
    US11894986B2 (en)2007-06-122024-02-06Icontrol Networks, Inc.Communication protocols in integrated systems
    US11316753B2 (en)2007-06-122022-04-26Icontrol Networks, Inc.Communication protocols in integrated systems
    US11632308B2 (en)2007-06-122023-04-18Icontrol Networks, Inc.Communication protocols in integrated systems
    US10237237B2 (en)2007-06-122019-03-19Icontrol Networks, Inc.Communication protocols in integrated systems
    US12283172B2 (en)2007-06-122025-04-22Icontrol Networks, Inc.Communication protocols in integrated systems
    US11237714B2 (en)2007-06-122022-02-01Control Networks, Inc.Control system user interface
    US11423756B2 (en)2007-06-122022-08-23Icontrol Networks, Inc.Communication protocols in integrated systems
    US11218878B2 (en)2007-06-122022-01-04Icontrol Networks, Inc.Communication protocols in integrated systems
    US10423309B2 (en)2007-06-122019-09-24Icontrol Networks, Inc.Device integration framework
    US12184443B2 (en)2007-06-122024-12-31Icontrol Networks, Inc.Controlling data routing among networks
    US11722896B2 (en)2007-06-122023-08-08Icontrol Networks, Inc.Communication protocols in integrated systems
    US12250547B2 (en)2007-06-122025-03-11Icontrol Networks, Inc.Communication protocols in integrated systems
    US10389736B2 (en)2007-06-122019-08-20Icontrol Networks, Inc.Communication protocols in integrated systems
    US11625161B2 (en)2007-06-122023-04-11Icontrol Networks, Inc.Control system user interface
    US11582065B2 (en)2007-06-122023-02-14Icontrol Networks, Inc.Systems and methods for device communication
    US10616075B2 (en)2007-06-122020-04-07Icontrol Networks, Inc.Communication protocols in integrated systems
    US12284057B2 (en)2007-06-122025-04-22Icontrol Networks, Inc.Systems and methods for device communication
    US10313303B2 (en)2007-06-122019-06-04Icontrol Networks, Inc.Forming a security network including integrated security system components and network devices
    US10523689B2 (en)2007-06-122019-12-31Icontrol Networks, Inc.Communication protocols over internet protocol (IP) networks
    US11601810B2 (en)2007-06-122023-03-07Icontrol Networks, Inc.Communication protocols in integrated systems
    US11611568B2 (en)2007-06-122023-03-21Icontrol Networks, Inc.Communication protocols over internet protocol (IP) networks
    US10498830B2 (en)2007-06-122019-12-03Icontrol Networks, Inc.Wi-Fi-to-serial encapsulation in systems
    US10365810B2 (en)2007-06-122019-07-30Icontrol Networks, Inc.Control system user interface
    US10444964B2 (en)2007-06-122019-10-15Icontrol Networks, Inc.Control system user interface
    US11815969B2 (en)2007-08-102023-11-14Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US11831462B2 (en)2007-08-242023-11-28Icontrol Networks, Inc.Controlling data routing in premises management systems
    US12301379B2 (en)2007-08-242025-05-13Icontrol Networks, Inc.Controlling data routing in premises management systems
    US11916928B2 (en)2008-01-242024-02-27Icontrol Networks, Inc.Communication protocols over internet protocol (IP) networks
    US11816323B2 (en)2008-06-252023-11-14Icontrol Networks, Inc.Automation system user interface
    US12244663B2 (en)2008-08-112025-03-04Icontrol Networks, Inc.Integrated cloud system with lightweight gateway for premises automation
    US11729255B2 (en)2008-08-112023-08-15Icontrol Networks, Inc.Integrated cloud system with lightweight gateway for premises automation
    US11641391B2 (en)2008-08-112023-05-02Icontrol Networks Inc.Integrated cloud system with lightweight gateway for premises automation
    US11616659B2 (en)2008-08-112023-03-28Icontrol Networks, Inc.Integrated cloud system for premises automation
    US12267385B2 (en)2008-08-112025-04-01Icontrol Networks, Inc.Integrated cloud system with lightweight gateway for premises automation
    US10522026B2 (en)2008-08-112019-12-31Icontrol Networks, Inc.Automation system user interface with three-dimensional display
    US10530839B2 (en)2008-08-112020-01-07Icontrol Networks, Inc.Integrated cloud system with lightweight gateway for premises automation
    US11711234B2 (en)2008-08-112023-07-25Icontrol Networks, Inc.Integrated cloud system for premises automation
    US11368327B2 (en)2008-08-112022-06-21Icontrol Networks, Inc.Integrated cloud system for premises automation
    US12341865B2 (en)2008-08-112025-06-24Icontrol Networks, Inc.Virtual device systems and methods
    US11190578B2 (en)2008-08-112021-11-30Icontrol Networks, Inc.Integrated cloud system with lightweight gateway for premises automation
    US11758026B2 (en)2008-08-112023-09-12Icontrol Networks, Inc.Virtual device systems and methods
    US11258625B2 (en)2008-08-112022-02-22Icontrol Networks, Inc.Mobile premises automation platform
    US11962672B2 (en)2008-08-112024-04-16Icontrol Networks, Inc.Virtual device systems and methods
    US11792036B2 (en)2008-08-112023-10-17Icontrol Networks, Inc.Mobile premises automation platform
    US11316958B2 (en)2008-08-112022-04-26Icontrol Networks, Inc.Virtual device systems and methods
    US20160274759A1 (en)2008-08-252016-09-22Paul J. DawesSecurity system with networked touchscreen and gateway
    US10375253B2 (en)2008-08-252019-08-06Icontrol Networks, Inc.Security system with networked touchscreen and gateway
    US11778534B2 (en)2009-04-302023-10-03Icontrol Networks, Inc.Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces
    US10813034B2 (en)2009-04-302020-10-20Icontrol Networks, Inc.Method, system and apparatus for management of applications for an SMA controller
    US10237806B2 (en)2009-04-302019-03-19Icontrol Networks, Inc.Activation of a home automation controller
    US11665617B2 (en)2009-04-302023-05-30Icontrol Networks, Inc.Server-based notification of alarm event subsequent to communication failure with armed security system
    US11601865B2 (en)2009-04-302023-03-07Icontrol Networks, Inc.Server-based notification of alarm event subsequent to communication failure with armed security system
    US11553399B2 (en)2009-04-302023-01-10Icontrol Networks, Inc.Custom content for premises management
    US11856502B2 (en)2009-04-302023-12-26Icontrol Networks, Inc.Method, system and apparatus for automated inventory reporting of security, monitoring and automation hardware and software at customer premises
    US10275999B2 (en)2009-04-302019-04-30Icontrol Networks, Inc.Server-based notification of alarm event subsequent to communication failure with armed security system
    US10332363B2 (en)2009-04-302019-06-25Icontrol Networks, Inc.Controller and interface for home security, monitoring and automation having customizable audio alerts for SMA events
    US10674428B2 (en)2009-04-302020-06-02Icontrol Networks, Inc.Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces
    US11356926B2 (en)2009-04-302022-06-07Icontrol Networks, Inc.Hardware configurable security, monitoring and automation controller having modular communication protocol interfaces
    US12245131B2 (en)2009-04-302025-03-04Icontrol Networks, Inc.Security, monitoring and automation controller access and use of legacy security control panel information
    US12127095B2 (en)2009-04-302024-10-22Icontrol Networks, Inc.Custom content for premises management
    US11129084B2 (en)2009-04-302021-09-21Icontrol Networks, Inc.Notification of event subsequent to communication failure with security system
    US11223998B2 (en)2009-04-302022-01-11Icontrol Networks, Inc.Security, monitoring and automation controller access and use of legacy security control panel information
    US11997584B2 (en)2009-04-302024-05-28Icontrol Networks, Inc.Activation of a home automation controller
    US10127802B2 (en)2010-09-282018-11-13Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US10223903B2 (en)2010-09-282019-03-05Icontrol Networks, Inc.Integrated security system with parallel processing architecture
    US11398147B2 (en)2010-09-282022-07-26Icontrol Networks, Inc.Method, system and apparatus for automated reporting of account and sensor zone information to a central station
    US11900790B2 (en)2010-09-282024-02-13Icontrol Networks, Inc.Method, system and apparatus for automated reporting of account and sensor zone information to a central station
    US12088425B2 (en)2010-12-162024-09-10Icontrol Networks, Inc.Bidirectional security sensor communication for a premises security system
    US11341840B2 (en)2010-12-172022-05-24Icontrol Networks, Inc.Method and system for processing security event data
    US12100287B2 (en)2010-12-172024-09-24Icontrol Networks, Inc.Method and system for processing security event data
    US10741057B2 (en)2010-12-172020-08-11Icontrol Networks, Inc.Method and system for processing security event data
    US12021649B2 (en)2010-12-202024-06-25Icontrol Networks, Inc.Defining and implementing sensor triggered response rules
    US11240059B2 (en)2010-12-202022-02-01Icontrol Networks, Inc.Defining and implementing sensor triggered response rules
    US12003387B2 (en)2012-06-272024-06-04Comcast Cable Communications, LlcControl system user interface
    US11296950B2 (en)2013-06-272022-04-05Icontrol Networks, Inc.Control system user interface
    US10348575B2 (en)2013-06-272019-07-09Icontrol Networks, Inc.Control system user interface
    US11943301B2 (en)2014-03-032024-03-26Icontrol Networks, Inc.Media content management
    US11146637B2 (en)2014-03-032021-10-12Icontrol Networks, Inc.Media content management
    US11405463B2 (en)2014-03-032022-08-02Icontrol Networks, Inc.Media content management
    CN111080170A (en)*2019-12-302020-04-28北京云享智胜科技有限公司Workflow modeling method and device, electronic equipment and storage medium
    CN111080170B (en)*2019-12-302023-09-05北京云享智胜科技有限公司Workflow modeling method and device, electronic equipment and storage medium
    WO2023158854A3 (en)*2022-02-212023-09-28Rf Code, Inc.System, apparatus, and method for monitoring edge compute sites
    US11922698B2 (en)2022-02-212024-03-05Rf Code, Inc.System, apparatus, and method for monitoring edge compute sites
    US12080070B2 (en)2022-02-212024-09-03Rf Code, Inc.System, apparatus, and method for monitoring edge compute sites
    CN118018716B (en)*2024-04-102024-05-31睿云联(厦门)网络通讯技术有限公司Cross-platform building intercom audio and video test method, device and medium
    CN118018716A (en)*2024-04-102024-05-10睿云联(厦门)网络通讯技术有限公司Cross-platform building intercom audio and video test method, device and medium

    Also Published As

    Publication numberPublication date
    CA2976682A1 (en)2018-02-16
    EP3285238A3 (en)2018-02-28

    Similar Documents

    PublicationPublication DateTitle
    US11816323B2 (en)Automation system user interface
    US11410531B2 (en)Automation system user interface with three-dimensional display
    US11625161B2 (en)Control system user interface
    EP3285238A2 (en)Automation system user interface
    US11296950B2 (en)Control system user interface
    US20170185277A1 (en)Automation system user interface
    US10692356B2 (en)Control system user interface
    US10365810B2 (en)Control system user interface
    CA2878117C (en)Control system user interface
    CA2976802A1 (en)Automation system user interface
    US12003387B2 (en)Control system user interface

    Legal Events

    DateCodeTitleDescription
    PUAIPublic reference made under article 153(3) epc to a published international application that has entered the european phase

    Free format text:ORIGINAL CODE: 0009012

    STAAInformation on the status of an ep patent application or granted ep patent

    Free format text:STATUS: THE APPLICATION HAS BEEN PUBLISHED

    PUALSearch report despatched

    Free format text:ORIGINAL CODE: 0009013

    AKDesignated contracting states

    Kind code of ref document:A2

    Designated state(s):AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

    AXRequest for extension of the european patent

    Extension state:BA ME

    AKDesignated contracting states

    Kind code of ref document:A3

    Designated state(s):AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

    AXRequest for extension of the european patent

    Extension state:BA ME

    RIC1Information provided on ipc code assigned before grant

    Ipc:G08B 13/196 20060101AFI20180122BHEP

    STAAInformation on the status of an ep patent application or granted ep patent

    Free format text:STATUS: REQUEST FOR EXAMINATION WAS MADE

    STAAInformation on the status of an ep patent application or granted ep patent

    Free format text:STATUS: EXAMINATION IS IN PROGRESS

    17PRequest for examination filed

    Effective date:20180828

    RBVDesignated contracting states (corrected)

    Designated state(s):AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

    17QFirst examination report despatched

    Effective date:20180919

    STAAInformation on the status of an ep patent application or granted ep patent

    Free format text:STATUS: THE APPLICATION HAS BEEN REFUSED

    18RApplication refused

    Effective date:20211214


    [8]ページ先頭

    ©2009-2025 Movatter.jp