TECHNICAL FIELDThe present invention relates generally to network analyzers, and more particularly to an augmented reality/virtual reality platform for a network analyzer.
BACKGROUNDNetwork analyzers are instruments that measure the network parameters of networks. Network analyzers are used to help network designers and administrators to optimize the design of the network, detect system and device issues, improve system performance and monitor the system for security threats. For example, the network analyzer may be used to monitor the network to protect against malicious activity as well as monitor the performance of the components of the network so that the network can be designed for optimum performance. For instance, a network analyzer may be used to measure the amplitude and phase properties of a network.
Currently, network analyzers only provide a two-dimensional view of the critical monitored network data. That is, currently, network analyzers present the monitored data on a two-dimensional screen of a computer device, such as a laptop or desktop computer.
As a result, the understanding and interaction with the presented data is limited.
SUMMARYIn one embodiment of the present invention, a method for overlaying network data on a physical space and/or network devices comprises receiving network data captured from a network device. The method further comprises applying rules to the captured network data to determine network data to be visualized by a user. The method additionally comprises selecting network data in the captured network data to be visualized by the user based on applying the rules. Furthermore, the method comprises enhancing the selected network data with additional information. Additionally, the method comprises adapting speed of transmission and/or volume of the enhanced network data according to human brain frame rate and user preferences to generate processed network data. In addition, the method comprises creating holograms of the processed network data for visualization at an augmented reality/virtual reality device, where the augmented reality/virtual reality device overlays the holograms on top of the physical space encompassing the network device and/or the network device.
Other forms of the embodiment of the method described above are in a system and in a computer program product.
In another embodiment of the present invention, a method for executing commands on a network device using holograms comprises receiving a command to be performed on a network device from an augmented reality/virtual reality device, where the augmented reality/virtual reality device identifies a gesture performed by a user in a hologram and translates the identified gesture into the command, and where the hologram overlays a physical space encompassing the network device and/or the network device. The method further comprises receiving an identifier of the hologram from the augmented reality/virtual reality device. The method additionally comprises identifying the network device associated with the hologram using the identifier of the hologram. Furthermore, the method comprises issuing the command to the identified network device to be executed by the identified network device.
Other forms of the embodiment of the method described above are in a system and in a computer program product.
The foregoing has outlined rather generally the features and technical advantages of one or more embodiments of the present invention in order that the detailed description of the present invention that follows may be better understood. Additional features and advantages of the present invention will be described hereinafter which may form the subject of the claims of the present invention.
BRIEF DESCRIPTION OF THE DRAWINGSA better understanding of the present invention can be obtained when the following detailed description is considered in conjunction with the following drawings, in which:
FIG. 1 illustrates an embodiment of the AR/VR network analyzer system in accordance with an embodiment of the present invention;
FIG. 2 illustrates a hardware configuration of a computing device which is representative of a hardware environment for practicing the present invention;
FIG. 3 is a flowchart of a method for overlaying network data on top of a physical space and/or network devices in accordance with an embodiment of the present invention; and
FIG. 4 is a flowchart of a method for executing commands on network devices by interacting with the holograms overlaying the network devices and/or the physical space encompassing the network devices in accordance with an embodiment of the present invention.
DETAILED DESCRIPTIONIn the following description, numerous specific details are set forth to provide a thorough understanding of the present invention. However, it will be apparent to those skilled in the art that the present invention may be practiced without such specific details. In other instances, well-known circuits have been shown in block diagram form in order not to obscure the present invention in unnecessary detail. For the most part, details considering timing considerations and the like have been omitted inasmuch as such details are not necessary to obtain a complete understanding of the present invention and are within the skills of persons of ordinary skill in the relevant art.
While the following discusses the present invention in connection with utilizing an augmented reality/virtual reality platform for network analyzers, the principles of the present invention may utilize other types of reality platforms, such as a mixed reality platform. A person of ordinary skill in the art would be capable of applying the principles of the present invention to such implementations. Further, embodiments applying the principles of the present invention to such implementations would fall within the scope of the present invention.
The present invention presents network information to users leveraging augmented reality/virtual reality technology to allow users of network analyzers to visualize the data in the context of the physical network they are managing, such as via a hologram. Using an augmented reality/virtual reality device, network administrators can see data in real-time, overlaid onto network devices, adding richer content and deeper understanding by enhancing situational awareness. Additionally, the present invention uses a fourth (4th) dimension of time thereby allowing the user to visualize, replay, and interact with the time sequence of network packets and data transmission. Data presented in this way would improve security threat detection as well as improve understanding of overall network health and assist in network management.
The present invention creates a state-of-the-art technology that integrates networking equipment with augment reality/virtual reality (“AR/VR”) software to create a “four-dimensional” (“4D”) experience of interacting with the network in the close vicinity of the user, using a “man-in-the-loop” concept. The 4D experience will provide real-time spatiotemporal visualization and allow the user to participate and interact with the network in a highly intuitive fashion. The three-dimensional (“3D”) spatial characteristic will enable the user to distinguish directional packets, flows, and transmissions between network-resident devices, such as routers and switches. The additional 4thdimension (time) will enable the user to visualize, store/replay, and interact directly with the time sequence of packets, flows, transmissions, and other network contexts.
In one embodiment, the present invention overlays the network related data on top of the network devices (e.g., servers, switches, routers, modems, etc.) in the physical world, such as via holograms. In one embodiment, the network data (e.g., sensor data, protocol used for transmission (e.g., Zigbee, TCP/IP, HTTP, etc.), packet format, frame structure) may be user-specific. For example, the user of the system selects the type of data to be visualized as holograms on top of the network devices. In certain instances, the application of the system of the present invention can be considered as a “network-based situational awareness” application directed at threat management.
In one embodiment, the system of the present invention includes a network sniffer engine that is responsible for capturing network related data from the network devices. The system of the present invention may also include a rules engine that is responsible for applying various rules, which may be defined and stored by the user in a rules database, on the captured network data.
In one embodiment, the system of the present invention also includes an enhancement engine that is responsible to integrate data from external databases and/or systems or to insert identifiers to the data for better hologram to data association. For example, the Internet Protocol (IP) address of network data will be correlated to its physical geographical coordinates of the server.
In one embodiment, the system of the present invention also includes a time translation engine that is responsible for modifying and adapting the high speed of network data transfer to human centric frame rates that can be processed by the brain. For example, a typical human brain is able to visualize and understand up to 120 frames per second. The time translation engine is also capable of performing store/replay of data and allowing an interaction directly with the time sequence of packets, flows, transmissions, and other network contexts.
Furthermore, in one embodiment, the system of the present invention also includes a holographic engine that is responsible to convert the raw data into holograms.
Additionally, in one embodiment, the system of the present invention includes an augmented engine that is responsible for performing the spatial mapping and overlaying the holograms on the physical space. In one embodiment, such functionality is performed by the use of commodity AR/VR devices.
Referring now to the Figures in detail,FIG. 1 illustrates an embodiment of the augmented reality/virtual reality (“AR/VR”)network analyzer system100 in accordance with an embodiment of the present invention. As shown inFIG. 1,system100 includes anetwork device101, anetwork sniffer102, a communication network103 (identified as “Network 2” inFIG. 1), atransformation engine104, a communication network105 (identified as “Network 3” inFIG. 1) and an AR/VR Device106.Network device101 is communicatively connected withnetwork sniffer102;network sniffer102 is communicatively connected withtransformation engine104;transformation engine104 is communicatively connected with AR/VR device106.
In one embodiment,network device101 includes acomputing unit107 which is connected to a network communication module108 (identified as “net comm” inFIG. 1) that sends and/or receives network data from various networks. For example,network communication module108 may send data to an external system though “Network 1”109 and/or to the proposed system throughnetwork 2103. Furthermore,network communication module108 is communicatively connected withnetwork sniffer102.
In one embodiment,network sniffer102 is a software layer that can operate withindevice101. An example ofnetwork sniffer102 is the TCP-Dump software in Linux®. In one embodiment,network sniffer102 captures the network related data that is generated bynetwork device101. Examples of network data include sensor readings (e.g., temperature, energy, humidity, location, health data, accelerometer, etc.), protocols (e.g., transmission control protocol (TCP), user datagram protocol (UDP), Internet protocol (IP), hypertext markup language (HTML), hypertext transport protocol (HTTP), etc.), network processes (e.g., ping, traceroute, acknowledgement, network topology, routing table, etc.), packets (e.g., frame, payload, header, etc.), etc.
In one embodiment,network sniffer102 is responsible for capturing the required network data and sending the captured network data totransformation engine104 for further processing. The data transmission can be done with a different or the same network (e.g., in the case of being the same network,network109 andnetwork103 are the same network) that is used bynetwork device101 to send and receive data from third parties.
In one embodiment,transformation engine104 includes arules engine110 connected to arules database111, anenhancement engine112 connected to anenhancement database113, atime translation engine114 connected to atime translation database115 and aholographic engine116 connected to aholographic database117. These elements are communicatively connected to each other as shown inFIG. 1.
In one embodiment, the network data captured bynetwork sniffer102 is sent totransformation engine104 for processing and transformation.Transformation engine104 is responsible for transforming the raw network data into holograms that are realistically overlaid on the physical space and/or the network devices (e.g., network device101).Transformation engine104 provides real-time spatio-temporal visualization and allows the user to participate and interact with the network in a highly intuitive fashion.
In one embodiment, the network data fromnetwork sniffer102 is received byrules engine110.Rules engine110 processes the network data to identify the preferences of the user of the system fromrules database111. For example, a user may only be willing to observe the temperature reading from a sensor. Other users, however, may be willing to visualize the network topology or the routing table of the communication network; whereas, other users may want to visualize the packet structure or angle of arrival of the packets in three-dimensional (3D) space. In one embodiment, the rules applied byrules engine110 are user-selected.
In one embodiment, the network data that is filtered byrules engine110 is fed toenhancement engine112.Enhancement engine112 communicates with adatabase113 or a third party (not shown) and is responsible to enhance additional information to the network data or to insert identifiers to the data for better hologram to data association. For example, a user may be willing to see the direction of arrival of packets from different servers. In that case,enhancement engine112 will communicate with third parties ordatabase113 to convert the IP address of the server to a geographical location and coordinate value. In the overall experience of the user, the system will overlay hologram packets coming from the relevant directions. The output ofenhancement engine112 is network data from the output ofrules engine110 as well as enhanced data from the processing ofenhancement engine112. These data are entered intotime translation engine114.
In one embodiment,enhancement engine112 stores a table, such as a data structure withinenhancement database113, that includes a mapping of the holograms to the network devices encompassed by the holograms, including the geographic locations of the network devices within the hologram if there are multiple network devices overlaid by the hologram.
In one embodiment,time translation engine114 communicates withtime translation database115 and is responsible for adapting the high speed and volume of packet transmission of the network data according to human brain frame rate and user preferences. For example, a packet rate of the order of kilo packet per second or mega packet per second cannot be conceived by the human brain. Thus, a time translation between the computer speed of communication with human based criteria needs to take place.Time translation engine114 provides the additional 4th dimension (time) that enables the user to store/replay, and interact directly with the time sequence of packets, flows, transmissions, and other network contexts.Time translation engine114 communicates withtime translation database115 that holds user centric criteria for the time translation. For example, a user may be willing to see the direction of packet reception every 10 seconds but the sensor reading every 1 second.
In one embodiment,holographic engine116 receives the processed data fromtime translation engine114 and creates the holograms of the selected network data for visualization at AR/VR device106.Holographic engine116 is connected to aholographic database117 that provides a selection of holograms, such as lines, graphics, numbers, etc.Holographic engine116 also adapts the format of the holograph to meet the needs of AR/VR device106.
In one embodiment, each hologram generated byholographic engine116 is assigned an identifier that is passed to AR/VR device106. In one embodiment, such identifiers are stored in a table (e.g., table within holographic database117) that maps each identifier to a designated hologram.
In one embodiment, AR/VR device106 is connected toholographic engine116 and receives the holographs thoughcommunication network105. AR/VR device106 includes anaugmented engine118 and aprojector engine119.Augmented engine118 is configured to receive the hologram fromtransformation engine104 and perform the spatial mapping for the overlay of the holograms to the physical space.Augmented engine118 may also send data for object detection to an external database and/or parties using different networks, such as network120 (identified as “Network 4” inFIG. 1). In some occasions, in order to reduce overall latency, a caching server (“cache”)121 may be used to minimize the costs of the object detection process.
In one embodiment,projector engine119 is configured to project the holograms in the vicinity of the user.
In one embodiment, AR/VR device106 is configured to identify a gesture made by a user to a projected hologram and to translate the identified gesture into a command. For example, a user may desire to block the reception of network packets from a network device. As a result, the user may move his/her hand in a “cut” motion or gesture to the hologram encompassing the network device. In one embodiment, AR/VR device106 stores a table mapping the detected motions to particular commands. In one embodiment, such a table is a data structure that is stored in a data storage device122 (e.g., memory, disk unit) of AR/VR device106.
System100 is not to be limited in scope to any one particular architecture.System100 may include any number ofnetwork devices101,network sniffers102,networks103,105,109,120,transformation engines104, AR/VR devices106 andcaches121. Furthermore,system100 may include fewer devices than depicted, such as utilizing only a single network. Furthermore, whileFIG. 1 illustratestransformation engine104 as being a separate device, some or all of the functionality oftransformation engine104, as discussed herein, may reside innetwork device101 and/or AR/VR device106.
In one embodiment,transformation engine104 may be implemented as software components (e.g.,application204 ofcomputing device200 as discussed further below) executed by a processor of a computing device, includingnetwork device101 and/or AR/VR device106. A description of an embodiment of the hardware configuration of such a computing device is provided below in connection withFIG. 2.
Referring now toFIG. 2,FIG. 2 illustrates a hardware configuration ofcomputing device200 which is representative of a hardware environment for practicing the present invention. Referring toFIG. 2,computing device200 has aprocessor201 connected to various other components bysystem bus202. Anoperating system203 runs onprocessor201 and provides control and coordinates the functions of the various components ofFIG. 2. Anapplication204 in accordance with the principles of the present invention runs in conjunction withoperating system203 and provides calls tooperating system203 where the calls implement the various functions or services to be performed byapplication204.Application204 may include, for example, a program for overlaying network data on top of a physical space and/or network devices (e.g., servers, switches, routers, modems, etc.) in the physical world, such as via holograms, as discussed further below in connection withFIG. 3. Furthermore,application204 may include a program for executing commands on network devices by the user interacting with the holograms overlaying the network devise and/or the physical space encompassing the network devices as discussed further below in connection withFIG. 4.
Referring again toFIG. 2, read-only memory (“ROM”)205 is connected tosystem bus202 and includes a basic input/output system (“BIOS”) that controls certain basic functions ofcomputing device200. Random access memory (“RAM”)206 anddisk adapter207 are also connected tosystem bus202. It should be noted that software components includingoperating system203 andapplication204 may be loaded intoRAM206, which may be computing device's200 main memory for execution.Disk adapter207 may be an integrated drive electronics (“IDE”) adapter that communicates with adisk unit208, e.g., disk drive.
Computing device200 may further include acommunications adapter209 connected tobus202.Communications adapter209interconnects bus202 with an outside network thereby enablingcomputing device200 to communicate with other devices.
Computing device200 ofFIG. 2 is not to be limited in scope to the elements depicted inFIG. 2 and may include fewer or additional elements than depicted inFIG. 2.
The present invention may be a system, a method, and/or a computer program product. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present invention.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non-exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present invention may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++ or the like, and conventional procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some embodiments, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present invention.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
These computer readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
As stated in the Background section, currently, network analyzers only provide a two-dimensional view of the critical monitored network data. That is, currently, network analyzers present the monitored data on a two-dimensional screen of a computer device, such as a laptop or desktop computer. As a result, the understanding and interaction with the presented data is limited.
The principles of the present invention improve the understanding and interaction with the presented data by providing a three-dimensional or a four-dimensional view, such as by overlaying the network data on top of a physical space and/or the network devices (e.g., servers, switches, routers, modems, etc.) in the physical world, such as via holograms, as discussed below in connection withFIG. 3.
FIG. 3 is a flowchart of amethod300 for overlaying network data on top of a physical space and/or network devices (e.g.,network device101 ofFIG. 1) in accordance with an embodiment of the present invention.
Referring toFIG. 3, in conjunction withFIGS. 1-2, instep301, AR/VR device106 recognizes a network device(s), such asnetwork device101, in the physical space. For example, a user that wears AR/VR device106 is looking towards a network device(s), such asnetwork device101, which may be one of the network devices of a datacenter. In one embodiment, AR/VR device106 (augmented engine118) scans the physical space and recognizes network device(s)101.
Instep302,transformation engine104, and more precisely, rulesengine110, receives the identification of network device(s)101 from AR/VR device106 and rules are applied byrules engine110. As previously discussed, such rules are used to determine which network data, such as which network data from the captured network data as discussed below, is to be visualized by the user of AR/VR device106, such as TCP/IP packets. For example, the rules may be applied to the identified network device, which indicate that for the identified network device (e.g., server), the user would like to view TCP/IP packets.
Instep303,network sniffer102 captures network critical data from the identified network device(s)101. For example,network sniffer102 may capture the TCP/IP packets transmitted and received as well as the IP addresses.
Instep304,transformation engine104 receives the captured network data fromnetwork sniffer102 and rules are applied byrules engine110 to determine the network data to be visualized by the user of AR/VR device106. For example, suppose that the user has selected to visualize holograms of the packets and their angle of arrival in a three-dimensional space based on geographical coordinates. Thus, rulesengine110 selects to capture the TCP/IP packets of transmission and reception and their IP address.
Instep305,transformation engine104 selects the network data from the captured network data to be visualized by the user of AR/VR device106 based on applying the rules byrules engine110.
Instep306, the data selected byrules engine110 is transferred toenhancement engine112 to enhance the selected network data with additional information (e.g., a geographical location, a name of a server vendor, an Internet service provider, identifiers for data to hologram association). For example, as discussed above,enhancement engine112 may communicate with a third party ordatabase113 to convert the IP address to real geographical coordinates.
Instep307,enhancement engine112 transfers the enhanced network data totime translation engine114.
Instep308,time translation engine114 adapts the speed of the transmission and/or volume of the enhanced network data according to human brain frame rate and user preferences to generate processed network data.
Instep309,time translation engine114 transmits the processed network data toholographic engine116.
Instep310,holographic engine116 creates the holograms of the processed network data for visualization at AR/VR device106. In one embodiment,holographic engine116 provides a selection of holograms, such as lines, graphics, numbers, etc.Holographic engine116 also adapts the format of the holograph to meet the needs of AR/VR device106. For example,holographic engine116 creates holograms to represent the packet information and enable the real visualization in the physical space.
Instep311,holographic engine116 transmits the holograms to AR/VR device106, which are received by AR/VR device106.
Instep312, AR/VR devices106 overlays the holograms on top of the physical space encompassing the network device(s) and/or the network device(s).
With this system, the user has created a four-dimensional (“4D”) experience and a new way to interact with networks and networking devices. The user can more easily identify threats or understand network processes because all data is overlaid on the physical space that the user can better understand compared to just monitoring a screen.
Upon overlaying the holograms on top of the physical space encompassing the network device(s) and/or the network device(s), a user may interact with the holograms to execute commands on the appropriate network device(s) as discussed below in connection withFIG. 4.
FIG. 4 is a flowchart of amethod400 for executing commands on network devices (e.g., network device101) by interacting with the holograms overlaying the network devices and/or the physical space encompassing the network devices in accordance with an embodiment of the present invention.
Referring toFIG. 4, in conjunction withFIGS. 1-3, in step401, a user interacts with a hologram overlaying the physical space encompassing the network device(s) and/or the network device(s), such as via a gesture (e.g., a movement of a hand to express an idea or meaning).
Instep402, AR/VR device106 identifies the gesture performed by the user of AR/VR device106 in the hologram and translates the identified gesture into a command. For example, a user may desire to block the reception of network packets from a network device. As a result, the user may move his/her hand in a “cut” motion or gesture to the hologram encompassing the network device. In one embodiment, AR/VR device106 stores a table mapping detected motions to particular commands. In one embodiment, such a table is a data structure that is stored in adata storage device122 of AR/VR device106.
Instep403,transformation engine104 receives the command issued by the user to be performed on the network device that is encompassed by the designated hologram from AR/VR device106.
In step404,transformation engine104 receives an identifier of the hologram interacted with by the user from AR/VR device106. In one embodiment, each hologram generated byholographic engine116 is assigned an identifier that is passed to AR/VR device106. Hence, when AR/VR device106 sends the command totransformation engine104 to be performed on the network device, AR/VR device106 sends the identifier of the hologram that encompasses that network device.
In step405, transformation engine104 (enhancement engine112) identifies the network device associated with the hologram (the hologram in which the user made a gesture) using the identifier of the hologram. In one embodiment,enhancement engine112 stores a table, such as a data structure, that includes a mapping of the holograms to the network devices encompassed by the holograms, including the geographic locations of the network devices within the hologram if there are multiple network devices overlaid by the hologram. In such a scenario,transformation engine104 may also receive the geographic location of the gesture thereby pinpointing the particular network device within the hologram that contains multiple network devices.
Instep406,transformation engine104 issues the command to the network device (e.g., network device101). In one embodiment, the command issued bytransformation engine104 takes into consideration the network data selected to be visualized by the user of AR/VR device106. For example, the issued command may be to block the reception of a particular type of network packet that was selected to be visualized by the user of AR/VR device106.
Instep407, the network device (e.g., network device101) executes the command.
The present invention improves the technology or technical field involving network analyzers. As discussed above, network analyzers are instruments that measure the network parameters of networks. Network analyzers are used to help network designers and administrators to optimize the design of the network, detect system and device issues, improve system performance and monitor the system for security threats. For example, the network analyzer may be used to monitor the network to protect against malicious activity as well as monitor the performance of the components of the network so that the network can be designed for optimum performance. For instance, a network analyzer may be used to measure the amplitude and phase properties of a network. Currently, network analyzers only provide a two-dimensional view of the critical monitored network data. That is, currently, network analyzers present the monitored data on a two-dimensional screen of a computer device, such as a laptop or desktop computer. As a result, the understanding and interaction with the presented data is limited.
The present invention improves such technology by improving the understanding and interaction with the presented data by providing a three-dimensional or a four-dimensional view, such as by overlaying the network data on top of a physical space and/or the network devices (e.g., servers, switches, routers, modems, etc.) in the physical world, such as via holograms. For instance, embodiments of the present invention may create a four-dimensional (“4D”) experience and a new way to interact with networks and networking devices. The user can more easily identify threats or understand network processes because all data is overlaid on the physical space that the user can better understand compared to just monitoring a screen. Additionally, in one embodiment, upon overlaying the holograms on top of the network devices(s) and/or the physical space encompassing the network device(s), a user may interact with the holograms to execute commands on the appropriate network device(s).
The descriptions of the various embodiments of the present invention have been presented for purposes of illustration, but are not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein was chosen to best explain the principles of the embodiments, the practical application or technical improvement over technologies found in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.